8 comments

  • cpgxiii 25 minutes ago
    > The hardware is built around a stackable 10×10cm compute module with two ARM Cortex-A55 SBCs — one for ROS 2 navigation/EKF localisation, one dedicated to vision/YOLO inference — connected via a single ethernet cable.

    I will preface this by saying that I have nothing against ARM per se, that my employer/team supported a good chunk of the work for making ROS 2 actually work on arm64, and that there is some good hardware out there.

    I really don't understand why startups and research projects keep using weird ARM SBCs for their robots. The best of these SBCs is still vastly shittier in terms of software support and stability than any random Chinese Intel ADL-N box. The only reasons to use ARM SBCs in robots are that either (1) you are using a Jetson for Jetson things (i.e. Nvidia libraries), or (2) you have a product which requires serious cost optimization to be produced at a large scale. Otherwise you are just committing yourselves and your users/customers to a future of terrible-to-nonexistent support and adding significantly to the amount of work you need to bring up the new system and port existing tools to it.

    • schaefer 12 minutes ago
      > The only reasons to use ARM SBCs in robots are...

      Obviously, anyone can have there own opinion on this. I work in robotics, we are quite happy with our A53 and M4. Though, we use a SOM, not a SBC, if you feel like splitting hairs.

    • Sabrees 22 minutes ago
      If you can send me an open hardware Intel, or Jetson I'd happily use it.

      Part of the point of this for me is to see what's possible with open hardware (down to chip level at least)

  • sgillen 56 minutes ago
    Very cool! shameless self promotion but check out greenwave-monitor[1] for the 'Diagnostics TUI'. I'll get it into the buildfarm soon.

    [1] https://github.com/NVIDIA-ISAAC-ROS/greenwave_monitor

    • Sabrees 49 minutes ago
      Nice, thanks! looks like a good one..
  • jvanderbot 1 hour ago
    What's your payload? Where are the seeds? How are they deposited?

    Recommend going to a farm right now to see how this works in production. For the most part, you can autonomously sow using GPS. But the farmer just rides along.

  • dylan604 2 hours ago
    From a video somewhere in the page: "The aim is to make food production more sustainable and efficient" yet requires a web app. I'd hope that you can run the server side on a local machine and not require cloud connectivity.
    • Sabrees 2 hours ago
      The web app runs locally from the robot. No cloud. Once we reach autonomy (still some way away) you shouldn't have to use that much either.
  • MoonWalk 2 hours ago
    Great name, if nothing else!
  • dheera 2 hours ago
    I highly encourage you to go visit farms sooner rather than later, especially during the rainy seasons and winter when farmers are really at work preparing for the next season. The kind of conditions robots need to deal with in that environment is no joke.

    I also notice you're using the BNO055 -- if you need an C++ I2C ROS driver for it I wrote one (https://github.com/dheera/ros-imu-bno055). I think the one in the ROS apt-get repository is written in Python but they claimed the package name before I did

  • abraae 2 hours ago
    This is the future, good luck to you
  • pixelsub 1 hour ago
    [dead]