The goal of the Nomad Autonomous Robot Project is to build and program a robotic platform which can navigate and interact with its environment without intervention or direct control from human operators. The final robot will be able to:
- Perceive its surroundings
- Navigate to its objectives
- Avoid static and non-static obstacles in its path
- Perform predetermined tasks
In addition to these physical characteristics, the robot platform will:
- Provide a base platform from which to build additional robots
- Provide an understanding of mobile robotics
- Serve as an experimental platform for further studies in autonomous systems
- Be expandable to increase capabilities
The initial inspiration and intended implementation is to compete in the annual Autonomous Vehicle Challenge (AVC) put on by Sparkfun Electronics in the Denver Colorado area. The intent id to be able to compete in the 2016 competition. At this time the actual date of the competition has not been announced, but from past events, the date is anticipated to be June 18th, 2016.
The Nomad autonomous robot project was originally taken on as a challenge by hobby and novice roboticist Jeff Cicolani. Jeff had always been interested in robotics but didn't pursue the science until much later in life. In the summer of 2013 Jeff joined The Robot Group in Austin Texas to find some like minded indivisuals and to get a little guidance on this new path.
His goal was to build a fully automomous photography robot to hire out for events. Of course, not knowing all of the difficulties in autonomous robotics, he began building 'bots. In the Fall of 2014 he had the opportunity to write a product review for the Nomad robot chassis, part of the Actobotics line of parts and kits from Servocity.com. At the tail end of the review he decided this may be a great platform to begin his research into autonomous robotics. The article appeared in the March 2015 issue of Servo Magazine. The project was displayed at SXSW Create 2015 as part of The Robot Group's booth. But it needed a name. Having little time and even less imagination, Jeff simply named the project after the chassis from which it originated, Nomad.
Having built the initial frame and named the robot it was decided that the target date for the compeltion of the robot, or at least it's debut, would be the Autonomous Vehicle Challenge (AVC) put on by Sparkfun Electronics every year. Howeverm it was a little late in the season to enter it into the 2015 competition so he targeted 2016.
Origially Nomad was going to use an old Samsung Galaxy S3 phone running the Android operating system as the primary processor. The plan was to use the sensors on board the phone to control the robot through an Arduino based controller. The Arduino device would be developed as an accessory to the phone and connect via USB and a motor controller would be controlled by the Arduino. The contest, however, awards more points for not using GPS navigation. Since the GPS could not be removed from the phone and it would be difficult to prove it was disabled or not used for the contest, it was decided to move away from this plan.
An Intel Edison was then selected as the processor. The carrier board was, supposedly, Arduino compatible so a sensor shield could be connected directly to it, thus eliminating the need for another, separate, Arduino board. On top of that, the Roboclaw motor controller, from Ion Motion Control, could be used via USB, further distributing the processing. The next step was to determine the control software.
It was at this point Jeff met and recruited Bill Tyler, an embedded systems programmer, and Jarred Jasper, a systems engineer. With their help they were able to get ROS installed on the Edison and begin programming Nomad proper. Unfortunately they ran into challenges early on. In order to demo Nomad during development they wanted to be able to control it with a simple game controller, similar to an XBox conroller. This would allow them to develop the code for the motor controller and other systems and be able to drive the robot around. However, it turns out, the drivers necessary to use a controller were not available for the Edison and, after several weeks of looking for a solution, the Edison was abandoned for the better documented, and proven, Raspberry Pi.
The Raspberry Pi worked wonderfully for the initial experiments. But it turns out it would not be powerful enough to manage computer vision and run the robot. A second Pi was added to the stack, this time a more powerful Pi 2 (the Pi 3s would not be released for a few more months). Processing would be distributed across both boards, a simple task with ROS, and the Pi 2 would be dedicated to vision processing. For the initial experimentation and development, an XBox Kinect would be used for vision since it was available and produces both a depth map, for 3D mapping, and an RGB image for object recognition. Over the following months Jeff would work on getting the Kinect working. This did not happen, however. It turns out the packages for the Kinect in ROS would not compile properly on the Pi 2. A new solution would need to be found.
It was at SXSW Create 2016 that the solutuion would be solidified. In a fortuitous turn of events, The Robot Group was placed very near the booth from Nvidia, the makers of graphics processors. At their display they had an autonomous robot powered by the Nvidia Jetson TX1 using a Zed stereo camera from Stereo Labs for vision. The Jetson proved to be ample enough to process the stereo images into 3D maps and power a neural network. Jeff was sold. After SXSW he ordered the Jetson and the Zed camera for Nomad. The carrier board was significantly larger than the Pi stack and required a new box, but that was a fairly simple issue to overcome.
As of the launch of this site, and the writing of this article, the Jetson board has arrived and the Zed is on its way (they were a little back ordered). This site is all about the updates from here.