czech english

Field Robot 2014

The FRE2014 contest is over. If you want to see blog from preparation and results with videos, please check the Czech alternative of this article (or use Google translation). Presentation in English is different and the content should be similar to what you find in FRE proceedings.

This is our answer to organizers request:
Dear Field Robot Event participants, we hope that you enjoyed the FRE 2014, we did our best to make it as successful as possible. We would like to leave an informative footprint of the event through the "FRE 2014 Proceedings", so that researchers, future FRE participants and the general public could have access to our experience allowing the development of better prototypes.

Eduro Team

1 Introduction

Eduro Team participated on Field Robot Event several times already already ([1]]) so for year 2014 we planned to present something new and use FRE as demo/show opportunity. We planned to have walking robot (FireAnt) AND/OR flying robot (Heidi - Parrot AR Drone2) to do standard FRE tasks 1, 2 and 3. We changed our mind But shortly before the contest due to the updated requirements. It was mandatory to carry heavy (more than 1kg) GPS receiver in the Professional Task and the robot for all tasks 1, 2 and 3 had to be the same. Our fall-back was "Eduro Maxi HD", proved platform from 2010, 2012 and 2013 competitions. Never-the-less there was still space left to present FireAnt and Heidi in tasks 4 and 5.

2 Mechanics

Eduro Maxi HD is the prototype of three-wheel outdoor robot with a differential drive. It is a modular robotic platform for education, research and contests. It weights about 15 kg with the dimensions of 38x60x56 cm. HD version uses SMAC (Stepper Motor - Adaptive Control) drives with belt transmission. The power supply is managed by two Pb batteries 12V/8Ah. The brain of the robot is single board computer with AMD Geode CPU, 256 MB RAM, compact flash card, wi-fi, 3 Ethernet, 1 RS232 and 2 USB ports. The RS232 port is used for connection to CAN bus via RS232-CAN converter. The most of sensors, actuators and other modules (display, beeper, power management etc.) are connected to CAN bus, which forms the backbone of the robot. The more data rate demanding sensors are connected directly to PC via ethernet interface. Two main sensors were used for the Field Robot Event. The weed detection was provided by an IP camera with fish-eye lens. For obstacle detection, the laser range finder SICK LMS100 was used. The robot is further equipped with sonar, GPS and compass, but these sensors were not used in main tasks during the Field Robot Event.
Hexapod FireAnt and quadcoper Parrot AR Drone2 are commercially available platforms. FireAnt is a construction kit from Orion Robotics ([2]). It has 25 force-feedback servos and they are controlled by Arduino board combined with Servo shield.
Parrot AR Drone 2 ([3]) is very successful platform of remotely controlled quadcopter via WiFi connection. The manual control can be replace by autonomous behavior realized by standard notebook.

3 Sensors/3.1 Hardware

The main sensor on the Eduro platform is laser range finder SICK LMS-100. The scanning plane is at 30cm above the ground and it is slightly tilted so at distance 2 meters it sees the ground (on a flat surface). There was an alternative navigation solution, in the case of very low plants, using IP camera from Arecont Vision equipped with wide angle lenses. Finally Eduro has very good odometry and thanks to 3-wheel construction it is useful even in outdoor field terrain.
The Eduro is controlled via CAN bus accessible by bridge to RS232. The main computer is older router running embedded Linux OS. Note, that Eduro has more sensors (magnetometer, sonar) which were collected data but not used in algorithm.
FireAnt had only touch/force sensors this year. Each servo reports its own position and current applied force at 25Hz rate. The development was temporarily paused but you will hopefully see it on FRE2015 again.
Heidi (our Parrot AR Drone 2) robot platform is surprisingly rich of sensors (taken into account price around $300). It has two cameras (front and bottom), 3D IMU (accelerometers, gyroscopes and magnetometer), sonar for height measurement, temperature and pressure sensor. The quadcopter uses 1000mAh, 11.1V LiPo batteries.

3.2 Software and strategy

The software is written in Python programming language and is freely available on GitHub web site ([4]). The Eduro code is very similar to the old code from 2010. The main changes were in improved direction estimation (Advanced Task with holes in plants rows), navigation at the end of rows and camera and GPS processing in Professional Task.
The primary navigation sensor was laser range finder. Array of 541 readings, corresponding to 270 degrees range, where first grouped by 5 degrees. In every group shortest reading was used (rejected were 0 readings = no reflection, or strong sunlight) to represent given angle. Finally experimentally approved threshold 1 meter was applied for decision if is in given angle obstacle/plant.
The algorithm task was to find a gap of given size. The processing was almost context free except variable containing previous position of found center. New direction was set in the middle of the gap. If the gap was too wide (missing plans or end of field) then alternative strategy was used (see below).
During testing we find out that the data from odometry/encoders are superior to compass/magenetometer readings. Because of that we deleted code using compass azimuth as reference and replaced it by "dynamic reference point" which was delayed by 2 seconds (see self.poseHistory in code). This was mainly used in cases when gap did not have desired width or where more then one option was available.
The second major change was navigation between rows at headland. The idea was to "follow a wall" on the left or right side (depending on the first turn) and count row openings. This could work for long transitions but was not reliable due to the uncertainty of robot initial position. Also the plans were hardly visible on the robot side (they were smaller than 30cm, at least on semi-damaged testing field). So at the end we used our former code with odometry only and better calibrated transition to 2nd and 3rd row.
Quite a challenge was integration of RTK GPS receiver. It is very difficult to get it working, parse data, integrate it to the main code, and test it all in 20 minutes! But it can be done . It is true that we had code tested with standard GPS and already verified USB-serial converter. We managed to do two tests in the field where the first onw was just to log data and the second run was already integration to the system.
We should also mention code for yellow balls recognition. Yellow is not simply distinguishable color (you can see it on dry leaves, for example). We classified each pixel separately and counted number of yellow pixels, its mass point and variance. While other teams modified their robots due to strong sunlight we did changes only in software. Basically instead of yellow we were looking for white (well, we did not realize that at start position we will see white start line). We set all filters to extremes to reduce the false detection - yellow was detected only in a small region, there was both lower and upper limit for number of yellow pixels, and they had to be very very bright.

4 Conclusion

Eduro Maxi HD was not the fastest robot but it was one of the most precise and most reliable. With a bit of luck this was enough to score well in all main FRE2014 tasks: it reached 2nd place in Basic Task, 2nd place in Advanced Task, and 1st place in Professional Task. As consequence we won, together with Phaeton team, the main FRE2014 contest.

5 References