czech english

SICK Robot Day 2012

Eurobot like game on large playground

Fifteen teams participated in the contest organized by German manufacturer SICK. The surprising fact was that this year seven of them were from Czech Republic! The playground was similar to the one in 2010, i.e. arena 15m in diameter. The floor was covered with many white, yellow and green balls. The robots competed in triplets and their task was to collect balls of their own color and drop them in their homes. And how it went?

This article is a compilation of contributions from several participating teams. The content may evolve with the time, as new texts will arrive or some parts will be translated cs<->en. Please note, that currently the Czech and English version may contain information from different teams!

1st place: Redbeard Button, Parma

Team Redbeard Button from the University of Parma, developed their robot on a MobileRobots Pioneer 3DX platform.

Robot Hardware

The robot is equipped with Sick LMS100 and TiM300 range finders placed at two different heights, a Logitech C270 camera and a forklift for taking, carrying and dropping balls. Several alternative solutions have been considered for ball transportation.
The forklift has the advantages of avoiding sensor occlusions and of caging the ball during the navigation, although approaching, raising and dropping of the target require careful software management.

Control Architecture

The control architecture consists of independent modules, each devoted to a specific task. During development and testing, components have been added, removed or linked differently to reach the proper final configuration. The ball detection components merge laser scanner and camera data to estimate ball position and color. Target tracking is provided to address intermittent observations and failures of detection algorithm. The localization and mapping modules detect the pen location from laser sensor and use this information to estimate and correct robot pose.
Since the ball positions are ephemeral and may change during the race, the pens and the border of arena represent the only stable landmarks that can be exploited to localize the robot. The navigation component is responsible for the robot behavior in the different situations. It primarily guarantees motion safety by avoiding collision with other robots, obstacles and balls. Furthermore, it handles the different phases of the competition: the random exploration of free space, the approach and detection of the target balls, the navigation to the pen, and the dropping of the ball.

Discussion

The effectiveness of the developed robotic system has been proved by the 7 green balls picked and brought to the pen in the first round of Sick Robot Day. The robot has approached and caged only the green balls, also when balls of another color were in contact with the target balls. Furthermore, the robot has successfully reached the pen by exploiting the localization system.
The main pitfall has been apparent in the second round, when the robot wasn't able to successfully detect the white balls due to reflections and difficulties in the segmentation of such color. The lack of sensor data in ball dropping phase is the second limitation, which resulted in collision with the border and penalties. When the forklift is lowered, the laser scanner is partially occluded preventing the obstacle detection. A closer integration of sensor data could improve and overcome these problems.
Nonetheless, the experience is positive since the team has successfully addressed several design and development problems and has learned to iteratively improve the system from the experimental results.

Ball Collector 3000 (Osnabrück)

Our robot "Ball Collector 3000" was built and programmed by a joint team from the University of Applied Sciences Osnabrück and the University of Osnabrück. We used one of our Kurt differential drive robots as a base. A SICK LMS 100 laser scanner was mounted at a height just above the balls, so that only the boundary including the pens and the other robots were visible in the scan. The laser scanner was used for localization (based on the pens) and obstacle avoidance.
For ball recognition, we used an Asus Xtion Pro RGB-D camera. The depth information was used to remove all points below a certain height threshold; also, all points close to a laser scan point were removed to filter out robots and boundary points. The result was an RGB image which only contained the balls, on which a simple color blob detector was run. When a ball was identified, we approached it reactively and picked it up using our custom-built parallel gripper.
During the competition, our system worked quite well, although we had some problems in both runs. In the first run, we forgot to activate the laser scan filter, which meant that our robot ignored many balls just before it picked them up. In the second run, we had fixed that bug, so our robot successfully picked up some balls. Unfortunately, the localization which had worked so well in the first run failed us now, so our robot was solely relying on odometry; after successfully picking up some balls, it lost track of its own pen and started putting down the balls next to it. All in all, we are still quite happy with our robot's performance and with the 4th place we got!

Team HECTOR

Introduction

Team HECTOR (Heterogenous Cooperating Team of Robots, http://www.gkmm.tu-darmstadt.de/rescue/) Darmstadt focuses on navigation and control of autonomous vehicles within disaster environments. As a evaluation scenario, the RoboCup Rescue League competition is used, with the team being the highest scoring team focusing on robot autonomy in recent competitions (2nd Place overall, Best in Class Autonomy and Innovation Award at RoboCup 2012 Mexico for instance).
As a contribution to the community, stable and proven software modules are made available to the research community as ROS packages. The hector_slam system (http://www.ros.org/wiki/hector_slam) is successfully used by top teams at RoboCup 2012 and many other researchers around the world for example.

Hardware

For the SICK robot day 2012 competition, a modified TurtleBot was used. This platform is highly reliable and already provides navigation capabilities for environments like that encountered at SICK robot day 2012 (flat ground, capability to plan/move around dynamic obstacles).
The modifications to the original Turtlebot are:
  • Use of a Hokuyo UTM-30LX LIDAR as a sensor for localization/mapping/obstacle avoidance
  • Use of a Asus Xtion Pro Live camera as a sensor for ball detection
  • Addition of a ball manipulator using 4 Robotis AX-12 servos and glass fiber reinforced plastic grippers

Software

For providing basic navigation capablities, existing ROS tools (move_base, gmapping) already well tuned for the Turtlebot platform are used. Ball detection is performed using a hough transform based approach on the Asus sensor's images. Classification is performed by sampling points on detected balls. The ball percepts are handed off to a EKF-based object tracker that keeps track of them in the world frame, thus memorizing the position of all balls.
A state machine is used to visit pre-defined waypoints in the arena, instantly switching to a "grab ball" behavior once a ball with the correct color has been detected. After grabbing the ball, the ball is carried to the own goal, released and approach of the previously interrupted waypoint is resumed.

Results

The SICK robot day competition is very challenging, as there is only very short time for testing available at the location of the competition. High level behaviors were tested extensively in simulation (simulated SICK robot day 2012 arena is available here: http://www.ros.org/wiki/hector_gazebo_worlds) before, but lighting conditions made adjustments to the vision code necessary at the competition venue.
For the first run, size-based confirmation of ball detections was configured wrongly, so the robot did not detect any balls, instead following the preset waypoints as intended.
For the second run, ball detection worked, but work on the vision setup meant that no more time was available to properly configure the ball grabbing behavior. This lead to the robot not being able to grab balls, instead losing them when trying to grab them. The robot successfully returned 3 "phantom balls" to it's own goal however, demonstrating the working high level decision making and navigation.

Links:


ARSControl team

ARSControl
ARSControl
University of Modena and Reggio Emilia, Dept. of Science and Methods for Engineering, Reggio Emilia, Italy http://www.arscontrol.org
The robot was built on a MobileRobots Pioneer P3dx platform. The choice of using a commercial platform was mainly motivated by the better reliability offered, with respect to custom solutions. The platform has been however customized, in order to fulfill the task.
The sensor system was composed of:
  • a Microsoft Kinect RGB-D camera, mainly used for ball recognition
  • a Logitech webcam used for pen recognition
  • a Sick Tim310 laser scanner, used for obstacle detection Moreover, the ball catching device was developed from scratch: it is an aluminum ring, moved upward and downward by means of a servo motor, controlled by a custom designed microcontroller board.
The software architecture was also developed for the competition. The architecture is designed as the interconnection of modular elements, each of which is in charge of handling one of the tasks, e.g. navigation, obstacle avoidance, ball catching, ball recognition, etc. A supervisor was designed to handle the interaction among the modules. The software architecture has been implemented using ROS, building a node for each software module.
Unlikely, during the SICK Robot Day we had some issues with a usb-serial converter, which made it impossible for the robot to move from the pen.

KZS/Eduro Team

Martin Dlouhý

SICK Robot Day is really difficult contest, because there is no time for basic testing — all you can do is to tune/tweak some of the parameters and that's it. While in the home environment the odometry was sufficient for version 0, where our robot was able to collect even four balls without loosing track, in Waldkirch, due to slippery floor and many many balls, this did not work and navigation to home base via laser was critical.
This was the first mistake: integration of not well debugged home base detection. One error in transformation, which appeared only when the robot approached the home base under wider angle, and all the preparation work turned out to be useless.
The second serious mistake was in color analysis algorithm. The color was classified only to three categories: white, yellow and green. The missing case was „other”, which would mean that robot had seen floor instead of ball. On the other hand the floor had similar color to our yellow and white, so this detector would probably fail also.
The third error was in reactive navigation towards ball. This error showed up in the second round, when Eduro was circulating around green ball for several minutes before Italian robot „rescued” it by defining an obstacle and switching to obstacle avoidance mode.
The fourth weakness was caused by deformed holder for the pot. The robot would never get that close to the border of playground in the case of correctly working localisation, but … in the game it was so bad that it had goals outside the area and safety space was not sufficient.
It is true, that at least in the first round Eduro actions looked more interesting then its competitors. The second round was just resignation, when the correction near the home base was intentionally turned off and the consequences were expected.
Well, enough of complaining, now the positive part …
The basic Eduro construction was based on robotic arm from Field Robot Event 2012 in Holland/Floriade, where only robot Eduro finished Professional Task (autonomous search for selected rose, picking and return to the start with the rose). This worked so well that we decided to recycle this mechanism. We even kept the idea of the pot, but a bit bigger and mounted upside-down.
The very first experiments were already two months ago (see youtube video). It looked promising and the idea de facto remained unchanged. During first bigger testing with MART team we observed several situations when the robotic arm got stuck against the ball. We solved this later with loose attachment of the pot on the upper side instead of fixed attachment on the bottom. The pot was tilted and the balls slipped in it in the limit case.
more flexible attachment of pot
more flexible attachment of pot
This solution, as spectators could see, worked quite well, so software watchdog for this operation was never used. What could be improved in this maneuver was direction of consequent rotation. If we changed it to anticlockwise, we could get more of only partially covered balls. The obsolete reason for clockwise direction was minimization of required space during rotation, because the arm sticks to the space (and there is not enough room in the kitchen where most of the first tests were realized).
The development of Eduro went forward and the major challenge was in integration of two lasers and camera, while the system was still running on the linux 500MHz router with 256MB RAM. The sensor TiM310 is using USB and thus it requires more processor power. It was necessary to lower sampling frequency down to 5Hz (the program asked for the data as soon as it was ready for processing next step) and the same hold for LMS100 and IP camera. We were not going to analyse the picture (and after seeing the yellow and white balls on yellow-white floor I am glad for that decision), and the plan was to check the color in the image on particular place when the ball is detected in some trigger area.
It is easy to say, but much harder to implement. In particular you need very good synchronization between USB laser and camera. The system was so slow, that if you only load picture (while all the other processing was running) you will process one image per second! Yes, as Tomas from our team would surely point out, we could use only image sub-area and then we could get much higher sampling rate … but on the other hand we would loose overview information useful for debugging and we could not create videos like this one. At the end we decided to collect images without further processing and only after laser trigger we made „a photo” which was analysed. Note, that on the video we added extra image with red square, which is the area for color detection.
The most frequent error in this approach was that region for color detection (separate average of all RGB values), contained ball only partially. If we created this region too small you would see interesting things, in particular white balls will turn out to be blue sometimes! How is this possible? Well, all balls have the same blue SICK label on them!
The color classification algorithm was quite simple: if the green value is dominant (i.e. 1.2times bigger than R and B), then the ball is green. If both red and green are dominant, i.e. blue is again 1.2times smaller than R and G, then the ball is yellow. In all other cases it is white. Simple, but it had some weaknesses. In particular green had slight blue touch and with blue label G>1.2*B constrain failed completely. So we simplified the formula to G>1.2*R only. It would be nice to filter out the color of the floor, but that was quite random and sometimes even with green (!) strips …
We used VFH (Vector Field Histogram) algorithm for obstacle avoidance. It worked quite well, but again it required computational power. We turned on only in cases when an obstacle was closer than 1m, which was too late in the case of playground border. To distinguish between obstacles and balls we used hardware solution this time – two lasers mounted in different parallel planes above the ground. Before the robot started to navigate towards a detected ball it first verified that this object is not in IP laser scan. This worked quite well if Eduro did not change orientation too fast. The scans would be desynchronized and we could see phantom balls near start area (this showed in some test in Prague).
I would conclude with Achilles feet, i.e. home detection. Although it looks in laser scan quite trivial, to get robust detection is not trivial. In theory you should see Voronoi edge going though the home gate. This is still not fully implemented so we used something similar to Voronoi — growing wave. The search started with reduction of the scan to 5deg steps (nearest obstacle in the intervals). Then we generated wave with step 5cm, until a gap of expected with and depth was detected. The search for the gate in reality seemed to be working fine, but the major flaw was in robot position correction, as mentioned sooner.
Conclusion — after calming down of emotions, seeing of videos and reading of troubles of other teams I would say that it was not that bad. I will not repeat the phrase „one more day of testing would be enough”, because the only relevant answer to it is „well, you should start testing one day earlier”. Contest was perfectly prepared from the organization point of view and it was nice to see so many enthusiastic roboteers again .

Video


RoBohemia TEAM

Ondřej Vožda

  • Department of Measurement and Control - Brno University of Technology,
  • Lukáš Otava, Tomáš Grepl, Martin Mališ, Martin Tříska, Milan Papež, Ondřej Vožda
Our participation in this competition started completely accidentally. We had chosen this assignment to be our semestral project in robotics course at our university. In view of the fact that we are proud students, longing for representing our Alma Mater, we definitely could not have missed this opportunity.
Since the beginning of the project, we had to face many difficulties. In the end we were given time-proven four-wheeled chassis developed by department of robotics of our faculty. Significant financial help was provided by our generous sponsor – TIPA s.r.o.
Our robot (familiarly called The Terminator – certainly the most original name for a robot ever) was intended to be controlled by Toradex Robin, system with Intel Atom CPU. Lower (realtime) layer was implemented on STM32 F4 Discovery kit. To be able to sense his surroundings, we mounted two SICK laser scanners and one webcam on the robot.
As the summer went by and remaining time shortened, we had to revise our plans. Despite our best effort mapping algorithm did not map, computer vision did not see and regulators did not regulate. The only positive things during the hot summer were therefore sample balls provided by SICK that can be used as water sports equipment.
After many sleepless nights spent in the lab, we finally finished our work and hit the road to Waldkirch. Road itself was full of amusing stories but finally we managed to arrive at our hotel in Winden im Elztal. Local lounge was transformed into the testing arena and last night was used for final adjustments.
The D-Day came and after abundant breakfast we moved to Stadthalle in Waldkirch, where warm-up run was taking place. With significant help from our companion František Burian a few last modifications were made and we were ready for the first run.
We figured out, that despite being very powerful tool, HSV color space has slight problem when detecting a yellow ball on a beige floor lit by fluorescent tubes. Very thrilling (and funny for onlookers) situation came about right before the start of first run. Our notebook (substituting Toradex board) probably had not withstood all its responsibility and displayed a beautiful blue screen of death. After quick restart everything seemed to be in order and robot accomplished his run. However, it was not successful, because robot did not manage to collect even a single ball.
Unfortunately, second run was not any better. Despite another two hours spent with tuning algorithms, Terminator failed in collecting the balls again. On the other side, this run was much smoother than the first one and robot did not stuck anywhere.
Whole competition was a great experience for all team members. We have learned many things we hope can be useful when working on our diploma theses. I would certainly recommend our faculty to participate again but, since we are students of the 5th grade, probably with different team.

If you have any question or comment — contact us.