czech english

SubT STIX in Ten Weeks

Robotika.cz team

DAPRA organizes training rounds called STIX (SubT Integration Exercise), where the first one is in April 2019 in Colorado. Although the STIX is for System Track only we plan to work simultaneously also on Virtual Track. This is English blog for the next three months … Update: 14/2/2019 — Return strategy ver1

Week 1

23rd January 2019 — official announcement

Last night there was an official DARPA announcement of qualified teams for STIX (SubT Integration Exercise):
So now it is suitable time to start the blog and report status. Note, that there was a relatively long „prelude” period, which you read at Rules and Prelude (both in Czech).
The Robotika.cz team is based on several former Robotour teams:
  • ND Team
  • Eduro Team
  • Cogito Team
  • Short Circuits Prague
and members of robotika.cz. We suppose that it will evolve as the DARPA SubT Challenge is scheduled for several years …
For STIX in Denver/Colorado we plan to use robots Robík and Eduro:

24th January 2019 — Štola Josef

Today we visited Štola Josef (tunnel Josef) as potential place for testing and preparation for STIX in April. It was „joint event” with the opponent team CRAS (Center for Robotics and Autonomous Systems), which is the 2nd Czech team qualified for DARPA SubT Challenge STIX. The goal today was to discuss the possibility and conditions with the tunnel owner.
It was definitely worth it, although there were so many failures starting with dead battery in the car, bad contact number, dead battery in my camera (I am used to that one, so I had spare), but also dead Eduro computer, which I really did not expect and I did not have backup. Next time. It was also freezing and the entrance was decorated with jaw-like stalactites.
The tunnel was very nice and it could surely imitate Edgar's mine in Colorado. It was actually too nice (during summer it is open for public), the rails were below floor surface or covered by stones, clean, dry. We learned that dripping water is typical as well as mud or small pools.

27th January 2019 — STIX Artifacts Specification

The teams for STIX in April 2019 already received a list of five selected artifacts, which will be used. They are grouped into common artifacts for all three events (Tunnel, Urban, Cave) and event specific.
STIX Artifacts
STIX Artifacts
Drill
Drill
Fire extinguisher
Fire extinguisher
Backpack
Backpack
Cell phone
Cell phone
Survivor
Survivor
This selection is good for us — our qualification video used trained network for backpack, person/survivor and fire extinguisher. The processing was relatively slow, so we may start with „detect red pixels” as our first submission to Virtual Track.

28th January 2019 — Virtual Track - narrow passage

There is a very narrow passage in the virtual environment of qualification world. It is on the left side behind the fire extinguisher (our first successfully reported artifact). Yesterday I placed robot X2 at -x 158 -y 139 -z -14.8 in order to examine LIDAR data and check if the passage is passable with the wheeled machine … and it is!
The robot actually went through, turned right and followed the wall until it has seen red backpack, and because I did not change the strategy, it turned 180 degrees and tried to return to the original start position (via measuring distance traveled). The way back did not work. After 3.5 hours of simulation I could see on terminal that the pose did not change:
…
3:29:00.023172 (3.8 32.2 0.0) [('pose2d', 452), ('rot', 453), ('scan', 181)]
3:30:00.016049 (3.8 32.2 0.0) [('pose2d', 458), ('rot', 457), ('scan', 183)]
3:31:00.060268 (3.8 32.2 0.0) [('pose2d', 448), ('rot', 448), ('scan', 179)]
After review of the output (time, (X, Y, Z) position and statistics of received OSGAR messages in the last minute) I could see that the artifact was found after 2 hours:
…
1:59:00.000973 (54.7 61.0 0.0) [('pose2d', 418), ('rot', 418), ('scan', 167)]
2:00:00.041863 (55.6 61.0 0.0) [('pose2d', 419), ('rot', 419), ('scan', 168)]
Published 106
Going HOME
2:00:02.348407 turn 90.0
2:00:18.360198 stop at 0:00:00.389300
2:00:18.360198 turn 90.0
2:00:34.706420 stop at 0:00:00.670223
2:01:00.038746 (55.2 61.2 0.0) [('artf', 1), ('pose2d', 405), ('rot', 405), ('scan', 162)]
2:02:00.028133 (54.3 61.2 0.0) [('pose2d', 419), ('rot', 419), ('scan', 168)]
2:03:00.039347 (53.5 61.4 0.0) [('pose2d', 426), ('rot', 425), ('scan', 170)]
…
and the robot was stuck 18 minutes later:
…
2:14:00.054387 (3.9 45.2 0.0) [('pose2d', 400), ('rot', 401), ('scan', 160)]
2:15:00.018687 (3.9 37.4 0.0) [('pose2d', 388), ('rot', 388), ('scan', 155)]
2:16:00.010633 (3.8 32.7 0.0) [('pose2d', 406), ('rot', 405), ('scan', 162)]
2:17:00.040365 (3.8 32.2 0.0) [('pose2d', 423), ('rot', 424), ('scan', 170)]
2:18:00.053812 (3.8 32.2 0.0) [('pose2d', 457), ('rot', 457), ('scan', 183)]
…
The position was at the entrance to narrow passage but on the other side. It seems to me that it is asymmetric, and there is different slope on the left and on the right side of the entrance.
Another reason could be simulation timeout (20 minutes of simulated time), because I had to change distance to the wall from 1.5 meters to 0.8 meters and so the robot moved automatically slower.
Finally a detail, which I realized this morning … I did not change the parameter for wall following for the way back so it would not pass through anyway … I guess.
Note, that the surface is variable 3D terrain so to properly decide what to do with 2D lidar scan only is a challenge:


Week 2


1st February 2019 — Eduro and Robík testing at home

We did a couple of test of Eduro and Robík robots at home. I took Eduro to dark basement, where are many storage places and narrow passages and PavelS repeated tests with osgar.go1m code to verify OSGAR driver functionality. Both log sets are available for download: robik-190130-hallway.zip and eduro-190129-basement.zip (my bad with the zip file name for Robík, should be 190129 as log files inside).
If you would like to visualize the data you can use osgar.tools.lidarview which now provides visualization of scan, pose2d and lidar data (#132).
It did not work very well (probably no surprise). The entires were too narrow so Eduro decided to turn around instead. What was worse it stopped in the corner without further activity — that has to be fixed (we probably have similar issue in the Virtual Track).
Test of Robík was slightly better. We learned that it is necessary to wait for LIDAR until the rotation speed stabilize and data with orientation are ready. Also ramping the speed was not properly working in the low level SW, so we may need to add wrapper in OSGAR driver. Tonight we plan to test and integrate OpenCV driver for camera (#134) and then we should be ready to use the same high level SW for both our SubT robots.

3rd February 2019 — STIX Group Assignments

Yesterday we received e-mail STIX Group Assignments. where my first reaction was „proč zrovna my jsme Béčko” (why we are the B-team), inspired by Jára Cimrman/Posel z Liptákova. We decided to buy plane tickets on 31st January, because we were warned that it was Lufthansa action price and it will change on 1st February … and it went up by 5000 CZK, but … never mind. I no longer complain that we will test in Colorado with the second Czech team CRAS … it could be fun.

Group 1

  • CSIRO
  • Pluto (Pennsylvania Laboratory for Underground Tunnel Operations)
  • CERBERUS (CollaborativE walking & flying RoBots for autonomous ExploRation in Underground Settings)

Group 2

  • MARBLE (Multi-agent Autonomy with Radar-Based Localization for Exploration)
  • CRAS
  • Robotika

Group 3

  • CoStar JPL/Caltech/MIT team-CoSTAR (Collaborative SubTerranean Autonomous Resilient Robots)
  • Explorer
  • CRETISE
p.s. as I was looking through the list of teams to find some URL there was often funded by DARPA, so I check older announcements DARPA Selects Teams to Explore Underground Domain in Subterranean Challenge (2018-09-26):
DARPA has selected seven teams to compete in the funded track of the Systems competition:
  • Carnegie Mellon University (Explorer)
  • Commonwealth Scientific and Industrial Research Organisation, Australia (CISRO)
  • iRobot Defense Holdings, Inc. dba Endeavor Robotics (CRETISE)
  • Jet Propulsion Laboratory, California Institute of Technology (CoStar)
  • University of Colorado, Boulder (MARBLE)
  • University of Nevada, Reno (CERBERUS)
  • University of Pennsylvania (Pluto)
… so it is 7 sponsored teams + 2 Czech teams … hmm … interesting.
The Carnegie Mellon team, including a key member from Oregon State University, is one of seven teams that will receive up to $4.5 million from DARPA to develop the robotic platforms, sensors and software necessary to accomplish these unprecedented underground missions. (source)


Week 3


7th February 2019 — OSGAR v0.2.0 Release

Yesterday we released OSGAR v0.2.0. It contains all HW drivers necessary for both Eduro and Robík robots and new osgar.tools.lidarview for log files analysis and debugging. There are also several small features like base class Node for easier creation of new nodes and drivers, environment variable OSGAR_LOGS for definition of logs storage place, and launcher for faster application prototyping.
From the programmers point of view LogReader is now directly iterator so it can be used as file open():
for timestamp, stream_id, data in LogReader(filename, only_stream_id=LIDAR_ID):
    print(timestamp, len(data))
There were also couple of tweaks motivated by SubT Challenge, primarily by Virtual Track. As the simulation can take several hours there is a fix for time overflow (resolution in microseconds, the limit used to be 1 hour). Also TCP driver now supports various options necessary for communication with ROS. The release of node ROSProxy implementation was postponed to the next release, mainly with the goal to simplify necessary user configuration.
The most visible progress is on lidarview:
If you would like to replay this test, log file is available here and you can visualize it with:
python3 -m osgar.tools.lidarview subt-190204_192711.log –lidar
    rosmsg_laser.scan –camera rosmsg_image.image –poses app.pose2d
You can use 'c' to flip camera on/off, +/- for zooming and SPACE for pause. A simple map from point cloud is displayed in camera off mode.

9th February 2019 — Virtual Track - two X2 robots

Last night we run simulation with two X2 robots (small and fast ground vehicles). There was a need for small tweaks to run OSGAR ROS Proxy in two instances (extra ports) but at the end it worked! And it was definitely more interesting to watch than TV … but everything is, so no big deal.
The strategy was simple: one robot (X2L) followed the left wall while the other (X2R) followed the right wall. The code was identical to version 0 — follow the wall until you find an artifact, turn 180 degrees and follow the other wall for previously measured distance. This way the robot should return close to starting area and report to Base Station should be successful.
After few screenshots, I went to bed … and one robot finished simulation in almost 3 hours, while the other in 5 hours. Both were convinced to reach the start which was not quite right in the reality … and now it is time to review the log files.
Well, the left robot went fine except when it reached the start area the entrance was detected as wall (maybe bug in the world?) and returned couple of meters. But the base station was close so the report was delivered (theoretically).
The right robot „overlooked” the valve artifact and continued to red backpack. But it did not make it back probably due to simulation time timeout 20 minutes.
It was interesting to see with a generated map how it is difficult to judge if a tunnel is blocked (dead-end) or only the LIDAR view is temporarily blocked by robot tilt.
On the generated map you can see that the integrated error is not too big, approximately 1 meter, and thus it is sufficient for report with 5 meter precision radius.
BTW why are we „wasting time” with virtual track when we have sharp system track deadline?! Well, we will probably use the same or very similar strategy and because the robots are not here at the moment, we can at least prepare for virtual track qualification (locate 8 artifacts in 20 minutes of simulation time by the end of April 2019).
p.s. after Jirka parameter tuning both robots returned to the start area, but … somehow it looks like they „hit the wall”
Note, that X2L simulation took only 47 minutes to return with artifact position while X2R timeout in 6 hours!? Well, the X2L went straight until classified the other robot as an artifact and returned home.


Week 4


12th February 2019 — JPEG Red Alert!

We played with artifacts classification over weekend and when we tried to replay detection of fire extinguisher it failed:
M:\git\osgar\examples\subt>python -m osgar.replay -module detector subt-190208_200349.log
b"['subt.py', 'run', 'subt-x2-right.json', '-note', '2x X2_SENSOR_CONFIG_1, rig
ht, commented out asserts for name size']"
['app.desired_speed', 'app.pose2d', 'app.artf_xyz', 'detector.artf', 'ros.emerge
ncy_stop', 'ros.pose2d', 'ros.cmd_vel', 'ros.imu_data', 'ros.imu_data_addr', 'ro
s.laser_data', 'ros.laser_data_addr', 'ros.image_data', 'ros.image_data_addr', '
ros.odom_data', 'ros.odom_data_addr', 'timer.tick', 'tcp_cmd_vel.raw', 'tcp_imu_
data.raw', 'tcp_laser_data.raw', 'tcp_image_data.raw', 'tcp_odom_data.raw', 'ros
msg_imu.rot', 'rosmsg_laser.scan', 'rosmsg_image.image', 'rosmsg_odom.pose2d']
['image'] ['artf']
{24: 'image'}
{4: 'artf'}
Exception in thread Thread-1:
Traceback (most recent call last):
  File "D:\WinPython-64bit-3.6.2.0Qt5\python-3.6.2.amd64\lib\threading.py", line
 916, in _bootstrap_inner
    self.run()
  File "M:\git\osgar\osgar\node.py", line 40, in run
    self.update()
  File "M:\git\osgar\examples\subt\artifacts.py", line 42, in update
    channel = super().update()  # define self.time
  File "M:\git\osgar\osgar\node.py", line 32, in update
    timestamp, channel, data = self.bus.listen()
  File "M:\git\osgar\osgar\bus.py", line 81, in listen
    channel = self.inputs[stream_id]
KeyError: 4
Hmm, that is really strange — there was no artifact reported at that time?! Note that record & replay is the key feature in OSGAR: you record your logfile as the robot navigate and later, typically on your notebook, you can analyze various situations „what happened?”. The prerequisite is that the replay is identical to original run. In this particular example the simulation of SubT Virtual Track was on different notebook with Ubuntu and ROS.
The simplified artifact detection is split into two phases:
  • search for sufficiently red objects
  • if you find local red maxima run classification
So what happened was that the redness of the JPEG compressed image was different on each machine. If we classify pixels as RED then their count differs!
It is yet another detail you should check before you deploy the code on the target machine. Surprisingly newly added unit tests for counting red pixels in uncompressed JPEG image pass on machine with GPU support (our primary simulation platform).
This has to be investigated and we surely have to check also installations on Eduro and Robík robots.
Fire Extinguisher
Fire Extinguisher
Backpack
Backpack
And one joke at the end: this is recording from the run when two X2 robots met. You would guess that they are blue, but they are obviously not.
p.s. note, that these are 1:1 output images from ROS simulation, topic /X2/camera_front/image_raw/compressed with X2_SENSOR_CONFIG_1 and resolution 320x240.
p.s.2 Loading jpg files gives different results in 3.4.0 and 3.4.2 … and yes, the versions are different 3.4.4 on ROS notebook, 3.4.0 on simulation machine with GPU and 3.3.0 on my old notebook …

14th February 2019 — Return strategy ver1

Yesterday we tried to complete Virtual Track with two robots X2 returning after detection of the first artifact. If I ignore some some typo detail both robots returned and send these reports:
TYPE_VALVE 231.74 -60.89 -14.80
TYPE_EXTINGUISHER 157.37 138.87 -15.09
But the Base Station did not accept it for some reason, so it is still work in progress, again.
Last night we also discussed the faster return strategy. Current version 0 simply follows the wall back for given distance traveled. This means that if there is some dead-end branch robot will go there and repeat the same trajectory.
In version 1 we plan to use shortcuts: There must be a pose from the way there (if not you are already at the start), so pick the nearest/oldest one within given radius (say 2 meters). The odometry corrected from IMU should be good enough (must be for the basic mapping of the artifacts) so it could help the robot navigate back. Moreover we may use VFH (Vector Field Histogram) for navigation to avoid potential shift and the old poses would define direction how to return HOME with the shortest path, right?
Do you like it? Do you see some weak point? Well, do not worry, we will share it with you as we „hit the wall” …
p.s. note, that this strategy can be used for both Virtual as System Track