four-wheels for SubT Challenge
Freyja is four-wheels differentially driven robot, which turns with skidding.
It is work of Swiss Cogito Team, which is one of the „organisations”
involved in SubT Challenge Tunnel Circuit. Freyja was growing up for 3
months from nothing and in one person it is almost unbelievable, that it can be
done … so it would be pity not to document it!
It is necessary to consider the very limited time frame in which Freyja was created —
decision was made after our return from SubT
STIX in Idaho Springs / Colorado
in April and by the end of June it had to
qualify (demonstrate an autonomous ride, show functionality of manual and remote
EMERGENCY STOP and demonstrate the passage through a narrow space). The competition itself begins
on 15 August 2019.
Freya: as of 27.5.
Freyja was born. Its weight (without batteries) is 25 kg. A little can be shaved off by
removing some electronics (pc, stereocamera) and placing it into the backpack (see attachment).
The design is mostly completed. A couple of components such as SICK lidars are missing but
they are not required for homologation. The cable is not ready. Batteries are not yet procured.
The good news is that it can withstand weight of a person. The bad news is that
combining the selected tires with skid-steering was not thought through. So now
I am afraid not just of it ripping apart but also of turns. I almost want to redesign it
as an articulated robot kloubak
, but in reality there is no risk of that given the time constraints
Also, it wouldn't fit into my luggage.
Zbyněk: Wow. That's pretty impressive! Have you sent the link where you bought its
PavelS: It is a pity that no one has learned from Robík -
I wrote and spoke a lot regarding problems with skid-steering,
most of the participants saw it on their own eyes ...
"Nobody learned" is a bit of an extreme statement. I took note of it and made design changes, for example the wheels are rounder in the transverse direction.
The frame is designed for excess strength. Hence the fear of tearing. And I regret that due to the luggage contraints the robot can't be wider to have better grip.
On the other hand, despite repeated attempts to find out exactly what the
problems are I was only able to find out that "I would not again design a
skid-steering robot"," Eduro
had no chance because of
the caster "and other advice on what NOT to build. Just yesterday: "I wouldn't
(yet) duplicate Kloubák". I can combine all this knowledge with my own
experience - for example that to design a suspension for a heavy robot is not
easy. In such circumstances I have two options: Do nothing, or take a guess and
choose a variant that may be functional and that I may be able to
After all, I rode with both "mob" and six-wheeler, so I have some experience with skid-steering robots. Which is why my previous designs didn't involve skid-steeing.
If I guessed wrong, "do nothing" may still very well be the path I ultimately choose.
I'll live with it.
MD: Well, it looks like a flame ... :(
So far, as far as I am concerned, Robík was the only robot that was able to navigate more complex terrain -
and that is still true. Further, it's still the "simplest design" - I would stick with that assessment too. Other teams chose this design as well so it is not completely misplaced (CMU, Colorado / Husky, now
CRAS). Even after a month, Kloubík does not go forward on a straight line and it is quite possible that it will end up being reconfigured as Robík.
I disagree with PavelS (we've heard it many times, but in practice it
used to be better) ... and for the reasons I saw back then I would have chosen it again (see my
older emails about proposing to buy it). I will change my mind once someone builds an alternative platform which within 30 days will be able to handle the terrain in the tunnel.
So for the time being I have no problems. Just fears. So it is hard to describe the problem.
One concern is that at this point the grip on the smooth floor is such that no skidding
will take place. Not at all.
The best way to describe it would probably be by showing a video from the perspective of an outside observer in
a situation where a skidding robot is in trouble. Otherwise, as he says, it is quite a vague description.
A somewhat agitated debate where we clarify our views and assumptions is useful. If we were only to scream at each other and repeat the same thing over and over again,
then there would be no point.
At the moment I see it pretty much like Martin — it was the only platform
which had some achievements and has a non-zero chance of doing it again. And when tanks use this method,
it must be possible to make it work. I don't have time for inventions. In the same time I am aware of the risks, and
therefore, all my participation is conditioned on making it "run a meter and then turning". If the robot can do this,
it will be able to do the other things too. If not, it was fun and I have a lot of new parts for my old
Freyja: Status to 2.6.
The light is on, the wheels are spinning (in the air). I.e. the base cabling is complete. Including the
Scheduled for the next week: Drive it manually with a joystick. And if I am really lucky, getting the remote stop to work.
MD: When you finally place the robot on the ground (I am sure you've done so already), do you want to just shoot a video with the STOP button - manual and remote (Ctrl + C)?
If you file it as unlisted "" or something like "Robotika / Freyja SubT Tunnel
Circuit Qualification ", I can forward it directly""
No, I didn't put it on the ground yet. It's too menacing for that But driving it with a joystick while in the air seems to work. It is a bit disappointing that I can't (yet) feed the controllers faster than 20 Hz. The pleasant surprise is that "odrive" (already?) supports
watchdog, so when I implement it on my side, the engines should stop maintaining speed after CTRL + C is pressed.
To qualify, do you really just need a red button and ctrl + c or do you need
the remote e-stop as well? I assume that joydrive is acceptable for homologation
given that some of the teams do nothing else in the tunnel. Do we need to pretend object detection?
MD: no need to pretend to detect an object for sure (we've already done that at STIX
and once is enough per team). e-stop is debatable but I will tell them that we use
unified SW and if they insist, they can send two samples to Switzerland
(I suppose they won't insist after that). What we talked with Zbyňek, the kill switch is probably primarily for drones and "dangerous monsters" and Freyja
looks friendly. But the earlier I send them the "homologation videos", the sooner
I will be sure that they won't bother us with the E-stop.
Freya - first videos (milestone set) (7/6/2019)
Martin, tell me if the first video suits you.
The second one shows that on four wheel vehicles it will probably be better to measure odometry as a minimum from the
encoder on one side instead of the average of all of them. This seems obvious, but it's implemented wrong in the code. PID is not tuned, the electrical current limit is low, so I wouldn't make big conclusions based on the second video.
I even saw it make a turn (milestone reached!) But because of the rubber and weak PID it
was either a very uncontrolled slip on that smooth floor, or a really wide turn
outdoors on asphalt. I also rode on it for a meter
MD: :) I was just startled by the darkness in the first half of the first video -
the lights just go off when you press the STOP button. Hopefully they will understand that the second STOP
it is from the keyboard (you may have commented on it, I watched it without sound). I will send it and
we will see
Good idea, but did not comment. "Remote stop" is in the caption below the video.
PavelJ: Nice :-) I just don't want to believe it's just 45cm wide. It appears more massive.
46 cm is a more accurate number. But almost half of it is wheels, so that's why
it looks a little wild. The body is narrow. 24 cm could be the width mobos
one wheel, right?
And now I'm going to reinforce the wheel attachments because they got loose :-/
Maybe this is my "low week". Paradoxically, because the development takes place precisely
as expected :-) No miracles are happening.
Freyja: as of 16.6.
Driving it with joystick works including logging data from cameras and stereo cameras.
USB is reaching the limit of its capabilities or just crossed them. Each device requests
certain transmission speed and each of them asks for the upper bound of what they may need. Given four
cameras, two stereo cameras, a joystick and the wifi module it all adds up. The USB then refuses to
cooperate. Kayeton cameras
simply require the maximum bandwidth available which is the final straw : - /
Custom fork of uvcvideo drivers, which for compressed stream cameras (i.e.
kayeton) makes its own estimation of the data transferred, whereas for uncompressed stream
(Stereo Mynt Eye) respects the device's override. I also attached the
joystick and the wifi in various ways to make sure they are on other USB "bus".
I haven't yet connected IMU and Franta's LoRa module
. They also attach via USB: - /
Possible future solutions:
- Get rid of the USB wifi and Wifi router instead of the current silly Ethernet switch. Or try to get virtual wifi going on an integrated wifi module.
- Maybe we could add RaspberryPI or something like a USB proxy, but I don't really want to. Among other things, because it would not fit.
- Reduce FPS cameras from the current 30 Hz to 15 Hz.
Another unfortunate consequence:
Looks like the USB controller is warming up so much that even the CPU next to it complains of
high temperature. And I haven't started any processing yet.
I log about 3 GB per minute. I.e. The disk drive I use can store only data from two hours of driving and even the largest disk I have laying around does not fit dazzlingly much.
MD: good god! :).
3GB / min - do you have some " - -stat", how much each process uses up? I assume this is
without lidar, right? 4 cameras, 2 stereo cameras?
Excellent question. I added a log of statistics to the logviewer. It looks like approx
half of the data is the depth map from the front of the stereo camera. So when I include the rear one, I'm screwed. But the good news is that the depth is at the moment
meters in floats. When I change it to centimeters or millimeters and protobuf saves it as varint, it should decrease drastically (to about half?). For the price of more
demanding serialization / deserialization. The bad news is that the rear stereo camera
has almost twice higher framerate. Maybe we need to look for compression
algorithms for depth maps.
Four minute log (hm, so it's more than 3GB / min outside):
.configuration: 1.83 KB
camera_back.jpeg: 1.34 GB
camera_front.jpeg: 1.57 GB
camera_left.jpeg: 1.54 GB
camera_right.jpeg: 1.32 GB
freyja.buttons: 163.86 kB
freyja.pose2d_info: 78.62 kB
freyja.velocity2d_info: 69.56 KB
heartbeat.heart_beat: 154.78 KB
joydrive.beep: 217.01 KB
joydrive.velocity2d_cmd: 239.72 KBs
joystick.joystick: 683.92 KB
selfcheck.LoggingInfo: 392 B
stereo_back.jpeg: 1.04 GB
stereo_front.depth_map: 6.20 GB
stereo_front.jpeg: 545.89 MB
stereo_front.depth_map: 6.20 GB
is it a zipped output? or an unpacked array? (working with Husky or the virtual track I couldn't survive handling the depth map without compression) :(
ps assume you use your own code for all the logging, right?
Unpacked array. I wouldn't expect much from the zip compression. You found it helpful? If I store integers and subsequent values are stored as a difference from the previous value, then due to the high "flatness" it could produce a very small number of
different values and their combination and then the zip compression could work well.
Yes, they are all protocol buffers.
Zbyněk: Yeah, but zipping it isn't a good idea CPU-wise. The only thing
worked somewhat reasonably was lz4. Unless you had an unused core that
could take care of it.
$ ls -lh / tmp / depth
-rw-r- -r- - 1 Jirka Jirka 6.2G Jun 17 18:57 /tmp/depth.log
-rw-r- -r- - 1 Jirka Jirka 898M Jun 17 19:11 /tmp/depth.log.gz
When gzip -9.
So there would be room for improvement. Regarding cpu, it may be worth considering to
log the data uncompressed and then packing it afterwards.
And logging it in centimeters is even smaller.
$ ls -lh /tmp/depth_cm.log
-rw-r- -r- - 1 Jirka Jirka 2.0G Jun 17 20:12 /tmp/depth_cm.log
-rw-r- -r- - 1 Jirka Jirka 522M Jun 17 20:32 /tmp/depth_cm.log.gz
Freyja: as of 24.6.
I was slowed down by a cold. And my daughter keeps throwing up. And on top of all that we have some 60 people at work coming for a visit from all over the world: - /
In theory, I have programmed a drive following the right wall according to the stereocamera. That should be sufficient for homologation.
In practice, it did not drive more than three meters on the balcony, where there is not much space.
At home, the stereocamera has problems with a not very textured, highly reflective floor. In is possible, that at the test polygon (ie in the basement) it will be similar.
MD: Do you have a lidar involved? Wouldn't that be easier? OSGAR on Freya is probably still
not available, right?
Lidar is not involved and will not be by the end of the week. And yes, it would be easier for the drive along the wall.
SubT Qualification - Robotika Robots Freyja and Kloubak (28/6/2019)
Thanks for sending Martin. We'll go ahead and accept these videos as
sufficient for qualifying the two additional platforms. We would still be
interested in seeing any other videos, but nothing else is required for
bringing your platforms to the Tunnel Circuit and competing with them.
Martin, can you send them a new video?
@ "piece of cake": You don't even know how I froze when the next turn would not succeed
Freyja: as of 30.6.
What was the problem:
- Sometimes (I suspect high cpu or motherboard temperature) odrive does not start and the log contains "not enough bandwidth" messages. I filmed the homologation videos with the cameras off.
- Even though everything runs (including cameras), the drive seems sluggish.
- Driving around a corner without a map, when the stereo camera looks just forward and can't see the corner.
- The compass sometimes jumps sixty degrees at a time. For now I ignore such jumps but they should not occur.
What worked nice:
- Freyja forceful turns no longer loosen the wheel mounts.
What I have planned for the next week:
- Not much. My parents are visiting.
- If I find some time, I will be either solving some of the known problems or I will look into the suitability of the Jupyter Notebook as an environment for the operator. Running Python on a robot and accessing it from a browser seems like an interesting option.
Freya: as of 7.7.
- SICKs are attached and connected. They're producing (somewhat strange) data.
- I added a 1.8 TB hard disk formatted as ZFS with compression. An estimated 2.5 - 3.5 TB of log (~ 13 - 19 hours of driving) should fit on it.
- SSD is so fast that the ZFS driver can't manage the data inflow, crashes and takes the system down with. There were no error messages I could use : - / The solution was to switch ZFS from asynchronous to synchronous mode. I don't know why. Lucky guess.
- I can change SICK's IP address even without magic applications.
- SICK Sopas and App Studio for Linux do not exist and do not run under Wine. Without the App Studio, I cannot package their Luu properly. I don't plan to worry about this.
- The measured distances from both SICKs somehow "pulsate". The whole thing oscillates about ten centimeters back and forth. If you've seen it before, let me know what to do with it.
Remains to be added:
- Critical hw: Remote shutdown.
- Noncritical hw:
- Better side boxes with on / off switches, emergency stop button, cameras and lightbulbs. Cameras and bulbs stick out and suffer from friction when carrying the robot around and on impacts into walls.
- Lora. Although I still don't know what we're going to do with it.
- Sw: Everything.
- Organization: plane tickets, batteries for USA.
Franta: Re LoRa - of course it could be used to some coordinated
exploration (swarm, etc.), but even without this, robots should at least
broadcast found artifacts and position, speed, orientation, etc. and be on receiving end for
stop, turn, return to base, go to x / y coordinates
Sounds nice. And it has a few weak spots:
- We don't have artifact detection, and we're just talking about setting up a coordinate system, so there's nothing to report.
- The boxes are with you, so no one can integrate them into either Osgar or Erro. So there is no * where to report.
- We don't have a Control Station and nobody's programming it. So there is nowhere to report.
- The operator doesn't know the robot's status, so * why * would he call it back?
- * Who * program this and * when *?
If there is a plan for these things, I want to hear it. I don't know there is. If it is
a long-term vision, not a short-term plan for Pittsburgh, that's different. Then
I would postpone this whole discussion to after Pittsburgh and resolve it properly.
Freyja can drive through a door (open )
A lot of sensors are not yet integrated, so a big part of that debugging
video is black. By the time I start using ima, I should be able to detect stereo obstacles
closer to the ground. I also removed the side lamps because they are in the way.
I want to redo them. And I will have to shade the front and rear lights to keep them from shining
directly to the cameras.
However, I expected that my first big robot to successfully pass through the door
will be Clementine. I wasn't able to do it with Eduro if I remember well.
PS: It can probably drive through closed doors too
Freya: as of 14.7.
- I redesigned the side camera mounts and the lights so they wouldn't damage so easily.
- I have replaced the plastic spacer screws with metal ones, which should prevent cross threading.
- I swapped the board under the PC whose corner I broke off in the middle of the call on Wednesday.
I'm starting to put sw together:
- Local planner over lidar data (no mapping) works. See video "Passing through the Door" in the week. Stereo cameras are not yet integrated into it.
- Global localization: I have a prototype with MRTP rbpf-slam https://www.mrpt.org/. It's basically what Pavel tried but without ROS. Again, just over the lidar data. And (so far) only in 2D. It is quite irritating that even after ten years of attempts I cannot write my own custom SLAM :-(
Catching up on organizational things:
Freyja: as of 21.7. 01.10 The Good, The Bad, The Ugly
- SLAM is integrated. If the robot feels like it, it's not too far and it's full moon, the robot returns back to start.
- I wrote off one SICK. Freyja * can * climb a wall, but can't *stay* there : - /
- For some reason, the relatively trivial processing of lidar data is slowing down and messages are piling up there. When not driving, it can handle almost 10k messages per second, during
a drive sometimes data from 15 FPS lidar are piling up. The first suspicion falls on cpu
throttling due to overheating. After increasing the speed of the cooling fan it no longer overheats but the problem persists. Another suspect is the Linux task scheduler, but
according to "perf sched record" et al. everything is fine and nothing is throttled.
According to htop the CPU load is around 20%. The accumulation of messages results in driving based on data that is several seconds old so the robot gets lost and crashes into the wall. If it doesn't climb it. I'm starting to feel hopeless.
So I managed to link the slow processing to high CPU temperature (> = 75
C). The trouble is that it gets warm like this, even if I leave the robot open. So
it's not due to insufficient ventilation. Probably need to install liquid nitrogen tanks.
MD: 1) how to read SICK data - do you use sleep to get to 15Hz,
or did you force it to "stream"? By default, it is capable of sending you the same data 10 times
Yes, I'm streaming. So unless I don't remember correctly or unless I measured something else,
it runs at 15 Hz. I can check … I confirm.
MD: 2) do you plan to temporarily replace TIM with your old (TIM551?)
Yes, my TIM551 is now screwed in and I carefully run after the robot
I needed such an uncle sooner And there were times, not so long ago, when I couldn't write such an email: - /
Context for others: SICK replacement is on its way.
The problem was / is quite complicated. A little like an airplane disaster.
1) The computer is overheating and the CPU responds by slowing down.
2) After slowing down the individual modules, messages begin to accumulate in various places instead of reaching their destination.
3) Watchdog runs on ODrive. If it doesn't receive the control signal for a while, it stops.
4) Such a stop can only be recovered from by reboot of ODrive, which the code attempts.
5) Reboot of ODrive takes about 20 seconds. (It takes a long time to negotiate between the variety and the PC on the current protocol and to set all possible RPCs.)
6) In the case of a reboot after the watchdog stop, I unfortunately managed to trigger a blocking call until ODrive wakes up.
7) During the blocking call, more messages were accumulating.
8) After restarting odrive there would be half a minute worth of buffered control messages.
9) A lot of confused instructions follow because they were calculated based on delayed measurements.
10. Odrive executes and rushes to climb a wall.
11) That 20s wait along with a listing on the console that it had already returned to base, was the reason
why I didn't respond fast enough.
PavelS: Hi, just such an idea - heatpipe cooling could be more effective and
the radiator could be placed outside the cooled compartment.
I was wondering if anyone would respond to that nonsense (or not?) Thanks.
First, I will try software solutions, such as setting BIOS fan speed to
maximum, possibly also disabling Turbo Boost. Intel Pstate, Linux
Control of the cpu frequency, demonstrates typical properties of Intel software: It is a
piece of crap. I'll try to reset it, or turn it off & replace it. For example
when I switch from "performance" to "ondemand", it takes * much * longer to overheat the cpu again. "ondemand" holds it just below 75 ° C much more aggressively without
anything freezing. I.e. in the "performance" mode I get worse performance. Ehm.
Zbyněk: Yeah, trying to turn off Turbo Boost also occurred to me. If you figure out how to do it
let us know. A while back I was looking for a case that acts as a
cooler and the processor is connected by a heatpipe - ie basically what
Pavel suggested. At work I used to use such a large CPU cooler to cool the warm side of a peltier and it for sure chilled over 100W.
Freya: as of 28.7. (Own SLAM 5/5)
Resolved problems with overheating: I turned off the BIOS Turbo Boost cpu, turned on fans to
100%, I am significantly subsampling data from kayeton cameras.
RBPF SLAM a) thought for a long time, b) got lost. ICP SLAM behaves
better, but it also gets lost. Finally, I wrote my own hybrid with the help of Ceres
between ICP and GraphSLAM using the ORB SLAM trick (keyframes). It's written
taking into account real-time requirements and is customized for the specific task (rotation information
is absolute because the compass; "closed loop" is usually a special case,
because the robot returns home following its own tracks). It should be easier
to integrate detected april tags
The result is five successful returns of the house in five consecutive trips.
Of these five rides, it came back three times with its butt first, twice with its front. Long live symmetrical robots.
Also, in almost two hours of testing, the robot only lightly touched the wall twice. No
wild bumps. This is new. This means that for now it is processing data fast enough.
A new SICK has arrived, but I'm still driving with my own old one.
Concerns for the future: i) There is little time and a lot of work to do. ii) I have no representative
test environment. iii) SLAM may not work with the same parameters in different
environments. iv) I obviously operate close to the thermal / computing / power limits.
And I still miss one stereo vision and artifact detection from four cameras. (v) In a narrow space or in a corner, Freyja can get into a situation where it is afraid to go
both forward and backward.
MD: :) congratulations :) 5/5 good enough!
Another gotcha scenario I can think of - can you incorporate a sloping area / garage ramp? What about the obstacles that do not show on the lidar?
There is nothing better than a regular Freyja update --- it always gives me hope :)
Well, I got depressed in the middle of the week. The PC overheated and resisted my attempts, the messages
were delayed and it wasn't possible to debug, SLAM didn't work and I knew that
to write a functional myself was something I never managed to do.
And you are right, "it's 2D" should be on the list of risks. In addition to the whole inclined surface problem I would also include "the robot drove one wheel on a brick". That's what I meant by "I don't have a representative test space." However, I do have a garage ramp. I could try that.
Freya: as of 4.8.
- Freyja got its suitcase (see appendix).
- Starting / stopping the program and displaying and playing data in the browser
works reasonably well (see next appendix). Control via a browser is not finished yet.
Lesson learned: ipywidgets do not support mouse click on image, I didn't manage to make ipyevents work.
- The soldered power cord has fallen off of one of the encoders. It is in an unfortunate place making the repair difficult. The repair is completed. I would hate to see something like this happen during the competition.
- I'm getting message delays again. The current main candidate is Erro itself
in the place where I store logs. If I don't flush after every message, it gets better.
If I don't log at all, it gets even better. Not logging to a compressed
file system also helps. But I haven't found a solution that works to my complete satisfaction.