Field Robot 2025
Milano, Italy
Field Robot Event 2025 (FRE2025) will take place in Milano, Italy, from June 9 to June 12, 2025, at Agriturismo da Pippo. This competition is open to universities worldwide, offering a unique opportunity to push the boundaries of autonomous farming and agricultural robotics. Blog update: 28/04/2025 — CULS Robotics Team Blog (week14)
Official announcement
Competition Challenges FRE2025:
- 🚜 Task 1: Autonomous Maize Field Navigation
- 🌽 Task 2: Autonomous Maize Field Navigation with Cobs Detection
- 🍎 Task 3: Fruit Mapping
- 🍄 Task 4: Bioluminescent Fungi Discovery
- ✨ Task 5: Freestyle
💡 Want to compete? Registration opens on March 3, 2025, and closes on April 25, 2025. The registration fee per team will be announced at the opening of registration.
🤖 Seeking sponsorship? New teams can apply for sponsorship to support their participation, with funding up to €1500 per team by CLAAS-Stiftung. More details and application link: https://www.claas-foundation.com/students/field-robot-event/application.
📢 Have questions about the rules? Join our FRE2025 Discord server now to get updates, connect with teams, and participate in the Q&A discussion: https://discord.gg/56unzZVBqn.
Read the rules now!
🌐 Website Update: Further event details will soon be available on our website: https://fieldrobot.nl/event/
📍 Getting here: The event venue is 12 km from Milano Linate Airport and 17 km from Milano Central Station (Google Maps).
CULS Robotics Team Blog (CRT-B)
CULS Robotics is a new Czech student team for Field Robot 2025! Yes, it took only 10 years to find a group of excited students and „start again”!
The team of 3 students was founded in Czech University of Life Science Prague. The members are all in the last year of their study, so the priorities a bit colide with with
details like finishing diploma thesis or final exams … but their self-motivation is still strong so hopefully we will see them all competing in Italy in June.
CRT will compete with robot Matty M02 — the green one, red one is a backup or it will be used for the free style discipline. Robot is equipped only with
OAK-D Pro camera and it is programmed in Python using OSGAR framework.
Team members:
- Simona (team lead)
- Ondra (software)
- Franta (hardware)
Content
- w18 - Matty GO!
- w17 - Fake it until you make it (matty-go-cam.json, github, Rules 2.0)
- w16 - NumPy and obstacle detection
Week 18 (18/2) — Matty GO!
This was the very first time we met with Franta and Ondra at the university. It was great that they had already some basic experience with OSGAR and they had
the development environment already prepared. The only needed detail was update of Matty driver to the latest version.
Two robots for two students looked like good match.
Unfortunately Franta's Win11 did not recognize the ESP USB-serial, so only Ondra could directly connect his laptop
to robot via USB. Never-the-less the goal was to use onboard ODroid, and for that putty was enough.
The 1st lessons learned was — try screen -dr just in case somebody is already using robot, otherwise:
screen workon osgar cd git/osgar python -m osgar.record –duration 10 config/matty-go.json –params app.dist=0.5 app.steering_deg=-20
… and this will go for 0.5m with steering 20 degrees to the right.
Week 17 (25/2) — Fake it until you make it (matty-go-cam.json, github, Rules 2.0)
The 2nd robotics round was again together with Franta and Ondra. Franta welded (!) new holder for OAK-D camera:
… so it already had sense to collect some basic data. Ondra combined OSGAR configuration files for driving and collection of
depth and image streams from OAK-D camera. At the end he also managed to push his changes to github, and thus resolved
issues with keys etc. Here you can see the proof — matty-go-cam.json.
Finally Simona shared with us the latest rules FRE2025_RULES_v2_0.pdf — which is kind of funny that even now (2025-04-11) there is still old version v1.0 on the official website … (even the location is still in Germany and not Itally).
Week 16 (4/3) — NumPy and obstacle detection
This time the whole team finally met: Ondra, Franta and Simona. The goal was to get started with OSGAR programming (see corresponding Czech notes
here). The plan was to stop the robot! in a case there is some obstacle in front of it. The inputs are depth data from OAK-D camera
which are available as 2D array of uint16 (unsigned 16 bit integers) corresponding to distance in millimeters.
Step one — pick distance in the middle of the image (single pixel) and if it smaller than 330 (I do not remember where they picked that value)
stop. It turned out that 0 is often present, but it rather means I DO NOT KNOW, i.e. the distance is not valid. What to do in such a case? If previous
measurement was long distance, then it is probably OK to continue moving. If it was short distance before then it would be safer to stop. And if you are receiving
zeros for a long time then you do not know anything and it would be also better to stop. Yes, such complex logic on single pixel reading!
This week Python lesson was that division with two / characters means integer division, which is needed for adressing array. So what we were interested
was value of depth[400//2][640//2].
Week 15 (11/3) — Camera holder upgrade, osgar.replay, NumPy mask
The most visible part for this week is a new camera holder. It is 3D printed, allows two degrees of freedom, and
camera can be tilted and/or shifted:
Note, that there are also serious changes inside — properly mounted Odrive, batteries etc.
Coding was this time on Ondra, so he learned how to use python -m osgar.replay --module app for debugging and checking that identical code was
running on the robot and on his notebook. Once there is a new feature/fix you can use this replay utility for syntax checking with parameter -F (force), which
will replay the logfile without checking output values.
Single pixel in the center of the depth image (data[400//2][640//2]) was replaced by rectangle. There the zeros (unknown distance)
were not as frequent, but they still had to be handled. This was the introduction of numpy masks:
>>> import numpy as np >>> a = np.array([ [0, 1, 2],[3, 4, 5] ]) >>> a array([ [0, 1, 2], [3, 4, 5] ]) >>> a.shape (2, 3) >>> a == 0 array([ [ True, False, False], [False, False, False] ]) >>> a[1] array([3, 4, 5]) >>> a[1][1:] array([4, 5]) >>> a[1][1:3] array([4, 5]) >>> a[1][1:2] array([4])
Yeah, maybe time to check numpy introduction for absolute beginners.
The artificial fruits and mashrooms needed for tasks 2-5 should be soon ordered. The team size is now 6 = 3 students + [Milan, Jakub, me].
New bumpers were brought for Matty M02, and Franta shoud mount them
next week. Finally Milan promissed to cut new set of corn/maize plants 40-50cm ("nice to have").
p.s. we also started to use git branches for new development, so feature/task1 was created.
Week 14 (18/3) — Row navigation
Robot Matty M02 for the first time autonomously navigated in curved row! Do not expect super-smart algorithm, but …
it worked (video).
Algorithm:
- if obstacle to the left and to the right are at roughtly the same distance then go straight
- otherwise if left obstacle is closer then turn right
- otherwise (i.e. right obstacle is closer) turn left
There was an update to used numpy mask as min/max from empty array is not defined:
>>> max([1, 2, 3]) 3 >>> max([]) Traceback (most recent call last): File "", line 1, in ValueError: max() arg is an empty sequence
… and the same situation can happen with the non-zero mask (i.e. only valid depth data)
>>> import numpy as np >>> arr = np.array([0, 0, 0]) >>> mask = arr != 0 >>> mask array([False, False, False]) >>> arr[mask].max() Traceback (most recent call last): File "", line 1, in File "C:\SDK\WinPython-64bit-3.6.2.0Qt5\python-3.6.2.amd64\lib\site-packages\numpy\core\_methods.py", line 26, in _amax return umr_maximum(a, axis, None, out, keepdims) ValueError: zero-size array to reduction operation maximum which has no identity
Another update was that instead of a line in the depth image we now take into account bigger rectangle — one for the left obstacle, one for the right obstacle and one just in front of the robot to STOP and avoid collision.
Also note, that to address an rectangle you need to specify limits with comma: data[row_begin:row_end, column_begin:column_end].
There was also an issue with numpy.dtype. In particular the depth is unsigned 16 bit integer, so evaluation „left and right obstacle are almost equal” in a half of the cases overflow.
The easiest solution was to cast result to normal Python integer (fix).
The newly mounted bumpers were tested with a new firmware and they worked fine. We will see if new black extenders will not colide with plants, but let's leave it for now.
Accidentally we also tested battery alarm. The Odroid battery reached 14.5V, and yes, the sound was well hearable even for closed box. The good news is that PC worked without any problem
although the minimal limit is officially 15V.