czech english

Field Robot 2025

Milano, Italy

Field Robot Event 2025 (FRE2025) will take place in Milano, Italy, from June 9 to June 12, 2025, at Agriturismo da Pippo. This competition is open to universities worldwide, offering a unique opportunity to push the boundaries of autonomous farming and agricultural robotics. Blog update: 14/04/2025 — CULS Robotics Team Blog (week16)



Official announcement

Competition Challenges FRE2025:
  • 🚜 Task 1: Autonomous Maize Field Navigation
  • 🌽 Task 2: Autonomous Maize Field Navigation with Cobs Detection
  • 🍎 Task 3: Fruit Mapping
  • 🍄 Task 4: Bioluminescent Fungi Discovery
  • ✨ Task 5: Freestyle
💡 Want to compete? Registration opens on March 3, 2025, and closes on April 25, 2025. The registration fee per team will be announced at the opening of registration.
🤖 Seeking sponsorship? New teams can apply for sponsorship to support their participation, with funding up to €1500 per team by CLAAS-Stiftung. More details and application link: https://www.claas-foundation.com/students/field-robot-event/application.
📢 Have questions about the rules? Join our FRE2025 Discord server now to get updates, connect with teams, and participate in the Q&A discussion: https://discord.gg/56unzZVBqn. Read the rules now!
🌐 Website Update: Further event details will soon be available on our website: https://fieldrobot.nl/event/
📍 Getting here: The event venue is 12 km from Milano Linate Airport and 17 km from Milano Central Station (Google Maps).



CULS Robotics Team Blog (CRT-B)

CULS Robotics is a new Czech student team for Field Robot 2025! Yes, it took only 10 years to find a group of excited students and „start again”! The team of 3 students was founded in Czech University of Life Science Prague. The members are all in the last year of their study, so the priorities a bit colide with with details like finishing diploma thesis or final exams … but their self-motivation is still strong so hopefully we will see them all competing in Italy in June.
CRT will compete with robot Matty M02 — the green one, red one is a backup or it will be used for the free style discipline. Robot is equipped only with OAK-D Pro camera and it is programmed in Python using OSGAR framework.
Team members:
  • Simona (team lead)
  • Ondra (software)
  • Franta (hardware)

Content


Week 18 (18/2) — Matty GO!

This was the very first time we met with Franta and Ondra at the university. It was great that they had already some basic experience with OSGAR and they had the development environment already prepared. The only needed detail was update of Matty driver to the latest version. Two robots for two students looked like good match.
Unfortunately Franta's Win11 did not recognize the ESP USB-serial, so only Ondra could directly connect his laptop to robot via USB. Never-the-less the goal was to use onboard ODroid, and for that putty was enough.
The 1st lessons learned was — try screen -dr just in case somebody is already using robot, otherwise:
screen
workon osgar
cd git/osgar
python -m osgar.record –duration 10 config/matty-go.json –params app.dist=0.5 app.steering_deg=-20
… and this will go for 0.5m with steering 20 degrees to the right.

Week 17 (25/2) — Fake it until you make it (matty-go-cam.json, github, Rules 2.0)

The 2nd robotics round was again together with Franta and Ondra. Franta welded (!) new holder for OAK-D camera:
… so it already had sense to collect some basic data. Ondra combined OSGAR configuration files for driving and collection of depth and image streams from OAK-D camera. At the end he also managed to push his changes to github, and thus resolved issues with keys etc. Here you can see the proof — matty-go-cam.json.
Finally Simona shared with us the latest rules FRE2025_RULES_v2_0.pdf — which is kind of funny that even now (2025-04-11) there is still old version v1.0 on the official website … (even the location is still in Germany and not Itally).

Week 16 (4/3) — NumPy and obstacle detection

This time the whole team finally met: Ondra, Franta and Simona. The goal was to get started with OSGAR programming (see corresponding Czech notes here). The plan was to stop the robot! in a case there is some obstacle in front of it. The inputs are depth data from OAK-D camera which are available as 2D array of uint16 (unsigned 16 bit integers) corresponding to distance in millimeters.
Step one — pick distance in the middle of the image (single pixel) and if it smaller than 330 (I do not remember where they picked that value) stop. It turned out that 0 is often present, but it rather means I DO NOT KNOW, i.e. the distance is not valid. What to do in such a case? If previous measurement was long distance, then it is probably OK to continue moving. If it was short distance before then it would be safer to stop. And if you are receiving zeros for a long time then you do not know anything and it would be also better to stop. Yes, such complex logic on single pixel reading!
This week Python lesson was that division with two / characters means integer division, which is needed for adressing array. So what we were interested was value of depth[400//2][640//2].