top of page
  • Vratislav Beneš

Digital Twin of Strawberry - part 1

We would like to share with you the ongoing results from our joint project Digital Twin of Plants for AI Training using Synthetic Datasets for Disease and Pest Detection in Hydroponic Greenhouses. 

 

We are collaborating on the project with the Space Agri Technologies Mendel University (SATMendelu) laboratory in Brno, led by Libor Lenža. Our common goal is to develop an innovative environment for generating synthetic datasets crucial for training convolutional neural networks. Autumn 2023 was marked by the preparation of models, greenhouses, and initial tests. The real 'rodeo' begins in April when the first strawberries bloom at Farm Ráječek. But there is already much to see.



The SATMendelu team grows strawberries in the laboratory and lovingly cares for pests such as thrips, aphids, and agents of bacterial and fungal diseases. The captured diseases and pests then serve as valuable inputs for creating 3D models in Blender. In the NVIDIA Omniverse, the 3D plants, including those with simulated diseases and pests, are cultivated in virtual planters in a virtual greenhouse.


Model of strawberries in Omniverse


Application in virtual greenhouse


The virtual world also includes a digital twin of our robot, whose movement can be simulated. Validation occurs in combination with the real greenhouse at Ráječek and using images from SATMendelu's laboratory greenhouses.


The 3D models are created parametrically, allowing dynamic changes in shape, position, number, and textures of objects. Practically, this means that a small strawberry with unripe fruit can instantly become a fully mature plant. Or a healthy fruit can turn into one ugly infected with mold.



Thus, it's easy to generate the position, size of disease occurrences on leaves, the position of the leaf relative to the camera, or the position of the sun, greatly increasing the variability of the training data.

 

The primary goal of this phase was to create a parametric 3D model. The next phase (1-6/2024) of the project addresses the ability to accurately detect objects. The texture library will be expanded, bringing greater variability to the training dataset and will be carried out in close cooperation with colleagues from SATMendelu.


Preliminary verification of accuracy was experimentally conducted using a network trained purely on a synthetic dataset. The limiting factor was the range of textures used in the testing phase. The results were as expected and we present them for interest. Some objects are detected well, others worse.





and testing on real data



Also, worth mentioning is the successful implementation of the robot model in Isaac Sim, which allows simulating movements and behavior of the robot at the PLC, motion, or vision level. And then applying that to the real robot. Or conversely, mapping the behavior of the real robot in the model and controlling it from the model. But more about that will be written by colleague Martin Juříček in the next post.

 




The project is supported by the Trend Technology Agency of the Czech Republic Program. I would like to highlight the enormous contribution of such a project to a startup like FRAVEBOT. It allows us to involve resources that would be hard to find otherwise. And pre-financing makes everything even simpler. What's also great is that the entire project is administratively very manageable and we handle it with our own forces without extra manpower.




28 zobrazení0 komentářů
bottom of page