1 June 2023
Jelgavas iela 3
Europe/Riga timezone

MITESENS - R&D on Navigation State Estimation, Flight Control and Georefe¬ren-cing Tasks for an auto¬nomous UAV System for Mites Detection in the Era of Digital Horticulture 4.0

1 Jun 2023, 13:30
20m
Alfa/110 (Jelgavas iela 3)

Alfa/110

Jelgavas iela 3

Jelgavas iela 3, Alfa/110
Geodynamics and Geospatial Research 2023

Speaker

Prof. Reiner Jäger (Karlsruhe University of Applied Sciences (HKA), Laboratory for GNSS and Navigation)

Description

The aim of the MITESENS project (www.h-ka.de/iaf/mitesens) in the Laboratory for GNSS and Naviga¬tion is an UAS based monitoring system (UAS = Unmanned Aerial System with an UAV as carrier platform) for spider mites infestation in greenhouse cultivations for the early infestations of plant leaves with spider mites.
The autonomous UAV flight over the plant stand is controlled with a newly developed MITESENS UAS flight control (FC). The respective hardware development for the multisensory and IEEE 1588 time synchronized GNSS/MEMS/Optics FC box is also part of the R&D project. The FC determines out- and indoors the 18 para¬¬meters navigation state vector y(t) (3D-position, 3D-velocity, 3-acceleration, 3D-rotation rates, 3D-rotation rate changes), which is georeferenced in the ETRF89, or generally in the ITRFyyyy.zzzz.mm, using optionally DGNSS and PPP.
In a first instance y(t) is the essential com¬ponent of the FC controlling the desired trajectory and attitude. As flight control FC types, the development of a PID control, and of a multi predictive control (MPC) are carried out. The PID control type is used to set up the georeferenced building model (BIM) of a greenhouse from the UAV optics of the camera and/or the lidar sensorics, by operating the UAV with a remote control. The MPC control is used in further for the general mission of mites detection and monitoring by the UAS being equipped also with a multispectral camera, where the UAS is autonomously flying out- and indoors. Indoors the UAS is using the digital twin of the greenhouse BIM model as an AI-based feature recognition vision component of the FC, providing the position infor¬ma¬tion from the ERTF89 or ITRFyyy.zz.¬mm BIM instead from GNSS, in the navigation state esti¬ma¬tion y(t) and MPC control.
Con¬ceptionally the navigation y(t) is not used only as a reference for the FC, but y(t) also forms the com¬mon core for all computational operations. So all images contain y(t) as metadata. A first further task of the UAS is the generation of an ERTF89 or ITRFyyy.¬zz.¬mm geo¬refe¬renced 3D voxel model of the plants by a bundle block adjustment of the RGB cameras (ZED 2i) data with known exterior orien-ta¬tion as component parts of y(t). Further y(t) in the image metadata is used for the geo¬refe¬rencing of the acquired spectral image data. The acquired hyperspectral images infor¬ma¬tion is on wavelengths between 500nm and 900nm. This image information is evaluated using AI (XGBoost classi¬fication based on a decision tree algorithm) on dividing the infestation probability of the leaves into three classes (green, yellow and red) with a prediction accuracy for the spider mite infestation above 85%. The classified hyperspectral metadata images are then pixelwise calculated back to the ERTF89 or ITRFyyy.zz.¬mm referenced 3D plant stand using y(t), creating in that way a classified 3D voxel model. In order to realize a simple and spatially clear representation of the recorded spider mite infestation for horticultural prac¬tice, the classified 3D voxel data are converted into a 2D plan map view of the green¬house infestation situation.
The complete MITESENS UAS is presented. The focus is on the mathematical models, algorithms and software of the out- and indoor navigation state y(t) estimation and SLAM, the MPC based control for the auto¬nomous flight, the 3D voxel model generation, and the back projection of the classified images to receive the classified 3D voxel model of the mite infestation. The results are of the MITSENS UAS development are shown

Primary author

Prof. Reiner Jäger (Karlsruhe University of Applied Sciences (HKA), Laboratory for GNSS and Navigation)

Co-author

Felix Vortisch

Presentation materials

There are no materials yet.