This paper focuses on the development of a small scale agricultural robotic platform with advantages over the current agricultural phenotyping platforms that lack the size-scale and sensor resolution needed to study hard to reach under-canopy row crops. The AgBug utilizes a sensor suite consisting of a LiDAR and RGB camera for crop monitoring on a 12″ × 9″ footprint platform. The main challenge for this platform design is not only its compact size and portability, but its ability to navigate and obtain geo-referenced and time-tagged data in the GNSS-denied environment that exists under the crop canopy. LiDAR and RGB sensors typically rely on inputs from GNSS data. Therefore, a new approach was developed here fusing direct feedback from the RGB camera and a visual-inertial tracking camera supplying robot odometry data. Indoor and outdoor tests were conducted to demonstrate the AgBug’s ability for in-row and under canopy crop monitoring along with the efficacy of sensor fusion approach.

This content is only available via PDF.
You do not currently have access to this content.