Posted in | News | Imaging

Study Improves Agricultural Robotics with 2D Lidar Technology

In an article published in the journal AgriEngineering, researchers presented various solutions to crop row tracking issues by employing a four-wheel-steering mobile agricultural robotics approach for straddling crops.

agriculture, crop tracking, robot, lidar

Study: Autonomous Vineyard Tracking Using a Four-Wheel-Steering Mobile Robot and a 2D LiDAR. Image Credit: MrDDK/Shutterstock.com

Due to the motion of the four-wheel-steering mobile robot, a 2D LiDAR laser-scanner-based navigation approach was used for crop row tracking in 3D. It was possible to define a reference trajectory followed by two different control strategies. The primary use of the novel agricultural robotics approach involved navigation in vineyards which achieved various operations, such as monitoring, cropping, or precise spraying. Using a row tracking approach based on a 2D LiDAR angled in front of the four-wheel-steering mobile robot to fit a specified shape of the vine row in the four-wheel-steering mobile robot framework was detailed in the first section.

A control architecture for the four-wheel-steering mobile robot was suggested in the second section. A backstepping approach-based strategy and an independently considered regulation of the front and rear steering axle positions were examined. With the aid of a 3D reconstruction of realistic vineyards in various seasons, the outcomes of these control rules were then contrasted within an extensive simulation framework.

Agricultural Robotics in the Present Context

A vital issue for the ecological transformation of agriculture in the context of vine production is the autonomous navigation of off-road robots. By reducing the number of necessary chemicals and utilizing bio-control solutions, jobs can be completed more accurately and frequently in the field. The creation of autonomous robots is encouraged by labor scarcity and the difficulty of field duties, particularly for the workers in vineyards. 

Farms are now employing extensive agricultural robotics to help farmers by decreasing the struggle of some manual activities. Agricultural robotics also play various other roles, including improving agricultural productivity, lowering worker exposure to chemicals sprayed on plantations, and enabling more precise and efficient farming. Centimeter-level accuracy concerning the vegetation is necessary for navigation in vineyards, especially for straddle robots. 

For reactive navigation in the vineyard, while accounting for weather conditions, lightning fluctuations, and vegetation growth, a robust 2D LiDAR laser-scanner-based navigation approach was adopted in this research. The authors were interested in researching the generalizability of a reactive LiDAR laser-scanner-based navigation technique throughout the year. Since the task was related to vegetation development rather than precise positioning, a GPS-free solution was suggested.

The laser-scanner-based navigation method identified a vine structure's anticipated shape in each navigation scan. These successive scans were combined into a global map close to the robot's position. The trajectory was determined by collecting points in 3D space as the robot moved. According to the proposed laser-scanner-based navigation algorithm, a four-wheel-steering mobile robot could work in vineyards, notwithstanding how the vegetation changes with the seasons.

Two distinct control schemes were presented to demonstrate the effectiveness of the laser-scanner-based navigation algorithm. In the first proposed method, a backstepping approach was taken to regulate the longitudinal and lateral controls separately, separating the robot's position and orientation management. In contrast, the second technique considered splitting control of the two axles to ensure that the front and the rear axle followed the same path.

The Procedural Scheme

The initial step in the row tracking technique outlined in this study was to use the LiDAR laser-scanner-based navigation framework to locate crop shapes that fit an anticipated model. Thus, the objective was to use an autonomous four-wheel-steering mobile robot equipped with an angled 2D LiDAR laser-scanner-based navigation system to follow a trajectory established by the vineyard structure.

The strategy then entailed identifying the anticipated shape in each laser-scanner-based navigation scan and combining the subsequent scans to create an overall map. Finally, the trajectory was determined by collecting points in 3D space as the robot moved. 

The proposed technique used LiDAR sensing to enable an autonomous four-wheel-steering mobile robot to follow a route established by a vineyard structure. This strategy identified the anticipated pattern in each laser scan and combined the subsequent scans into a comprehensive map. Consequently, the route to follow was determined by accumulating points in three-dimensional space as the four-wheel-steering mobile robot moved.

A four-wheel-steering mobile robot was typically employed to improve vehicles' motion capabilities and maneuverability, particularly those functioning in confined environments like warehouses or agricultural settings like vineyards. The four-wheel-steering mobile robot was designed as a bicycle, i.e., it was reduced to two wheels, with one wheel serving as the front axle and the other as the rear axle, both having steering angles denoted by dF and dR, respectively.

A sophisticated simulation testbed was developed to lower the expenses and risks for both the four-wheel-steering mobile robot and the vineyard. The virtual vineyard needed to accurately reflect reality for the robot to behave as it would in a natural vineyard. Hence, the suggested autonomous laser-scanner-based navigation was tried in various field setups, particularly those that affected plant development. Despite some regions of lost vegetation, the recommended method exhibited accuracy within a few centimeters. 

Four-Wheel-Steering Mobile Robot and the Future of Agricultural Robotics

Several suggestions were proposed in this work concerning the employment of agricultural robotics for row tracking in a viticultural setting. The authors specifically addressed the issue of crop row tracking by utilizing a four-wheel-steering mobile robot that straddled the crops. A 2D LiDAR inclined in front of the robot was presented as the basis for a row tracking method to match the predetermined shape of the vineyard row in the four-wheel-steering mobile robot framework.

Through system odometry, the successively recognized zones of interest were aggregated along the direction of the local robot motion. Such an aggregation made it possible to determine the four-steering-wheel mobile robot's local trajectory. Also, a control architecture that enabled the control of the four-wheel-steering mobile robot was proposed.

The obligatory blind procedure at the start of the laser-scanner-based navigation, caused by the requirement of a minimal number of distant points to estimate a regression and then compute a pertinent control, appeared to be a restriction of the suggested method. Due to 3D LiDAR's potential to minimize the blind process phase, the future study would primarily focus on extending the proposed laser-scanner-based navigation algorithm into 3D space.

Future studies would also focus on controlling the four-wheel-steering mobile robot while simultaneously controlling an embedded tool, like a sprayer. It would also include using a digital terrain model rebuilt for obstacle identification, traversability analysis, and obstacle avoidance using 3D data. Additionally, it would further concentrate on experimentation on actual vineyards, although validation was carried out on a sophisticated simulation testbed made from a digitalized vineyard.

Reference

Iberraken, D., Gaurier, F., Roux, J.C., Chaballier, C., Lenain, R. (2022) Autonomous Vineyard Tracking Using a Four-Wheel-Steering Mobile Robot and a 2D LiDAR. AgriEngineering, 4(4), 826-846. https://www.mdpi.com/2624-7402/4/4/53

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Pritam Roy

Written by

Pritam Roy

Pritam Roy is a science writer based in Guwahati, India. He has his B. E in Electrical Engineering from Assam Engineering College, Guwahati, and his M. Tech in Electrical & Electronics Engineering from IIT Guwahati, with a specialization in RF & Photonics. Pritam’s master's research project was based on wireless power transfer (WPT) over the far field. The research project included simulations and fabrications of RF rectifiers for transferring power wirelessly.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Roy, Pritam. (2022, September 27). Study Improves Agricultural Robotics with 2D Lidar Technology. AZoOptics. Retrieved on April 20, 2024 from https://www.azooptics.com/News.aspx?newsID=27954.

  • MLA

    Roy, Pritam. "Study Improves Agricultural Robotics with 2D Lidar Technology". AZoOptics. 20 April 2024. <https://www.azooptics.com/News.aspx?newsID=27954>.

  • Chicago

    Roy, Pritam. "Study Improves Agricultural Robotics with 2D Lidar Technology". AZoOptics. https://www.azooptics.com/News.aspx?newsID=27954. (accessed April 20, 2024).

  • Harvard

    Roy, Pritam. 2022. Study Improves Agricultural Robotics with 2D Lidar Technology. AZoOptics, viewed 20 April 2024, https://www.azooptics.com/News.aspx?newsID=27954.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.