Implementation of Simultaneous Localization and Mapping (slam) on a Wheeled Mobile Robot Deployed on a Cassava Farm.
Student: Michael Gabriel Egwim (Project, 2025)
Department of Mechatronics
University of Nigeria, Nsukka, Enugu State
Abstract
ABSTRACT
The design of a wheeled robot for its autonomous navigation through a cassava farm comes with strenuous factors comprising irregular terrain, steep landscapes, poor traction, and environmental issues. This study focuses on developing an autonomous robot capable of efficient movement within a cassava plantation while avoiding all forms of damage to crops and ensuring precise navigation. The robot integrates a well-advanced localization technology, Simultaneous Localization and Mapping (SLAM), to enhance positioning accuracy on a scaled farm. Sensor fusion, incorporating LiDAR, ultrasonic sensors, and computer vision, is employed to detect obstacles, avoid them, recognize plant rows, and adjust the navigation path dynamically.
The mechanical design involves the integration of sensors, DC motors, and robust traction and suspension systems to ensure stability on uneven farm surfaces. Systems for motion coordination are based on artificial intelligence and machine learning, adaptive decision-making allowing the robot to navigate efficiently. A robotic arm is integrated to extract cassava tubers accurately, without damaging the crop. Experimental simulations validate the robot’s ability to operate effectively in real-world farm conditions, demonstrating its potential for widespread agricultural application requires sophisticated engineering. An Arduino software was used in this design because it offered an accessible and efficient way to program and control components of the robot.
Movement control systems are based on kinematics and dynamics optimize stability and maneuverability, allowing the robot to adapt to changing environmental conditions. The robot must translate its understanding of the environment into precise wheel movements, allowing it to traverse spaces smoothly and efficiently. The sensor works to create detailed maps of the environment while simultaneously determining the robot's position relative to key landmarks and features within that space. This involves complex calculations that account for the robot's physical characteristics feedback, ensuring accurate and safe navigation even when encountering unexpected obstacles or changes in the environment.
Keywords
For the full publication, please contact the author directly at: gabriel.egwim.241120@unn.edu.ng
Filters
Institutions
- Landmark University, Omu-Aran, Kwara State 1
- Lead City University, Ibadan, Oyo State 1
- Lens Polytechnic, offa, Kwara State. 213
- Madonna University, Elele, Rivers State 20
- Madonna University, Okija, Anambra State 2
- Mcpherson University, Seriki Sotayo, Ogun State 1
- Michael and Cecilia Ibru University, Owhrode, Delta State 1
- Michael Okpara University of Agriculture, Umudike 43
- Michael Otedola Col of Primary Educ. Epe, Lagos (affl To University of Ibadan) 8
- Modibbo Adama University, Yola, Adamawa State 15