Off-Road Autonomous Driving Tech Focuses on Stereo Vision for Space, Agriculture
Southwest Research Institute has developed off-road autonomous driving tools that focus on stealth for the military and agility for space and agriculture clients. The vision-based system pairs stereo cameras with novel algorithms, eliminating the need for lidar and active sensors.
SwRI engineers developed a suite of tools known as the Vision for Off-Road Autonomy (VORA)—a passive system that perceives objects, models environments, and simultaneously localizes and maps while navigating off-road environments.
“We reflected on the toughest machine vision challenges and then focused on achieving dense, robust modeling for off-road navigation,” said Abe Garza, a research engineer in SwRI’s Intelligent Systems Division.
The VORA team envisioned a camera system as a passive sensing alternative to lidar, a light detection and ranging sensor, that emits active lasers to probe objects and calculate depth and distance. Lidar sensors produce light that can be detected by hostile forces. Radar's radio waves are also detectable. GPS navigation can be jammed, and its signals are often blocked in canyons and mountains, limiting agricultural automation.
“For our defense clients, we wanted to develop better passive sensing capabilities but discovered that these new computer vision tools could benefit agriculture and space research,” said Meera Towler, an SwRI assistant program manager who led the project.
The researchers developed VORA to explore planetary surfaces. In space applications, autonomous robots are limited by power, payload capacity, and intermittent connectivity. In space, cameras make more sense than power-hungry lidar systems.
To overcome challenges, the team developed new software to use stereo camera data for high-precision tasks traditionally accomplished by using lidar. These tasks include localization, perception, mapping, and world modeling.
Based on this research, a deep learning stereo matcher (DLSM) tool was developed with a recurrent neural network to create dense, accurate disparity maps from stereo vision. A disparity map highlights motion differences between two stereo images.
To aid in localization and mapping, a factor graph algorithm intelligently combines sparse data from stereo image features, landmarks, inertial measurement unit (IMU) readings, and wheel encoders to produce highly accurate data. Autonomous systems use factor graphs, or probabilistic graphical models, to make inferences by comparing variables.
SwRI plans to integrate VORA technology into other autonomy systems and test it on an off-road course at SwRI’s San Antonio campus.
SwRI has made safety and security a priority in the development of autonomous vehicles and automated driving systems as the technology reaches advanced levels of readiness for commercial and governmental use.
To learn more about this project, visit Autonomous Vehicle Research & Testing, or contact Robert Crowe, (210) 522-4630 Communications Department, Southwest Research Institute, 6220 Culebra Road, San Antonio, TX 78238-5166.