Autonomous vehicles are increasingly becoming more ever-present in many industries and in the world. From self-driving cars to automated guided vehicles to autonomous robots and machines, this concept has proven it has tremendous use in our world. However, the future of this technology is still somewhat up in the air as it is still being continually researched and improved upon.
One of the biggest concerns surrounding this technology is its safety and reliability. Tony Zarola, general manager of Analog Devices (ADI) offers insight into what the future holds for autonomous vehicles and how sensors—specifically inertial measurement units—play the biggest role in ensuring the success of this technology.
What is the role of an IMU in autonomous vehicles?
Several modalities need to work in near-perfect synchrony to enable fully functional, fully autonomous machines. Among these modalities, perception technology such as radar, lidar, and cameras receive much of the credit for advancing the autonomous movement. However, the unsung hero, but critical lynchpin to an autonomous operation that these modalities hinge on, both in terms of image stabilization and vehicle navigation, is the inertial measurement unit or IMU.
IMUs are based on multi-axis combinations of precision gyroscopes, accelerometers, and in some cases magnetometers. This plug and play technology reliably senses and processes multiple degrees of freedom, even in highly complex applications and under dynamic conditions, providing a consistent signal when other modalities are compromised. IMUs feature full factory calibration, embedded compensation, sensor processing, and a simple programmable interface.
Let’s step back from full autonomy for a second and talk about something more immediate in terms of market pull. If you consider advancing autopilot, also known as hands-off driving, you may be surprised to learn that that the critical specification is system latency. While it takes an IMU mere microseconds to update and output a relative measure of location, cameras, lidar, and radar take milliseconds—a lifetime when you consider highway speeds. Combined with environmental noise that causes perception blind spot, as the update rate continues to get longer, the fusion engine experiences information lags and gaps. It is during these critical ‘flying blind’ moments that the machine receives and relies on the speedier data supplied by the IMU. Beyond that, IMUs are the only sensing mechanism not affected and potentially rendered useless, by external conditions. Adverse weather conditions, excess heat or cold, wavelength restrictions, glare, and many other external factors can impede the reliability and functioning of other sensing modalities, making IMUs essential for situations when those situations arise.
What are other autonomous applications that are currently being used?
As autonomous technologies continue to develop and undergo refinements, many industries are applying customized autonomous solutions to supplement a range of functions. New investments are pouring into markets that have the potential for a large ROI as the economy becomes increasingly integrated and data-driven.
Construction and agriculture stand out as sectors working towards more autonomous practices, maximizing efficiency and safety to better manage worksites and crop fields. Last-mile delivery is another area that has seen a spike in its use of autonomous machines. With the COVID-19 outbreak adding to an already high-pressure ecosystem, the addition of autonomous guided vehicles (AGV), drones, and driverless trucks to deliver goods has seen a sharp increase in development activities over the last few months.
Advances in automation are driving the shift for factories in the fourth industrial revolution, also known as Industry 4.0, enhancing their processes and operations through data and machine learning. Robots and cobots that perform tasks within an assembly line are obvious applications of increased autonomy, but an increasing number of industrial facilities are introducing AGVs that can move goods around within a geofenced area, allowing orders to be filled faster at a delivery facility, for example.
Zeroing in on autonomous vehicles, what are the specific challenges for this fast-growing, and highly visible industry?
Autonomous vehicles are an exciting extension of autonomous machines. However, while we’ve all been enamored by the concept, potential implications, and benefits of a driverless future, truly autonomous, personally owned cars are not yet technically or financially viable. Autonomous vehicles will impact the safety and lives of millions of people who use our roads on any given day, and there are vast, nuanced challenges that need to be overcome before the underlying technology can be introduced in non-controlled environments.
Although it is common to see vehicles that operate in a limited speed constrained autopilot mode and or can self-park, the challenge ahead of us is to add speed without losing safety to get these semi-autonomous vehicles on roadways without relying on an alert driver. New technologies, including high-performance IMUs, will help expand the speed envelope. There are many other hurdles to overcome, considering the system will have to account for an unconstrained environment and mapping the entire world–including every lamppost, streetlight, and stop sign, not to mention accounting for pedestrians and unpredictable encounters with wildlife–is a gargantuan task.
Original equipment manufacturers are faced with two options. They can advance existing technologies such as automatic emergency braking, valet parking, and autopilot, or invest in future technologies that seek to move the needle faster toward fully autonomous driving. However, ultimately, they need people to buy their vehicles, and considerable investments in technologies that make cars too expensive for the average consumer could result in a diminished or negative ROI.
What role will IMUs possibly play in the future of autonomous vehicles?
Each sensing modality has its strengths and weaknesses, but by leveraging sensor fusion techniques, the strengths can complement one another, and the weaknesses can be supported. Cameras can get dirty and are susceptible to glare. Radar can be unable to pick up smaller objects or project a very clear image. Lidar is expensive and offers only limited usage in adverse weather conditions. And as noted earlier, all these modalities update more slowly than IMUs, with the time lag potentially creating a dangerous situation for occupants of the car and others on the road. These machines need trusted sensors and architectures, along with fail-safe back-up systems. In this endeavor, high-performance IMUs can take on the role of vehicle arbitrator to assist in guiding the vehicle when other sensing modalities fail.
The reason that IMUs work when all other modalities fail is their robustness to environmental impact. Beyond being unaffected by changes in temperature, IMUs, such as those from Analog Devices, feature vibration rejection, enabling the elimination of noise (such as vibrations generated by driving on rough roads) from the signal. Without this feature, vibrations from a moving vehicle would come through as a signal and affect the coordinates and direction, and therefore the safety of the car. This ability to reject conditions means the output of the sensor is uncompromised. Yes, a drawback of IMUs is the drift that is inherent in all gyroscopes, but with the right IMU that offers a greater degree of environment immunity, engineers can predict error growth, and incorporate that change into the algorithm to maintain the reliability of the IMU.
Ultimately, the future of autonomy comes down to reliability. What modalities will navigate this autonomous vehicle, drone or robot as quickly, safely, and efficiently as possible? What will be reliable when a split-second decision could be the difference between life and serious injury (or even death)?