Latest from Fun Innovations Friday
Fun Innovations Friday: Thermal Imaging Innovation Creates Broad Daylight In Pitch Black
Traditional machine vision and perception, while still advancing, has been at a bit of a standstill with its use in the industry. Yes, there are countless cameras, sensors, and AI components that offer exceptional machine vision results, but there are still a few areas where machine vision hasn't advanced yet.
Machine vision uses cameras to capture visual information from the surrounding environment, processes those images using a combination of hardware and software, and then prepares the information for use in various applications. Specialized cameras usually acquire the images so characteristics of the image can be processed, analyzed, and measured in deep detail.
For example, a machine vision application in a manufacturing process can be used to analyze a certain characteristic of a part being manufactured on an assembly line, determining if the part meets product quality criteria and, if not, disposing of it.
The advancements we've seen in machine vision being able to catch defects are amazing, but what happens when you pivot those capabilities over to other applications such as automated cars and helper robots? Especially when they're navigating in the dark.
Researchers at Purdue University are advancing the world of robotics and autonomy with their patent-pending method that improves traditional machine vision and perception, specifically nighttime machine vision.
“Each of these agents will collect information about its surrounding scene through advanced sensors to make decisions without human intervention,” Jacob said. “However, simultaneous perception of the scene by numerous agents is fundamentally prohibitive.”
Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering, and research scientist Fanglin Bao have developed HADAR, or heat-assisted detection and ranging.
HADAR combines thermal physics, infrared imaging, and machine learning to pave the way to fully passive and physics-aware machine perception. The patent-pending innovation sees texture and depth and perceives the physical attributes of people and environments as they would appear in broad daylight.
“Our work builds the information-theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight. Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night,” Jacob said.
Bao said, “HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity, and texture, or TeX, of all objects in a scene. It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, or red, green, and blue, visible imaging or conventional thermal sensing. It is surprising that it is possible to see through pitch darkness like broad daylight.”
Traditional thermal imaging is a fully passive sensing method that collects invisible heat radiation originating from all objects in a scene. It can sense through darkness, inclement weather, and solar glare. However, it's not without some fundamental challenges that hinder its use more broadly.
Traditional active sensors like LiDAR, or light detection and ranging, radar, and sonar emit signals and subsequently receive them to collect 3D information about a scene. These methods have drawbacks that increase as they are scaled up, including signal interference and risks to people’s eye safety. In comparison, video cameras that work based on sunlight or other sources of illumination are advantageous, but low-light conditions such as nighttime, fog or rain present a serious impediment.
“Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” Bao said.
“Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture, and features is a roadblock for machine perception using heat radiation.”
The team tested HADAR TeX vision using an off-road nighttime scene. “HADAR TeX vision recovered textures and overcame the ghosting effect,” Bao said. “It recovered fine textures such as water ripples, bark wrinkles, and culverts in addition to details about the grassy land.”
Additional improvements to HADAR are improving the size of the hardware and the data collection speed.
“The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation,” Bao said. “To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars, we need around 30 to 60-hertz frame rate or frames per second.”
HADAR TeX vision’s initial applications are automated vehicles and robots that interact with humans in complex environments. The technology could be further developed for agriculture, defense, geosciences, health care, and wildlife monitoring applications.
Jacob and Bao disclosed HADAR TeX to the Purdue Innovates Office of Technology Commercialization, which has applied for a patent on intellectual property. Industry partners seeking to further develop the innovations should contact Dipak Narula, [email protected] about 2020-JACO-68773.
Jacob and Bao have received funding from DARPA to support their research. The Office of Technology Commercialization awarded Jacob $50,000 through its Trask Innovation Fund to further develop the research.
Fun Innovations Friday
Created by the editors of New Equipment Digest and Plant Services, Fun Innovations Friday is a feel-good blog that showcases how advances in science, math, engineering, and technology are making our world more whimsical. Here’s another post that is guaranteed to brighten your day.
Itsy, bitsy, teenie, weenie MilliMobile robot runs on light and radio waves
What happens when you combine a vending machine with plastic injection molding and add in a pinch of mid-century flair?