Purdue researchers allow robots to see as well in pitch darkness

Listen to this article

Voiced by Amazon Polly

Researchers at Purdue University have developed a patent-pending vision method that improves on traditional machine vision and perception. The system, called HADAR or heat-assisted detection and ranging, allows robots to see in the dark the same as they can in sunlight.

The Purdue research team included Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering, and research scientist Fanglin Bao. The team’s research was recently featured on the cover of Nature.

HADAR combines thermal physics, infrared imaging, and matching learning to create fully passive and physics-aware machine perception. It fills a gap left by traditional thermal sensing methods, which collects invisible heat radiation originating from all objects in a scene. 

Traditional thermal methods do have some advantages over other vision systems, like LiDAR, radar, and sonar, which emit signals and receive them to collect 3D information about a scene, and cameras.

LiDAR, radar, and sonar, for example, have drawbacks that increase when they’re scaled up, including signal interference and risks to people’s eyes. Cameras don’t have these drawbacks, but they don’t work well in low light, fog, or rain. 

While thermal imaging methods don’t have these drawbacks, they do typically provide less information than LiDAR, radar, sonar, and cameras. 

“Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” Bao said. “Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture and features is a roadblock for machine perception using heat radiation.”

“HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity and texture, or TeX, of all objects in a scene,” Bao said. “It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, or red, green and blue, visible imaging or conventional thermal sensing. It is surprising that it is possible to see through pitch darkness like broad daylight.”

The research team tested HADAR TeX vision using an off-road nighttime scene. During testing, they found that HADAR TeX was able to pick up on textures, even fine textures like water ripples, bark wrinkles, and culverts. 

While the results are encouraging so far, there are still some important improvements the team wants to make to HADAR. In particular, the size of HADAR’s hardware and its data collection speed. 

“The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation,” Bao said. “To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars we need around 30 to 60-hertz frame rate, or frames per second.”

Jacob and Bao disclosed HADAR TeX to the Purdue Innovates Office of Technology Commercialization, which has applied for a patent on the intellectual property. 

Credit: Source link

Comments are closed.