Researchers at the esteemed Purdue University have made a significant leap in the realm of robotics, machine vision, and perception. Their groundbreaking approach offers a marked improvement over conventional techniques, promising a future where machines can perceive their surroundings more effectively and safely than ever before.
Introducing HADAR: A Revolutionary Leap in Machine Perception
Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering, in collaboration with research scientist Fanglin Bao, introduced a pioneering method named HADAR, short for heat-assisted detection and ranging. Their innovation garnered substantial attention, and this recognition has amplified the anticipation surrounding HADAR’s potential applications in various sectors.
Traditionally, machine perception depended on active sensors like LiDAR, radar, and sonar, which emit signals to gather three-dimensional data about their surroundings. However, these methods present challenges, especially when scaled up. They are prone to signal interference and can even pose risks to human safety. The limitations of video cameras in low-light conditions and the ‘ghosting effect’ in conventional thermal imaging have further complicated machine perception.
HADAR seeks to address these challenges. “Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” Bao elaborated. He continued, “Thermal pictures of a person’s face show only contours and some temperature contrast; there are no features, making it seem like you have seen a ghost. This loss of information, texture, and features is a roadblock for machine perception using heat radiation.”
HADAR’s solution is a combination of thermal physics, infrared imaging, and machine learning, enabling fully passive and physics-aware machine perception. Jacob emphasized the paradigm shift that HADAR brings about, stating, “Our work builds the information theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight. Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night.”
Practical Implications and Future Directions
The effectiveness of HADAR was underscored by its ability to recover textures in an off-road nighttime scenario. “HADAR TeX vision recovered textures and overcame the ghosting effect,” Bao noted. It accurately delineated intricate patterns like water ripples and bark wrinkles, showcasing its superior sensory capabilities.
However, before HADAR can be integrated into real-world applications like self-driving cars or robots, there are challenges to address. Bao remarked, “The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation. To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster.” The aspiration is to enhance the frame rate of the current sensor, which currently creates an image every second, to meet the demands of autonomous vehicles.
In terms of applications, while HADAR TeX vision is currently tailored for automated vehicles and robots, its potential extends much further. From agriculture and defense to health care and wildlife monitoring, the possibilities are vast.
In recognition of their groundbreaking work, Jacob and Bao have secured funding from DARPA and were awarded $50,000 from the Office of Technology Commercialization’s Trask Innovation Fund. The duo has disclosed their innovation to the Purdue Innovates Office of Technology Commercialization, taking the initial steps to patent their creation.
This transformative research from Purdue University is set to redefine the boundaries of machine perception, making way for a safer, more efficient future in robotics and beyond.
Credit: Source link
Comments are closed.