Nighttime is when cameras struggle, but a sensing method called HADAR, short for heat assisted detection and ranging, lets machines see detail in darkness.
In tests it recovered texture, distance, and material information at night with accuracy similar to stereo cameras in daylight.
That kind of night vision could change how automated cars, drones, and helper robots move through the world.
Researchers at Purdue University and Michigan State University (MSU) designed HADAR for situations where millions of machines may someday share the same roads and skies.
Why machines struggle in the dark
The work was led by Zubin Jacob, an Elmore Professor of Electrical and Computer Engineering at Purdue University in Indiana. His research focuses on using light and thermal radiation to invent new sensors and imaging methods for autonomous systems.
Modern robots rely on machine perception, computer-based sensing of the environment that feeds their decision making.
To understand the world, many prototypes mix cameras with sonar, radar, and laser rangefinders that send out energy and watch what returns.
Among these, LiDAR, a laser-based distance sensing method for 3D mapping, is especially important for navigation.
But when many vehicles and robots carry active sensors, their signals can interfere and must obey strict eye-safety rules, so scaling is hard.
Passive thermal cameras take a very different approach, recording thermal radiation, invisible heat energy emitted by every object above absolute zero.
Studies of infrared imaging show that these cameras work in darkness and fog but their pictures lack contrast and detail.
What makes HADAR different
One major limitation is the ghosting effect, a loss of image texture in heat pictures. Scientists noted that objects and their surroundings are always giving off and scattering thermal radiation.
Later research showed that thermal blur comes not just from lenses but from this physics-driven loss of texture.
They modeled how heat based pictures wash out geometric features and proposed new algorithms that recover sharper thermal scenes.
HADAR attacks ghosting by collecting many wavelengths of thermal infrared light and feeding them into physics-aware algorithms instead of simple camera filters.
From that information the system estimates each object’s temperature and emissivity, how efficiently a surface emits thermal radiation, along with fine-scale texture.
Researchers call this trio of temperature, emissivity, and texture TeX, and it gives the computer a richer view than brightness alone. Scientists reported that the system can clearly pull out texture from a complex heat signal.
Texture, depth, and distance at night
In their first outdoor tests, the team aimed HADAR at a rough off road nighttime scene with grass, water, tree trunks, and concrete structures.
Instead of the bright blobs that standard thermal cameras produce, the new system revealed bark wrinkles, water ripples, culvert edges, and subtle ground patterns.
“Pitch darkness carries the same amount of information as broad daylight,” said Jacob. He argues that future machine perception will treat night and day as equally informative, removing a bias built into human vision.
After computing TeX for every pixel, HADAR infers how far away each region is and builds a three-dimensional map for navigation.
That map comes from heat alone, without shining light or sound into the environment, so machines could watch the same area without interference.
In one demonstration, the researchers arranged a dark-colored car, a human driver, and a cardboard cutout of Albert Einstein on a nighttime track.
Conventional cameras and LiDAR struggled to tell the person from the cardboard, but HADAR separated skin, fabric, and material types.
How HADAR will advance robotics
For automated vehicles, robust sensing in poor weather and at night is vital because tiny mistakes can quickly lead to serious crashes.
Existing radar, LiDAR, and camera systems work well in many cases but each can fail under glare, heavy rain, or low-contrast lighting.
Because HADAR is passive, it does not add signals to crowded streets or skies, which reduces interference risks as fleets of automated machines grow.
It also sees properties like temperature and emissivity, so it can distinguish a pedestrian from a statue when they look similar in visible light.
This ability to read heat, texture, and distance without extra light could help farmers monitor crop health at night or detect animals in vegetation.
Hospitals and firefighters might one day use HADAR tools to spot temperature patterns that signal infection, hidden people, or smoldering hotspots in smoky environments.
Right now the HADAR camera is bulky and slow, because the algorithms need many infrared colors and a lot of signal processing.
Bao notes that the current sensor takes about one second per image, while self-driving cars need about 30 to 60 frames per second.
Engineers must raise the frame rate, images captured per second by a camera, and shrink optics so the system fits into cars and robots.
They are also working on faster data pipelines and more efficient computing hardware to crunch the incoming heat data in real-time.
HADAR is an early prototype, yet it shows that thermal signals carry structure that lets machines read the night as clearly as the day.
If researchers solve the engineering challenges, future robots and vehicles could treat darkness not as a hazard but as another source of useful information.
The study is published in Nature.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–