When Australian Automotive looked at sensors for autonomous vehicles a couple of years ago we concentrated mainly on LiDAR. This highly capable technology, combined with ultrasonic sensing, radar and visual images from cameras, were seen as the sensing quartet to safely guide us into the future. And it’s true, as long as we don’t have to drive into blinding sun, driving rain, heavy fog or maybe even smoke.
In current hands-off, semi-autonomous developmental systems, challenging conditions like those mentioned usually call for a human driver to take control. That’s a long way from full autonomy. What’s more, humans are often next to useless in those conditions anyway.
We all know how difficult it is to see into direct, late-day sun. Also, suddenly asking a human to snap out of a relaxed, distracted state and instantly make split-second assessments in a highly dynamic situation and then act on them safely is plainly ridiculous. This is particularly so when the situation is beyond human physical capabilities by definition. Dealing with these conditions reliably is one of the great challenges facing full Level 5 autonomy for vehicles.
Thermal imaging based on the detection of infrared radiation is a technology that can assist in providing an effective solution to these problems. Infrared radiation makes visible all sorts of things normally invisible to the human eye. Importantly, it does so under a wide range of conditions, both day and night.
It’s sometimes thought that night-vision systems and thermal imaging systems are the same, but they’re not. Night vision systems rely on the amplification of light that’s being reflected from a scene. Even if there’s very little light available, a high-quality night vision system will capture it, enhance it and turn it into one of those green images so prized by spies in relatively recent times. But if there’s no light, no image is possible.
Thermal imaging has nothing to do with reflected light. A thermal imaging system is sensitive to thermal energy emitted in the form of infrared radiation. This covers the widest range of objects because anything that has a temperature above zero degrees kelvin (minus 273 degrees Celsius) emits thermal radiation. That covers basically everything, and certainly anything relevant to a moving vehicle. Even ice cubes emit infrared radiation.
The hotter an object, the higher the frequency of the radiation it emits. A thermal camera converts the differences in the frequencies (temperatures) of the object in a scene into a picture. Such cameras are used in a wide range of industrial applications. For instance, excessive heat is often a precursor to component failure. So, observing the thermal signature of, say, a bearing can indicate if it’s likely to fail. Preventative maintenance is much less expensive than a failed component. Also, heat-leaks from systems can be determined and remedied.
The applications of thermal imaging are many but we’re interested in just one – the ability to distinguish one object from another, and how such differentiation can be used to help guide an autonomous car.
It’s true that the four technologies mentioned above have just about everything covered but the fact that thermal imaging is much more sensitive to frequency than intensity means it isn’t affected by the blinding glare of on-coming headlights or late summer sun. It could well be the technology that bridges the capabilities of the four main sensing systems and finally allows the removal of humans from the vehicle control loop once and for all.
Infrared imaging is not a new technology. As might be expected, some of the earliest uses were military. In the ’60s and later, for instance, infrared cameras allowed lumbering B-52 bombers to hug the ground to avoid radar detection.
Imaging systems of this sort are called FLIR (Forward Looking Infrared) as opposed to SLIR (Side Looking Infrared). One application of a SLIR system would be when flying a surveillance aircraft along the border of another country. Of course, FLIR imaging systems are the type used for navigation.
Given the well-established use of FLIR in guiding military aircraft in mission-critical/life-critical applications, FLIR technology would seem to be a natural fit for automotive applications. A number of manufacturers have included this basic technology but it’s usually known as night vision and seen as a way to augment drivers’ vision in darkness. Indeed, these systems present images on screens for direct interpretation by drivers. It’s a great idea and definite safety benefit but not of much use for an autonomous system with no human input.
Using thermal imaging to plug the gaps in the four main sensing technologies for full autonomy doesn’t seem to be part of the plan from the major players in this area.
Last year Aptiv, Audi, BMW, Volkswagen, Intel, Continental and others published a paper called Safety First For Automated Driving. It seeks to assist in “working toward the industry-wide standardization of automated driving.” From the scope section of the publication: “The goal of this publication is to provide an overview of and guidance about the generic steps for developing and validating a safe automated driving system.” Infrared or thermal imaging isn’t mentioned despite the fact that some of these manufacturers have offered thermal imaging night vision systems on some models.
It’s true that every extra sensing system added to a vehicle adds considerable cost and increases pre-market development time. Companies have to draw the line somewhere. Tesla is a prime example of this type of production pragmatism. As has been widely publicised, the company says it doesn’t need even the four basic sensors utilised by other manufacturers.
Read Hot stuff (Part 2). Alternatively, view the full article and accompanying imagery in the April 2020 issue of Australian Automotive, page 38.