In conditions with clear visibility, autonomous vehicles can now navigate well-marked roads with no surprises.
But developing a system that drives safely at night, in challenging lighting conditions, in adverse weather, and in low-visibility environmental conditions has proven far more difficult. That’s one of the major reasons why we haven’t seen mass adoption of self-driving trucks and cars despite their clear advantages.
Virtually every heavy-duty commercial truck manufacturer and many shippers and motor carriers are now experimenting with autonomous trucks. They recognize how the industry and the economy would benefit from increased automation, such as the use of advanced driver assistance systems (ADAS) and fully self-driving trucks.
THE DUST PROBLEM
Although designers are striving to develop automation systems to increase truck safety and expand the operational situations that autonomous vehicles can safely operate, they still face the challenge of developing a system that can see through dust.
“Dust vision” is one of many situations that infrared thermal cameras increase vehicle safety. Thermal cameras are an option in luxury vehicles today and increasingly they are playing a role in the sensor systems of autonomous vehicles. These cameras detect heat and can “see” just as well in daylight as in total darkness. Visible cameras struggle with visibility in dusty environments, such as on dirt roads or in dust storms, due to reflected light off the dust particles in the air.
In response, ADAS and AV vehicles today usually employ radar sensors as a redundant sensing technology for visible cameras. They work reasonably well in adverse weather conditions, but no radar system today exists with sufficient resolution to reliably classify objects like a camera system can. Normally these radar systems, and LiDAR, work in concert with a camera system to tell range, and in some cases the speed of objects. However, radar and lidar are not well suited for delineating the difference between a living thing, what drivers want to avoid most, and an inanimate object. This classification capability is normally left to visible cameras. In the case of these dusty environments and other challenging lighting situations that visible cameras can struggle to see, thermal cameras really help.
VISION THROUGH DUST WITH THERMAL CAMERAS
Various companies, institutions, and government-funded programs have demonstrated that thermal cameras can be used to improve the perception in dusty driving conditions as compared to visible cameras. A large reason for the effectiveness in these challenging, dusty conditions is that thermal cameras detect long wave infrared (LWIR) energy and do not require light, sun, or headlights to illuminate the object. A thermal camera ‘sees’ by detecting the heat that is emitted by the object. In dusty conditions, both thermal cameras and radar usually have the performance edge over visible cameras and lidar sensors.
Furthermore, dusty conditions and dust storms aren’t always predictable. In dry regions, like on the U.S. Interstate 10 highway running through Arizona, there are occasions where strong winds can blow dust into the air and partially obstruct the driver’s and the visible-camera-based ADAS system’s ability to see what’s ahead or behind. Meanwhile, there are many other environmental conditions that a thermal camera can help with besides dust, including most fog, darkness, sun or headlight glare, dark tunnels, rain, snow, and other low-contrast situations.
To further help drivers and ADAS systems, stereo imaging can help improve the perception accuracy of thermal cameras. Stereo imaging consists of comparing a scene’s information from two points of vantage. It is how human eyesight works. Stereo-paired thermal cameras help improve perception through dust, measuring distance by triangulating the cameras with objects in a field of view.
Stereoscopic thermal imaging provides increased perception in dusty environments compared to visible cameras, as can be seen in the side-by-side images above. Some companies have focused on stereo pairs and optimizing the ability to reliably provide range data by making the calibration and mounting of such systems much easier. The above image displays captured data from a thermal stereo pair on the left in comparison to a visible-camera stereo pair displayed on the right. The visible cameras fail to detect, classify, and range the vehicle ahead, in comparison to the thermal cameras.
THE PROMISING FUTURE OF AUTONOMOUS TRUCKS
In addition to providing improved driver vision in dusty environments, thermal cameras can be combined with artificial intelligence (AI) or what is now referred to as a trained convolutional neural network (CNN). CNNs enable camera systems to detect and classify objects via software. A CNN can be used with visible cameras or thermal cameras to analyse images in real time for specific characteristics or shapes the system has been trained to detect and classify. Common object classes include pedestrians, cars, trucks, dogs, and bicycles. When paired with stereo vision or radar, the objects are classified/identified along with the respective distances. The ADAS or AV system can therefore make appropriate decisions for an array of situations.
In the case of commercial truckers and other like industries, the cost-verses-benefit of adding thermal cameras to the vehicle is easy to justify and compute. Commercial trucks benefit greatly when they can operate longer and in more types of conditions. In one study, a company’s implementation of thermal cameras on its autonomous trucks afforded it significant productivity increases – up to 30 percent. Adding two thermal cameras in stereo further increases the capability and safety. In dust, blowing sand, and other low-visibility conditions, thermal imaging can play that crucial role, and make roadways and driving in adverse weather conditions safer for all.