Autonomous vehicles (AVs) have shown to work well in ideal weather conditions with clear skies, well-marked roads, and predicable routes. However, creating a system that can handle all types of weather and environmental conditions is significantly more challenging. Those issues become more pronounced in the development of autonomous and unmanned vehicles and other ground-based systems designed for off-road use, especially for vital industrial functions that power the global economy including resource extraction, agriculture, and construction. One of the key hurdles to overcoming harsh environments is accounting for particulate matter in the air, especially dust.
Existing sensor technologies found on vehicles equipped with advanced driver assistance systems (ADAS) and AVs today are degraded in dusty environments by reduced visibility and range of visible (RGB) cameras. Radar can effectively penetrate adverse conditions, including dust, but can only capture a partial picture of scene—it can’t classify objects nor distinguish between a living thing (the thing drivers least want to hit) and an inanimate object, a critical aspect to improving workplace safety as it relates to transportation. Radar also suffers from high false positive rate which limits its ability to operate independently without fusion with other sensors, typically mono cameras.
Through testing conducted by Teledyne FLIR and other government-funded programs, long-wave infrared imaging (LWIR), which detects electromagnetic light in the 8 μm to 14 μm spectrum band, has proven to be effective at detecting objects where particulate matter suspended in the atmosphere reduces visibility. Thermal imaging is especially effective for pedestrian detection in comparison with RGB, near-infrared and short-wave infrared camera systems at night.
To further improve the perception accuracy of thermal imaging cameras, pairing two together stereoscopically adds a third dimension to the data captured. Stereo paired thermal cameras can provide crucial depth perception through dust, including the ability to measure distances by triangulating the cameras with objects in the field of view.
That comprehension, combined with a convolutional neural network, enables the system to both identify objects and their respective distances to make the most appropriate decisions for the situation, in conjunction with the other sensor modalities mentioned above.
Furthermore, stereoscopic thermal imaging has recently become significantly more effective thanks to automotive vision systems innovator Foresight Automotive, part of the Thermal by FLIR® program. Its industry-first, patent-pending automatic calibration system, is designed to ensure that stereo cameras remain calibrated regardless of their configuration or position, in order to create accurate and continuous depth perception..