As thermal imaging cameras detect long-wave infrared (LWIR) radiation, or heat, they are equally adept at detecting heat in daylight, in total darkness, or in blinding sun glare. They also work significantly better than visible cameras in smoke or in inclement weather such as fog. But what if thermal imaging and visible imaging, or RGB, could be fused together to create a comprehensive single image for improved awareness?
That’s now possible thanks to QuadSight, a multi-spectral vision solution from Foresight, an innovator in automotive vision systems. The QuadSight imaging system essentially combines data from two FLIR Boson thermal imaging cameras and two RGB cameras, hence quad, developed with a ground-breaking stereo imaging method by merging the strengths of each vision system technology. All four cameras are synced and run at 30 frames per second, featuring a unique, patent-pending technology that calibrates all cameras to a common coordinate axis, thus providing accurate pixel fusion between images received from the RGB and thermal cameras.
By using both thermal cameras and RGB cameras, QuadSight creates three-dimensional (3D) awareness of its surroundings across two parts of the electromagnetic spectrum, which serves complementary and redundancy roles for other sensing modalities, including LIDAR, radar, and ultrasonic. Within this context, QuadSight not only determines the distance and shape of objects in front of the vehicle, it also measures the heat of everything in view, enabling the system to more accurately classify if an object detected is a living thing. Afterall, living things such as people and other large mammals are the types of objects people least want to hit.
Stereo vision works similar to human vision in that it’s based on triangulation of rays, in this case, through both left and right thermal and visible cameras, as compared to our left and right eyes. This way, Quadsight can provide depth perception by computing the distances to different objects in a given scene in real time. This is achieved by finding corresponding pixels between the camera stereo pairs and then triangulating the distance measurements through image processing and deep learning algorithms.
Stereo thermal and RGB cameras can be valuable for advanced driving assist systems (ADAS) that already exist today, such as automatic emergency braking (AEB), through to level 4 and level 5 autonomous systems currently in development, making each level more reliable and safer. Not only could the QuadSight system tell the vehicle to slow down upon detecting any type of object on the road in any lighting condition, it could also provide redundant distance data, potentially with a vehicle’s LIDAR and radar systems, to ensure the vehicle makes the most appropriate decision for the circumstance.
This feature is especially crucial in cluttered urban environments. As both visible and thermal cameras collect data passively, they can bridge information gaps that could develop from interference between competing active LIDAR or radar systems, making up for a reduction in fidelity, as well as in poor weather such as heavy rain, where LIDAR might be further challenged.
Ultimately, to achieve true autonomy, vehicles of the future will need multiple sensor types, including stereo thermal and visible cameras, that provide multiple redundancies to ensure that in any given situation, no matter how rare or unlikely, the vehicle will perform effectively to maintain safety. That safety refers to the passengers inside a vehicle along with passengers in other vehicles, pedestrians, cyclists, animals, and any other objects sharing the roadway.