Augmented reality head-up displays - navigating the next-gen driving experience

author avatar

24 Jun, 2022

Road obstacle detection with machine learning algorithms (left) and 3D AR HUD navigating through public roads (right). Credit: Department of Engineering, University of Cambridge

Road obstacle detection with machine learning algorithms (left) and 3D AR HUD navigating through public roads (right). Credit: Department of Engineering, University of Cambridge

Augmented reality (AR) head-up displays (HUDs) are widely considered to be the future of connected vehicles, but more human-centred studies are needed to assess the impact of AR on the driver while operating a vehicle on public roads, say Cambridge researchers.

In a new review of the efficiency, usability, safety and security of automotive holographic HUDs, lead author Jana Skirnewskaja and her PhD supervisor Professor Tim Wilkinson, say that while the technology can aid in providing safer and inclusive transportation, the impact on driver comfort and on overall road safety and security is not yet fully understood and must be addressed to ensure the development of modern vehicles that can be safely operated. The findings are reported in the journal Advanced Materials.

HUDs work by projecting a transparent 2D or 3D digital image of navigational and hazard warning information, for example, onto the windscreen of the vehicle. These projected images then merge with the driver’s view of the road ahead. Windshield HUDs, for example, are set up so that the driver does not need to shift their gaze away from the road in order to view the relevant, timely information. This technology helps to keep the driver’s attention on the road, as opposed to the driver having to look down at the dashboard or navigation system.

Technological advances in this area have led to HUDs with holographic displays and AR in 3D. This added depth perception makes it possible to project computer-generated virtual objects in real time into the driver’s field of view to warn, inform or entertain the user. The driver’s alertness to road obstacles is increased by enabling shorter obstacle visualisation times, and eye strain and driving stress levels are reduced. 

“Holographic HUDs are paramount if we are to explore the possibilities of augmented and mixed reality for road safety,” said Jana, who conducts her PhD research at the EPSRC Centre for Doctoral Training (CDT) in Connected Electronic and Photonic Systems, a joint centre with the University of Cambridge and UCL. “Holographic HUDs can project 3D objects directly into the retina to achieve an AR experience. The technology is seen as the next new addition to tomorrow’s connected vehicles, but managing an abundance of information while driving in this modern set-up requires the driver to multitask, leading to cognitive overload – this increases the likelihood of a collision while driving.”

According to their review, there are important challenges to be tackled with regards to the implementation of AR HUDs, with the following deemed to be the “most important”: creating a multifocal display, large viewing area without comprising the field of view; ensuring the optimal positioning on the windscreen; creating minimal invasiveness while driving by ensuring the accurate identification of hazards on the road.

“Further studies on driver distraction are required,” said Jana. “This could include analysis of the placement of content on the windshield and the placement of AR projections in the driver’s field of view via behavioural, neurodiversity and system engineering studies.”

The Cambridge researchers' vision of the LiDAR derived AR HUD applied in a car setting. Credit: Department of Engineering, University of Cambridge.  

Developments in this field have led to LiDAR (light detection and ranging) data being trialled to create ultra-high-definition holographic representations of road objects that are beamed directly to the driver’s eyes. LiDAR is commonly used in agriculture, archaeology and geography, but it is also being trialled in autonomous vehicles for obstacle detection. It is a remote sensing method that works by sending out a laser pulse to measure the distance between the scanner and an object. Meanwhile, the integration of machine learning algorithms (for gesture recognition and automatic obstacle detection) into HUDs, as well as metamaterials – an emerging class of artificially engineered ultralight materials with extreme functional properties – could, say the researchers, lead to customisable, transformative image projection capabilities designed to enhance road safety. 

“Holographic AR HUDs have the potential to increase safety and security in transportation,” said Jana. “In the future, this technology can increase the interconnectedness of vehicles and with the interactive urban environment to reduce traffic accidents.”

In their review, the researchers say there is a need to develop an “inclusive strategy” for the driver where, for example, certain objects could be prioritised and placed into an area of personal preference by the driver based on their needs. Additionally, they say, focus is also needed on developing the technology for those who are visually and/or movement impaired, for example the elderly and those affected by disability. This is where machine learning could play a central role in ‘intelligent’ collision avoidance, visual enhancements, and support for people with neurodiverse conditions, such as attention deficit hyperactivity disorder (ADHD), autism and dyslexia.

While the researchers point out that holographic HUDs can improve road safety, they also advise that future legislative requirements are developed for safe automotive holographic video displays (that perceive road obstacles in full depth and within a 360° field of view) and AR HUDs. 

“This is a crucial step toward a full obstacle assessment for safety reasons and to prevent the driver from suffering from fatigue due to changing views resulting from holographic video displays and AR HUDs,” added Jana.

Another important point for consideration, they say, is ensuring secure information management by, for example, combining 3D hologram projection techniques with real-time encryption-decryption to generate colour holographic videos. Vehicle and driver data can then be shared with the smart urban environment.

Reference:
Jana Skirnewskaja; Timothy D. Wilkinson. ‘Automotive Holographic Head-Up Displays.’ Advanced Materials (2022). DOI: 10.1002/adma.202110463

More by Team Cambridge

The University of Cambridge is one of the world's foremost research universities. The University is made up of 31 Colleges and over 150 departments, faculties, schools and other institutions. Its mission is 'to contribute to society through the pursuit of education, learning, and research at the hi...