At CES 2021, Panasonic Automotive Systems of America introduced an augmented reality (AR) head-up display (HUD) that leverages its expertise in projection generation. The technology uses its latest advances in optics, volume optimization, and imaging technology, combined with AI (artificial intelligence) technology from its SPYDR cockpit domain controller. The combination can render near-field and far-field content for vehicle information like speed, object and pedestrian detection, and mapping/route guidance for a more engaged and informed driver experience.
“The HUD market is one of the fastest-growing categories in mobility, but traditional HUDs only cover a small section of the road,” said Scott Kirchner, President, Panasonic Automotive, and Executive Director, Panasonic Smart Mobility. “Panasonic’s AR HUD solutions cover more of the roadway, with traditional cluster content like speed and fuel in the near field as well as 3D overlays in the far field, showing navigation and other critical driver data mapping spatially to the road ahead. And in a future with more self-driving vehicles, our AR HUD could provide an important added level of comfort and assurance for AV passengers as well.”
Panasonic’s AR HUD system projects 3D, AI-driven key information into the driver’s line of sight to help reduce driver distraction and increase safety. The development uses a PRIZM process to address aspects of users’ needs for precise Placement (optimal image positioning); Reflection (AI smart optical graphic road overlays for object/sign detection); Intuitive (discriminates/prioritizes user focus on what is ahead, e.g. is that a deer or a box in the road?); Zonal (UX optimized field-of-view organizationally displays objects along the road); and Mission control (dynamic imaging that brings visibility and the roadway together).
Among the key features of the HUD is eye-tracking technology, projecting information at a driver’s level of sight based on the driver’s eye position, eliminating a potential mismatch between the projected image when the driver moves their head.
Advanced optical design techniques provide expanded field-of-view (beyond 10 by 4 degrees) for virtual image distance of 10m or greater. It can detect pedestrians and objects through enhanced low light and nighttime views. Tilted virtual image planes adjust the visibility of objects in the driver’s field of view. The embedded camera system allows discrete monitoring for the driver’s eye location.
AI-driven AR navigation technology detects and provides multi-color 3D navigation graphics that adjust with moving vehicle’s surroundings, displaying information like lane markers, GPS arrows where turns will occur, and sudden changes such as possible collisions or cyclists in the projected path.
Panasonic’s proprietary camera image-stability algorithm enables AR icons to lock onto the driving environment regardless of the bumpiness of the road. For real-time situational awareness, driving environment updates occur in real-time; ADAS, AI, AR environment information updates in less than 300 ms. The 3D imaging radar enables sensor data capture for 180-degree forward vision up to 90 m (295 ft) and across about three traffic lanes. Crisp and bright 4-K resolution using advanced laser and holography technology, with static near-field cluster information and far-field image plane for AR graphic overlay. The tech’s compact packaging is engineered to fit any vehicle configuration.
Panasonic’s strategic collaborations with emerging tech innovators provide added depth and breadth to the data-driven visuals in Panasonic’s HUD technology.
The dual-plane, high-resolution laser holography is by Envisics, developers of a patent-protected, dynamic holographic platform that enables true holography across multiple mobility applications led by Dr. Jamieson Christmas, Founder, CEO & CTO. Envisics’ Dynamic Holography Platform, with its patented algorithms and high magnification designs, redistributes the light to precisely where and when it is needed. All light-control elements on the holographic modulator contribute to every point in the image it creates. With greater durability and longevity, this is a key technology to meet all current and future automotive requirements.
The 3D localization technology and AI navigation and situation awareness analytics is from Phiar, developers of a patent-protected spatial-AI, AR navigation platform led by Chen-Ping Yu, CEO, and Nasser Iravani, VP of Strategy and Business Development. Phiar’s deep-learning AI technology runs on automotive infotainment systems, detects, and analyzes the driver’s surroundings in real-time, and combines it with 3D localization of the vehicle to provide augmented guidance and safety information. Phiar uses map and navigation data from the leading map platforms to offer live visual navigation with traffic and other contextual data.