Panasonic Automotive Systems of America has introduced its new Augmented Reality (AR) HUD at CES 2021.
The leading player in projection generation innovation makes use of its latest advances in optics, volume optimization and imaging technology, combined with AI technology.
Panasonic does this from its SPYDR cockpit domain controller to render near-field and far-field content for vehicle information (like speed), object and pedestrian detection, and mapping, route guidance, for a seamless, more engaged and informed driver experience.
Scott Kirchner, president Panasonic Automotive and executive director, Panasonic Smart Mobility, said, “The HUD market is one of the fastest growing categories in mobility, but traditional HUDs only cover a small section of the road.”
He added: “Panasonic’s AR HUD solutions cover more of the roadway, with traditional cluster content like speed and fuel in the near field as well as 3D overlays in the far field, showing navigation and other critical driver data mapping spatially to the road ahead.
And in a future with more self-driving vehicles, our AR HUD could provide an important added level of comfort and assurance for AV passengers as well.”
The AR HUD system projects 3D, AI-driven key information into the driver’s line of sight to help reduce driver distraction and potentially increase safety on the road. AR HUD development utilizes a PRIZM process to address all aspects of users’ needs today, tomorrow and in the future.
Key features :
The key features of new AR HUD from Panasonic Automotive includes eye tracing technology, which projects information at driver’s level of sight based on driver’s eye position while also eliminating a potential mismatch between the projected image when the driver moving their head
Another feature is ‘Advanced Optics’ as its techniques provide expanded field-of-view for virtual image distance of 10m or greater. This technology detects pedestrians and objects through enhanced low light and nighttime view; tilted virtual image planes adjust visibility of objects in the driver’s field of view. While so, embedded camera system permits discrete monitoring for the driver’s eye location.
Yet another feature is AI navigation accuracy. As per this feature, AI-driven AR navigation technology detects and provides multi-color 3D navigation graphics that adjust with moving vehicle’s surroundings. It also displays information like lane markers and GPS arrows where turns will occur and sudden changes such as collisions or cyclists in one’s path
And its Vibration control feature has proprietary camera image stability algorithm, enabling AR icons to lock onto the driving environment regardless of the bumpiness of the road
Under Real-time situational awareness feature, the new technology has driving environment updates occur in real-time; ADAS, AI, AR environment information updates in less than 300 milliseconds
The company’s strategic collaborations with emerging tech innovators provide added depth and breadth to the data-driven visuals in Panasonic’s AR HUD: striking, dual plane, high-resolution laser holography is by Envisics, developers of a patent-protected, dynamic holographic platform that enables true holography across multiple mobility applications; and the 3D localization technology and AI navigation and situation awareness analytics is from Phiar, developers of a patent-protected spatial-AI, AR navigation platform.
(With inputs from Automotive Lead Research Team)
If you like this article from Automotive Lead, please feel free to share this in your social media platforms to help your contacts to understand more on this subject