Intelligent Cockpit ADAS and HUD Convergence System Design
2025-04-18

Intelligent Cockpit ADAS and HUD Convergence System Design

 

HUDs were first used in airplane cockpits to assist pilots in maneuvering their flights. Pilots in the flight to keep looking ahead to keep an eye on the external environment changes, the emergence of HUD makes the pilot does not look down can also receive important information.

 

The application of HUD in the automobile intelligent cockpit, is the driver in the access to driving information can not look down, at the same time to maintain the attention of the team in front of the road conditions, which can effectively reduce the rate of accidents.

 

 

Currently, HUD technology has evolved from the 2nd generation W-HUD (Windshield HUD) to the 3rd generation AR-HUD (Augmented Reality HUD). Compared to its predecessor, the new AR-HUD can provide drivers with richer and more intuitive driving information.

 

1. AR-HUD key technologies

 

To realize its stunning augmented reality effect, AR-HUD must make breakthroughs in three key aspects: perception, display, and fusion. Compared with W-HUD, AR-HUD puts forward higher requirements in terms of display smoothness, optical effect, position tracking and system stability.

 

First, perception is the cornerstone of AR-HUD technology. The system needs to accurately perceive the real-time road environment in real time, including lane position, intersection position and other key information. At the same time, it also needs to accurately recognize the driver’s line of sight position, as well as the vehicle’s speed, acceleration, steering angle and other dynamic data. Based on this rich sensing information, AR-HUD can adjust the optical projection position in real time to ensure that the virtual information matches the real environment perfectly.

 

Second, the display link is the core of AR-HUD technology. It relies on the results of physical space perception and advanced 3D light field display technology. Through real-time computation, the system is able to render and generate any virtual objects that need to be projected into the real world. These virtual objects should not only be visually lifelike, but also seamlessly integrated with the real environment in terms of location, providing drivers with an immersive driving experience.

 

Finally, the fusion link is the key to AR-HUD technology. It requires high-precision fusion of the real road view and the virtual image to ensure a perfect match between the real world and the virtual imaging in terms of location. At the same time, the system also needs to control the delay at the millisecond level to ensure that the driver can perceive the changes in the virtual information in real time, so as to make accurate judgments and decisions.

 

2. AR-HUD can display two categories of vehicle information and outside information.

 

Vehicle information. This part of the information needs to be obtained through the vehicle bus, which can display the vehicle status, usually including core information such as speed, RPM, fuel power and mileage, alert information such as gear position, fog lamps, turn signals, high beams, instantaneous fuel consumption, and interior temperature, early warning information such as tire pressure, door status, and request for the driver to take over, as well as additional information such as the driving mode and seat status.

 

Out-of-vehicle information. This part of the information is usually obtained through information interaction with the outside world via the network. Including navigation, driver assistance and other travel-related information, speed warning, collision warning and other safety-related information, social WeChat, e-mail, video conferencing and other intelligent office information, as well as infotainment such as music, radio, audio and video playback, etc.

 

 

3. ADAS and AR-HUD integration design

 

Advanced Driver Assistance System (ADAS), as a more important module in the intelligent cockpit domain, undertakes an inaccessible task in the process of the development of cockpit intelligence. In ADAS, the sensors will collect various dimensional data inside and outside the vehicle in real time, and the system will calculate and analyze the data, and prejudge various situations and possible response strategies based on the AI model.

 

ADAS can be divided into three main categories according to its function. They are active control, early warning, and assistance.

 

Active control category. That is, systems that ensure driving safety by actively controlling the vehicle, such as ADAS systems related to vehicle cruise, lane keeping, emergency braking, etc., while the vehicle is in motion.

 

Early warning category. This type of system generally does not actively control the vehicle, but sends an early warning to the driver, prompting the driver to make the appropriate operation to prevent danger, such as reminding the driver to prevent vehicle collision, avoid pedestrians, avoid fatigue driving, and prevent deviation from the lane of the ADAS system.

 

Assist category. Different from the above two categories, this type of system focuses on improving the comfort of the driver’s driving process, including blind spot monitoring, adaptive high beams, night vision, parking assistance, panoramic parking, attention detection, level-view display, traffic sign recognition, and pedestrian detection.

 

The sensors, as the senses of ADAS, can also be broadly categorized into three main categories, namely, environment perception, driving intent and perception, vehicle status sensors, etc.

 

Environment sensing category. Millimeter wave radar, ultrasonic radar, cameras, infrared sensors, light sensors, etc. all belong to the environment-aware class of sensors. In the process of driving, there are various environmental elements in the surrounding environment, including roads, pedestrians, obstacles, traffic signs, etc., and these types of sensors are responsible for sensing these environmental elements.

 

Driving intention sensing class. This type of sensor can obtain the driver’s operating information in real time, based on such information can analyze the driver’s operating intent, such as the steering wheel, brake pedal, accelerator pedal and other information can be obtained from the sensor are driving intent sensors.

 

Vehicle state sensor. This type of sensor can obtain real-time vehicle attitude information, such as speed sensors, wheel speed sensors, body height sensors, etc. are vehicle state sensors.

 

The realization of R-HUD system function requires several steps, firstly, we have to rely on the front-view camera to collect the road condition data in front of us, and obtain the basic data of vehicles, pedestrians and other targets through analytical modeling, including the size of the object, its position and distance, etc., and then project the data to be displayed by the HUD to the corresponding area.

 

The completion of these tasks, ADAS needs to rely on strong arithmetic support. The application of domain control provides a direction for ADAS and begins to develop in the direction of integrated active safety systems. Different ADAS systems can share sensors and control systems for data sharing and connectivity, which not only ensures efficient data processing, but also saves costs.

Copyright © 2022 Vehicledigital-All rights reserved
Translate »