AR-HUD enhanced head-up display technology

In order to realize the new form of dialogue between people and vehicles, the international auto parts supplier Continental has developed an augmented reality head-up display (AR-HUD). The augmented reality head-up display expands or enhances the driver's perception of the actual driving environment by accurately incorporating image information into the actual traffic conditions through an internally designed optical system. Therefore, AR-HUD technology is very likely to become the most innovative development direction of automotive human-machine interface (HMI).

AR-HUD has now reached the later stages of pilot production development. The demonstration vehicle is used on the one hand to prove its feasibility and on the other hand to provide valuable experience for batch development. Continental is expected to reach mass production levels in 2017.

AR-HUD: What you see is known

“The AR-HUD optical system can reasonably superimpose the status of the driver assistance system and the meaning of this information in the driver's field of vision,” said Dr. Pablo Richter, HUD technical expert at Continental's Body Electronics Division. “To this end, the AR-HUD, a new member of the human-machine interface, has established a close network connection with the environmental sensors of the driver assistance system and GPS data, map data and driving dynamics data during the pilot production phase.”

On the demonstration vehicle, Continental Group demonstrated for the first time the late development of this new technology. The selected vehicle is equipped with a radar sensor and a CMOS single camera on the front bumper and the rear view mirror. The Advanced Driver Assistance System (ADAS) selected for AR-HUD applications includes Lane Departure Warning System (LDW), Adaptive Cruise Control System (ACC), and navigation system route indication.

“If the driver assistance system recognizes an important situation, the AR-HUD will present the alarm message to the driver as a virtual image,” Richter explained. “In addition to the direct safety benefits, this form of dialogue is a key technology for automated driving in the future. “This enhanced information display will make it easier for drivers to build trust in the new driving function of autonomous driving. ”

Projection surfaces for two different projection distances - based on two different imaging units

Continental's AR-HUD can achieve projection surfaces produced by two different projection distances, also known as near projection or state projection surfaces, and far projection or enhanced projection surfaces. The close-up projection appears at the end of the hood in front of the driver and can display status information selected by the driver, such as immediate speed, effective distance limits such as prohibition of overtaking and speed limit, or ACC current settings. To view this information, the driver only needs to slightly lower the line of sight by about 6°. The state projection information has a field of view of 5° x 1° (equivalent to 210 mm x 42 mm) and a projection distance of 2.4 m. It is equivalent to a virtual image of a "traditional" heads-up display based on a mirror optics and imaging unit (PGU). The latter consists of a TFT (Thin Film Transistor) display that uses a strong backlight to display content. This imaging unit is extremely compactly integrated in the upper part of the AR-HUD module. This mirror optics magnifies the contents of the virtual display. These functions are achieved with the help of curved mirrors. In order to achieve two projection surfaces with different projection distances, Continental has adopted a sophisticated optical design in the AR-HUD. The light paths of the two projection surfaces overlap slightly inside. The optical path near the projection surface uses only the upper edge region of the AR-HUD mirror (aspherical surface) and does not require another "mirror". This part of the AR-HUD system represents the current technology of Continental's second generation HUD on mass production vehicles.

Enhance cars with film technology

“The protagonist in the AR-HUD is of course an enhanced projection surface. It has a projection distance of 7.5 m in front of the driver and can project the enhanced display symbol directly onto the road, blending with current traffic conditions,” says Richter. The content of the far projection surface was generated by the new imaging unit exhibited by Continental for the first time at IAA 2013. Image elements are generated by digital micromirror elements (DMDs) as in a movie digital projector. At the heart of the PGU is an optical semiconductor with hundreds of thousands of tiny mirror matrices that can be tilted individually by means of an electrostatic field. The micromirror matrix is ​​rapidly alternately illuminated by three color LEDs (red, blue, and green) in chronological order. Use a deflecting mirror with a filter function (ie a two-color filter) to ensure that the three colors collimate (parallel direction). These special mirrors let light pass or reflect depending on the color. In order to synchronize with the illuminated color, all of the micromirrors of the color are tilted such that the incoming light is reflected through the lens, forming a single image point of the color on the subsequent imaging screen. The synchronization method for all three colors is the same. The human eye "synthesizes" all three colors on the imaging screen to get a full-color image.

Viewed from the front of the imaging screen, the next optical path is similar to the optical path of a conventional heads-up display: the image on the imaging screen is reflected by the first mirror (mirror) onto the second larger mirror (AR HUD mirror) . From there, shoot at the windshield. The exit surface of the reinforced optical system is almost DIN A4 sized. This results in an enhanced projection surface with a viewing area of ​​10° X 4.8°, which corresponds to an enhanced field of view with a geometric width of 130 cm and a height of 63 cm in the direct field of view. To read the information in this far-projection surface, the driver's field of view can be slightly lowered by 2.4°. The imaging unit of the two projection surfaces (state and enhanced projection surface) can adjust the display brightness with the ambient brightness, and the optical density can reach 10,000 cd/m2 or more. Therefore, it can be clearly displayed under almost any strong ambient light conditions.

In the test vehicle, the advantage of the system is to use the AR-HUD of the projection surface of two different projection distances of the Continental Group. The advantage is obvious in most traffic situations, with the 7.5 m far projection surface directly enhancing the content on the road. The 2.4 m near projection surface is used to display status information.

AR-Creator fuses data and generates images

A large number of simulation experiments and Continental's live-action experiments have shown that the enhanced image display appears to be approximately 18 to 20 m in front of the vehicle and the driver feels comfortable when the route lasts approximately 100 m. Peter Giegerich, who is responsible for the development of AR-Creator, explains how image prompts are generated and how to properly place them. “AR-Creator is a challenging new development. This controller has to evaluate a large number of data streams. In order to accurately position the image elements on the imaging screen to accurately project into the driver's AR-HUD field of view. Some of them require considerable computation."

AR-Creator will fuse data from three sources: A single camera can be used to draw road geometry. These include the convolution curve and a mathematical description of the curvature of the front lane of the vehicle. The size, position and distance of the object in front of the vehicle are combined from radar sensor data and the camera data of the control. Finally, Continental's eHorizon provides a map framework to compile live sensor data. On the demo vehicle eHorizon is still static, using only navigation data. The Continental Group's highly dynamic networked eHorizon products have reached mass production levels, processing data from a variety of sources (vehicle-to-vehicle, traffic control centers, etc.) and displaying them on AR-HUD. Position the vehicle on a digital map by integrating driving dynamics data, camera data, and GPS data.

AR-Creator uses the fused data to calculate the geometry of the road ahead of the car as seen from the driver's position. This is possible because the position of the driver's eyes is known: On the demo vehicle, the driver needs to set the position of the eye recognition frame correctly before driving. This can be done automatically by the internal camera. It identifies the position of the driver's eyes and guides the positioning of the eye recognition frame. The eye recognition frame is a rectangular area whose width and height are comparable to the theoretical "window". As long as the driver looks at the road from this window, he can see the complete AR-HUD image. Passengers cannot see what the HUD and AR-HUD display.

“AR-Creator knows from the adjustable position of the eye recognition box, where the driver's eyes are, and at which angle the traffic situation and surroundings are being viewed,” explains Giegerich. If the driver assistance system detects what is happening, the corresponding virtual image prompt will be generated and projected to the correct position on the AR-HUD screen.

less is more

According to Human Machine Interface (HMI) developer Stephan Cieler, most of the development work is done around the design of virtual prompts. “After countless design studies and real-life experiments, we also implemented on AR-HUD: Less is more motto.” So the original idea was to lay a transparent colored carpet on the driveway and quickly discard it. “We want to give the driver only the minimum amount of image prompts necessary to cover the real driving horizon.”

For example, as a navigation aid, the angled arrows can be selected to be "flat" in the lane, and upright to the new direction of travel when commuting, like a sign. Such a design can also provide a virtual cue at a sharp turn, but in this case it cannot be actually enhanced due to the lack of a distant view.

Warnings that cross the lane boundaries are also cautiously prompted. The AR-HUD of the demonstration vehicle will only emphasize the lane boundaries when the driver inadvertently crosses all the signs.

If eHorizon receives an incident message in advance, a hazard symbol with a high attention value will appear in the driver's field of vision. “As long as this new form of interaction with the driver is installed, the HMI can provide a variety of ways for the driver to understand and foresee the driving situation. Just like the driver and the car have a silent conversation,” Richter said.

The mining intrinsically safe Explosion Proof Vfd is used for frequency conversion of belt conveyor, scraper conveyor, fan, water pump, pump station and other equipment. This explosion proof motor starter has a variety of control modes to choose from, wide speed range, flexible speed control, strong anti-interference ability and long service life, which is the best choice for downhole frequency conversion speed control device.

Explosion-proof MVD

Abb Vfd,Ac Variable Frequency Drive,Vfd Motor Drive,Flameproof Vfd,Explosion Proof Vfd,Explosion Proof Manual Motor Starter