If you have difficulty in submitting comments on draft standards you can use a commenting template and email it to admin.start@bsigroup.com. The commenting template can be found here.

We use cookies to give you the best experience and to help improve our website

Find out what cookies we use and how to disable them

ISO/NP 26738 Road vehicles — Visibility — Viewing quality evaluation for Head-up displays (HUD)

Source:
ISO
Committee:
AUE/0 - International work on road vehicles
Categories:
Information management | Standardization. General rules
Comment period start date:
Comment period end date:

Comment by:

Scope

This standard provides methods for assessing the driver’s viewing quality of AR images presented within the forward field of view. AR images displayed by a head-up display (HUD) are superimposed onto real-world objects such as preceding vehicles and road surfaces. This document covers both 2D and 3D HUD systems, regardless of the underlying technology, but excludes wearable-type HUDs.

Considering actual driving environments, the driver’s viewing quality is assessed from the following perspectives:

- To define the factors that affect visual comfort when viewing AR images and specifying their measurement methods;

- To assessing AR images in terms of depth perception;

- To assess spatial resolution and the relationship between the physical resolution and the perceived resolution of AR images.

Purpose

1) Identification of key viewing-quality factors of AR images in real driving environments, and the need to develop in-vehicle evaluation methods

⚫ The driver watches the HUD AR images superimposed on the road surface or the vehicle ahead in the day to night environment.

- For example, changes in the distance and surface properties (e.g., texture and reflectance) of real background objects — such as the road surface or the license plate and rear window of the preceding vehicle — can influence the driver’s visual comfort during AR image viewing.

⚫ In such real world conditions, the driver’s visual perception is influenced by varying ambient illuminance, contrast with real objects, depth alignment between virtual and real objects, and continuous accommodation–convergence interactions.

- The continuous accommodation–convergence interaction refers to the ongoing coupling between focal adjustment and eye convergence as the driver continuously shifts gaze between objects at different distances. 

⚫ Therefore, beyond component-level performance evaluation, it is essential to assess the viewing quality of AR images as perceived by the driver under real driving conditions using in - vehicle HUD systems.

 2) The need for standard development considering the technological advancements of AR HUD systems and their hardware differences from conventional flat panel displays.

⚫ AR HUD technologies are evolving toward True AR HUD, enabling virtual images to be overlaid in real time within the driver’s field of view and spatially aligned with real -world elements such as pedestrians and surrounding vehicles. This evolution is driven by the increasing adoption of AR HUD systems to enhance driving safety and situational awareness.

- In addition, AI-based contextual recognition technologies enable adaptive content filtering, selectively presenting essential information — such as collision warnings and navigation guidance — within AR HUD images. As hardware and software technologies advance, HUD systems are capable of delivering increasingly diverse and dynamic content.

- These developments highlight the need for systematic evaluation of AR image viewing quality. In particular, visual comfort is a fundamental determinant of viewing quality for both static and dynamic AR images. Therefore, key factors affecting visual comfort in real driving environments should be identified, and appropriate evaluation methods should be established.

⚫ From a hardware perspective, the significant difference between conventional flat panel displays and HUD systems is the introduction of an optical system — such as lenses for magnification and mirrors for light path redirection — to present virtual images beyond the windshield.

- As a result of this difference, AR HUD systems exhibit viewing-quality characteristics that differ from those of conventional flat panel displays, including (1) sharpness, (2) focus/depth, (3) geometric distortion, (4) luminance uniformity, and (5) ghost images. Physical evaluation methods for characteristics (3) to (5) are defined in ISO TS 21957.

- The optical performance evaluation of the HUD component is described in ISO TS 21957. This standard, on the other hand, deals with viewing quality evaluation from the driver's perspective when the HUD is integrated into the actual vehicle.

- Furthermore, resolution encompasses both physical pixel -based resolution and perceptual resolution required for interpreting text and graphical information, both of which determine perceived (1) sharpness. In contrast to flat panel displays with a fixed viewing distance, AR images in HUD systems may be presented at varying distances. Therefore, the (2) distance relationship between AR images and corresponding real -world objects should be evaluated.

Comment on proposal

Required form fields are indicated by an asterisk (*) character.


Please email further comments to: debbie.stead@bsigroup.com

Follow standard

You are now following this standard. Weekly digest emails will be sent to update you on the following activities:

You can manage your follow preferences from your Account. Please check your mailbox junk folder if you don't receive the weekly email.

Unfollow standard

You have successfully unsubscribed from weekly updates for this standard.

Error