Eyewear display - Part 400-20: Introduction to sensing functions - 3D sensing

IEC TR 63145-400-20:2026, which is a Technical Report, provides general information, main features and applications of 3D sensing used for eyewear display, and to clarify the normative aspects of the standardization in this technology area.
The 3D sensing techniques mentioned in this document are mainly based on optical, non-contact principles.

General Information

Status
Published
Publication Date
01-Apr-2026
Technical Committee
TC 110 - Electronic displays
Drafting Committee
WG 12 - TC 110/WG 12
Current Stage
PPUB - Publication issued
Start Date
02-Apr-2026
Completion Date
24-Apr-2026

Overview

IEC TR 63145-400-20:2026 is a Technical Report developed by the International Electrotechnical Commission (IEC) under the title Eyewear display - Part 400-20: Introduction to sensing functions - 3D sensing. This document provides comprehensive information on 3D sensing technologies utilized in eyewear displays, particularly focusing on optical, non-contact methods. The report outlines key features, types, and practical applications of 3D sensing, clarifies standardization considerations, and emphasizes the importance of calibration and measurement accuracy. This standard is pivotal for advancing augmented reality (AR) and virtual reality (VR) solutions in wearable displays, ensuring accurate depth perception and spatial awareness.

Key Topics

Types of 3D Sensing Technologies:

  • Laser Scanning: Employs laser beams to scan objects and capture precise 3D coordinates, widely used in industrial assembly and terrain modeling. Laser scanning ensures high-resolution data, although its implementation in eyewear is limited by stability requirements.
  • Structured Light: Utilizes projection and cameras to map object surfaces based on pattern distortion. Effective for gesture and facial recognition in indoor environments due to its high accuracy at close distances.
  • Time of Flight (ToF): Measures the time light pulses take to return after hitting an object, providing real-time depth mapping. ToF is suited for mobile, AR, and VR applications, balancing cost and performance.
  • Binocular Stereo Vision: Uses two cameras to analyze images from different viewpoints, allowing for detailed 3D information extraction. Crucial for applications needing high resolution and geometric accuracy.
  • Monocular Vision & Photogrammetry: Estimates depth from single images or through the analysis of multiple images captured over time. These low-cost, flexible solutions can be achieved with advanced algorithms and minimal hardware.

Standardization Criteria:

IEC TR 63145-400-20:2026 identifies critical performance metrics for 3D sensing in eyewear displays, including:

  • Deviation (absolute and relative): Impact on calibration and precision.
  • Dynamic Range: Effective measurement across diverse depths.
  • Colour and Reflectivity Response: Performance across varying object surfaces.
  • Multipath & Moving Object Response: Adaptability to reflections and dynamic scenes.
  • Lateral Resolution: Fineness of spatial detail captured.
  • Flying Spot Noise: Presence of non-reliable depth data.
  • Portrait Integrity: Completeness of human subject capture at various distances.

Applications

3D Sensing in Eyewear Displays:

  • 3D Registration: Aligns digital content accurately within real-world environments for seamless AR experiences.
  • Gesture Interaction: Enables intuitive, natural interaction with virtual objects through precise hand and body tracking.
  • 3D Reconstruction: Builds accurate digital models of real environments, supporting immersive simulations and analytics.
  • Simultaneous Localization and Mapping (SLAM): Merges sensor input to create real-time spatial maps essential for navigation, robotics, and AR applications.

These capabilities are vital in fields such as:

  • Medical and Surgical Assistance: Accurate mapping and overlay for complex procedures.
  • Industrial Inspection and Maintenance: Spatial awareness for remote support and precision tasks.
  • Consumer Electronics: Enhanced AR/VR gaming and productivity applications.
  • Cultural Heritage and Architecture: 3D reconstruction of artifacts and structures.

Related Standards

For a consistent and interoperable approach to 3D sensing technologies, several related standards are referenced:

  • IEC 63145 Series: Covers additional aspects of eyewear displays.
  • ISO 10360-13: Pertains to optical 3D coordinate measuring systems, useful for calibration and validation.
  • ISO/IEC JTC1: Standards for information technology relevant to sensor integration.
  • ISO/TC172: Focused on optics and photonics; applicable in calibration and measurement techniques.
  • ASTM E57: Directed at 3D imaging systems with guidance for data quality and system evaluation.
  • IEEE Electron Devices Society: Involvement in electronic device standards impacting sensor development.

By adhering to IEC TR 63145-400-20:2026 and related standards, manufacturers and developers ensure the reliability, quality, and interoperability of 3D sensing technologies in eyewear display systems, driving innovation in AR, VR, and mixed reality ecosystems.

Buy Documents

Technical report

IEC TR 63145-400-20:2026 - Eyewear display - Part 400-20: Introduction to sensing functions - 3D sensing

ISBN:978-2-8327-1160-6
Release Date:02-Apr-2026
English language (20 pages)
sale 15% off
Preview
sale 15% off
Preview

Get Certified

Connect with accredited certification bodies for this standard

BSMI (Bureau of Standards, Metrology and Inspection)

Taiwan's standards and inspection authority.

TAF Taiwan Verified

Sponsored listings

Frequently Asked Questions

IEC TR 63145-400-20:2026 is a technical report published by the International Electrotechnical Commission (IEC). Its full title is "Eyewear display - Part 400-20: Introduction to sensing functions - 3D sensing". This standard covers: IEC TR 63145-400-20:2026, which is a Technical Report, provides general information, main features and applications of 3D sensing used for eyewear display, and to clarify the normative aspects of the standardization in this technology area. The 3D sensing techniques mentioned in this document are mainly based on optical, non-contact principles.

IEC TR 63145-400-20:2026, which is a Technical Report, provides general information, main features and applications of 3D sensing used for eyewear display, and to clarify the normative aspects of the standardization in this technology area. The 3D sensing techniques mentioned in this document are mainly based on optical, non-contact principles.

IEC TR 63145-400-20:2026 is classified under the following ICS (International Classification for Standards) categories: 17.180.99 - Other standards related to optics and optical measurements; 31.120 - Electronic display devices. The ICS classification helps identify the subject area and facilitates finding related standards.

IEC TR 63145-400-20:2026 is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.

Standards Content (Sample)


IEC TR 63145-400-20 ®
Edition 1.0 2026-04
TECHNICAL
REPORT
Eyewear display -
Part 400-20: Introduction to sensing functions - 3D sensing
ICS 17.180.99; 31.120 ISBN 978-2-8327-1160-6

All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized in any form or
by any means, electronic or mechanical, including photocopying and microfilm, without permission in writing from either
IEC or IEC's member National Committee in the country of the requester. If you have any questions about IEC copyright
or have an enquiry about obtaining additional rights to this publication, please contact the address below or your local
IEC member National Committee for further information.

IEC Secretariat Tel.: +41 22 919 02 11
3, rue de Varembé info@iec.ch
CH-1211 Geneva 20 www.iec.ch
Switzerland
About the IEC
The International Electrotechnical Commission (IEC) is the leading global organization that prepares and publishes
International Standards for all electrical, electronic and related technologies.

About IEC publications
The technical content of IEC publications is kept under constant review by the IEC. Please make sure that you have the
latest edition, a corrigendum or an amendment might have been published.

IEC publications search - IEC Products & Services Portal - products.iec.ch
webstore.iec.ch/advsearchform Discover our powerful search engine and read freely all the
The advanced search enables to find IEC publications by a
publications previews, graphical symbols and the glossary.
variety of criteria (reference number, text, technical With a subscription you will always have access to up to date
committee, …). It also gives information on projects, content tailored to your needs.
replaced and withdrawn publications.

Electropedia - www.electropedia.org
IEC Just Published - webstore.iec.ch/justpublished The world's leading online dictionary on electrotechnology,
Stay up to date on all new IEC publications. Just Published containing more than 22 500 terminological entries in English
details all new publications released. Available online and and French, with equivalent terms in 25 additional languages.
once a month by email. Also known as the International Electrotechnical Vocabulary
(IEV) online.
IEC Customer Service Centre - webstore.iec.ch/csc
If you wish to give us your feedback on this publication or
need further assistance, please contact the Customer
Service Centre: sales@iec.ch.
CONTENTS
FOREWORD . 3
1 Scope . 5
2 Normative references . 5
3 Terms, definitions and abbreviated terms . 5
3.1 Terms and definitions . 5
3.2 Abbreviated terms. 5
4 3D sensing technologies. 5
4.1 General introduction. 5
4.2 Types of 3D sensing . 6
4.2.1 Laser scanning . 6
4.2.2 Structured light . 7
4.2.3 Time of flight (ToF) . 8
4.2.4 Binocular stereo vision . 8
4.2.5 Monocular vision . 9
4.2.6 Photogrammetry . 10
4.3 Summary . 11
5 3D sensing application in eyewear displays . 11
5.1 General . 11
5.2 3D registration. 12
5.3 Gesture interaction . 12
5.4 3D reconstruction . 12
5.5 Simultaneous localization and mapping (SLAM) . 12
6 Possible standardization items for 3D sensing . 13
6.1 General . 13
6.2 Deviation . 14
6.3 Dynamic range . 14
6.4 Colour response . 14
6.5 Reflectivity response . 14
6.6 Multipath response . 15
6.7 Moving object response . 15
6.8 Lateral resolution . 16
6.9 Flying spot noise . 16
6.10 Portrait integrity . 16
6.11 Summary . 17
7 Related work in other standardization organizations . 17
7.1 ISO/IEC/JTC1 (Information technology). 17
7.2 ISO/TC172 (Optics and photonics) . 18
7.3 ASTM E57 (3D imaging systems) . 18
7.4 IEEE Electron Devices Society . 18
Bibliography . 19

Figure 1 – Comparison of 3D sensing and 2D sensing . 6
Figure 2 – Schematic diagram of laser scanning . 7
Figure 3 – Schematic diagram of structured light . 7
Figure 4 – Schematic diagram of ToF . 8
Figure 5 – Schematic diagram of binocular stereo vision . 9
Figure 6 – Example of monocular depth estimation . 10
Figure 7 – Example of calibration for simple augmented reality application . 11
Figure 8 – Markers for a single hand . 12
Figure 9 – 3D reconstruction of the real scene . 12
Figure 10 – Fusion of virtual and reality . 13
Figure 11 – Example pipeline of 3D sensing . 13
Figure 12 – Example of colour response . 14
Figure 13 – Example of reflectivity response . 15
Figure 14 – Example of multipath response . 15
Figure 15 – Example of moving object response . 15
Figure 16 – Example of depth map for lateral resolution . 16
Figure 17 – Example of flying spot noise . 16
Figure 18 – Example of depth map at different distance . 17

INTERNATIONAL ELECTROTECHNICAL COMMISSION
____________
Eyewear display -
Part 400-20: Introduction to sensing functions - 3D sensing

FOREWORD
1) The International Electrotechnical Commission (IEC) is a worldwide organization for standardization comprising
all national electrotechnical committees (IEC National Committees). The object of IEC is to promote international
co-operation on all questions concerning standardization in the electrical and electronic fields. To this end and
in addition to other activities, IEC publishes International Standards, Technical Specifications, Technical Reports,
Publicly Available Specifications (PAS) and Guides (hereafter referred to as "IEC Publication(s)"). Their
preparation is entrusted to technical committees; any IEC National Committee interested in the subject dealt with
may participate in this preparatory work. International, governmental and non-governmental organizations liaising
with the IEC also participate in this preparation. IEC collaborates closely with the International Organization for
Standardization (ISO) in accordance with conditions determined by agreement between the two organizations.
2) The formal decisions or agreements of IEC on technical matters express, as nearly as possible, an international
consensus of opinion on the relevant subjects since each technical committee has representation from all
interested IEC National Committees.
3) IEC Publications have the form of recommendations for international use and are accepted by IEC National
Committees in that sense. While all reasonable efforts are made to ensure that the technical content of IEC
Publications is accurate, IEC cannot be held responsible for the way in which they are used or for any
misinterpretation by any end user.
4) In order to promote international uniformity, IEC National Committees undertake to apply IEC Publications
transparently to the maximum extent possible in their national and regional publications. Any divergence between
any IEC Publication and the corresponding national or regional publication shall be clearly indicated in the latter.
5) IEC itself does not provide any attestation of conformity. Independent certification bodies provide conformity
assessment services and, in some areas, access to IEC marks of conformity. IEC is not responsible for any
services carried out by independent certification bodies.
6) All users should ensure that they have the latest edition of this publication.
7) No liability shall attach to IEC or its directors, employees, servants or agents including individual experts and
members of its technical committees and IEC National Committees for any personal injury, property damage or
other damage of any nature whatsoever, whether direct or indirect, or for costs (including legal fees) and
expenses arising out of the publication, use of, or reliance upon, this IEC Publication or any other IEC
Publications.
8) Attention is drawn to the Normative references cited in this publication. Use of the referenced publications is
indispensable for the correct application of this publication.
9) IEC draws attention to the possibility that the implementation of this document may involve the use of (a)
patent(s). IEC takes no position concerning the evidence, validity or applicability of any claimed patent rights in
respect thereof. As of the date of publication of this document, IEC had not received notice of (a) patent(s), which
may be required to implement this document. However, implementers are cautioned that this may not represent
the latest information, which may be obtained from the patent database available at https://patents.iec.ch. IEC
shall not be held responsible for identifying any or all such patent rights.
IEC TR 63145-400-20 has been prepared by IEC technical committee 110: Electronic displays.
It is a Technical Report.
The text of this Technical Report is based on the following documents:
Draft Report on voting
110/1819/DTR 110/1839/RVDTR
Full information on the voting for its approval can be found in the report on voting indicated in
the above table.
The language used for the development of this Technical Report is English.
This document was drafted in accordance with ISO/IEC Directives, Part 2, and developed in
accordance with ISO/IEC Directives, Part 1 and ISO/IEC Directives, IEC Supplement, available
at www.iec.ch/members_experts/refdocs. The main document types developed by IEC are
described in greater detail at www.iec.ch/publications.
A list of all parts in the IEC 63145 series, published under the general title Eyewear display,
can be found on the IEC website.
The committee has decided that the contents of this document will remain unchanged until the
stability date indicated on the IEC website under webstore.iec.ch in the data related to the
specific document. At this date, the document will be
– reconfirmed,
– withdrawn, or
– revised.
1 Scope
This part of IEC 63145, which is a Technical Report, provides general information, main
features and applications of 3D sensing used for eyewear display, and to clarify the normative
aspects of the standardization in this technology area.
The 3D sensing techniques mentioned in this document are mainly based on optical, non-
contact principles.
2 Normative references
There are no normative references in this document.
3 Terms, definitions and abbreviated terms
3.1 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following
addresses:
– IEC Electropedia: available at https://www.electropedia.org/
– ISO Online browsing platform: available at https://www.iso.org/obp
3.1.1
3D sensing
functional unit that can detect the spatial information of the scene additionally by depicting the
depth information with a coloured code
3.2 Abbreviated terms
AR augmented reality
DOF degrees of freedom
FoV field of view
LiDAR light detection and ranging
MDE monocular depth estimation
MR mixed reality
SLAM simultaneous localization and mapping
ToF time of flight
VR virtual reality
4 3D sensing technologies
4.1 General introduction
The conventional optical 2D sensing provides local brightness information in a scene by
capturing images from a sensor location. Such images do not contain depth information. 3D
sensing can detect and add spatial information of the scene, for example, by depicting the depth
information with a coloured code. z-axis data can be added to the x, y-axis data for each pixel
of the image being captured by the camera module, after the data processing and ideally with
a minimum of latency. Most of the methods can only generate depth or distance data with a
time delay, which allows to convert 2D data into 3D data (Figure 1).
Figure 1 – Comparison of 3D sensing and 2D sensing
A distinction is made between passive and active 3D sensing:
Passive 3D sensing techniques use available means or hardware, like one or more embedded
cameras. The main advantage of these techniques is that the eyewear displays is not burdened
by the higher weight or volume. All calculations for generating z-axis data based on camera
images can be performed remotely and then sent to the eyewear displays.
Active 3D sensing is generally equipped with additional build-in or external hardware, which
project beams onto the observed scene by using random, fixed or modulated pattern. Any
integrated or autonomous projection devices mean additional weight, volume and energy
storage capacity, rather than additional computational power. Most sensing principles have
latency when collecting sequential data, which can be minimized to allow near real-time
applications.
It is easy to produce 3D impression or qualitative information when they are displayed or
monitored in the eyewear device. But for quantitative results and precise correlation between
the displayed information and the shape or position of the real objects within the observed
scene, the calibration is essential. This also includes the tracking of the position of the image
recording sensor as well as the viewer or the eyewear display. This is fundamentally important
for the applications like remote maintenance, virtually assisted production or service, while it is
critical for the applications in the medical field such as surgery.
4.2 Types of 3D sensing
4.2.1 Laser scanning
3D laser scanning utilizes a laser beam to capture the physical shape of the subject, and
produces a digital image or point cloud, using the principle of laser ranging and the method of
high-speed laser scanning measurement. The high-precision three-dimensional coordinate data
and extensive spatial point information, including (x, y, z) coordinates and reflectivity, from the
surface of the measured object can be achieved. These large-scale and high-resolution data
enabled the creation of high-precision and high-resolution real 3D models of objects and digital
terrain models.
As indicated in Figure 2, the laser beam or the object moves while the images are recorded and
processed, otherwise the distorted laser line only provides the object-height variations along
the projected line. To create 3D information, the exact angle is known and kept stable during
the measurement and calibration. Elaborated standards which exist for the measurement of
precise optical 3D coordination systems such as ISO 10360-13 [1] can be a good reference
for laser scanning techniques.
___________
Numbers in square brackets refer to the Bibliography.
Figure 2 – Schematic diagram of laser scanning
3D laser scanning technology has a wide range of applications, mainly including cultural relics
protection, architecture, planning, civil engineering, factory renovation, interior design, building
monitoring, etc. It plays a major role in the industrial assembly and the positioning of the parts
especially in the automotive sector but also controlling the path of laser welding. However, this
type of 3D sensing is difficult to implement with eyewear display because of the difficulty in high
stability between the projection and the observation.
4.2.2 Structured light
Structured light is a set of systems which consists of a projector and a camera (Figure 3). The
projector projects specific light with encoded information on the surface of the object. The
camera captures the reflected pattern distortion. According to the change of the light signal
caused by the object, the information such as the position and depth of the object is calculated.

Figure 3 – Schematic diagram of structured light
Structured light is mostly used for robotic, industrial inspection of known surfaces, like car
bodies and formed sheet glass. For monochrome surfaces, the use of black and white fringe
pattern is sufficient and reduces the processing time. Because of its low deviation at close
distance, it is also applied for face and gesture recognition, as well as for measuring the exact
shape of human body parts and for precisely adapted shoes and prothesis. Again, deviation
depends on the calibration, which is based on a pre-defined measurement volume and the
angular setup. Also, in order to cover a wide depth of field or quantify steep surfaces, the local
frequency and orientation of the projected colour can be varied, and shadows avoided. Finally,
since parallax has a substantial influence and the baseline is limited on an eyewear device, the
observed scene is limited in distance in function of the tolerated deviation.
Since the measurement can be easily disturbed by strong outdoor lights, this technique is
usually used indoors. Structured light can be very efficient when object and situation repeats,
otherwise it basically takes more recording and processing time and generates additional
volume and weight. The same standards for precise optical 3D coordination measuring systems
as for laser scanning, for example ISO 10360-13 [1], can apply here. Performance evaluation
tests and systematic errors in structured light systems have already been achieved [2].
4.2.3 Time of flight (ToF)
ToF is a set of subsystems which consists of an emitter and a receiver. It is mostly used in laser
rangefinders and speed meters (Figure 4). The emitted, pulsed laser light is controlled by a
variable optical "gate", which is then backscattered from the object or scene to the receiver.
When ToF is not limited to single beams but applied to a whole scene, by using a mechanically
or optically scanned beam, it is also called "LiDAR" (Light Detection and Ranging). LiDAR is a
remote sensing technology that uses laser light to measure distances and create detailed 3D
representations of objects and landscapes. It works by emitting laser pulses and measuring the
time it takes for them to return to the sensor after bouncing off objects. The shorter light paths
reveal object details closer to the receiver, whereas the longer light paths indicate longer
distances within the observed scene. Knowing the speed, pulse width and phase of the
projected light, the distance between object points and the sensor as well as accurate 3D maps
and models can be calculated according to the time or phase difference between the emitted
light and the received light.
Figure 4 – Schematic diagram of ToF
ToF has the advantages of good real-time performance and simple algorithm, but also has
disadvantages of low image resolution, high cost and high-power consumption, since gated
illumination drastically reduces the amount of backscattered light. The distance between the
emitter and the receiver in ToF module can be as close as possible, so that the error caused
by different paths will be smaller. ToF can be applied to mobile phones, AR and VR, and other
portable devices.
4.2.4 Binocular stereo vision
Binocular stereo vision (Figure 5), which is based on the principle of parallax, is a set of systems
consisting of two cameras to capture two images of the measured object from different positions.
The three-dimensional geometric information of the object is obtained by calculating the
positional deviation between the corresponding points in the two images [3].
Figure 5 – Schematic diagram of binocular stereo vision
Binocular stereo vision has high imaging resolution, whereas the quality of the image depends
on the quality of the imaging lenses and cameras. Basically, only a minimum of two images is
sufficient to a
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.

Loading comments...