ISO 23150:2023
(Main)Road vehicles - Data communication between sensors and data fusion unit for automated driving functions - Logical interface
Road vehicles - Data communication between sensors and data fusion unit for automated driving functions - Logical interface
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available. This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
Véhicules routiers — Communication de données entre capteurs et unité de fusion de données pour les fonctions de conduite automatisée — Interface logique
General Information
- Status
- Published
- Publication Date
- 30-May-2023
- Technical Committee
- ISO/TC 22/SC 31 - Data communication
- Drafting Committee
- ISO/TC 22/SC 31 - Data communication
- Current Stage
- 9092 - International Standard to be revised
- Start Date
- 24-Jul-2025
- Completion Date
- 13-Dec-2025
Relations
- Effective Date
- 06-Jun-2022
Overview
ISO 23150:2023 - Road vehicles: Data communication between sensors and data fusion unit for automated driving functions - Logical interface defines a logical, semantic interface between in‑vehicle environmental perception sensors (radar, lidar, camera, ultrasonic, etc.) or sensor clusters and the data fusion unit that builds a surround model for automated driving. The standard is modular and layered, describing object level, feature level, and detection level representations, plus supportive sensor interfaces. It applies to road vehicles with automated driving functions and excludes electrical/mechanical specifications and raw data interfaces. ISO 23150:2023 is the second edition and replaces the 2021 edition.
Key Topics
- Logical interface structure: standardized headers, signals, interface groupings and profiles to ensure consistent semantic exchange between sensors and fusion units.
- Layered data model:
- Object level - provides object‑level entities (e.g., potentially moving objects, road objects, static objects, free space areas).
- Feature level - sensor cluster and modality‑specific features (e.g., camera, ultrasonic features).
- Detection level - sensor detections by technology (radar, lidar, camera, ultrasonic).
- Supportive interfaces: sensor performance, sensor health, and sensor input interfaces to help interpret detection confidence and operational status.
- Profiles and versioning: mechanisms for interface version uniqueness, options, constraints and profiles tailored to use cases and sensor technologies.
- Normative annexes: detailed interface signals, options and constraints, and bibliographic references.
Applications and Who Uses It
ISO 23150:2023 is practical for organizations involved in designing, integrating, validating and maintaining automated driving systems:
- OEMs and vehicle system architects standardize sensor-to-fusion communication to support multi‑sensor perception.
- Sensor manufacturers and Tier‑1 suppliers implement logical output formats for interoperable fusion.
- ADAS/AD software developers & integrators use semantic object, feature and detection data for perception, tracking and decision functions.
- Validation, test labs and certification bodies rely on consistent interfaces for test cases and interoperability checks.
- Fleet operators and mobility service providers benefit indirectly via improved sensor fusion reliability and portability.
Practical benefits include improved interoperability, clearer semantics for object and detection data, easier integration of heterogeneous sensors, and reduced ambiguity in fusion inputs - accelerating robust, scalable automated driving development.
Related Standards
- ISO 23150:2023 was prepared by ISO/TC 22/SC 31 (Data communication).
- Related domains include functional safety and system safety standards (e.g., ISO 26262, ISO 21448 SOTIF) and software/architecture frameworks used in automotive integration.
Frequently Asked Questions
ISO 23150:2023 is a standard published by the International Organization for Standardization (ISO). Its full title is "Road vehicles - Data communication between sensors and data fusion unit for automated driving functions - Logical interface". This standard covers: This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available. This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available. This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
ISO 23150:2023 is classified under the following ICS (International Classification for Standards) categories: 43.040.15 - Car informatics. On board computer systems. The ICS classification helps identify the subject area and facilitates finding related standards.
ISO 23150:2023 has the following relationships with other standards: It is inter standard links to ISO 23150:2021. Understanding these relationships helps ensure you are using the most current and applicable version of the standard.
You can purchase ISO 23150:2023 directly from iTeh Standards. The document is available in PDF format and is delivered instantly after payment. Add the standard to your cart and complete the secure checkout process. iTeh Standards is an authorized distributor of ISO standards.
Standards Content (Sample)
INTERNATIONAL ISO
STANDARD 23150
Second edition
2023-05
Road vehicles — Data communication
between sensors and data fusion unit
for automated driving functions —
Logical interface
Véhicules routiers — Communication de données entre capteurs et
unité de fusion de données pour les fonctions de conduite automatisée
— Interface logique
Reference number
© ISO 2023
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii
Contents Page
Foreword . vi
Introduction . viii
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 Architectural components . 1
3.2 Terms for logical interface layers . 2
3.3 Structure terms . 3
3.4 Measurement terms . 3
3.5 Requirement level terms . 6
3.6 Road user relevant entity types . 7
3.7 Axis and coordinate system terms . 9
4 Abbreviated terms . 15
5 Structure of the interface description . 16
5.1 General . 16
5.2 Signal . 17
5.3 Interface . 17
5.4 Specific signal grouping. 18
5.5 Profile . 19
6 Logical interface from a sensor as well as a sensor cluster to a fusion unit . 19
6.1 General . 19
6.2 Generic interface header . 22
6.3 Generic interface entity . 22
6.4 Profile: Uniqueness of interface versioning . 23
7 Object level . 23
7.1 General . 23
7.2 Generic object level interface . 24
7.2.1 Generic object level header . 24
7.2.2 Generic object level entity . 25
7.3 Potentially moving object interface . 25
7.3.1 Potentially moving object header . 34
7.3.2 Potentially moving object entity . 35
7.3.3 Profile: Motion type . 36
7.3.4 Profile: Motion state vector . 37
7.4 Road object interface . 38
7.4.1 Road object header . 49
7.4.2 Road object entity . 51
7.4.3 Profile: Colour model for RDOI . 53
7.5 Static object interface . 53
7.5.1 Static object header . 81
7.5.2 Static object entity . 83
7.5.3 Profile: Colour model for SOI . 91
7.5.4 Profile: Detection references for 3D detections . 91
7.6 Free space area object interface . 94
7.6.1 Free space area object header . 99
7.6.2 Free space area object entity . 100
8 Feature level . 101
8.1 General . 101
8.2 Generic sensor cluster feature interface . 102
8.2.1 Generic sensor cluster feature header . 102
8.2.2 Generic sensor cluster feature entity . 103
8.3 Camera feature interface . 103
8.3.1 Camera feature header . 107
8.3.2 Camera feature entity . 108
8.3.3 Profile: Colour model for CFI . 109
8.4 Ultrasonic feature interface . 110
8.4.1 Ultrasonic feature header . 113
8.4.2 Ultrasonic feature entity . 114
9 Detection level . 114
9.1 General . 114
9.2 Generic sensor detection interface . 115
9.2.1 Generic sensor detections header . 115
9.2.2 Generic sensor detections entity . 116
9.3 Radar detection interface . 117
9.3.1 Radar detections header . 119
9.3.2 Radar detections entity . 121
9.3.3 Profile: Radar ambiguity . 121
9.4 Lidar detection interface . 121
9.4.1 Lidar detection header . 124
9.4.2 Lidar detection entity . 125
9.5 Camera detection interface . 125
9.5.1 Camera detection header . 130
9.5.2 Camera detection entity . 131
9.5.3 Profile: Colour model for CDI . 132
9.6 Ultrasonic detection interface . 132
9.6.1 Ultrasonic detection header . 135
9.6.2 Ultrasonic detection entity . 136
9.6.3 Profile: Ultrasonic sensor cluster . 137
10 Supportive sensor interfaces . 139
10.1 General . 139
10.2 Generic supportive sensor interface . 140
10.2.1 Generic supportive sensor header . 140
10.2.2 Generic supportive sensor entity . 141
10.3 Sensor performance interface . 141
10.3.1 Sensor performance header . 145
10.3.2 Sensor performance entity . 147
10.3.3 Profile: Uniqueness of interface versioning of SPIs . 147
10.4 Sensor health information interface . 147
10.4.1 Sensor health information header . 150
10.4.2 Sensor health information entity . 151
11 Sensor input interface . 151
11.1 General . 151
11.2 Generic sensor input interface . 151
11.2.1 Generic sensor inputs header . 151
11.2.2 Generic sensor inputs entity . 152
11.2.3 Profile: Uniqueness of interface versioning of SII . 152
11.3 Common sensor input interface . 152
11.3.1 Common sensor input header . 156
11.3.2 Common sensor input entity . 156
iv © ISO 2023 – All rights reserved
Annex A (normative) Interface signals . 158
Annex B (normative) Options and constraints . 328
Bibliography . 344
Foreword
ISO (the International Organization for Standardization) is a worldwide federation of national
standards bodies (ISO member bodies). The work of preparing International Standards is normally
carried out through ISO technical committees. Each member body interested in a subject for which a
technical committee has been established has the right to be represented on that committee.
International organizations, governmental and non-governmental, in liaison with ISO, also take part in
the work. ISO collaborates closely with the International Electrotechnical Commission (IEC) on all
matters of electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the
different types of ISO documents should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject of
patent rights. ISO shall not be held responsible for identifying any or all such patent rights. Details of
any patent rights identified during the development of the document will be in the Introduction and/or
on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to the
World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT), see
www.iso.org/iso/foreword.html.
This document was prepared by Technical Committee ISO/TC 22, Road vehicles, Subcommittee SC 31,
Data communication.
This second edition cancels and replaces the first edition (ISO 23150:2021), which has been technically
revised.
The main changes are as follows:
— extension of the potentially moving object interface at object level (extension of the logical
signal group person);
— update of the road object interface at object level (extension of the road marking sign);
— extension of the static object interface at object level (extension of the entity type traffic sign;
addition of new entity type traffic sign board);
— addition of new free space area interface at object level;
— extension of the camera detection interface at detection level (addition of new entity type point;
update of existing entity type shape);
— addition of new interfaces group/layer – sensor input interface (addition of new generic sensor
input interface; addition of new common sensor input interface);
vi © ISO 2023 – All rights reserved
— extension of the error model (addition of covariances, cross-covariances and time series as
error model implementation);
— refinement of the terms, for example, value as measured-, tracked- and predicted quantity
value;
— new measures to link signals with their origin, that means linking signals at object level with, for
example, detection entities;
— harmonisation of the document, for example, to achieve a better readability;
— update and add figures for clarification.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
Introduction
(Highly-)automated driving (AD) functions for road vehicles require a situation awareness of the
surroundings of the vehicle and, preferably, a comprehensive scene understanding. For the fast and
reliable recognition of real-world objects, a sensor suite is necessary to provide information for the
fusion unit. Utilisation of different sensor technologies like radar, lidar, camera and ultrasonic with
different detection capabilities is indispensable to ensure both complementary and redundant
information. The fusion unit analyses and evaluates the different sensor signals and finally generates a
dynamic surround model with sufficient scene understanding.
While current partly-automated functions utilise only particular objects (for example, vehicles,
pedestrians, road markings) to generate a simple surround model, it is necessary for future highly-
automated driving functions to merge not only the recognised objects but also to include other sensor-
specific properties and characteristics of these objects for the generation of a coherent surround model
of the surroundings. To minimise the development efforts for the sensors and the fusion unit and to
maximise the reusability of development and validation efforts for the different functions on the sensor
and fusion unit side, a standardised logical interface layer between the sensor suite and the fusion unit
and a standardised logical interface layer to the sensor suite are worthwhile and beneficial for both the
sensor supplier and the system supplier.
Key
1 logical interface layer between the fusion unit and AD functions
2 logical interface layer between a single sensor as well as a single sensor cluster and the fusion unit
3 interface layer on raw data level of a sensor’s sensing element(s) and its processing
Figure 1 — Architecture: sensors/sensor clusters – fusion unit – AD functions
The logical interface layer between a single sensor as well as a single sensor cluster and the fusion unit
(see Figure 1, key 2) addresses the encapsulation of technical complexity as well as objects including
viii © ISO 2023 – All rights reserved
free space areas, features and detections to enable object-level, feature-level and detection-level fusion.
Additional supportive information of the sensor as well as the sensor cluster will supplement the data
for the fusion unit.
ECU 1 ECU 2 ECU 3
1 Input Input Input
Input
Key
1 logical interface layer between other in-vehicle electronic control units (ECUs) (for example, odometry) and a single
sensor or a single sensor cluster
Figure 2 — Architecture: ECUs’ sensor input – sensors/sensor clusters
The logical interface layer between an electronic control unit and a single sensor as well as a single
sensor cluster (see Figure 2, key 1) addresses the input of a single sensor as well as a single sensor
cluster.
INTERNATIONAL STANDARD ISO 23150:2023(E)
Road vehicles — Data communication between sensors
and data fusion unit for automated driving functions —
Logical interface
1 Scope
This document is applicable to road vehicles with automated driving functions. The document specifies
the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar,
camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene
around the vehicle based on the sensor data. The interface is described in a modular and semantic
representation and provides information on object level (for example, potentially moving objects, road
objects, static objects) as well as information on feature and detection levels based on sensor
technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces
are also excluded.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https://www.iso.org/obp
— IEC Electropedia: available at http://www.electropedia.org/
3.1 Architectural components
3.1.1
fusion
act of uniting signals (3.3.1) from two or more sensors (3.1.5) as well as sensor clusters (3.1.6) to create a
surround model (3.1.7)
3.1.2
fusion unit
computing unit where the fusion (3.1.1) of sensor (3.1.5) data as well as a sensor cluster (3.1.6) data is
performed
3.1.3
interface
shared boundary between two functional units, defined by various characteristics pertaining to the
functions, physical interconnections, signal (3.3.1) exchanges and other characteristics of the units, as
appropriate
[SOURCE: ISO/IEC 2382:2015, 2124351, modified — Notes to entry have been removed.]
3.1.4
logical interface
interface (3.1.3) between a sensor (3.1.5) as well as a sensor cluster (3.1.6) and the fusion unit (3.1.2),
defined by logical characteristics
Note 1 to entry: Logical means a semantic description of the interface.
Note 2 to entry: Mechanical and electrical interfaces are excluded.
Note 3 to entry: This document uses the term interface as a shortcut for the term logical interfaces.
3.1.5
sensor
in-vehicle unit which detects entities external of the vehicle with pre-processing capabilities serving at
least one logical interface (3.1.4)
Note 1 to entry: A sensor may use one or more sensing elements.
3.1.6
sensor cluster
group of sensors (3.1.5) of the same technology serving common logical interfaces (3.1.4)
Note 1 to entry: A sensor cluster can exceptionally consist of only one sensor.
EXAMPLE A stereo camera, a surround-view camera, an ultrasonic sensor array, a corner radar system.
3.1.7
surround model
representation of the real world adjacent to the ego-vehicle
3.1.8
in-vehicle communication
communication network used in vehicles to connect devices to exchange information
Note 1 to entry: A in-vehicle communication connects, for example, electric control units and sensors (3.1.5)
with each other.
3.2 Terms for logical interface layers
3.2.1
detection
sensor technology specific entity represented in the sensor coordinate system (3.7.18) based on a single
measurement (3.4.5) of a sensor (3.1.5)
Note 1 to entry: A small amount of history can be used for some detection signals (3.3.1), for example, model-
free filtering may be used in track-before-detect algorithms.
3.2.2
detection level
set of logical interfaces (3.1.4) that provides detections (3.2.1)
3.2.3
feature
sensor technology specific entity represented in the vehicle coordinate system (3.7.16) based on
multiple measurements (3.4.5)
2 © ISO 2023 – All rights reserved
Note 1 to entry: Multiple measurements can originate from a sensor cluster (3.1.6).
Note 2 to entry: Multiple measurements can originate from multiple measurement cycles (3.4.1).
Note 3 to entry: The term feature is used in this document not as function or group of functions as specified in
ISO/SAE PAS 22736:2021.
3.2.4
feature level
set of logical interfaces (3.1.4) that provides features (3.2.3)
3.2.5
object
representation of a real-world entity with defined boundaries and characteristics in the vehicle
coordinate system (3.7.16)
Note 1 to entry: The geometric description of the object is in the vehicle coordinate system.
Note 2 to entry: Object signals (3.3.1) are basically sensor technology independent. Sensor technology specific
signals may extend the object signals.
EXAMPLE A potentially moving object (3.6.1), a road object (3.6.2), a static object (3.6.3), a free space area object
(3.6.4).
3.2.6
object level
set of logical interfaces (3.1.4) that provides objects (3.2.5)
3.2.7
sensor input
data received by a sensor (3.1.5) or a sensor cluster (3.1.6) via the in-vehicle communication (3.1.8)
3.3 Structure terms
3.3.1
signal
entity consisting of one or more values and which is part of a logical interface (3.1.4)
3.3.2
logical signal group
grouping of signals (3.3.1) that has a logical relationship and a name for the grouping
3.3.3
classification
attribute-based differentiation
Note 1 to entry: An attribute is defined by a list of enumerators.
3.4 Measurement terms
3.4.1
measurement cycle
time period from the start of a data acquisition event to the start of the next data acquisition event
Note 1 to entry: A measurement cycle of one sensor (3.1.5) is a consistent view of an observed scene and not
overlapping in time.
3.4.2
measured quantity value
value of a quantity resulting from a measurement (3.4.5)
3.4.3
tracked quantity value
value of a quantity determined from observed sequential changes, using information related to the
same characteristic
3.4.4
predicted quantity value
value of a quantity assessed before it is actually observable, using information related to the same
characteristic
EXAMPLE Related information can be recent and previous measured quantity values (3.4.2), tracked quantity
values (3.4.3) and state variables.
[SOURCE: IEV 192-13-02, modified — EXAMPLE has been added and the word "quantity" has been
added to the term.]
3.4.5
measurement
processing result of a measurement cycle (3.4.1)
3.4.6
tracking
computation process used to calculate the tracked quantity value (3.4.3) of a quantity
3.4.7
prediction
computation process used to obtain the predicted quantity value (3.4.4) of a quantity
[SOURCE: IEV 192-11-01]
3.4.8
error
discrepancy between a measured quantity value (3.4.2), tracked quantity value (3.4.3) or predicted
quantity value (3.4.4) or condition, and the true, specified or theoretically correct reference quantity
value or condition
Note 1 to entry: An error within a system can be caused by failure of one or more of its components, or by the
activation of a systematic fault.
[SOURCE: IEV 192-03-02, modified — “computed, observed or measured value” was modified to
“measured quantity value, tracked quantity value or predicted quantity value”, “value” was modified to
“reference quantity value”, Note 1 to entry has been adapted and Note 2 to entry was deleted.]
3.4.9
accuracy
closeness of agreement between a measured quantity value (3.4.2), tracked quantity value (3.4.3) or
predicted quantity value (3.4.4) and a true quantity value
Note 1 to entry: The concept accuracy is not a quantity and is not given a numerical quantity value. A
measurement (3.4.5), tracking (3.4.6) or prediction (3.4.7) is said to be more accurate when it offers a smaller error
(3.4.8).
4 © ISO 2023 – All rights reserved
Note 2 to entry: The term accuracy should not be used for trueness (3.4.10) and the term precision (3.4.11)
should not be used for accuracy, which, however, is related to both these concepts.
Note 3 to entry: Accuracy is sometimes understood as closeness of agreement between measured, tracked or
predicted quantity values that are being attributed to the measurand.
[SOURCE: ISO/IEC Guide 99:2007, 2.13, modified — The terms "measurement accuracy" and "accuracy
of measurement" were deleted, definition was extended for tracked or predicted quantity values
and the Notes to entry have been adapted.]
3.4.10
trueness
closeness of agreement between the average of an infinite number of replicated measured quantity
values (3.4.2), tracked quantity values (3.4.3) or predicted quantity values (3.4.4) and a reference
quantity value
Note 1 to entry: Trueness is not a quantity and thus cannot be expressed numerically, but measures for
closeness of agreement are given in the ISO 5725 series.
Note 2 to entry: Trueness is inversely related to systematic error but is not related to random error.
Note 3 to entry: The term accuracy (3.4.9) should not be used for trueness.
[SOURCE: ISO/IEC Guide 99:2007, 2.14, modified — The terms "measurement trueness" and "trueness
of measurement" were deleted, definition was extended for tracked or predicted quantity values and
the Notes to entry have been adapted.]
3.4.11
precision
closeness of agreement between indications or measured quantity values (3.4.2), tracked quantity value
(3.4.3) or predicted quantity values (3.4.4) obtained by replicate measurements (3.4.5), tracking (3.4.6)
or prediction (3.4.7) on the same or similar measurands under specified conditions
Note 1 to entry: Precision is usually expressed numerically by measures, trackings or predictions of
imprecision, such as standard deviation, variance or coefficient of variation under the specified conditions of
measurement, tracking or prediction.
Note 2 to entry: The specified conditions can be, for example, repeatability conditions of measurement,
ISO 5725-
intermediate precision conditions of measurement or reproducibility conditions of measurement (see
1:1994).
Note 3 to entry: Precision is used to define measurement, tracking or prediction repeatability, intermediate
measurement or prediction precision and measurement, tracking or prediction reproducibility.
Note 4 to entry: Sometimes precision is erroneously used to mean accuracy (3.4.9).
Note 5 to entry: Precision is inversely related to random error but is not related to systematic error.
[SOURCE: ISO/IEC Guide 99:2007, 2.15, modified — The term "measurement precision" was deleted,
the word “objects” was replaced by “measurands”, definition was extended for tracked or predicted
quantity values, the Notes to entry have been adapted and Note 5 to entry has been added.]
3.4.12
measurement error
measured quantity value (3.4.2) minus a reference quantity value
Note 1 to entry: The concept of measurement error can be used both:
a) when there is a single reference quantity value to refer to, which occurs if a calibration is made by means
of a measurement standard with a measured quantity value having a negligible measurement uncertainty or
if a conventional quantity value is given, in which case the error (3.4.8) is known, and
b) if a measurand is supposed to be represented by a unique true quantity value or a set of true quantity
values of negligible range, in which case the error is not known.
Note 2 to entry: Measurement error should not be confused with production error or mistake.
[SOURCE: ISO/IEC Guide 99:2007, 2.16, modified — The terms "error" and "error of measurement"
were deleted and the Notes to entry have been adapted.]
3.4.13
tracking error
quantitative statement about the tracked quantity value (3.4.3) and the reference quantity value
3.4.14
prediction error
quantitative statement about the predicted quantity value (3.4.4) and the reference quantity value
3.4.15
error model
model used to estimate the error (3.4.8)
3.4.16
fixation
short temporal holds of movements that keep alignment of the eyes to a particular point within an area
of interest which falls on the fovea (the middle of the retina responsible for our central, sharpest vision)
for a given time period
[SOURCE: ISO 15007:2020, 3.1.4, modified — Notes to entry have been deleted.]
3.5 Requirement level terms
3.5.1
conditional
required under certain specified conditions
Note 1 to entry: One of three obligation statuses applied to a requirement level (3.5.4) of a logical interface
(3.1.4) specification, indicating the conditions under which the signal (3.3.1) or logical signal group (3.3.2) is
required. In other cases, the signal or logical signal group is optional. See also mandatory (3.5.2) and optional
(3.5.3).
[SOURCE: ISO/IEC 11179-3:2023, 3.2.77, modified — Note 1 to entry has been adapted and Note 2 to
entry has been removed.]
3.5.2
mandatory
always required
Note 1 to entry: One of three obligation statuses applied to a requirement level (3.5.4) of a logical interface
(3.1.4) specification, indicating the conditions under which the signal (3.3.1) or logical signal group (3.3.2) is
required. See also conditional (3.5.1) and optional (3.5.3).
6 © ISO 2023 – All rights reserved
[SOURCE: ISO/IEC 11179-3:2023, 3.2.75, modified — Note 1 to entry has been adapted and Note 2 to
entry has been removed.]
3.5.3
optional
permitted but not required
Note 1 to entry: One of three obligation statuses applied to a requirement level (3.5.4) of a logical interface
(3.1.4) specification, indicating the conditions under which the signal (3.3.1) or logical signal group (3.3.2) is
required. See also conditional (3.5.1) and mandatory (3.5.2).
[SOURCE: ISO/IEC 11179-3:2023, 3.2.76, modified — Note 1 to entry has been adapted and Note 2 to
entry has been removed.]
3.5.4
requirement level
definition of the obligation status of a logical interface's (3.1.4) logical signal group (3.3.2), signal (3.3.1)
as well as a signal's identifier or signal's enumerator
Note 1 to entry: Each requirement level entry has one of three possible obligation statuses applied: conditional
(3.5.1), mandatory (3.5.2) or optional (3.5.3).
3.6 Road user relevant entity types
3.6.1
potentially moving object
real-world entity which can potentially move and is relevant for driving situations
Note 1 to entry: A representation of a potentially moving object is part of logical interfaces (3.1.4) on object level
(3.2.6).
EXAMPLE A vehicle, a bicycle, a pedestrian, an obstacle.
3.6.2
road object
marking or structure of a road which is relevant for driving situations
Note 1 to entry: A representation of a road object is part of logical interfaces (3.1.4) on object level (3.2.6).
EXAMPLE A road marking (3.6.2.1), a road boundary (3.6.2.2), the road surface (3.6.2.3).
3.6.2.1
road marking
line, symbol or other mark on the surface of a road or a structure intended to limit, regulate, warn,
guide or inform road users
Note 1 to entry: Other marks could be text, numbers, arrows or combinations.
EXAMPLE A lane marking, Botts' dots.
[SOURCE: ISO 6707-1:2020, 3.3.5.80, modified — "user" was modified to "road users", “limit” was
added and the Note 1 to entry and EXAMPLE have been added.]
3.6.2.2
road boundary
structure that limits the road
EXAMPLE A curb stone, a guard rail, the end of the surface of the road.
3.6.2.3
road surface
surface supporting the tyre and providing friction necessary to generate shear forces in the road plane
(3.7.6)
Note 1 to entry: The surface may be flat, curved, undulated or of other shape.
[SOURCE: ISO 8855:2011, 2.6]
3.6.3
static object
real-world stationary entity which can be used for information and/or localisation
Note 1 to entry: A representation of a static object is part of logical interfaces (3.1.4) on object level (3.2.6).
EXAMPLE A general landmark (3.6.3.1), a traffic sign (3.6.3.2), a traffic sign board (3.6.3.3), a traffic light
(3.6.3.4).
3.6.3.1
general landmark
real-world stationary entity which can be used for localisation
Note 1 to entry: A stationary traffic sign (3.6.3.2) or traffic light (3.6.3.4) is also regarded as a general landmark.
EXAMPLE A building, a tunnel, a bridge, a sign gantry structure, a tree.
3.6.3.2
traffic sign
traffic relevant, authorised sign that limits, regulates, warns, guides or informs road users
Note 1 to entry: One traffic sign usually consists of one main sign (3.6.3.2.1) and none, one or several
supplementary signs (3.6.3.2.2).
EXAMPLE A speed limit which is restricted for trucks.
3.6.3.2.1
main sign
traffic sign (3.6.3.2) which gives a general message, obtained by a combination of colour and geometric
shape and which, by the addition of a graphical symbol or text, gives a particular message for road users
[SOURCE: ISO 3864-1:2011, 3.12, modified — The original term was "safety sign", "sign" has been
replaced by "traffic sign" and the phrases "or text" and "for road users" have been added to the
definition.]
3.6.3.2.2
supplementary sign
traffic sign (3.6.3.2) that is supportive of a main sign (3.6.3.2.1) and the main purpose of which is to
provide additional clarification
[SOURCE: ISO 3864-1:2011, 3.14, modified — "traffic sign" now replaces "sign" and "main sign"
replaces "traffic sign".]
3.6.3.3
traffic sign board
8 © ISO 2023 – All rights reserved
traffic relevant, authorised sign board that limits, regulates, warns, guides or informs road users by
abstract representations of lanes and streets ahead to control traffic on the road
3.6.3.4
traffic light
traffic relevant, official lights
Note 1 to entry: One traffic light consists of one or several light spots with different light colours and/or shapes.
EXAMPLE A pedestrian traffic light.
3.6.4
free space area object
area entity of the road surface (3.6.2.3) that can be driven on and is free of obstacles that can limit
driving
Note 1 to entry: A representation of a free space area object is part of logical interfaces (3.1.4) on object level
(3.2.6).
3.7 Axis and coordinate system terms
3.7.1
reference frame
geometric environment in which all points remain fixed with respect to each other at all times
[SOURCE: ISO 8855:2011, 2.1]
3.7.2
axis system
set of direction vectors
3.7.3
coordinate system
numbering convention used to assign a unique ordered trio of values to each point in a reference frame
(3.7.1) and which consists of an axis system (3.7.2) plus an origin point
[SOURCE: ISO 8855:2011, 2.4, modified — "(x, y, z)" has been removed from the definition.]
3.7.4
cartesian coordinate system
set of numerical coordinates (x, y, z), which are the signed distances to the YZ-, ZX- and XY-planes using
a set of three orthogonal directions associated with X, Y and Z axes
⃗ ⃗ �⃗
Note 1 to entry: A right-handed axis system (3.7.2) is assumed throughout this document, where: 𝑍𝑍 =𝑋𝑋 ×𝑌𝑌.
3.7.5
spherical coordinate system
set of one distance vector and two angles associated with radial distance, azimuth and elevation
Note 1 to entry: The azimuth angle is the angle in XY-plane [of a cartesian coordinate system (3.7.4)] of the axis
system (3.7.2) counted from the X-axis. The elevation angle is the angle from the azimuth direction in the XY-plane
of the axis system towards the direction of the distance vector, that is XY-plane has an elevation angle = 0 rad.
Note 2 to entry: The angles of the spherical coordinate system have increasing values in counterclockwise
direction.
3.7.6
road plane
plane representing the road surface (3.6.2.3) within the front tyre contact patches and the vehicle road-
level reference point (3.7.13)
Note 1 to entry: See Figure 3.
Note 2 to entry: For tyre contact patches, see ISO 8855:2011, 4.1.5.
2b
2a
Key
1 vehicle front
2a vehicle’s front tyre contact patches
2b vehi
...










Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...