ISO/IEC 23090-28:2026
(Main)Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations
Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations
This document specifies an architecture and data model for interchange of scene-based media for use in systems and applications that employ renderers for the presentation of immersive media, including audio, visual, tactile and other representations of media types. The architecture and data model specified in this document applies to use cases where there is a need for interchange of scene-based media for presentation by systems with 3D render-based technologies. One specific application of this document is immersive display technologies that utilize a real-time renderer of 3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-based media. Another application of this document is for the distribution of media assets for use across a large variety of applications comprising the Metaverse.
Technologies de l'information — Représentation codée de média immersifs — Partie 28: Représentations de médias basés sur des scènes interchangeables
General Information
- Status
- Published
- Publication Date
- 12-Feb-2026
- Current Stage
- 6060 - International Standard published
- Start Date
- 13-Feb-2026
- Due Date
- 30-Sep-2025
- Completion Date
- 13-Feb-2026
Overview
ISO/IEC FDIS 23090-28:2025, titled "Information technology - Coded representation of immersive media - Part 28: Interchangeable scene-based media representations," is a key international standard developed by ISO/IEC JTC 1/SC 29. It addresses the architecture and data models necessary for the interchange of scene-based media, particularly targeting immersive media applications that employ 3D renderers. With the rapid growth of immersive technologies, including holographic displays and metaverse platforms, this standard facilitates interoperability and efficient distribution of complex 3D media content.
The standard introduces the concept of the Independent Mapping Space (IMS), a metadata vocabulary that annotates scene graphs-collections of nodes representing objects and their attributes in a 3D scene. IMS enables consistent translation and mapping between various scene-based media formats, addressing the challenges posed by the variety of proprietary or widely used 3D graphics formats.
Key Topics
Scene-Based Media Representations
The standard defines architectures for organizing immersive media content as scene graphs, where each node contains spatial, logical, or temporal information. This supports media types including audio, visual, and haptic data for immersive presentations.Independent Mapping Space (IMS)
IMS is a central component designed to standardize metadata annotations across scene graph formats. It provides a common vocabulary to describe nodes, attributes, media types, and rendering instructions facilitating interoperability.Architecture and Translation Mechanisms
ISO/IEC FDIS 23090-28 specifies architectural concepts for nesting scenes, annotating scene graphs using IMS, and translating between different scene graph representations. This ensures seamless interchange and rendering across diverse platforms.Support for Emerging Display Technologies
The standard supports immersive display technologies like holographic and 3D aerial image displays that require dense, detailed scene descriptions beyond traditional 2D raster formats.Terminology and Definitions
Key terms are clearly defined, such as scene graph, node, container, renderer, serialization, and asset, ensuring standardization in communication and implementation.
Applications
Immersive Display Systems
Enables next-generation holographic displays and other immersive technologies to ingest and render detailed scene-based media for photorealistic experiences.Metaverse Content Distribution
Facilitates interoperability of 3D assets across varied metaverse applications such as gaming, social media, and e-commerce, by simplifying asset translation between formats.3D Graphics Production and Workflows
Assists creators and developers in managing scene-based media assets with improved consistency and reusability across different rendering engines and platforms.Real-Time Rendering Engines
Supports real-time 3D renderers by providing a standardized format and metadata annotation system that enhances cross-platform compatibility and visual fidelity.Multimedia and Hypermedia Coding
Integrates with broader multimedia coding standards to enhance scene-based audio-visual-haptic media representation in digital ecosystems.
Related Standards
ISO/IEC 23090 Series
Part of a comprehensive set of standards for coded representation of immersive media. ISO/IEC 23090-27 provides extended architectural contexts relevant to this part.Immersive Technology Media Format (ITMF)
IMS annotations leverage specifications defined in ITMF for encoding complex scene attributes and metadata.glTF 2.0
Annexes in the standard provide normative mappings and annotation processes for glTF 2.0-a widely adopted 3D asset format-highlighting interoperability with IMS.ITU-R BT.2100 and Related Raster Formats
While primarily focused on 2D raster media, these standards contrast the richer scene description capabilities facilitated by ISO/IEC 23090-28.Open Shading Language (OSL)
Integration with shading and rendering standards, supporting advanced material and surface descriptions in scene-based media.
ISO/IEC FDIS 23090-28 is essential for stakeholders aiming to develop or utilize interoperable, high-fidelity immersive media systems. By unifying scene-based media representations and enabling efficient format translation, it empowers the future of photorealistic rendering in immersive and metaverse experiences, advancing the international landscape of coded immersive media technology.
Buy Documents
ISO/IEC 23090-28:2026 - Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations Released:2/13/2026
ISO/IEC FDIS 23090-28 - Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations Released:30. 10. 2025
REDLINE ISO/IEC FDIS 23090-28 - Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations Released:30. 10. 2025
ISO/IEC 23090-28 - Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations Released:10/30/2025
REDLINE ISO/IEC 23090-28 - Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations Released:10/30/2025
Get Certified
Connect with accredited certification bodies for this standard

BSI Group
BSI (British Standards Institution) is the business standards company that helps organizations make excellence a habit.

NYCE
Mexican standards and certification body.
Sponsored listings
Frequently Asked Questions
ISO/IEC 23090-28:2026 is a standard published by the International Organization for Standardization (ISO). Its full title is "Information technology — Coded representation of immersive media — Part 28: Interchangeable scene-based media representations". This standard covers: This document specifies an architecture and data model for interchange of scene-based media for use in systems and applications that employ renderers for the presentation of immersive media, including audio, visual, tactile and other representations of media types. The architecture and data model specified in this document applies to use cases where there is a need for interchange of scene-based media for presentation by systems with 3D render-based technologies. One specific application of this document is immersive display technologies that utilize a real-time renderer of 3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-based media. Another application of this document is for the distribution of media assets for use across a large variety of applications comprising the Metaverse.
This document specifies an architecture and data model for interchange of scene-based media for use in systems and applications that employ renderers for the presentation of immersive media, including audio, visual, tactile and other representations of media types. The architecture and data model specified in this document applies to use cases where there is a need for interchange of scene-based media for presentation by systems with 3D render-based technologies. One specific application of this document is immersive display technologies that utilize a real-time renderer of 3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-based media. Another application of this document is for the distribution of media assets for use across a large variety of applications comprising the Metaverse.
ISO/IEC 23090-28:2026 is classified under the following ICS (International Classification for Standards) categories: 35.040.40 - Coding of audio, video, multimedia and hypermedia information. The ICS classification helps identify the subject area and facilitates finding related standards.
ISO/IEC 23090-28:2026 is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.
Standards Content (Sample)
International
Standard
ISO/IEC 23090-28
First edition
Information technology — Coded
2026-02
representation of immersive
media —
Part 28:
Interchangeable scene-based media
representations
Technologies de l'information — Représentation codée de média
immersifs —
Partie 28: Représentations de médias basés sur des scènes
interchangeables
Reference number
© ISO/IEC 2026
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
© ISO/IEC 2026 – All rights reserved
ii
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 General .1
3.2 Abbreviated terms .3
4 Architectures for scene-based media . 3
4.1 General .3
4.2 Architecture for scene graph representations .4
4.3 Scenes nested within scenes .5
4.4 Architecture for annotated scene graph using IMS .5
4.5 Architecture for translation between scene graph representations using IMS .5
5 Independent mapping space (IMS) . 7
5.1 General .7
5.2 Architecture of IMS .7
5.3 IMS systems.7
5.3.1 General .7
5.3.2 IMS logical organization component .9
5.3.3 IMS physical organization component .18
5.3.4 IMS processing directions component . 23
Annex A (normative) Annotation using IMS and ITMF .32
Annex B (normative) Mapping of IMS to glTF 2.0 .37
Annex C (normative) Annotation process for glTF .39
Bibliography .45
© ISO/IEC 2026 – All rights reserved
iii
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical activity.
ISO and IEC technical committees collaborate in fields of mutual interest. Other international organizations,
governmental and non-governmental, in liaison with ISO and IEC, also take part in the work.
The procedures used to develop this document and those intended for its further maintenance are described
in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the different types
of document should be noted. This document was drafted in accordance with the editorial rules of the ISO/
IEC Directives, Part 2 (see www.iso.org/directives or www.iec.ch/members_experts/refdocs).
ISO and IEC draw attention to the possibility that the implementation of this document may involve the
use of (a) patent(s). ISO and IEC take no position concerning the evidence, validity or applicability of any
claimed patent rights in respect thereof. As of the date of publication of this document, ISO and IEC had not
received notice of (a) patent(s) which may be required to implement this document. However, implementers
are cautioned that this may not represent the latest information, which may be obtained from the patent
database available at www.iso.org/patents and https://patents.iec.ch. ISO and IEC shall not be held
responsible for identifying any or all such patent rights.
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions
related to conformity assessment, as well as information about ISO's adherence to the World Trade
Organization (WTO) principles in the Technical Barriers to Trade (TBT) see www.iso.org/iso/foreword.html.
In the IEC, see www.iec.ch/understanding-standards.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
© ISO/IEC 2026 – All rights reserved
iv
Introduction
Emerging display technologies capable of creating fully formed holograms, e.g. 3D aerial images comprised
of waveforms emitted from a display panel surface, require input source media that are sufficiently dense in
information such that each of the objects in a holographic visual scene is described in terms of its complete
and explicit geometry, and surface properties to characterize how the surface responds to the presence of
light. Such visual information can furthermore facilitate display technologies to produce visual scenes of
objects that cannot be distinguished from the same objects when viewed in the natural world; i.e., the visual
information of the media is sufficiently dense such that the display is enabled to produce a photorealistic
result.
Commonly used raster media formats such as those specified in ITU-R BT.601, ITU-R BT.709, ITU-R BT.2020,
ITU-R BT.2100 were originally developed to efficiently provide information sufficient for 2D displays to
produce 2D visual presentations, albeit with varying degrees of spatial resolutions, frame rates, and pixel
depth. These raster formats, however, neither provide sufficient information in a practical nor efficient
manner, to enable a holographic display to produce holograms. As an alternative to raster media, 3D
graphics formats used in digital content creation tools and production workflows offer a solution to provide
sufficiently dense information to enable photorealistic results where raster formats do not. The use of 3D
graphics technologies hence serves as a fundamental technology in enabling immersive displays to create
their optimal visual presentations.
To date, immersive displays provide a media input interface that is capable of ingesting 3D graphics formats
for presentation by the display. Such interfaces can be enabled and shipped as part of the onboard processing
features in the display by use of game engine software development kits. However, 3D scene media that
are targeted for processing by the display’s choice of renderer must first be translated into a format that
is consistent with the internal scene representations of the renderer. To facilitate the translation and
distribution of 3D scene media into the renderer interface provided by the display, an independent mapping
space (IMS) is specified as a part of the ISO/IEC 23090 series.
At the core of the IMS is a vocabulary of terms and definitions that can be used to annotate a scene graph
or scene asset with metadata, i.e. to facilitate the translation of media from one format to another. Such
translation from one format to another format is a commonly encountered challenge in the distribution of
scene-based media, as a number of formats have emerged to gain popularity in various application areas,
e.g., gaming, social media, e-commerce, and content production. The corresponding vocabulary of terms
and definitions describes nodes, media formats, media attributes, and render processing instructions of
commonly-used scene graph formats and media representations. Elements from the IMS can be used to
annotate a scene graph or media format so that its translation to another format is more straightforward
and consistently performed. Thus, the IMS facilitates the distribution and interoperability of a variety of
scene-based media into render applications that are limited in the types of formats that they are designed to
support.
Another application of this document is to guide the translation of media assets for use across a diverse
set of applications comprising the Metaverse, e.g., gaming, social media, retail applications. In the absence
of a common format that specifies media for use across the variety of Metaverse applications, the IMS can
facilitate the translation from one asset representation to another.
© ISO/IEC 2026 – All rights reserved
v
International Standard ISO/IEC 23090-28:2026(en)
Information technology — Coded representation of
immersive media —
Part 28:
Interchangeable scene-based media representations
1 Scope
This document specifies an architecture and data model for interchange of scene-based media for use in
systems and applications that employ renderers for the presentation of immersive media, including audio,
visual, tactile and other representations of media types.
The architecture and data model specified in this document applies to use cases where there is a need for
interchange of scene-based media for presentation by systems with 3D render-based technologies. One
specific application of this document is immersive display technologies that utilize a real-time renderer of
3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-
based media. Another application of this document is for the distribution of media assets for use across a
large variety of applications comprising the Metaverse.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
3.1 General
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
3.1.1
scene graph
general data structure commonly used by vector-based graphics editing applications and modern computer
games, that arranges the logical and often (but not necessarily) spatial representation of a graphical scene; a
collection of nodes and vertices in a graph structure
3.1.2
immersive technology media format
ITMF
suite of specifications developed by the Immersive Digital Experiences Alliance
3.1.3
node
fundamental element of the scene graph comprised of information related to the logical or spatial or
temporal representation of visual or audio information
© ISO/IEC 2026 – All rights reserved
3.1.4
node graph
collection of nodes that can be positioned or modified as a single unit and linked to the remainder of the
scene graph by linker nodes
3.1.5
node type
general category of node that performs a certain function, for example, a camera node type can be a
panoramic lens camera node, an open shading language (OSL) node, or other types of cameras
3.1.6
attribute
metadata associated with a node used to describe a particular characteristic or feature of a node either in a
canonical or more complex form (e.g. in terms of another node)
3.1.7
container
serialized format to store and exchange information to represent all natural, all synthetic, or a mixture of
synthetic and natural scenes including a scene graph and all of the media resources that are required for
rendering the scene
3.1.8
serialization
process of translating data structures or object state into a format that can be stored (for example, in a file
or memory buffer) or transmitted (for example, across a network connection link) and reconstructed later
(possibly in a different computer environment)
Note 1 to entry: When the resulting series of bits is reread according to the serialization format, it can be used to
create a semantically identical clone of the original object.
3.1.9
renderer
software-based application or process, based on a selective mixture of disciplines related to: acoustic
physics, light physics, visual perception, audio perception, mathematics, and software development, that,
given an input scene graph and asset container, emits a visual and/or audio signal suitable for presentation
on a targeted device or conforming to the desired properties as specified by attributes of a render target
node in the scene graph
Note 1 to entry: For visual-based media assets, a renderer may emit a visual signal suitable for a targeted display, or
for storage as an intermediate asset (e.g. repackaged into another container i.e. used in a series of rendering processes
in a graphics pipeline); for audio-based media assets, a renderer may emit an audio signal for presentation in a multi-
channel loudspeaker and/or binauralized headphones, or for repackaging into another (output) container.
3.1.10
scene-based media
audio, visual, haptic, and other primary types of media and media-related information organized logically
and spatially by a use of a scene graph
3.1.11
RGB
additive colour model in which red, green, and blue are the primary colours
Note 1 to entry: The abbreviation RGB is derived from the first letter of each of the primary colours.
3.1.12
asset
one of potentially multiple objects comprising the media for a scene in scene-based media; each object fully
formed and packaged such that it can be rendered separately from other objects in the scene
Note 1 to entry: Individual assets may be shared, e.g. reused across multiple scenes.
© ISO/IEC 2026 – All rights reserved
3.1.13
pin
input parameter to a node, or an output from a node
3.1.14
this document
referring to this International Standard
3.1.15
independent mapping space label
IMS label
sequence of alphanumeric characters using dot notation comprised of an IMS prefix followed by a node label,
[4]
which may optionally be followed by labels derived from the ITMF Data Encoding Specification
3.1.16
independent mapping space prefix
IMS prefix
sequence of alphanumeric characters using dot notation comprised of "ims" followed by an alphanumeric
label to denote the IMS component, subsequently followed by an alphanumeric label to denote the IMS
system for a particular node
3.2 Abbreviated terms
2D two dimensional
3D three dimensional
AOV arbitrary output variables
IDEA Immersive Digital Experiences Alliance
ITMF Immersive Technology Media Format
IMS independent mapping space
OSL Open Shading Language
RGB red, green, blue
USD Universal Scene Description
glTF graphics language Transmission Format
4 Architectures for scene-based media
4.1 General
This clause illustrates architectures related to the organization of scene-based media and how such media
can be annotated by a process that stores IMS metadata into the media. Furthermore, this clause provides a
hypothetical architecture for a process that translates one scene-based media format to another.
[2]
A more in-depth description of these architectures is provided in ISO/IEC TR 23090-27.
[2]
NOTE This document references only some of the architectures in ISO/IEC TR 23090-27 for the purposes of
providing additional context.
© ISO/IEC 2026 – All rights reserved
4.2 Architecture for scene graph representations
Figure 1 illustrates an architecture for the physical representation of scene graphs for scene-based
immersive media.
NOTE The same or closely similar figure can also be drawn to illustrate a logical representation of scene-based
media.
Key
1 binary container for immersive media
2 human-readable or binary scene description
3 media asset #1
4 media asset #2
5 media asset #N
6 network
7 media asset #K
8 media asset #M
9 network-based reference from scene description to asset #K
10 local reference from scene description to asset #N stored within binary container
11 local reference from scene description to asset #1 stored within binary container
12 local reference from scene description to asset #2 stored within binary container
13 external storage reference from scene description to asset #M
Figure 1 — Architecture for scene graph representations of scene-based immersive media
In this illustration of a scene graph architecture, the description of the scene (1), including spatial, temporal,
[7] [13]
or logical relationships between objects, is stored in a human-readable format (e.g. XML or JSON ). This
human-readable portion of the scene graph comprises the information that may be annotated with metadata
from the IMS. The scene is further comprised of binary assets that are stored adjacent to the human-
readable description in a “container” (13), such as files that are combined, stored, and compressed into a
© ISO/IEC 2026 – All rights reserved
single archive file format (similar to files with a zip archive extension), or assets that are stored externally
from the container, e.g., either in a local or remote access network.
4.3 Scenes nested within scenes
An important aspect of scene graph architectures is that there is no explicit restriction within this document
that prohibits an individual asset from comprising an entirely self-described scene as depicted in Figure 1.
For example, referring to the architecture depicted in Figure 1, assets A, B, K, M, and N may themselves
be scenes stored in individual containers. Furthermore, these assets may likewise be annotated with IMS
metadata.
4.4 Architecture for annotated scene graph using IMS
Figure 2 illustrates an example of a scene graph that is annotated with metadata from the IMS. In the figure,
items 1A, 1B, 1C, 1D, and 1E refer to individual portions of the human readable description of the scene. Each
such portion may provide information related to the geometry or structure of the scene, and is separately
annotated with IMS metadata for that particular part of the scene. The figure also illustrates that the scene
contains four binary assets labelled: 2A, 2B, 2C, and 2D. The assets themselves may or may not be separately
annotated with IMS metadata. Furthermore, the assets themselves may be individual scenes.
Key
1A portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1B portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1C portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1D portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1E portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
2A binary asset A optionally annotated with IMS metadata
2B binary asset B optionally annotated with IMS metadata
2C binary asset C optionally annotated with IMS metadata
2D binary asset D optionally annotated with IMS metadata
Figure 2 — Architecture for scene graph annotated with IMS metadata
4.5 Architecture for translation between scene graph representations using IMS
A hypothetical architecture for translating one scene graph format to another scene graph format is
illustrated in Figure 3.
© ISO/IEC 2026 – All rights reserved
Key
1A portion of human-readable scene description 1 annotated with IMS metadata
1B portion of human-readable scene description 1 annotated with IMS metadata
1C portion of human-readable scene description 1 annotated with IMS metadata
1D portion of human-readable scene description 1 annotated with IMS metadata
1E portion of human-readable scene description 1 annotated with IMS metadata
2A portion of human-readable scene description 2 translated with IMS metadata from 1A
2B portion of human-readable scene description 2 translated with IMS metadata from 1B
2C portion of human-readable scene description 2 translated with IMS metadata from 1C
2D portion of human-readable scene description 2 translated with IMS metadata from 1D
2E portion of human-readable scene description 1 translated with IMS metadata from 1E
3 scene description 1 annotated with IMS metadata
4 scene description 2 translated from scene description 1 and IMS metadata
5A binary asset A associated with both scene description 1 and scene description 2
5B binary asset B associated with both scene description 1 and scene description 2
5C binary asset C associated with both scene description 1 and scene description 2
5D binary asset D associated with both scene description 1 and scene description 2
6 translation process
Figure 3 — Architecture for scene graph translation
In this figure, scene description 2 is derived via a translation process (6) from scene description 1 and its
corresponding IMS metadata. Scene description 1 is comprised of five parts labelled 1A, 1B, 1C, 1D, and
1E. Each part is annotated with corresponding IMS metadata. Associated with scene description 1 are four
binary assets, each individually labelled as: 5A, 5B, 5C, and 5D. Assets 5A, 5B, 5C, and 5D may be optionally
annotated. The combined IMS metadata and parts comprising scene description 1 are labelled as 3.
Component 3 serves as input to a translation process (6) that is guided by the IMS metadata to produce
output 4 which is scene description 2. Scene description 2 is also comprised of five parts, although there
does not need to be a one-to-one correspondence between the number of input scene description parts to
the number of output scene description parts.
© ISO/IEC 2026 – All rights reserved
5 Independent mapping space (IMS)
5.1 General
The IMS defines a vocabulary and corresponding set of labels that can be used to describe scene graphs to
aid in the translation of a single scene graph format into another single scene graph format, or to aid in the
distribution of a scene graph to a particular rendering interface. While descriptions and terms that comprise
the IMS are sufficiently dense and robust to describe a scene graph capable of representing photorealistic
scenes, the IMS may also be used to describe scene representations that are not designed to represent
photorealistic visualizations.
5.2 Architecture of IMS
The IMS is comprised of the following three architectural components that describe:
— the logical organization of geometric objects and their properties within the scene;
NOTE 1 In general, immersive media may be comprised of media that is created to stimulate any of the human
senses, e.g., visual and auditory senses of humans. Scenes are likewise organized according to the types of media
that stimulate human senses, i.e., visual and auditory scenes. While there are different types of scenes that are
used to create immersive media, this document is currently limited to define an IMS for visual scenes.
— the physical organization of the scene, e.g., for the purposes of access and storage of binary media within
the scene;
— processing directions for the presentation engine, e.g., for rendering or animation.
NOTE 2 A single scene graph can contain multiple sets of render processing directions, each set relevant to the
interface for a particular renderer.
Each of the IMS architectural components is comprised of systems that describe a particular system node
type for which there can be multiple of such subtypes further defined.
NOTE 3 For example, a “lighting” system node type can define multiple subtypes of “lighting” each of which can
satisfy the requirements of another system node that requires information about lighting.
Each system node type may be regarded as a process that is executed by a presentation engine. Each of such
processes produces an output that has a type consistent with its definition, e.g., lighting. Likewise, some of
the node types have input requirements that are described in terms of other node types. As an example, a
render instruction node type requires input information of node type “geometry” and node type of “camera,”
at a minimum, in order to produce a rendered output.
5.3 IMS systems
5.3.1 General
This subclause specifies the individual systems that comprise the IMS architectural components for logical
organization, physical organization, and processing directions. Associated with each system is a type, label,
description, and hypothetical rendering semantics, i.e., a processing model, for how a renderer can interpret
a node belonging to each system.
NOTE These systems are designed to mirror the structure, representation, and encoding of scene-based media
[3]
as specified by the ITMF Scene Graph Specification, ITMF Data Encoding Specification, and ITMF Container
[5]
Specification.
Associated with each system is one or more nodes that belong to the system. Figure 4 provides a canonical
representation of how the IMS systems relate to each other.
© ISO/IEC 2026 – All rights reserved
Key
1 canonical IMS values (e.g., integer, enumeration, float, …)
2 IMS systems for logical organization, physical organization, and processing directions
3 input or output pin
4 attribute
5 node graph
6 logical organization of geometry
7 processing directions
8 physical organization of scene graph
9 presentation engine (not a system in the IMS)
10 rendered output (not a system in the IMS)
Figure 4 — Canonical representation of IMS geometric objects within the scene
An IMS value (item 1) represents an input value from the value node system to any one of the other IMS
systems (item 2) including systems designated for logical organization, physical organization, and
processing directions for the scene.
Input or output relationships between any of the system nodes are established with IMS data pins (item
3). Attributes represent other IMS system nodes or value nodes that are considered to be immutable (not
changeable by rendering processes). A collection of system nodes is designated to describe the logical
organization of the geometry (item 6); processing directions (item 7); and physical organization (item 8)
in the scene. A node graph (item 5) represents a collection of other IMS system nodes that may also include
attributes or render-specific metadata (not in scope of this document). Inputs to a presentation engine (item
9) include information about the logical organization, processing directions, and physical organization (not
shown in the figure) to produce a rendered output (item 10).
© ISO/IEC 2026 – All rights reserved
5.3.2 IMS logical organization component
IMS systems used to describe the logical organization component of a scene are identified by a label with a
prefix of "ims.logical". Table 1 specifies the IMS systems that comprise the logical organization component.
Table 1 — Summary of IMS systems for logical organization of the scene
System node type IMS system label Description Hypothetical renderer
semantics
system value node ims.logical.sysVal An input value (e.g., inte- Values provided by system
gers, single-precision floats, value nodes are used by
spatial coordinates, vec- renderers for processing of
tors) to an associated node. other nodes to which the
values are associated.
material node ims.logical.material Surface properties of indi- Materials provide charac-
vidual geometric objects or teristics (e.g., porousness,
groups of geometric objects bumps, metallic properties)
in the scene. for the surfaces of individu-
al objects that are rendered
in the scene.
texture node ims.logical.texture Colour and brightness for a Textures define individu-
surface corresponding to a al colours and brightness
geometric objects or group for surfaces of individual
of geometric objects in the objects that are rendered in
scene. the scene.
geometry node ims.logical.geometry Geometric shape of an ob- Geometry defines the
ject or group of objects in a 3D shape or volume of
scene or geometry archive. individual objects that are
rendered in the scene.
surface (medium) node ims.logical.surface A description of how light The behaviour of individual
is reflected, refracted, or rays in ray-tracing algo-
absorbed by an object. rithms is characterized
by formulae provided in
surface nodes.
transformation node ims.logical.transform A description of how ge- The parameters are used
ometric objects are scaled, to transform individual
translated, or rotated. geometric objects within a
scene.
connection pins ims.logical.pins.input (input Parameters used as input One or more data pins are
pin) to a node, or a single output inputs to a node or a single
from a node, i.e., input data data pin is an output from
ims.logical.pins.output
pin(s) or output data pin. a node.
(output pin)
data attributes ims.logical.attribute Non-mutable characteris- The renderer does not allow
tics, features, or values of or other nodes to alter attrib-
for an object. ute values during serializa-
tion.
[9]
scene object ims.logical.sceneObject A geometric object, possibly Denotes use of Alembic,
animated, created using a Universal Scene Description
digital content creation tool, (USD), glTF, and Filmbox or
and included in a composit- other formats as individual
ed scene. geometry assets, possibly
for a larger scene.
5.3.2.1 Value nodes system
Table 2 specifies the types of values and labels that comprise the value nodes system within the IMS. IMS
labels for value nodes for use within the logical organization of the scene are identified by an IMS prefix of
‘ims.logical.sysVal’. The ITMF Data Encoding Specification provides additional information regarding usage
© ISO/IEC 2026 – All rights reserved
of the value nodes system. Annex A of this document provides information about how to use the ITMF Data
Encoding Specification with the IMS.
Table 2 — Nodes for value node system for logical organization component
Node ITMF node Node label IMS label
code point
float value 6 float ims.logical.sysVal.float
int value 9 integer ims.logical.sysVal.integer
boolean value 11 boolean ims.logical.sysVal.boolean
image resolution 12 imageResolution ims.logical.sysVal.imageResolution
sun direction 30 sunDirection ims.logical.sysVal.sunDirection
enumeration 57 enumeration ims.logical.sysVal.enumeration
annotation 68 annotation ims.logical.sysVal.annotation
string 84 string ims.logical.sysVal.string
file 88 file ims.logical.sysVal.file
directory 101 directoryName ims.logical.sysVal.directoryName
bit mask 132 bitMask ims.logical.sysVal.bitMask
5.3.2.2 Material nodes system
Table 3 specifies the node names and corresponding labels that comprise the material nodes system within
the IMS. IMS labels for material nodes are identified by an IMS prefix of "ims.logical.material". The ITMF
Data Encoding Specification provides additional information regarding usage of the material nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 3 — Nodes for material nodes system
Node ITMF node Node label IMS label
code point
material map 2 map ims.logical.material.map
glossy material 16 glossy ims.logical.material.glossy
diffuse material 17 diffuse ims.logical.material.diffuse
specular material 18 specular ims.logical.material.specular
mix material 19 mix ims.logical.material.mix
portal material 20 portal ims.logical.material.portal
metallic material 120 metallic ims.logical.material.metallic
toon material 121 toon ims.logical.material.toon
toon ramp material 122 toonRamp ims.logical.material.toonRamp
universal material 130 universal ims.logical.material.universal
composite material 138 composite ims.logical.material.composite
specular material layer 139 specularMaterialLayer ims.logical.material.specularMaterialLay-
er
diffuse material layer 140 diffuseMaterialLayer ims.logical.material.diffuseMaterialLayer
metallic material layer 141 metallicMaterialLayer ims.logical.material.metallicMaterialLayer
sheen material layer 142 sheenMaterialLayer ims.logical.material.sheenMaterialLayer
layered material 143 layered ims.logical.material.layered
material layer group 144 materialLayerGroup ims.logical.material.materialLayerGroup
shadow catcher 145 shadowCatcher ims.logical.material.shadowCatcher
© ISO/IEC 2026 – All rights reserved
TTabablele 3 3 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
hair material 147 hair ims.logical.material.hair
null material 159 null ims.logical.material.null
clipping material 178 clipping ims.logical.material.clipping
5.3.2.3 Texture nodes system
Table 4 specifies the node names and corresponding labels that comprise the texture nodes system within
the IMS. IMS labels for texture nodes are identified by an IMS prefix of ‘ims.logical.texture’. The ITMF Data
Encoding Specification provides additional information regarding usage of the texture nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 4 — Nodes for texture nodes system
Node ITMF node Node label IMS label
code point
turbulence texture 22 turbulence ims.logical.texture.turbulence
grayscale colour 31 grayscaleColour ims.logical.texture.grayscaleColour
gaussian spectrum 32 gaussianSpectrum ims.logical.texture.gaussianSpectrum
RGB colour 33 rgbColour ims.logical.texture.rgbColour
RGB image 34 rgbImage ims.logical.texture.rgbImage
alpha image 35 alphaImage ims.logical.texture.alphaImage
grayscale image 36 grayscaleImage ims.logical.texture.grayscaleImage
mix texture 38 mix ims.logical.texture.mix
multiply texture 39 multiply ims.logical.texture.multiply
cosine mix texture 40 cosineMix ims.logical.texture.cosineMix
clamp texture 41 clamp ims.logical.texture.clamp
saw wave texture 42 sawWave ims.logical.texture.sawWave
triangle wave texture 43 triangleWave ims.logical.texture.triangleWave
sine wave texture 44 sineWave ims.logical.texture.sineWave
checks texture 45 checks ims.logical.texture.checks
invert texture 46 invert ims.logical.texture.invert
marble texture 47 marble Ims.logical.texture.marble
rigid fractal texture 48 rigidFractal ims.logical.texture.rigidFractal
gradient map texture 49 gradientMap ims.logical.texture.gradientMap
falloff map 50 falloffMap ims.logical.texture.falloffMap
colour correction 51 colourCorrection ims.logical.texture.colourCorrection
dirt texture 63 dirt ims.logical.texture.dirt
cylindrical 74 cylindrical ims.logical.texture.cylindrical
linear XYZ to UVW 75 linearXYZtoUVW ims.logical.texture.linearXYZtoUVW
perspective 76 perspective ims.logical.texture.perspective
spherical 77 spherical ims.logical.texture.spherical
mesh UV 78 meshUV ims.logical.texture.meshUV
box 79 box ims.logical.texture.box
displacement 80 displacement ims.logical.texture.displacement
random colour texture 81 randomColour ims.logical.texture.randomColour
© ISO/IEC 2026 – All rights reserved
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
noise texture 87 noise ims.logical.texture.noise
polygon side 89 polygonSide ims.logical.texture.polygonSide
vertex displacement 97 vertexDisplacement ims.logical.texture.vertexDisplacement
W-coordinate 104 wCoordinate ims.logical.texture.wCoordinate
add texture 106 add ims.logical.texture.add
comparison 107 comparison ims.logical.texture.comparison
subtract texture 108 subtract ims.logical.texture.subtract
triplanar map 109 triplanarMap ims.logical.texture.triplanarMap
triplanar 111 triplanar ims.logical.texture.triplanar
instance colour 113 instanceColour ims.logical.texture.instanceColour
instance range 114 instanceRange ims.logical.texture.instanceRange
baking texture 115 baking ims.logical.texture.baking
OSL texture 117 oslTexture ims.logical.texture.oslTexture
UVW transform 118 uvwTransform ims.logical.texture.uvwTransform
OSL 125 osl ims.logical.texture.osl
OSL delayed UV 127 oslDelayedUV ims.logical.texture.oslDelayedUV
image tiles 131 imageTiles ims.logical.texture.imageTiles
colour vertex attribute 135 colourVertexAttribute ims.logical.texture.colourVertexAttrib-
texture ute
grayscale colour vertex 136 grayscaleColourVertexAttrib- ims.logical.texture.grayscaleColourVer-
attribute texture ute texAttribute
vertex displacement mixer 151 vertexDisplacementMixer ims.logical.texture.vertexDisplacement-
Mixer
spotlight distribution 158 spotlightDistribution ims.logical.texture.spotlightDistribu-
tion
Cinema4D noise texture 162 cinema4dNoise ims.logical.texture.cinema4dNoise
chaos texture 170 chaos ims.logical.texture.chaos
channel picker 171 channelPicker ims.logical.texture.channelPicker
channel merger 172 channelMerger ims.logical.texture.channelMerger
ray switch 173 raySwitch ims.logical.texture.raySwitch
channel inverter 174 channelInverter ims.logical.texture.channelInverter
channel mapper 175 channelMapper ims.logical.texture.channelMapper
composite texture 176 composite ims.logical.texture.composite
mesh UV 177 meshUV ims.logical.texture.meshUV
iridescent texture 187 iridescent ims.logical.texture.iridescent
volume to texture 256 volumeToTexture ims.logical.texture.volumeToTexture
colour to UVW 258 colourToUVW ims.logical.texture.colourToUVW
smooth Voronoi contours 260 smoothVoronoiContours ims.logical.texture.smoothVoronoiCon-
texture tours
tile patterns texture 261 tilePatterns ims.logical.texture.tilePatterns
procedural effects texture 262 proceduralEffects ims.logical.texture.procedurealEffects
chainmail texture 263 chainmail ims.logical.texture.chainmail
Moire mosaic texture 264 moireMosaic ims.logical.texture.moireMosaic
colour squares texture 265 colourSquares ims.logical.texture.colourSquares
stripes texture 266 stripes ims.logical.texture.stripes
© ISO/IEC 2026 – All rights reserved
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
flakes texture 267 flakes ims.logical.texture.flakes
fractal texture 268 fractal ims.logical.texture.fractal
hagelslag texture 269 hagelslag ims.logical.texture.hagelslag
glowing circle texture 270 glowingCircle ims.logical.texture.glowingCircle
curvature texture 271 curvature ims.logical.texture.curvature
cell noise texture 275 cellNoise ims.logical.texture.cellNoise
sample position to UV 317 samplePositionToUV ims.logical.texture.samplePositionToUV
composite texture layer 318 compositeTextureLayer ims.logical.texture.compositeTexture-
Layer
floats to colour 320 floatsToColour ims.logical.texture.floatsToColour
float3 to colour 321 float3ToColour ims.logical.texture.float3ToColour
distorted mesh UV 322 distortedMeshUV ims.logical.texture.distortedMeshUV
capture to custom AOV 323 captureToCustomAOV ims.logical.texture.captureToCus-
tomAOV
float to greyscale 324 floatToGreyscale ims.logical.texture.floatToGreyscale
ray direction 326 rayDirection ims.logical.texture.rayDirection
normal texture 327 normal Ims.logical.texture.normal
position texture 328 position ims.logical.texture.position
relative distance 329 relativeDistance ims.logical.texture.relativeDistance
UV coordinate 330 uvCoordinate ims.logical.texture.uvCoordinate
Z depth 331 zDepth ims.logical.texture.zDepth
gradient generator texture 332 gradientGenerator ims.logical.texture.gradientGenerator
random map 333 randomMap ims.logical.texture.randomMap
range texture 334 range ims.logical.texture.range
surface tangent uPdu 335 surfaceTangentUPdu ims.logical.texture.surfaceTangentUPdu
surface tangent uPdv 336 surfaceTangentUPdv ims.logical.texture.surfaceTangentUPdv
sample position 337 samplePosition ims.logical.texture.samplePosition
matcap 338 matcap ims.logical.texture.matcap
binary math opera
...
FINAL DRAFT
International
Standard
ISO/IEC
FDIS
23090-28
ISO/IEC JTC 1/SC 29
Information technology — Coded
Secretariat: JISC
representation of immersive
Voting begins on:
media —
2025-11-13
Part 28:
Voting terminates on:
2026-01-08
Interchangeable scene-based media
representations
Technologies de l'information — Représentation codée de média
immersifs —
Partie 28: Représentations de médias basés sur des scènes
interchangeables
RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT,
WITH THEIR COMMENTS, NOTIFICATION OF ANY
RELEVANT PATENT RIGHTS OF WHICH THEY ARE AWARE
AND TO PROVIDE SUPPOR TING DOCUMENTATION.
IN ADDITION TO THEIR EVALUATION AS
BEING ACCEPTABLE FOR INDUSTRIAL, TECHNO
LOGICAL, COMMERCIAL AND USER PURPOSES, DRAFT
INTERNATIONAL STANDARDS MAY ON OCCASION HAVE
TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL
TO BECOME STAN DARDS TO WHICH REFERENCE MAY BE
MADE IN NATIONAL REGULATIONS.
Reference number
ISO/IEC FDIS 2309028:2025(en) © ISO/IEC 2025
FINAL DRAFT
ISO/IEC FDIS 23090-28:2025(en)
International
Standard
ISO/IEC
FDIS
23090-28
ISO/IEC JTC 1/SC 29
Information technology — Coded
Secretariat: JISC
representation of immersive media —
Voting begins on:
Part 28:
Interchangeable scene-based media
Voting terminates on:
representations
Technologies de l'information — Représentation codée de média
immersifs —
Partie 28: Représentations de médias basés sur des scènes
interchangeables
RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT,
WITH THEIR COMMENTS, NOTIFICATION OF ANY
RELEVANT PATENT RIGHTS OF WHICH THEY ARE AWARE
AND TO PROVIDE SUPPOR TING DOCUMENTATION.
© ISO/IEC 2025
IN ADDITION TO THEIR EVALUATION AS
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
BEING ACCEPTABLE FOR INDUSTRIAL, TECHNO
LOGICAL, COMMERCIAL AND USER PURPOSES, DRAFT
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
INTERNATIONAL STANDARDS MAY ON OCCASION HAVE
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL
or ISO’s member body in the country of the requester.
TO BECOME STAN DARDS TO WHICH REFERENCE MAY BE
MADE IN NATIONAL REGULATIONS.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland Reference number
ISO/IEC FDIS 2309028:2025(en) © ISO/IEC 2025
© ISO/IEC 2025 – All rights reserved
ii
ISO/IEC FDIS 23090-28:2025(en)
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 General .1
3.2 Abbreviated terms .3
4 Architectures for scene-based media . 3
4.1 General .3
4.2 Architecture for scene graph representations .4
4.3 Scenes nested within scenes .5
4.4 Architecture for annotated scene graph using IMS .5
4.5 Architecture for translation between scene graph representations using IMS .5
5 Independent mapping space (IMS) . 7
5.1 General .7
5.2 Architecture of IMS .7
5.3 IMS systems.7
5.3.1 General .7
5.3.2 IMS logical organization component .9
5.3.3 IMS physical organization component .18
5.3.4 IMS processing directions component . 23
Annex A (normative) Annotation using IMS and ITMF .32
Annex B (normative) Mapping of IMS to glTF 2.0 .37
Annex C (normative) Annotation process for glTF .39
Bibliography .45
© ISO/IEC 2025 – All rights reserved
iii
ISO/IEC FDIS 23090-28:2025(en)
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical activity.
ISO and IEC technical committees collaborate in fields of mutual interest. Other international organizations,
governmental and non-governmental, in liaison with ISO and IEC, also take part in the work.
The procedures used to develop this document and those intended for its further maintenance are described
in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the different types
of document should be noted. This document was drafted in accordance with the editorial rules of the ISO/
IEC Directives, Part 2 (see www.iso.org/directives or www.iec.ch/members_experts/refdocs).
ISO and IEC draw attention to the possibility that the implementation of this document may involve the
use of (a) patent(s). ISO and IEC take no position concerning the evidence, validity or applicability of any
claimed patent rights in respect thereof. As of the date of publication of this document, ISO and IEC had not
received notice of (a) patent(s) which may be required to implement this document. However, implementers
are cautioned that this may not represent the latest information, which may be obtained from the patent
database available at www.iso.org/patents and https://patents.iec.ch. ISO and IEC shall not be held
responsible for identifying any or all such patent rights.
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions
related to conformity assessment, as well as information about ISO's adherence to the World Trade
Organization (WTO) principles in the Technical Barriers to Trade (TBT) see www.iso.org/iso/foreword.html.
In the IEC, see www.iec.ch/understanding-standards.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information Technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
© ISO/IEC 2025 – All rights reserved
iv
ISO/IEC FDIS 23090-28:2025(en)
Introduction
Emerging display technologies capable of creating fully formed holograms, e.g. 3D aerial images comprised
of waveforms emitted from a display panel surface, require input source media that are sufficiently dense in
information such that each of the objects in a holographic visual scene is described in terms of its complete and
explicit geometry, and surface properties to characterize how the surface responds to the presence of light.
Such visual information can furthermore facilitate display technologies to produce visual scenes of objects that
cannot be distinguished from the same objects when viewed in the natural world; i.e., the visual information
of the media is sufficiently dense such that the display is enabled to produce a photorealistic result.
Commonly used raster media formats such as those specified in ITU-R BT.601, ITU-R BT.709, ITU-R BT.2020,
ITU-R BT.2100 were originally developed to efficiently provide information sufficient for 2D displays to
produce 2D visual presentations, albeit with varying degrees of spatial resolutions, frame rates, and pixel
depth. These raster formats, however, neither provide sufficient information in a practical nor efficient
manner, to enable a holographic display to produce holograms. As an alternative to raster media, 3D
graphics formats used in digital content creation tools and production workflows offer a solution to provide
sufficiently dense information to enable photorealistic results where raster formats do not. The use of 3D
graphics technologies hence serves as a fundamental technology in enabling immersive displays to create
their optimal visual presentations.
To date, immersive displays provide a media input interface that is capable of ingesting 3D graphics formats
for presentation by the display. Such interfaces can be enabled and shipped as part of the onboard processing
features in the display by use of game engine software development kits. However, 3D scene media that
are targeted for processing by the display’s choice of renderer must first be translated into a format that
is consistent with the internal scene representations of the renderer. To facilitate the translation and
distribution of 3D scene media into the renderer interface provided by the display, an independent mapping
space (IMS) is specified as a part of the ISO/IEC 23090 series.
At the core of the IMS is a vocabulary of terms and definitions that can be used to annotate a scene graph
or scene asset with metadata, i.e., to facilitate the translation of media from one format to another. Such
translation from one format to another format is a commonly encountered challenge in the distribution of
scene-based media, as a number of formats have emerged to gain popularity in various application areas,
e.g., gaming, social media, e-commerce, and content production. The corresponding vocabulary of terms
and definitions describes nodes, media formats, media attributes, and render processing instructions of
commonly-used scene graph formats and media representations. Elements from the IMS can be used to
annotate a scene graph or media format so that its translation to another format is more straightforward
and consistently performed. Thus, the IMS facilitates the distribution and interoperability of a variety of
scene-based media into render applications that are limited in the types of formats that they are designed to
support.
Another application of this document is to guide the translation of media assets for use across a diverse
set of applications comprising the Metaverse, e.g., gaming, social media, retail applications. In the absence
of a common format that specifies media for use across the variety of Metaverse applications, the IMS can
facilitate the translation from one asset representation to another.
© ISO/IEC 2025 – All rights reserved
v
FINAL DRAFT International Standard ISO/IEC FDIS 23090-28:2025(en)
Information technology — Coded representation of
immersive media —
Part 28:
Interchangeable scene-based media representations
1 Scope
This document specifies an architecture and data model for interchange of scene-based media for use in
systems and applications that employ renderers for the presentation of immersive media, including audio,
visual, tactile and other representations of media types.
The architecture and data model specified in this document apply to use cases where there is a need for
interchange of scene-based media for presentation by systems with 3D render-based technologies. One
specific application of this document is immersive display technologies that utilize a real-time renderer of
3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-
based media. Another application of this document is for the distribution of media assets for use across a
large variety of applications comprising the Metaverse.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
3.1 General
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
3.1.1
scene graph
general data structure commonly used by vector-based graphics editing applications and modern computer
games, that arranges the logical and often (but not necessarily) spatial representation of a graphical scene; a
collection of nodes and vertices in a graph structure
3.1.2
immersive technology media format
ITMF
suite of specifications developed by the Immersive Digital Experiences Alliance
3.1.3
node
fundamental element of the scene graph comprised of information related to the logical or spatial or
temporal representation of visual or audio information
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.4
node graph
collection of nodes that can be positioned or modified as a single unit and linked to the remainder of the
scene graph by linker nodes
3.1.5
node type
general category of node that performs a certain function, for example, a camera node type can be a
panoramic lens camera node, an open shading language (OSL) node, or other types of cameras
3.1.6
attribute
metadata associated with a node used to describe a particular characteristic or feature of a node either in a
canonical or more complex form (e.g. in terms of another node)
3.1.7
container
serialized format to store and exchange information to represent all natural, all synthetic, or a mixture of
synthetic and natural scenes including a scene graph and all of the media resources that are required for
rendering the scene
3.1.8
serialization
process of translating data structures or object state into a format that can be stored (for example, in a file
or memory buffer) or transmitted (for example, across a network connection link) and reconstructed later
(possibly in a different computer environment)
Note 1 to entry: When the resulting series of bits is reread according to the serialization format, it can be used to
create a semantically identical clone of the original object.
3.1.9
renderer
software-based application or process, based on a selective mixture of disciplines related to: acoustic
physics, light physics, visual perception, audio perception, mathematics, and software development, that,
given an input scene graph and asset container, emits a visual and/or audio signal suitable for presentation
on a targeted device or conforming to the desired properties as specified by attributes of a render target
node in the scene graph
Note 1 to entry: For visual-based media assets, a renderer may emit a visual signal suitable for a targeted display, or
for storage as an intermediate asset (e.g. repackaged into another container i.e. used in a series of rendering processes
in a graphics pipeline); for audio-based media assets, a renderer may emit an audio signal for presentation in a multi-
channel loudspeaker and/or binauralized headphones, or for repackaging into another (output) container.
3.1.10
scene-based media
audio, visual, haptic, and other primary types of media and media-related information organized logically
and spatially by a use of a scene graph
3.1.11
RGB
additive colour model in which red, green, and blue are the primary colours
Note 1 to entry: The abbreviation RGB is derived from the first letter of each of the primary colours.
3.1.12
asset
one of potentially multiple objects comprising the media for a scene in scene-based media; each object fully
formed and packaged such that it can be rendered separately from other objects in the scene
Note 1 to entry: Individual assets may be shared, e.g. reused across multiple scenes.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.13
pin
input parameter to a node, or an output from a node
3.1.14
this document
referring to this International Standard
3.1.15
independent mapping space label
IMS label
sequence of alphanumeric characters using dot notation comprised of an IMS prefix followed by a node label,
[4]
which may optionally be followed by labels derived from the ITMF Data Encoding Specification
3.1.16
independent mapping space prefix
IMS prefix
sequence of alphanumeric characters using dot notation comprised of "ims" followed by an alphanumeric
label to denote the IMS component, subsequently followed by an alphanumeric label to denote the IMS
system for a particular node
3.2 Abbreviated terms
2D two dimensional
3D three dimensional
AOV arbitrary output variables
IDEA Immersive Digital Experiences Alliance
ITMF Immersive Technology Media Format
IMS independent mapping space
OSL Open Shading Language
RGB red, green, blue
USD Universal Scene Description
glTF graphics language Transmission Format
4 Architectures for scene-based media
4.1 General
This clause illustrates architectures related to the organization of scene-based media and how such media
can be annotated by a process that stores IMS metadata into the media. Furthermore, this clause provides a
hypothetical architecture for a process that translates one scene-based media format to another.
[2]
A more in-depth description of these architectures is provided in ISO/IEC TR 23090-27.
[2]
NOTE This document references only some of the architectures in ISO/IEC TR 23090-27 for the purposes of
providing additional context.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
4.2 Architecture for scene graph representations
Figure 1 illustrates an architecture for the physical representation of scene graphs for scene-based
immersive media.
NOTE The same or closely similar figure can also be drawn to illustrate a logical representation of scene-based media.
Key
1 binary container for immersive media
2 human-readable or binary scene description
3 media asset #1
4 media asset #2
5 media asset #N
6 network
7 media asset #K
8 media asset #M
9 network-based reference from scene description to asset #K
10 local reference from scene description to asset #N stored within binary container
11 local reference from scene description to asset #1 stored within binary container
12 local reference from scene description to asset #2 stored within binary container
13 external storage reference from scene description to asset #M
Figure 1 — Architecture for scene graph representations of scene-based immersive media
In this illustration of a scene graph architecture, the description of the scene (1), including spatial, temporal,
[7] [13]
or logical relationships between objects, is stored in a human-readable format (e.g. XML or JSON ). This
human-readable portion of the scene graph comprises the information that may be annotated with metadata
from the IMS. The scene is further comprised of binary assets that are stored adjacent to the human-
readable description in a “container” (13), such as files that are combined, stored, and compressed into a
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
single archive file format (similar to files with a zip archive extension), or assets that are stored externally
from the container, e.g., either in a local or remote access network.
4.3 Scenes nested within scenes
An important aspect of scene graph architectures is that there is no explicit restriction within this document
that prohibits an individual asset from comprising an entirely self-described scene as depicted in Figure 1.
For example, referring to the architecture depicted in Figure 1, assets A, B, K, M, and N may themselves
be scenes stored in individual containers. Furthermore, these assets may likewise be annotated with IMS
metadata.
4.4 Architecture for annotated scene graph using IMS
Figure 2 illustrates an example of a scene graph that is annotated with metadata from the IMS. In the figure,
items 1A, 1B, 1C, 1D, and 1E refer to individual portions of the human readable description of the scene. Each
such portion may provide information related to the geometry or structure of the scene, and is separately
annotated with IMS metadata for that particular part of the scene. The figure also illustrates that the scene
contains four binary assets labelled: 2A, 2B, 2C, and 2D. The assets themselves may or may not be separately
annotated with IMS metadata. Furthermore, the assets themselves may be individual scenes.
Key
1A portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1B portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1C portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1D portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1E portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
2A binary asset A optionally annotated with IMS metadata
2B binary asset B optionally annotated with IMS metadata
2C binary asset C optionally annotated with IMS metadata
2D binary asset D optionally annotated with IMS metadata
Figure 2 — Architecture for scene graph annotated with IMS metadata
4.5 Architecture for translation between scene graph representations using IMS
A hypothetical architecture for translating one scene graph format to another scene graph format is
illustrated in Figure 3.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Key
1A portion of human-readable scene description 1 annotated with IMS metadata
1B portion of human-readable scene description 1 annotated with IMS metadata
1C portion of human-readable scene description 1 annotated with IMS metadata
1D portion of human-readable scene description 1 annotated with IMS metadata
1E portion of human-readable scene description 1 annotated with IMS metadata
2A portion of human-readable scene description 2 translated with IMS metadata from 1A
2B portion of human-readable scene description 2 translated with IMS metadata from 1B
2C portion of human-readable scene description 2 translated with IMS metadata from 1C
2D portion of human-readable scene description 2 translated with IMS metadata from 1D
2E portion of human-readable scene description 1 translated with IMS metadata from 1E
3 scene description 1 annotated with IMS metadata
4 scene description 2 translated from scene description 1 and IMS metadata
5A binary asset A associated with both scene description 1 and scene description 2
5B binary asset B associated with both scene description 1 and scene description 2
5C binary asset C associated with both scene description 1 and scene description 2
5D binary asset D associated with both scene description 1 and scene description 2
6 translation process
Figure 3 — Architecture for scene graph translation
In this figure, scene description 2 is derived via a translation process (6) from scene description 1 and its
corresponding IMS metadata. Scene description 1 is comprised of five parts labelled 1A, 1B, 1C, 1D, and
1E. Each part is annotated with corresponding IMS metadata. Associated with scene description 1 are four
binary assets, each individually labelled as: 5A, 5B, 5C, and 5D. Assets 5A, 5B, 5C, and 5D may be optionally
annotated. The combined IMS metadata and parts comprising scene description 1 are labelled as 3.
Component 3 serves as input to a translation process (6) that is guided by the IMS metadata to produce
output 4 which is scene description 2. Scene description 2 is also comprised of five parts, although there
does not need to be a one-to-one correspondence between the number of input scene description parts to
the number of output scene description parts.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
5 Independent mapping space (IMS)
5.1 General
The IMS defines a vocabulary and corresponding set of labels that can be used to describe scene graphs to
aid in the translation of a single scene graph format into another single scene graph format, or to aid in the
distribution of a scene graph to a particular rendering interface. While descriptions and terms that comprise
the IMS are sufficiently dense and robust to describe a scene graph capable of representing photorealistic
scenes, the IMS may also be used to describe scene representations that are not designed to represent
photorealistic visualizations.
5.2 Architecture of IMS
The IMS is comprised of the following three architectural components that describe:
— the logical organization of geometric objects and their properties within the scene;
NOTE 1 In general, immersive media may be comprised of media that is created to stimulate any of the human
senses, e.g., visual and auditory senses of humans. Scenes are likewise organized according to the types of media
that stimulate human senses, i.e., visual and auditory scenes. While there are different types of scenes that are
used to create immersive media, this document is currently limited to define an IMS for visual scenes.
— the physical organization of the scene, e.g., for the purposes of access and storage of binary media within
the scene;
— processing directions for the presentation engine, e.g., for rendering or animation.
NOTE 2 A single scene graph can contain multiple sets of render processing directions, each set relevant to the
interface for a particular renderer.
Each of the IMS architectural components is comprised of systems that describe a particular system node
type for which there can be multiple of such subtypes further defined.
NOTE 3 For example, a “lighting” system node type can define multiple subtypes of “lighting” each of which can
satisfy the requirements of another system node that requires information about lighting.
Each system node type may be regarded as a process that is executed by a presentation engine. Each of such
processes produces an output that has a type consistent with its definition, e.g., lighting. Likewise, some of
the node types have input requirements that are described in terms of other node types. As an example, a
render instruction node type requires input information of node type “geometry” and node type of “camera,”
at a minimum, in order to produce a rendered output.
5.3 IMS systems
5.3.1 General
This subclause specifies the individual systems that comprise the IMS architectural components for logical
organization, physical organization, and processing directions. Associated with each system is a type, label,
description, and hypothetical rendering semantics, i.e., a processing model, for how a renderer can interpret
a node belonging to each system.
NOTE These systems are designed to mirror the structure, representation, and encoding of scene-based media
[3]
as specified by the ITMF Scene Graph Specification, ITMF Data Encoding Specification, and ITMF Container
[5]
Specification.
Associated with each system is one or more nodes that belong to the system. Figure 4 provides a canonical
representation of how the IMS systems relate to each other.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Key
1 canonical IMS values (e.g., integer, enumeration, float, …)
2 IMS systems for logical organization, physical organization, and processing directions
3 input or output pin
4 attribute
5 node graph
6 logical organization of geometry
7 processing directions
8 physical organization of scene graph
9 presentation engine (not a system in the IMS)
10 rendered output (not a system in the IMS)
Figure 4 — Canonical representation of IMS geometric objects within the scene
An IMS value (item 1) represents an input value from the value node system to any one of the other IMS
systems (item 2) including systems designated for logical organization, physical organization, and
processing directions for the scene.
Input or output relationships between any of the system nodes are established with IMS data pins (item
3). Attributes represent other IMS system nodes or value nodes that are considered to be immutable (not
changeable by rendering processes). A collection of system nodes is designated to describe the logical
organization of the geometry (item 6); processing directions (item 7); and physical organization (item 8)
in the scene. A node graph (item 5) represents a collection of other IMS system nodes that may also include
attributes or render-specific metadata (not in scope of this document). Inputs to a presentation engine (item
9) include information about the logical organization, processing directions, and physical organization (not
shown in the figure) to produce a rendered output (item 10).
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
5.3.2 IMS logical organization component
IMS systems used to describe the logical organization component of a scene are identified by a label with a
prefix of "ims.logical". Table 1 specifies the IMS systems that comprise the logical organization component.
Table 1 — Summary of IMS systems for logical organization of the scene
System node type IMS system label Description Hypothetical renderer
semantics
system value node ims.logical.sysVal An input value (e.g., inte- Values provided by system
gers, single-precision floats, value nodes are used by
spatial coordinates, vec- renderers for processing of
tors) to an associated node. other nodes to which the
values are associated.
material node ims.logical.material Surface properties of indi- Materials provide charac-
vidual geometric objects or teristics (e.g., porousness,
groups of geometric objects bumps, metallic properties)
in the scene. for the surfaces of individu-
al objects that are rendered
in the scene.
texture node ims.logical.texture Colour and brightness for a Textures define individu-
surface corresponding to a al colours and brightness
geometric objects or group for surfaces of individual
of geometric objects in the objects that are rendered in
scene. the scene.
geometry node ims.logical.geometry Geometric shape of an ob- Geometry defines the
ject or group of objects in a 3D shape or volume of
scene or geometry archive. individual objects that are
rendered in the scene.
surface (medium) node ims.logical.surface A description of how light The behaviour of individual
is reflected, refracted, or rays in ray-tracing algo-
absorbed by an object. rithms is characterized
by formulae provided in
surface nodes.
transformation node ims.logical.transform A description of how ge- The parameters are used
ometric objects are scaled, to transform individual
translated, or rotated. geometric objects within a
scene.
connection pins ims.logical.pins.input (input Parameters used as input One or more data pins are
pin) to a node, or a single output inputs to a node or a single
from a node, i.e., input data data pin is an output from
ims.logical.pins.output
pin(s) or output data pin. a node.
(output pin)
data attributes ims.logical.attribute Non-mutable characteris- The renderer does not allow
tics, features, or values of or other nodes to alter attrib-
for an object. ute values during serializa-
tion.
[9]
scene object ims.logical.sceneObject A geometric object, possibly Denotes use of Alembic,
animated, created using a Universal Scene Description
digital content creation tool, (USD), glTF, and Filmbox or
and included in a composit- other formats as individual
ed scene. geometry assets, possibly
for a larger scene.
5.3.2.1 Value nodes system
Table 2 specifies the types of values and labels that comprise the value nodes system within the IMS. IMS
labels for value nodes for use within the logical organization of the scene are identified by an IMS prefix of
‘ims.logical.sysVal’. The ITMF Data Encoding Specification provides additional information regarding usage
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
of the value nodes system. Annex A of this document provides information about how to use the ITMF Data
Encoding Specification with the IMS.
Table 2 — Nodes for value node system for logical organization component
Node ITMF node Node label IMS label
code point
float value 6 float ims.logical.sysVal.float
int value 9 integer ims.logical.sysVal.integer
boolean value 11 boolean ims.logical.sysVal.boolean
image resolution 12 imageResolution ims.logical.sysVal.imageResolution
sun direction 30 sunDirection ims.logical.sysVal.sunDirection
enumeration 57 enumeration ims.logical.sysVal.enumeration
annotation 68 annotation ims.logical.sysVal.annotation
string 84 string ims.logical.sysVal.string
file 88 file ims.logical.sysVal.file
directory 101 directoryName ims.logical.sysVal.directoryName
bit mask 132 bitMask ims.logical.sysVal.bitMask
5.3.2.2 Material nodes system
Table 3 specifies the node names and corresponding labels that comprise the material nodes system within
the IMS. IMS labels for material nodes are identified by an IMS prefix of "ims.logical.material". The ITMF
Data Encoding Specification provides additional information regarding usage of the material nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 3 — Nodes for material nodes system
Node ITMF node Node label IMS label
code point
material map 2 map ims.logical.material.map
glossy material 16 glossy ims.logical.material.glossy
diffuse material 17 diffuse ims.logical.material.diffuse
specular material 18 specular ims.logical.material.specular
mix material 19 mix ims.logical.material.mix
portal material 20 portal ims.logical.material.portal
metallic material 120 metallic ims.logical.material.metallic
toon material 121 toon ims.logical.material.toon
toon ramp material 122 toonRamp ims.logical.material.toonRamp
universal material 130 universal ims.logical.material.universal
composite material 138 composite ims.logical.material.composite
specular material layer 139 specularMaterialLayer ims.logical.material.specularMaterialLay-
er
diffuse material layer 140 diffuseMaterialLayer ims.logical.material.diffuseMaterialLayer
metallic material layer 141 metallicMaterialLayer ims.logical.material.metallicMaterialLayer
sheen material layer 142 sheenMaterialLayer ims.logical.material.sheenMaterialLayer
layered material 143 layered ims.logical.material.layered
material layer group 144 materialLayerGroup ims.logical.material.materialLayerGroup
shadow catcher 145 shadowCatcher ims.logical.material.shadowCatcher
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 3 3 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
hair material 147 hair ims.logical.material.hair
null material 159 null ims.logical.material.null
clipping material 178 clipping ims.logical.material.clipping
5.3.2.3 Texture nodes system
Table 4 specifies the node names and corresponding labels that comprise the texture nodes system within
the IMS. IMS labels for texture nodes are identified by an IMS prefix of ‘ims.logical.texture’. The ITMF Data
Encoding Specification provides additional information regarding usage of the texture nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 4 — Nodes for texture nodes system
Node ITMF node Node label IMS label
code point
turbulence texture 22 turbulence ims.logical.texture.turbulence
grayscale colour 31 grayscaleColour ims.logical.texture.grayscaleColour
gaussian spectrum 32 gaussianSpectrum ims.logical.texture.gaussianSpectrum
RGB colour 33 rgbColour ims.logical.texture.rgbColour
RGB image 34 rgbImage ims.logical.texture.rgbImage
alpha image 35 alphaImage ims.logical.texture.alphaImage
grayscale image 36 grayscaleImage ims.logical.texture.grayscaleImage
mix texture 38 mix ims.logical.texture.mix
multiply texture 39 multiply ims.logical.texture.multiply
cosine mix texture 40 cosineMix ims.logical.texture.cosineMix
clamp texture 41 clamp ims.logical.texture.clamp
saw wave texture 42 sawWave ims.logical.texture.sawWave
triangle wave texture 43 triangleWave ims.logical.texture.triangleWave
sine wave texture 44 sineWave ims.logical.texture.sineWave
checks texture 45 checks ims.logical.texture.checks
invert texture 46 invert ims.logical.texture.invert
marble texture 47 marble Ims.logical.texture.marble
rigid fractal texture 48 rigidFractal ims.logical.texture.rigidFractal
gradient map texture 49 gradientMap ims.logical.texture.gradientMap
falloff map 50 falloffMap ims.logical.texture.falloffMap
colour correction 51 colourCorrection ims.logical.texture.colourCorrection
dirt texture 63 dirt ims.logical.texture.dirt
cylindrical 74 cylindrical ims.logical.texture.cylindrical
linear XYZ to UVW 75 linearXYZtoUVW ims.logical.texture.linearXYZtoUVW
perspective 76 perspective ims.logical.texture.perspective
spherical 77 spherical ims.logical.texture.spherical
mesh UV 78 meshUV ims.logical.texture.meshUV
box 79 box ims.logical.texture.box
displacement 80 displacement ims.logical.texture.displacement
random colour texture 81 randomColour ims.logical.texture.randomColour
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
noise texture 87 noise ims.logical.texture.noise
polygon side 89 polygonSide ims.logical.texture.polygonSide
vertex displacement 97 vertexDisplacement ims.logical.texture.vertexDisplacement
W-coordinate 104 wCoordinate ims.logical.texture.wCoordinate
add texture 106 add ims.logical.texture.add
comparison 107 comparison ims.logical.texture.comparison
subtract texture 108 subtract ims.logical.texture.subtract
triplanar map 109 triplanarMap ims.logical.texture.triplanarMap
triplanar 111 triplanar ims.logical.texture.triplanar
instance colour 113 instanceColour ims.logical.texture.instanceColour
instance range 114 instanceRange ims.logical.texture.instanceRange
baking texture 115 baking ims.logical.texture.baking
OSL texture 117 oslTexture ims.logical.texture.oslTexture
UVW transform 118 uvwTransform ims.logical.texture.uvwTransform
OSL 125 osl ims.logical.texture.osl
OSL delayed UV 127 oslDelayedUV ims.logical.texture.oslDelayedUV
image tiles 131 imageTiles ims.logical.texture.imageTiles
colour vertex attribute 135 colourVertexAttribute ims.logical.texture.colourVertexAttrib-
texture ute
grayscale colour vertex 136 grayscaleColourVertexAttrib- ims.logical.texture.grayscaleColourVer-
attribute texture ute texAttribute
vertex displacement mixer 151 vertexDisplacementMixer ims.logical.texture.vertexDisplacement-
Mixer
spotlight distribution 158 spotlightDistribution ims.logical.texture.spotlightDistribu-
tion
Cinema4D noise texture 162 cinema4dNoise ims.logical.texture.cinema4dNoise
chaos texture 170 chaos ims.logical.texture.chaos
channel picker 171 channelPicker ims.logical.texture.channelPicker
channel merger 172 channelMerger ims.logical.texture.channelMerger
ray switch 173 raySwitch ims.logical.texture.raySwitch
channel inverter 174 channelInverter ims.logical.texture.channelInverter
channel mapper 175 channelMapper ims.logical.texture.channelMapper
composite texture 176 composite ims.logical.texture.composite
mesh UV 177 meshUV ims.logical.texture.meshUV
iridescent texture 187 iridescent ims.logical.texture.iridescent
volume to texture 256 volumeToTexture ims.logical.texture.volumeToTexture
colour to UVW 258 colourToUVW ims.logical.texture.colourToUVW
smooth Voronoi contours 260 smoothVoronoiContours ims.logical.texture.smoothVoronoiCon-
texture tours
tile patterns texture 261 tilePatterns ims.logical.texture.tilePatterns
procedural effects texture 262 proceduralEffects ims.logical.texture.procedurealEffects
chainmail texture 263 chainmail ims.logical.texture.chainmail
Moire mosaic texture 264 moireMosaic ims.logical.texture.moireMosaic
colour squares texture 265 colourSquares ims.logical.texture.colourSquares
stripes texture 266 stripes ims.logical.texture.stripes
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
flakes texture 267 flakes ims.logical.texture.flakes
fractal texture 268 fractal ims.logical.texture.fractal
hagelslag texture 269 hagels
...
ISO/IEC FDIS 23090-28
ISO/IEC JTC 1/SC 29
Secretariat: JISC
Date: 2025-10-30
Information technology — Coded representation of immersive
media —
Part 28:
Interchangeable scene-based media representations
Technologies de l'information — Représentation codée de média immersifs —
Partie 28: Représentations de médias basés sur des scènes interchangeables
FDIS stage
ISO/IEC FDIS 23090-28:2025(en)
© ISO/IEC 2025
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication
may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying,
or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO
at the address below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: + 41 22 749 01 11
E-mail: copyright@iso.org
Website: www.iso.org
Published in Switzerland
© ISO #### /IEC 2025 – All rights reserved
ii
ISO/IEC DIS FDIS 23090-28:20242025(en)
Contents
Foreword . iv
Introduction . v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 General . 1
3.2 Abbreviated terms . 3
4 Architectures for scene-based media . 4
4.1 General . 4
4.2 Architecture for scene graph representations . 4
4.3 Scenes nested within scenes . 5
4.4 Architecture for annotated scene graph using IMS . 5
4.5 Architecture for translation between scene graph representations using IMS . 6
5 Independent mapping space (IMS) . 7
5.1 General . 7
5.2 Architecture of IMS . 7
5.3 IMS systems . 8
Annex A (normative) Annotation using IMS and ITMF . 35
Annex B (normative) Mapping of IMS to glTF 2.0 . 40
Annex C (normative) Annotation process for glTF . 42
Bibliography . 49
© ISO/IEC 2024 2025 – All rights reserved
iii
ISO/IEC FDIS 23090-28:2025(en)
Foreword
ISO (the International Organization for Standardization) is a and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide federation of national standardsstandardization.
National bodies (that are members of ISO member bodies). The workor IEC participate in the development of
preparing International Standards is normally carried out through ISO technical committees. Each member
body interested in a subject for which a technical committee has been established has the right to be
represented on that committee. Internationalby the respective organization to deal with particular fields of
technical activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the work.
ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of
electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are described
in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the different types of
ISO documentsdocument should be noted. This document was drafted in accordance with the editorial rules
of the ISO/IEC Directives, Part 2 (see www.iso.org/directives or www.iec.ch/members_experts/refdocs).
Attention is drawnISO and IEC draw attention to the possibility that some of the elementsimplementation of
this document may beinvolve the subjectuse of (a) patent(s). ISO and IEC take no position concerning the
evidence, validity or applicability of any claimed patent rights in respect thereof. As of the date of publication
of this document, ISO and IEC had not received notice of (a) patent(s) which may be required to implement
this document. However, implementers are cautioned that this may not represent the latest information,
which may be obtained from the patent database available at www.iso.org/patents and https://patents.iec.ch
rights. ISO. ISO and IEC shall not be held responsible for identifying any or all such patent rights. Details of any
patent rights identified during the development of the document will be in the Introduction and/or on the ISO
list of patent declarations received (see ).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions
related to conformity assessment, as well as information about ISO's adherence to the World Trade
Organization (WTO) principles in the Technical Barriers to Trade (TBT),) see
www.iso.org/iso/foreword.html. In the IEC, see www.iec.ch/understanding-standards.
ISO/IEC 23090-28This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information
Technology, Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
© ISO #### /IEC 2025 – All rights reserved
iv
ISO/IEC DIS FDIS 23090-28:20242025(en)
Introduction
Emerging display technologies capable of creating fully formed holograms, e.g. 3D aerial images comprised of
waveforms emitted from a display panel surface, require input source media that are sufficiently dense in
information such that each of the objects in a holographic visual scene is described in terms of its complete
and explicit geometry, and surface properties to characterize how the surface responds to the presence of
light. Such visual information can furthermore facilitate display technologies to produce visual scenes of
objects that cannot be distinguished from the same objects when viewed in the natural world; i.e., the visual
information of the media is sufficiently dense such that the display is enabled to produce a photorealistic
result.
Commonly used raster media formats such as those specified in ITU-R BT.601, ITU-R BT.709, ITU-R BT.2020,
ITU-R BT.2100 were originally developed to efficiently provide information sufficient for 2D displays to
produce 2D visual presentations, albeit with varying degrees of spatial resolutions, frame rates, and pixel
depth. These raster formats, however, neither provide sufficient information in a practical nor efficient
manner, to enable a holographic display to produce holograms. As an alternative to raster media, 3D graphics
formats used in digital content creation tools and production workflows offer a solution to provide sufficiently
dense information to enable photorealistic results where raster formats do not. The use of 3D graphics
technologies hence serves as a fundamental technology in enabling immersive displays to create their optimal
visual presentations.
To date, immersive displays provide a media input interface that is capable of ingesting 3D graphics formats
for presentation by the display. Such interfaces maycan be enabled and shipped as part of the onboard
processing features in the display by use of game engine software development kits. However, 3D scene media
that are targeted for processing by the display’s choice of renderer must first be translated into a format that
is consistent with the internal scene representations of the renderer. To facilitate the translation and
distribution of 3D scene media into the renderer interface provided by the display, an Independent Mapping
Spaceindependent mapping space (IMS) is specified as a part of the ISO/IEC 23090 series.
At the core of the IMS is a vocabulary of terms and definitions that can be used to annotate a scene graph or
scene asset with metadata, i.e., to facilitate the translation of media from one format to another. Such
translation from one format to another format is a commonly encountered challenge in the distribution of
scene-based media, as a number of formats have emerged to gain popularity in various application areas, e.g.,
gaming, social media, e-commerce, and content production. The corresponding vocabulary of terms and
definitions describes nodes, media formats, media attributes, and render processing instructions of
commonly-used scene graph formats and media representations. Elements from the IMS maycan be used to
annotate a scene graph or media format so that its translation to another format is more straightforward and
consistently performed. Thus, the IMS facilitates the distribution and interoperability of a variety of scene-
based media into render applications that are limited in the types of formats that they are designed to support.
Another application of this document is to guide the translation of media assets for use across a diverse set of
applications comprising the Metaverse, e.g., gaming, social media, retail applications. In the absence of a
common format that specifies media for use across the variety of Metaverse applications, the Independent
Mapping SpaceIMS can facilitate the translation from one asset representation to another.
© ISO/IEC 2024 2025 – All rights reserved
v
ISO/IEC DIS 23090-28:2024(en)
DRAFT International Standard
Information technology 一 — Coded representation of immersive
media 一 —
Part 28:
Interchangeable scene-based media representations
1 Scope
This document specifies an architecture and data model for interchange of scene-based media for use in
systems and applications that employ renderers for the presentation of immersive media, including audio,
visual, tactile and other representations of media types.
The architecture and data model specified in this document apply to use cases where there is a need for
interchange of scene-based media for presentation by systems with 3D render-based technologies. One
specific application of this document is immersive display technologies that utilize a real-time renderer of 3D
media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-based
media. Another application of this document is for the distribution of media assets for use across a large
variety of applications comprising the Metaverse.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
3.1 General
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— — ISO Online browsing platform: available at https://www.iso.org/obp
— — IEC Electropedia: available at https://www.electropedia.org/
3.1.1 3.1.1
scene graph
general data structure commonly used by vector-based graphics editing applications and modern computer
games, that arranges the logical and often (but not necessarily) spatial representation of a graphical scene; a
collection of nodes and vertices in a graph structure
3.1.2
Immersive Technology Media Format
3.1.2
immersive technology media format
ITMF
suite of specifications for the Immersive Technology Media Format developed by the Immersive Digital
Experiences Alliance
© ISO/IEC 2024 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.43.1.3 3.1.3
node
fundamental element of the scene graph comprised of information related to the logical or spatial or temporal
representation of visual or audio information
3.1.53.1.4 3.1.4
node graph
collection of nodes that can be positioned or modified as a single unit and linked to the remainder of the scene
graph by linker nodes
3.1.63.1.5 3.1.5
node type
general category of node that performs a certain function, for example, a camera node type can be a panoramic
lens camera node, an OSLopen shading language (OSL) node, or other types of cameras
3.1.73.1.6 3.1.6
attribute
metadata associated with a node used to describe a particular characteristic or feature of a node either in a
canonical or more complex form (e.g. in terms of another node)
3.1.83.1.7 3.1.7
container
serialisedserialized format to store and exchange information to represent all natural, all synthetic, or a
mixture of synthetic and natural scenes including a scene graph and all of the media resources that are
required for rendering of the scene
3.1.8
serialisation
3.1.8
serialization
process of translating data structures or object state into a format that can be stored (for example, in a file or
memory buffer) or transmitted (for example, across a network connection link) and reconstructed later
(possibly in a different computer environment)
Note 1 to entry: When the resulting series of bits is reread according to the serialisationserialization format, it can be
used to create a semantically identical clone of the original object.
3.1.113.1.9 3.1.9
renderer
software-based) application or process, based on a selective mixture of disciplines related to: acoustic physics,
light physics, visual perception, audio perception, mathematics, and software development, that, given an
input scene graph and asset container, emits a visual and/or audio signal suitable for presentation on a
targeted device or conforming to the desired properties as specified by attributes of a render target node in
the scene graph
Note 1 to entry: For visual-based media assets, a renderer may emit a visual signal suitable for a targeted display, or for
storage as an intermediate asset (e.g. repackaged into another container i.e. used in a series of rendering processes in a
graphics pipeline); for audio-based media assets, a renderer may emit an audio signal for presentation in a multi-channel
loudspeaker and/or binauralized headphones, or for repackaging into another (output) container.
3.1.123.1.10 3.1.10
scene-based media
audio, visual, haptic, and other primary types of media and media-related information organized logically and
spatially by a use of a scene graph
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
3.1.133.1.11 3.1.11
RGB
referring to an additive colour model in which red, green, and blue are the primary colours; the
Note 1 to entry: The abbreviation RGB is derived from the first letter of each of the primary colours.
3.1.143.1.12 3.1.12
asset
one of potentially multiple objects comprising the media for a scene in scene-based media; each object fully
formed and packaged such that it can be rendered separately from other objects in the scene
Note 1 to entry: individual Individual assets may be shared, e.g. reused across multiple scenes.
3.1.153.1.13 3.1.13
pin
input parameter to a node, or an output from a node
3.1.163.1.14 3.1.14
this document
referring to this International Standard
3.1.15
3.1.15
independent mapping space label
IMS label
sequence of alphanumeric characters using dot notation comprised of an IMS prefix followed by a node label,
[ ]
which may optionally be followed by labels derived from the ITMF Data Encoding Specification 4 [4]
3.1.16
3.1.16
independent mapping space prefix
IMS prefix
sequence of alphanumeric characters using dot notation comprised of ‘ims’"ims" followed by an alphanumeric
label to denote the IMS component, subsequently followed by an alphanumeric label to denote the IMS system
for a particular node
3.2 Abbreviated terms
2D two dimensional
3D three dimensional
AOV arbitrary output variables
IDEA Immersive Digital Experiences Alliance
ITMF Immersive Technology Media Format
IMS independent mapping space
OSL Open Shading Language
RGB red, green, blue
USD Universal Scene Description
glTF graphics language Transmission Format
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
4 Architectures for scene-based media
4.1 General
This clause illustrates architectures related to the organization of scene-based media and how such media
maycan be annotated by a process that stores IMS metadata into the media. Furthermore, this clause provides
a hypothetical architecture for a process that translates one scene-based media format to another.
[ ]
A more in-depth description of these architectures is provided in ISO/IEC TR 23090-27. 2 [2].
[ ]
Note: NOTE This document references only some of the architectures in ISO/IEC TR 23090-27 2 [2] for the purposes
of providing additional context.
4.2 Architecture for scene graph representations
Figure 1Figure 1 illustrates an architecture for the physical representation of scene graphs for scene-based
immersive media.
Note: NOTE The same or closely similar figure can also be drawn to illustrate a logical representation of scene-based
media.
Key
1 binary container for immersive media
2 human-readable or binary scene description
3 media asset #1
4 media asset #2
5 media asset #N
6 network
7 media asset #K
8 media asset #M
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
9 network-based reference from scene description to asset #K
10 local reference from scene description to asset #N stored within binary container
11 local reference from scene description to asset #1 stored within binary container
12 local reference from scene description to asset #2 stored within binary container
13 external storage reference from scene description to asset #M
Figure 1 — Architecture for scene graph representations of scene-based immersive media
In this illustration of a scene graph architecture, the description of the scene (1), including spatial, temporal,
[ ] [
or logical relationships between objects, is stored in a human-readable format (e.g. XML 7 [7] or JSON 13
]
[13]). ). This human-readable portion of the scene graph comprises the information that may be annotated
with metadata from the IMS. The scene is further comprised of binary assets that are stored adjacent to the
human-readable description in a “container” (13), such as files that are combined, stored, and compressed
into a single archive file format (similar to files with a zip archive extension), or assets that are stored
externally from the container, e.g., either in a local or remote access network.
4.3 Scenes nested within scenes
An important aspect of scene graph architectures is that there is no explicit restriction within this document
that prohibits an individual asset from comprising an entirely self-described scene as depicted in
Figure 1Figure 1. For example, referring to the architecture depicted in Figure 1Figure 1,, assets A, B, K, M,
and N may themselves be scenes stored in individual containers. Furthermore, these assets may likewise be
annotated with IMS metadata.
4.4 Architecture for annotated scene graph using IMS
Figure 2Figure 2 illustrates an example of a scene graph that is annotated with metadata from the IMS. In the
figure, items 1A, 1B, 1C, 1D, and 1E refer to individual portions of the human readable description of the scene.
Each such portion may provide information related to the geometry or structure of the scene, and is separately
annotated with IMS metadata for that particular part of the scene. The figure also illustrates that the scene
contains four binary assets labelled: 2A, 2B, 2C, and 2D. The assets themselves may or may not be separately
annotated with IMS metadata. Furthermore, the assets themselves may be individual scenes.
Key
1A portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1B portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1C portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1D portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1E portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
2A binary asset A optionally annotated with IMS metadata
2B binary asset B optionally annotated with IMS metadata
2C binary asset C optionally annotated with IMS metadata
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
2D binary asset D optionally annotated with IMS metadata
Figure 2 — Architecture for scene graph annotated with IMS metadata
4.5 Architecture for translation between scene graph representations using IMS
A hypothetical architecture for translating one scene graph format to another scene graph format is illustrated
in Figure 3Figure 3.
Key
1A portion of human-readable scene description 1 annotated with IMS metadata
1B portion of human-readable scene description 1 annotated with IMS metadata
1C portion of human-readable scene description 1 annotated with IMS metadata
1D portion of human-readable scene description 1 annotated with IMS metadata
1E portion of human-readable scene description 1 annotated with IMS metadata
2A portion of human-readable scene description 2 translated with IMS metadata from 1A
2B portion of human-readable scene description 2 translated with IMS metadata from 1B
2C portion of human-readable scene description 2 translated with IMS metadata from 1C
2D portion of human-readable scene description 2 translated with IMS metadata from 1D
2E portion of human-readable scene description 1 translated with IMS metadata from 1E
3 scene description 1 annotated with IMS metadata
4 scene description 2 translated from scene description 1 and IMS metadata
5A binary asset A associated with both scene description 1 and scene description 2
5B binary asset B associated with both scene description 1 and scene description 2
5C binary asset C associated with both scene description 1 and scene description 2
5D binary asset D associated with both scene description 1 and scene description 2
6 translation process
Figure 3 — Architecture for scene graph translation
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
In this figure, Scene Descriptionscene description 2 is derived via a translation process (6) from Scene
Descriptionscene description 1 and its corresponding IMS metadata. Scene Descriptiondescription 1 is
comprised of five parts labelled 1A, 1B, 1C, 1D, and 1E. Each part is annotated with corresponding IMS
metadata. Associated with Scene Descriptionscene description 1 are four binary assets, each individually
labelled as: 5A, 5B, 5C, and 5D. Assets 5A, 5B, 5C, and 5D may be optionally annotated. The combined IMS
metadata and parts comprising Scene Descriptionscene description 1 are labelled as 3. Component 3 serves
as input to a translation process (6) that is guided by the IMS metadata to produce output 4 which is scene
description 2. Scene Description 2. Scene Descriptiondescription 2 is also comprised of five parts, although
there does not need to be a one-to-one correspondence between the number of input scene description parts
to the number of output scene description parts.
5 Independent Mapping Space
F.0.0A.1.1 General
5 The Independent Mapping Spacemapping space (IMS)
5.1 General
The IMS defines a vocabulary and corresponding set of labels that can be used to describe scene graphs to aid
in the translation of a single scene graph format into another single scene graph format, or to aid in the
distribution of a scene graph to a particular rendering interface. While descriptions and terms that comprise
the IMS are sufficiently dense and robust to describe a scene graph capable of representing photorealistic
scenes, the IMS may also be used to describe scene representations that are not designed to represent
photorealistic visualizations.
6.15.2 Architecture of IMS
The IMS is comprised of the following three architectural components that describe:
— — the logical organization of geometric objects and their properties within the scene;
NOTE 1 – In general, immersive media may be comprised of media that is created to stimulate any of the human
senses, e.g., visual and auditory senses of humans. Scenes are likewise organized according to the types of media
that stimulate human senses, i.e., visual and auditory scenes. While there are different types of scenes that are used
to create immersive media, this document is currently limited to define an IMS for visual scenes.
— — the physical organization of the scene, e.g., for the purposes of access and storage of binary media
within the scene;
— — processing directions for the presentation engine, e.g., for rendering or animation.
NOTE 2 – A single scene graph can contain multiple sets of render processing directions, each set relevant to the
interface for a particular renderer.
Each of the IMS architectural components is comprised of systems that describe a particular system node type
for which there can be multiple of such subtypes further defined.
NOTE 3 For example, a “lighting” system node type can define multiple subtypes of “lighting” each of which maycan
satisfy the requirements of another system node that needsrequires information about lighting.
Each system node type may be regarded as a process that is executed by a presentation engine. Each of such
processes produces an output that has a type consistent with its definition, e.g., lighting. Likewise, some of the
node types have input requirements that are described in terms of other node types. As an example, a render
instruction node type requires input information of node type “geometry” and node type of “camera,” at a
minimum, in order to produce a rendered output.
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
6.25.3 IMS systems
6.2.15.3.1 General
This subclause specifies the individual systems that comprise the IMS architectural components for logical
organization, physical organization, and processing directions. Associated with each system is a type, label,
description, and hypothetical rendering semantics, i.e., a processing model, for how a renderer mightcan
interpret a node belonging to each system.
NOTE –These systems are designed to mirror the structure, representation, and encoding of scene-based media as
[ ]
specified by the ITMF Scene Graph Specification, 3 [3], ITMF Data Encoding Specification, and ITMF Container
[ ]
Specification. 5 [5].
Associated with each system is one or more nodes that belong to the system. Figure 4Figure 4 provides a
canonical representation of how the IMS systems relate to each other.
Key
1 canonical IMS values (e.g., integer, enumeration, float, …)
2 IMS systems for logical organization, physical organization, and processing directions
3 input or output pin
4 attribute
5 node graph
6 logical organization of geometry
7 processing directions
8 physical organization of scene graph
9 presentation engine (not a system in the IMS)
10 rendered output (not a system in the IMS)
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
Figure 4 — Canonical representation of IMS geometric objects within the scene
An IMS value (item 1) represents an input value from the value node system to any one of the other IMS
systems (item 2) including systems designated for logical organization, physical organization, and processing
directions for the scene.
Input or output relationships between any of the system nodes are established with IMS data pins (item 3).
Attributes represent other IMS system nodes or value nodes that are considered to be immutable (not
changeable by rendering processes). A collection of system nodes is designated to describe the logical
organization of the geometry (item 6); processing directions (item 7); and physical organization (item 8) in
the scene. A node graph (item 5) represents a collection of other IMS system nodes that may also include
attributes or render-specific metadata (not in scope of this document). Inputs to a presentation engine (item
9) include information about the logical organization, processing directions, and phycialphysical organization
(not shown in the figure) to produce a rendered output (item 10).
6.2.25.3.2 IMS logical organization component
IMS systems used to describe the logical organization component of a scene are identified by a label with a
prefix of ‘"ims.logical’. Table 1logical". Table 1 specifies the IMS systems that comprise the logical organization
component.
Table 1 — Summary of IMS systems for logical organization of the scene
System node type IMS system label Description Hypothetical renderer
semantics
system value node ims.logical.sysVal An input value (e.g., Values provided by system
integers, single-precision value nodes are used by
floats, spatial coordinates, renderers for processing of
vectors) to an associated other nodes to which the
node. values are associated.
material node ims.logical.material Surface properties of Materials provide
individual geometric characteristics (e.g.,
objects or groups of porousness, bumps,
geometric objects in the metallic properties) for the
scene. surfaces of individual
objects that are rendered
in the scene.
texture node ims.logical.texture Colour and brightness for a Textures define individual
surface corresponding to a colours and brightness for
geometric objects or group surfaces of individual
of geometric objects in the objects that are rendered
scene. in the scene.
geometry node ims.logical.geometry Geometric shape of an Geometry defines the 3D
object or group of objects shape or volume of
in a scene or geometry individual objects that are
archive. rendered in the scene.
surface (medium) node ims.logical.surface A description of how light The behaviour of
is reflected, refracted, or individual rays in ray-
absorbed by an object. tracing algorithms is
characterized by formulae
provided in surface nodes.
transformation node ims.logical.transform A description of how The parameters are used
geometric objects are to transform individual
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
System node type IMS system label Description Hypothetical renderer
semantics
scaled, translated, or geometric objects within a
rotated. scene.
connection pins ims.logical.pins.input Parameters used as input One or more data pins are
(input pin) to a node, or a single inputs to a node or a single
output from a node, i.e., data pin is an output from
ims.logical.pins.output
input data pin(s) or output a node.
(output pin)
data pin.
data attributes ims.logical.attribute Non-mutable The renderer does not
characteristics, features, or allow other nodes to alter
values of or for an object. attribute values during
serialisationserialization.
[
scene object ims.logical.sceneObject A geometric object, Denotes use of Alembic, 9
]
possibly animated, created [9], Universal Scene
using a digital content Description (USD), glTF,
creation tool, and included and Filmbox or other
in a composited scene. formats as individual
geometry assets, possibly
for a larger scene.
6.2.2.15.3.2.1 Value nodes system
Table 2Table 2 specifies the types of values and labels that comprise the value nodes system within the IMS.
IMS labels for value nodes for use within the logical organization of the scene are identified by an IMS prefix
of ‘ims.logical.sysVal’. The ITMF Data Encoding Specification provides additional information regarding usage
of the value nodes system. Annex AAnnex A of this document provides information about how to use the ITMF
Data Encoding Specification with the IMS.
Table 2 — Nodes for value node system for logical organization component
Node ITMF node Node label IMS label
code point
float value 6 float ims.logical.sysVal.float
int value 9 integer ims.logical.sysVal.integer
boolean value 11 boolean ims.logical.sysVal.boolean
image resolution 12 imageResolution ims.logical.sysVal.imageResolution
sun direction 30 sunDirection ims.logical.sysVal.sunDirection
enumeration 57 enumeration ims.logical.sysVal.enumeration
annotation 68 annotation ims.logical.sysVal.annotation
string 84 string ims.logical.sysVal.string
file 88 file ims.logical.sysVal.file
directory 101 directoryName ims.logical.sysVal.directoryName
bit mask 132 bitMask ims.logical.sysVal.bitMask
6.2.2.25.3.2.2 Material nodes system
Table 3Table 3 specifies the node names and corresponding labels that comprise the material nodes system
within the IMS. IMS labels for material nodes are identified by an IMS prefix of ‘"ims.logical.material’.material".
The ITMF Data Encoding Specification provides additional information regarding usage of the material nodes
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
system. Annex AAnnex A of this document provides information about how to use the ITMF Data Encoding
Specification with the IMS.
Table 3 — Nodes for material nodes system
Node ITMF node Node label IMS label
code point
material map 2 map ims.logical.material.map
glossy material 16 glossy ims.logical.material.glossy
diffuse material 17 diffuse ims.logical.material.diffuse
specular material 18 specular ims.logical.material.specular
mix material 19 mix ims.logical.material.mix
portal material 20 portal ims.logical.material.portal
metallic material 120 metallic ims.logical.material.metallic
toon material 121 toon ims.logical.material.toon
toon ramp material 122 toonRamp ims.logical.material.toonRamp
universal material 130 universal ims.logical.material.universal
composite material 138 composite ims.logical.material.composite
specular material layer 139 specularMaterialLayer ims.logical.material.specularMaterialLaye
r
diffuse material layer 140 diffuseMaterialLayer ims.logical.material.diffuseMaterialLayer
metallic material layer 141 metallicMaterialLayer ims.logical.material.metallicMaterialLaye
r
sheen material layer 142 sheenMaterialLayer ims.logical.material.sheenMaterialLayer
layered material 143 layered ims.logical.material.layered
material layer group 144 materialLayerGroup ims.logical.material.materialLayerGroup
shadow catcher 145 shadowCatcher ims.logical.material.shadowCatcher
hair material 147 hair ims.logical.material.hair
null material 159 null ims.logical.material.null
clipping material 178 clipping ims.logical.material.clipping
6.2.2.35.3.2.3 Texture nodes system
Table 4Table 4 specifies the node names and corresponding labels that comprise the texture nodes system
within the IMS. IMS labels for texture nodes are identified by an IMS prefix of ‘ims.logical.texture’. The ITMF
Data Encoding Specification provides additional information regarding usage of the texture nodes system.
Annex AAnnex A of this document provides information about how to use the ITMF Data Encoding
Specification with the IMS.
Table 4 — Nodes for texture nodes system
Node ITMF node Node label IMS label
code point
turbulence texture 22 turbulence ims.logical.texture.turbulence
grayscale colour 31 grayscaleColour ims.logical.texture.grayscaleColour
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Node ITMF node Node label IMS label
code point
gaussian spectrum 32 gaussianSpectrum ims.logical.texture.gaussianSpectrum
RGB colour 33 rgbColour ims.logical.texture.rgbColour
RGB image 34 rgbImage ims.logical.texture.rgbImage
alpha image 35 alphaImage ims.logical.texture.alphaImage
grayscale image 36 grayscaleImage ims.logical.texture.grayscaleImage
mix texture 38 mix ims.logical.texture.mix
multiply texture 39 multiply ims.logical.texture.multiply
cosine mix texture 40 cosineMix ims.logical.texture.cosineMix
clamp texture 41 clamp ims.logical.texture.clamp
saw wave texture 42 sawWave ims.logical.texture.sawWave
triangle wave texture 43 triangleWave ims.logical.texture.triangleWave
sine wave texture 44 sineWave ims.logical.texture.sineWave
checks texture 45 checks ims.logical.texture.checks
invert texture 46 invert ims.logical.texture.invert
marble texture 47 marble Ims.logical.texture.marble
rigid fractal texture 48 rigidFractal ims.logical.texture.rigidFractal
gradient map texture 49 gradientMap ims.logical.texture.gradientMap
falloff map 50 falloffMap ims.logical.texture.falloffMap
colour correction 51 colourCorrection ims.logical.texture.colourCorrection
dirt texture 63 dirt ims.logical.texture.dirt
cylindrical 74 cylindrical ims.logical.texture.cylindrical
linear XYZ to UVW 75 linearXYZtoUVW ims.logical.texture.linearXYZtoUVW
perspective 76 perspective ims.logical.texture.perspective
spherical 77 spherical ims.logical.texture.spherical
mesh UV 78 meshUV ims.logical.texture.meshUV
box 79 box ims.logical.texture.box
displacement 80 displacement ims.logical.texture.displacement
random colour texture 81 randomColour ims.logical.texture.randomColour
noise texture 87 noise ims.logical.texture.noise
polygon side 89 polygonSide ims.logical.texture.polygonSide
vertex displacement 97 vertexDisplacement ims.logical.texture.vertexDisplacement
W-coordinate 104 wCoordinate ims.logical.texture.wCoordinate
add texture 106 add ims.logical.texture.add
comparison 107 comparison ims.logical.texture.comparison
subtract texture 108 subtract ims.logical.texture.subtract
triplanar map 109 triplanarMap ims.logical.texture.triplanarMap
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
Node ITMF node Node label IMS label
code point
triplanar 111 triplanar ims.logical.texture.triplanar
instance colour 113 instanceColour ims.logical.texture.instanceColour
instance range 114 instanceRange ims.logical.texture.instanceRange
baking texture 115 baking ims.logical.texture.baking
OSL texture 117 oslTexture ims.logical.texture.oslTexture
UVW transform 118 uvwTransform ims.logical.texture.uvwTransform
OSL 125 osl ims.logical.texture.osl
OSL delayed UV 127 oslDelayedUV ims.logical.texture.oslDelayedUV
image tiles 131 imageTiles ims.logical.texture.imageTiles
colour vertex attribute 135 colourVertexAttribute ims.logical.texture.colourVertexAttribu
texture te
grayscale colour vertex 136 grayscaleColourVertexAttribut ims.logical.texture.grayscaleColourVert
attribute texture e exAttribute
vertex displacement 151 vertexDisplacementMixer ims.logical.texture.vertexDisplacement
mixer Mixer
spotlight distribution 158 spotlightDistribution ims.logical.texture.spotlightDistributio
n
Cinema4D noise texture 162 cinema4dNoise ims.logical.texture.cinema4dNoise
chaos texture 170 chaos ims.logical.texture.chaos
channel picker 171 channelPicker ims.logical.texture.channelPicker
channel merger 172 channelMerger ims.logical.texture.channelMerger
ray switch 173 raySwitch ims.logical.texture.raySwitch
channel inverter 174 channelInverter ims.logical.texture.channelInverter
channel mapper 175 channelMapper ims.logical.texture.channelMapper
composite texture 176 composite ims.logical.texture.composite
mesh UV 177 meshUV ims.logical.texture.meshUV
iridescent texture 187 iridescent ims.logical.texture.iridescent
volume to texture 256 volumeToTexture ims.logical.texture.volumeToTexture
colour to UVW 258 colourToUVW ims.logical.texture.colourToUVW
smooth Voronoi contours 260 smoothVoronoiContours ims.logical.texture.smoothVoronoiCont
texture ours
tile patterns texture 261 tilePatterns ims.logical.texture.tilePatterns
procedural effects texture 262 proceduralEffects ims.logical.texture.procedurealEffects
chainmail texture 263 chainmail ims.logical.texture.chainmail
Moire mosaic texture 264 moireMosaic ims.logical.texture.moireMosaic
colour squares texture 265 colourSquares ims.logical.texture.colourSquares
stripes texture 266 stripes ims.logical.texture.stripes
flakes texture 267 flakes ims.logical.texture.flakes
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Node ITMF node Node label IMS label
code point
fractal texture 268 fractal ims.logical.texture.fractal
hagelslag texture 269 hagelslag ims.logical.texture.hagelslag
glowing circle texture 270 glowingCircle ims.logical.texture.glowingCircle
curvature texture 271 curvature ims.logical.texture.curvature
cell noise texture 275 cellNoise ims.logical.texture.cellNoise
sample position to UV 317 samplePositionToUV ims.logical.texture.samplePositionToU
V
composite texture layer 318 compositeTextureLayer ims.logical.texture.compositeTextureLa
yer
floats to colour 320 floatsToColour ims.logical.texture.floatsToColour
float3 to colour 321 float3ToColour ims.logical.texture.float3ToColour
distorted mesh UV 322 distortedMeshUV ims.logical.texture.distortedMeshUV
capture to custom AOV 323 captureToCustomAOV ims.logical.texture.captureToCustomA
OV
float to greyscale 324 floatToGreyscale ims.logical.texture.floatToGreyscale
ray direction 326 rayDirection ims.logical.texture.rayDirection
normal texture 327 normal Ims.logical.texture.normal
position texture 328 position ims.logical.texture.position
relative distance 329 relativeDistance ims.logical.texture.relativeDistance
UV coordinate 330 uvCoordinate ims.logical.texture.uvCoordinate
Z depth 331 zDepth ims.logical.texture.zDepth
gradient generator 332 gradientGenerator ims.logical.texture.gradientGenerator
texture
random map 333 randomMap ims.logical.texture.randomMap
range t
...
FINAL DRAFT
International
Standard
ISO/IEC
FDIS
23090-28
ISO/IEC JTC 1/SC 29
Information technology — Coded
Secretariat: JISC
representation of immersive
Voting begins on:
media —
2025-11-13
Part 28:
Voting terminates on:
2026-01-08
Interchangeable scene-based media
representations
Technologies de l'information — Représentation codée de média
immersifs —
Partie 28: Représentations de médias basés sur des scènes
interchangeables
RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT,
WITH THEIR COMMENTS, NOTIFICATION OF ANY
RELEVANT PATENT RIGHTS OF WHICH THEY ARE AWARE
AND TO PROVIDE SUPPOR TING DOCUMENTATION.
IN ADDITION TO THEIR EVALUATION AS
BEING ACCEPTABLE FOR INDUSTRIAL, TECHNO
LOGICAL, COMMERCIAL AND USER PURPOSES, DRAFT
INTERNATIONAL STANDARDS MAY ON OCCASION HAVE
TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL
TO BECOME STAN DARDS TO WHICH REFERENCE MAY BE
MADE IN NATIONAL REGULATIONS.
Reference number
ISO/IEC FDIS 2309028:2025(en) © ISO/IEC 2025
FINAL DRAFT
ISO/IEC FDIS 23090-28:2025(en)
International
Standard
ISO/IEC
FDIS
23090-28
ISO/IEC JTC 1/SC 29
Information technology — Coded
Secretariat: JISC
representation of immersive media —
Voting begins on:
Part 28:
Interchangeable scene-based media
Voting terminates on:
representations
Technologies de l'information — Représentation codée de média
immersifs —
Partie 28: Représentations de médias basés sur des scènes
interchangeables
RECIPIENTS OF THIS DRAFT ARE INVITED TO SUBMIT,
WITH THEIR COMMENTS, NOTIFICATION OF ANY
RELEVANT PATENT RIGHTS OF WHICH THEY ARE AWARE
AND TO PROVIDE SUPPOR TING DOCUMENTATION.
© ISO/IEC 2025
IN ADDITION TO THEIR EVALUATION AS
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
BEING ACCEPTABLE FOR INDUSTRIAL, TECHNO
LOGICAL, COMMERCIAL AND USER PURPOSES, DRAFT
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
INTERNATIONAL STANDARDS MAY ON OCCASION HAVE
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
TO BE CONSIDERED IN THE LIGHT OF THEIR POTENTIAL
or ISO’s member body in the country of the requester.
TO BECOME STAN DARDS TO WHICH REFERENCE MAY BE
MADE IN NATIONAL REGULATIONS.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland Reference number
ISO/IEC FDIS 2309028:2025(en) © ISO/IEC 2025
© ISO/IEC 2025 – All rights reserved
ii
ISO/IEC FDIS 23090-28:2025(en)
Contents Page
Foreword .iv
Introduction .v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 General .1
3.2 Abbreviated terms .3
4 Architectures for scene-based media . 3
4.1 General .3
4.2 Architecture for scene graph representations .4
4.3 Scenes nested within scenes .5
4.4 Architecture for annotated scene graph using IMS .5
4.5 Architecture for translation between scene graph representations using IMS .5
5 Independent mapping space (IMS) . 7
5.1 General .7
5.2 Architecture of IMS .7
5.3 IMS systems.7
5.3.1 General .7
5.3.2 IMS logical organization component .9
5.3.3 IMS physical organization component .18
5.3.4 IMS processing directions component . 23
Annex A (normative) Annotation using IMS and ITMF .32
Annex B (normative) Mapping of IMS to glTF 2.0 .37
Annex C (normative) Annotation process for glTF .39
Bibliography .45
© ISO/IEC 2025 – All rights reserved
iii
ISO/IEC FDIS 23090-28:2025(en)
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical activity.
ISO and IEC technical committees collaborate in fields of mutual interest. Other international organizations,
governmental and non-governmental, in liaison with ISO and IEC, also take part in the work.
The procedures used to develop this document and those intended for its further maintenance are described
in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the different types
of document should be noted. This document was drafted in accordance with the editorial rules of the ISO/
IEC Directives, Part 2 (see www.iso.org/directives or www.iec.ch/members_experts/refdocs).
ISO and IEC draw attention to the possibility that the implementation of this document may involve the
use of (a) patent(s). ISO and IEC take no position concerning the evidence, validity or applicability of any
claimed patent rights in respect thereof. As of the date of publication of this document, ISO and IEC had not
received notice of (a) patent(s) which may be required to implement this document. However, implementers
are cautioned that this may not represent the latest information, which may be obtained from the patent
database available at www.iso.org/patents and https://patents.iec.ch. ISO and IEC shall not be held
responsible for identifying any or all such patent rights.
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions
related to conformity assessment, as well as information about ISO's adherence to the World Trade
Organization (WTO) principles in the Technical Barriers to Trade (TBT) see www.iso.org/iso/foreword.html.
In the IEC, see www.iec.ch/understanding-standards.
This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information Technology,
Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
© ISO/IEC 2025 – All rights reserved
iv
ISO/IEC FDIS 23090-28:2025(en)
Introduction
Emerging display technologies capable of creating fully formed holograms, e.g. 3D aerial images comprised
of waveforms emitted from a display panel surface, require input source media that are sufficiently dense in
information such that each of the objects in a holographic visual scene is described in terms of its complete and
explicit geometry, and surface properties to characterize how the surface responds to the presence of light.
Such visual information can furthermore facilitate display technologies to produce visual scenes of objects that
cannot be distinguished from the same objects when viewed in the natural world; i.e., the visual information
of the media is sufficiently dense such that the display is enabled to produce a photorealistic result.
Commonly used raster media formats such as those specified in ITU-R BT.601, ITU-R BT.709, ITU-R BT.2020,
ITU-R BT.2100 were originally developed to efficiently provide information sufficient for 2D displays to
produce 2D visual presentations, albeit with varying degrees of spatial resolutions, frame rates, and pixel
depth. These raster formats, however, neither provide sufficient information in a practical nor efficient
manner, to enable a holographic display to produce holograms. As an alternative to raster media, 3D
graphics formats used in digital content creation tools and production workflows offer a solution to provide
sufficiently dense information to enable photorealistic results where raster formats do not. The use of 3D
graphics technologies hence serves as a fundamental technology in enabling immersive displays to create
their optimal visual presentations.
To date, immersive displays provide a media input interface that is capable of ingesting 3D graphics formats
for presentation by the display. Such interfaces can be enabled and shipped as part of the onboard processing
features in the display by use of game engine software development kits. However, 3D scene media that
are targeted for processing by the display’s choice of renderer must first be translated into a format that
is consistent with the internal scene representations of the renderer. To facilitate the translation and
distribution of 3D scene media into the renderer interface provided by the display, an independent mapping
space (IMS) is specified as a part of the ISO/IEC 23090 series.
At the core of the IMS is a vocabulary of terms and definitions that can be used to annotate a scene graph
or scene asset with metadata, i.e., to facilitate the translation of media from one format to another. Such
translation from one format to another format is a commonly encountered challenge in the distribution of
scene-based media, as a number of formats have emerged to gain popularity in various application areas,
e.g., gaming, social media, e-commerce, and content production. The corresponding vocabulary of terms
and definitions describes nodes, media formats, media attributes, and render processing instructions of
commonly-used scene graph formats and media representations. Elements from the IMS can be used to
annotate a scene graph or media format so that its translation to another format is more straightforward
and consistently performed. Thus, the IMS facilitates the distribution and interoperability of a variety of
scene-based media into render applications that are limited in the types of formats that they are designed to
support.
Another application of this document is to guide the translation of media assets for use across a diverse
set of applications comprising the Metaverse, e.g., gaming, social media, retail applications. In the absence
of a common format that specifies media for use across the variety of Metaverse applications, the IMS can
facilitate the translation from one asset representation to another.
© ISO/IEC 2025 – All rights reserved
v
FINAL DRAFT International Standard ISO/IEC FDIS 23090-28:2025(en)
Information technology — Coded representation of
immersive media —
Part 28:
Interchangeable scene-based media representations
1 Scope
This document specifies an architecture and data model for interchange of scene-based media for use in
systems and applications that employ renderers for the presentation of immersive media, including audio,
visual, tactile and other representations of media types.
The architecture and data model specified in this document apply to use cases where there is a need for
interchange of scene-based media for presentation by systems with 3D render-based technologies. One
specific application of this document is immersive display technologies that utilize a real-time renderer of
3D media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-
based media. Another application of this document is for the distribution of media assets for use across a
large variety of applications comprising the Metaverse.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
3.1 General
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— ISO Online browsing platform: available at https:// www .iso .org/ obp
— IEC Electropedia: available at https:// www .electropedia .org/
3.1.1
scene graph
general data structure commonly used by vector-based graphics editing applications and modern computer
games, that arranges the logical and often (but not necessarily) spatial representation of a graphical scene; a
collection of nodes and vertices in a graph structure
3.1.2
immersive technology media format
ITMF
suite of specifications developed by the Immersive Digital Experiences Alliance
3.1.3
node
fundamental element of the scene graph comprised of information related to the logical or spatial or
temporal representation of visual or audio information
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.4
node graph
collection of nodes that can be positioned or modified as a single unit and linked to the remainder of the
scene graph by linker nodes
3.1.5
node type
general category of node that performs a certain function, for example, a camera node type can be a
panoramic lens camera node, an open shading language (OSL) node, or other types of cameras
3.1.6
attribute
metadata associated with a node used to describe a particular characteristic or feature of a node either in a
canonical or more complex form (e.g. in terms of another node)
3.1.7
container
serialized format to store and exchange information to represent all natural, all synthetic, or a mixture of
synthetic and natural scenes including a scene graph and all of the media resources that are required for
rendering the scene
3.1.8
serialization
process of translating data structures or object state into a format that can be stored (for example, in a file
or memory buffer) or transmitted (for example, across a network connection link) and reconstructed later
(possibly in a different computer environment)
Note 1 to entry: When the resulting series of bits is reread according to the serialization format, it can be used to
create a semantically identical clone of the original object.
3.1.9
renderer
software-based application or process, based on a selective mixture of disciplines related to: acoustic
physics, light physics, visual perception, audio perception, mathematics, and software development, that,
given an input scene graph and asset container, emits a visual and/or audio signal suitable for presentation
on a targeted device or conforming to the desired properties as specified by attributes of a render target
node in the scene graph
Note 1 to entry: For visual-based media assets, a renderer may emit a visual signal suitable for a targeted display, or
for storage as an intermediate asset (e.g. repackaged into another container i.e. used in a series of rendering processes
in a graphics pipeline); for audio-based media assets, a renderer may emit an audio signal for presentation in a multi-
channel loudspeaker and/or binauralized headphones, or for repackaging into another (output) container.
3.1.10
scene-based media
audio, visual, haptic, and other primary types of media and media-related information organized logically
and spatially by a use of a scene graph
3.1.11
RGB
additive colour model in which red, green, and blue are the primary colours
Note 1 to entry: The abbreviation RGB is derived from the first letter of each of the primary colours.
3.1.12
asset
one of potentially multiple objects comprising the media for a scene in scene-based media; each object fully
formed and packaged such that it can be rendered separately from other objects in the scene
Note 1 to entry: Individual assets may be shared, e.g. reused across multiple scenes.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.13
pin
input parameter to a node, or an output from a node
3.1.14
this document
referring to this International Standard
3.1.15
independent mapping space label
IMS label
sequence of alphanumeric characters using dot notation comprised of an IMS prefix followed by a node label,
[4]
which may optionally be followed by labels derived from the ITMF Data Encoding Specification
3.1.16
independent mapping space prefix
IMS prefix
sequence of alphanumeric characters using dot notation comprised of "ims" followed by an alphanumeric
label to denote the IMS component, subsequently followed by an alphanumeric label to denote the IMS
system for a particular node
3.2 Abbreviated terms
2D two dimensional
3D three dimensional
AOV arbitrary output variables
IDEA Immersive Digital Experiences Alliance
ITMF Immersive Technology Media Format
IMS independent mapping space
OSL Open Shading Language
RGB red, green, blue
USD Universal Scene Description
glTF graphics language Transmission Format
4 Architectures for scene-based media
4.1 General
This clause illustrates architectures related to the organization of scene-based media and how such media
can be annotated by a process that stores IMS metadata into the media. Furthermore, this clause provides a
hypothetical architecture for a process that translates one scene-based media format to another.
[2]
A more in-depth description of these architectures is provided in ISO/IEC TR 23090-27.
[2]
NOTE This document references only some of the architectures in ISO/IEC TR 23090-27 for the purposes of
providing additional context.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
4.2 Architecture for scene graph representations
Figure 1 illustrates an architecture for the physical representation of scene graphs for scene-based
immersive media.
NOTE The same or closely similar figure can also be drawn to illustrate a logical representation of scene-based media.
Key
1 binary container for immersive media
2 human-readable or binary scene description
3 media asset #1
4 media asset #2
5 media asset #N
6 network
7 media asset #K
8 media asset #M
9 network-based reference from scene description to asset #K
10 local reference from scene description to asset #N stored within binary container
11 local reference from scene description to asset #1 stored within binary container
12 local reference from scene description to asset #2 stored within binary container
13 external storage reference from scene description to asset #M
Figure 1 — Architecture for scene graph representations of scene-based immersive media
In this illustration of a scene graph architecture, the description of the scene (1), including spatial, temporal,
[7] [13]
or logical relationships between objects, is stored in a human-readable format (e.g. XML or JSON ). This
human-readable portion of the scene graph comprises the information that may be annotated with metadata
from the IMS. The scene is further comprised of binary assets that are stored adjacent to the human-
readable description in a “container” (13), such as files that are combined, stored, and compressed into a
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
single archive file format (similar to files with a zip archive extension), or assets that are stored externally
from the container, e.g., either in a local or remote access network.
4.3 Scenes nested within scenes
An important aspect of scene graph architectures is that there is no explicit restriction within this document
that prohibits an individual asset from comprising an entirely self-described scene as depicted in Figure 1.
For example, referring to the architecture depicted in Figure 1, assets A, B, K, M, and N may themselves
be scenes stored in individual containers. Furthermore, these assets may likewise be annotated with IMS
metadata.
4.4 Architecture for annotated scene graph using IMS
Figure 2 illustrates an example of a scene graph that is annotated with metadata from the IMS. In the figure,
items 1A, 1B, 1C, 1D, and 1E refer to individual portions of the human readable description of the scene. Each
such portion may provide information related to the geometry or structure of the scene, and is separately
annotated with IMS metadata for that particular part of the scene. The figure also illustrates that the scene
contains four binary assets labelled: 2A, 2B, 2C, and 2D. The assets themselves may or may not be separately
annotated with IMS metadata. Furthermore, the assets themselves may be individual scenes.
Key
1A portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1B portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1C portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1D portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1E portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
2A binary asset A optionally annotated with IMS metadata
2B binary asset B optionally annotated with IMS metadata
2C binary asset C optionally annotated with IMS metadata
2D binary asset D optionally annotated with IMS metadata
Figure 2 — Architecture for scene graph annotated with IMS metadata
4.5 Architecture for translation between scene graph representations using IMS
A hypothetical architecture for translating one scene graph format to another scene graph format is
illustrated in Figure 3.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Key
1A portion of human-readable scene description 1 annotated with IMS metadata
1B portion of human-readable scene description 1 annotated with IMS metadata
1C portion of human-readable scene description 1 annotated with IMS metadata
1D portion of human-readable scene description 1 annotated with IMS metadata
1E portion of human-readable scene description 1 annotated with IMS metadata
2A portion of human-readable scene description 2 translated with IMS metadata from 1A
2B portion of human-readable scene description 2 translated with IMS metadata from 1B
2C portion of human-readable scene description 2 translated with IMS metadata from 1C
2D portion of human-readable scene description 2 translated with IMS metadata from 1D
2E portion of human-readable scene description 1 translated with IMS metadata from 1E
3 scene description 1 annotated with IMS metadata
4 scene description 2 translated from scene description 1 and IMS metadata
5A binary asset A associated with both scene description 1 and scene description 2
5B binary asset B associated with both scene description 1 and scene description 2
5C binary asset C associated with both scene description 1 and scene description 2
5D binary asset D associated with both scene description 1 and scene description 2
6 translation process
Figure 3 — Architecture for scene graph translation
In this figure, scene description 2 is derived via a translation process (6) from scene description 1 and its
corresponding IMS metadata. Scene description 1 is comprised of five parts labelled 1A, 1B, 1C, 1D, and
1E. Each part is annotated with corresponding IMS metadata. Associated with scene description 1 are four
binary assets, each individually labelled as: 5A, 5B, 5C, and 5D. Assets 5A, 5B, 5C, and 5D may be optionally
annotated. The combined IMS metadata and parts comprising scene description 1 are labelled as 3.
Component 3 serves as input to a translation process (6) that is guided by the IMS metadata to produce
output 4 which is scene description 2. Scene description 2 is also comprised of five parts, although there
does not need to be a one-to-one correspondence between the number of input scene description parts to
the number of output scene description parts.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
5 Independent mapping space (IMS)
5.1 General
The IMS defines a vocabulary and corresponding set of labels that can be used to describe scene graphs to
aid in the translation of a single scene graph format into another single scene graph format, or to aid in the
distribution of a scene graph to a particular rendering interface. While descriptions and terms that comprise
the IMS are sufficiently dense and robust to describe a scene graph capable of representing photorealistic
scenes, the IMS may also be used to describe scene representations that are not designed to represent
photorealistic visualizations.
5.2 Architecture of IMS
The IMS is comprised of the following three architectural components that describe:
— the logical organization of geometric objects and their properties within the scene;
NOTE 1 In general, immersive media may be comprised of media that is created to stimulate any of the human
senses, e.g., visual and auditory senses of humans. Scenes are likewise organized according to the types of media
that stimulate human senses, i.e., visual and auditory scenes. While there are different types of scenes that are
used to create immersive media, this document is currently limited to define an IMS for visual scenes.
— the physical organization of the scene, e.g., for the purposes of access and storage of binary media within
the scene;
— processing directions for the presentation engine, e.g., for rendering or animation.
NOTE 2 A single scene graph can contain multiple sets of render processing directions, each set relevant to the
interface for a particular renderer.
Each of the IMS architectural components is comprised of systems that describe a particular system node
type for which there can be multiple of such subtypes further defined.
NOTE 3 For example, a “lighting” system node type can define multiple subtypes of “lighting” each of which can
satisfy the requirements of another system node that requires information about lighting.
Each system node type may be regarded as a process that is executed by a presentation engine. Each of such
processes produces an output that has a type consistent with its definition, e.g., lighting. Likewise, some of
the node types have input requirements that are described in terms of other node types. As an example, a
render instruction node type requires input information of node type “geometry” and node type of “camera,”
at a minimum, in order to produce a rendered output.
5.3 IMS systems
5.3.1 General
This subclause specifies the individual systems that comprise the IMS architectural components for logical
organization, physical organization, and processing directions. Associated with each system is a type, label,
description, and hypothetical rendering semantics, i.e., a processing model, for how a renderer can interpret
a node belonging to each system.
NOTE These systems are designed to mirror the structure, representation, and encoding of scene-based media
[3]
as specified by the ITMF Scene Graph Specification, ITMF Data Encoding Specification, and ITMF Container
[5]
Specification.
Associated with each system is one or more nodes that belong to the system. Figure 4 provides a canonical
representation of how the IMS systems relate to each other.
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Key
1 canonical IMS values (e.g., integer, enumeration, float, …)
2 IMS systems for logical organization, physical organization, and processing directions
3 input or output pin
4 attribute
5 node graph
6 logical organization of geometry
7 processing directions
8 physical organization of scene graph
9 presentation engine (not a system in the IMS)
10 rendered output (not a system in the IMS)
Figure 4 — Canonical representation of IMS geometric objects within the scene
An IMS value (item 1) represents an input value from the value node system to any one of the other IMS
systems (item 2) including systems designated for logical organization, physical organization, and
processing directions for the scene.
Input or output relationships between any of the system nodes are established with IMS data pins (item
3). Attributes represent other IMS system nodes or value nodes that are considered to be immutable (not
changeable by rendering processes). A collection of system nodes is designated to describe the logical
organization of the geometry (item 6); processing directions (item 7); and physical organization (item 8)
in the scene. A node graph (item 5) represents a collection of other IMS system nodes that may also include
attributes or render-specific metadata (not in scope of this document). Inputs to a presentation engine (item
9) include information about the logical organization, processing directions, and physical organization (not
shown in the figure) to produce a rendered output (item 10).
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
5.3.2 IMS logical organization component
IMS systems used to describe the logical organization component of a scene are identified by a label with a
prefix of "ims.logical". Table 1 specifies the IMS systems that comprise the logical organization component.
Table 1 — Summary of IMS systems for logical organization of the scene
System node type IMS system label Description Hypothetical renderer
semantics
system value node ims.logical.sysVal An input value (e.g., inte- Values provided by system
gers, single-precision floats, value nodes are used by
spatial coordinates, vec- renderers for processing of
tors) to an associated node. other nodes to which the
values are associated.
material node ims.logical.material Surface properties of indi- Materials provide charac-
vidual geometric objects or teristics (e.g., porousness,
groups of geometric objects bumps, metallic properties)
in the scene. for the surfaces of individu-
al objects that are rendered
in the scene.
texture node ims.logical.texture Colour and brightness for a Textures define individu-
surface corresponding to a al colours and brightness
geometric objects or group for surfaces of individual
of geometric objects in the objects that are rendered in
scene. the scene.
geometry node ims.logical.geometry Geometric shape of an ob- Geometry defines the
ject or group of objects in a 3D shape or volume of
scene or geometry archive. individual objects that are
rendered in the scene.
surface (medium) node ims.logical.surface A description of how light The behaviour of individual
is reflected, refracted, or rays in ray-tracing algo-
absorbed by an object. rithms is characterized
by formulae provided in
surface nodes.
transformation node ims.logical.transform A description of how ge- The parameters are used
ometric objects are scaled, to transform individual
translated, or rotated. geometric objects within a
scene.
connection pins ims.logical.pins.input (input Parameters used as input One or more data pins are
pin) to a node, or a single output inputs to a node or a single
from a node, i.e., input data data pin is an output from
ims.logical.pins.output
pin(s) or output data pin. a node.
(output pin)
data attributes ims.logical.attribute Non-mutable characteris- The renderer does not allow
tics, features, or values of or other nodes to alter attrib-
for an object. ute values during serializa-
tion.
[9]
scene object ims.logical.sceneObject A geometric object, possibly Denotes use of Alembic,
animated, created using a Universal Scene Description
digital content creation tool, (USD), glTF, and Filmbox or
and included in a composit- other formats as individual
ed scene. geometry assets, possibly
for a larger scene.
5.3.2.1 Value nodes system
Table 2 specifies the types of values and labels that comprise the value nodes system within the IMS. IMS
labels for value nodes for use within the logical organization of the scene are identified by an IMS prefix of
‘ims.logical.sysVal’. The ITMF Data Encoding Specification provides additional information regarding usage
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
of the value nodes system. Annex A of this document provides information about how to use the ITMF Data
Encoding Specification with the IMS.
Table 2 — Nodes for value node system for logical organization component
Node ITMF node Node label IMS label
code point
float value 6 float ims.logical.sysVal.float
int value 9 integer ims.logical.sysVal.integer
boolean value 11 boolean ims.logical.sysVal.boolean
image resolution 12 imageResolution ims.logical.sysVal.imageResolution
sun direction 30 sunDirection ims.logical.sysVal.sunDirection
enumeration 57 enumeration ims.logical.sysVal.enumeration
annotation 68 annotation ims.logical.sysVal.annotation
string 84 string ims.logical.sysVal.string
file 88 file ims.logical.sysVal.file
directory 101 directoryName ims.logical.sysVal.directoryName
bit mask 132 bitMask ims.logical.sysVal.bitMask
5.3.2.2 Material nodes system
Table 3 specifies the node names and corresponding labels that comprise the material nodes system within
the IMS. IMS labels for material nodes are identified by an IMS prefix of "ims.logical.material". The ITMF
Data Encoding Specification provides additional information regarding usage of the material nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 3 — Nodes for material nodes system
Node ITMF node Node label IMS label
code point
material map 2 map ims.logical.material.map
glossy material 16 glossy ims.logical.material.glossy
diffuse material 17 diffuse ims.logical.material.diffuse
specular material 18 specular ims.logical.material.specular
mix material 19 mix ims.logical.material.mix
portal material 20 portal ims.logical.material.portal
metallic material 120 metallic ims.logical.material.metallic
toon material 121 toon ims.logical.material.toon
toon ramp material 122 toonRamp ims.logical.material.toonRamp
universal material 130 universal ims.logical.material.universal
composite material 138 composite ims.logical.material.composite
specular material layer 139 specularMaterialLayer ims.logical.material.specularMaterialLay-
er
diffuse material layer 140 diffuseMaterialLayer ims.logical.material.diffuseMaterialLayer
metallic material layer 141 metallicMaterialLayer ims.logical.material.metallicMaterialLayer
sheen material layer 142 sheenMaterialLayer ims.logical.material.sheenMaterialLayer
layered material 143 layered ims.logical.material.layered
material layer group 144 materialLayerGroup ims.logical.material.materialLayerGroup
shadow catcher 145 shadowCatcher ims.logical.material.shadowCatcher
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 3 3 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
hair material 147 hair ims.logical.material.hair
null material 159 null ims.logical.material.null
clipping material 178 clipping ims.logical.material.clipping
5.3.2.3 Texture nodes system
Table 4 specifies the node names and corresponding labels that comprise the texture nodes system within
the IMS. IMS labels for texture nodes are identified by an IMS prefix of ‘ims.logical.texture’. The ITMF Data
Encoding Specification provides additional information regarding usage of the texture nodes system.
Annex A of this document provides information about how to use the ITMF Data Encoding Specification with
the IMS.
Table 4 — Nodes for texture nodes system
Node ITMF node Node label IMS label
code point
turbulence texture 22 turbulence ims.logical.texture.turbulence
grayscale colour 31 grayscaleColour ims.logical.texture.grayscaleColour
gaussian spectrum 32 gaussianSpectrum ims.logical.texture.gaussianSpectrum
RGB colour 33 rgbColour ims.logical.texture.rgbColour
RGB image 34 rgbImage ims.logical.texture.rgbImage
alpha image 35 alphaImage ims.logical.texture.alphaImage
grayscale image 36 grayscaleImage ims.logical.texture.grayscaleImage
mix texture 38 mix ims.logical.texture.mix
multiply texture 39 multiply ims.logical.texture.multiply
cosine mix texture 40 cosineMix ims.logical.texture.cosineMix
clamp texture 41 clamp ims.logical.texture.clamp
saw wave texture 42 sawWave ims.logical.texture.sawWave
triangle wave texture 43 triangleWave ims.logical.texture.triangleWave
sine wave texture 44 sineWave ims.logical.texture.sineWave
checks texture 45 checks ims.logical.texture.checks
invert texture 46 invert ims.logical.texture.invert
marble texture 47 marble Ims.logical.texture.marble
rigid fractal texture 48 rigidFractal ims.logical.texture.rigidFractal
gradient map texture 49 gradientMap ims.logical.texture.gradientMap
falloff map 50 falloffMap ims.logical.texture.falloffMap
colour correction 51 colourCorrection ims.logical.texture.colourCorrection
dirt texture 63 dirt ims.logical.texture.dirt
cylindrical 74 cylindrical ims.logical.texture.cylindrical
linear XYZ to UVW 75 linearXYZtoUVW ims.logical.texture.linearXYZtoUVW
perspective 76 perspective ims.logical.texture.perspective
spherical 77 spherical ims.logical.texture.spherical
mesh UV 78 meshUV ims.logical.texture.meshUV
box 79 box ims.logical.texture.box
displacement 80 displacement ims.logical.texture.displacement
random colour texture 81 randomColour ims.logical.texture.randomColour
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
noise texture 87 noise ims.logical.texture.noise
polygon side 89 polygonSide ims.logical.texture.polygonSide
vertex displacement 97 vertexDisplacement ims.logical.texture.vertexDisplacement
W-coordinate 104 wCoordinate ims.logical.texture.wCoordinate
add texture 106 add ims.logical.texture.add
comparison 107 comparison ims.logical.texture.comparison
subtract texture 108 subtract ims.logical.texture.subtract
triplanar map 109 triplanarMap ims.logical.texture.triplanarMap
triplanar 111 triplanar ims.logical.texture.triplanar
instance colour 113 instanceColour ims.logical.texture.instanceColour
instance range 114 instanceRange ims.logical.texture.instanceRange
baking texture 115 baking ims.logical.texture.baking
OSL texture 117 oslTexture ims.logical.texture.oslTexture
UVW transform 118 uvwTransform ims.logical.texture.uvwTransform
OSL 125 osl ims.logical.texture.osl
OSL delayed UV 127 oslDelayedUV ims.logical.texture.oslDelayedUV
image tiles 131 imageTiles ims.logical.texture.imageTiles
colour vertex attribute 135 colourVertexAttribute ims.logical.texture.colourVertexAttrib-
texture ute
grayscale colour vertex 136 grayscaleColourVertexAttrib- ims.logical.texture.grayscaleColourVer-
attribute texture ute texAttribute
vertex displacement mixer 151 vertexDisplacementMixer ims.logical.texture.vertexDisplacement-
Mixer
spotlight distribution 158 spotlightDistribution ims.logical.texture.spotlightDistribu-
tion
Cinema4D noise texture 162 cinema4dNoise ims.logical.texture.cinema4dNoise
chaos texture 170 chaos ims.logical.texture.chaos
channel picker 171 channelPicker ims.logical.texture.channelPicker
channel merger 172 channelMerger ims.logical.texture.channelMerger
ray switch 173 raySwitch ims.logical.texture.raySwitch
channel inverter 174 channelInverter ims.logical.texture.channelInverter
channel mapper 175 channelMapper ims.logical.texture.channelMapper
composite texture 176 composite ims.logical.texture.composite
mesh UV 177 meshUV ims.logical.texture.meshUV
iridescent texture 187 iridescent ims.logical.texture.iridescent
volume to texture 256 volumeToTexture ims.logical.texture.volumeToTexture
colour to UVW 258 colourToUVW ims.logical.texture.colourToUVW
smooth Voronoi contours 260 smoothVoronoiContours ims.logical.texture.smoothVoronoiCon-
texture tours
tile patterns texture 261 tilePatterns ims.logical.texture.tilePatterns
procedural effects texture 262 proceduralEffects ims.logical.texture.procedurealEffects
chainmail texture 263 chainmail ims.logical.texture.chainmail
Moire mosaic texture 264 moireMosaic ims.logical.texture.moireMosaic
colour squares texture 265 colourSquares ims.logical.texture.colourSquares
stripes texture 266 stripes ims.logical.texture.stripes
© ISO/IEC 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
TTabablele 4 4 ((ccoonnttiinnueuedd))
Node ITMF node Node label IMS label
code point
flakes texture 267 flakes ims.logical.texture.flakes
fractal texture 268 fractal ims.logical.texture.fractal
hagelslag texture 269 hagels
...
ISO/IEC FDIS 23090-28
ISO/IEC JTC 1/SC 29
Secretariat: JISC
Date: 2025-10-30
Information technology — Coded representation of immersive
media —
Part 28:
Interchangeable scene-based media representations
Technologies de l'information — Représentation codée de média immersifs —
Partie 28: Représentations de médias basés sur des scènes interchangeables
FDIS stage
ISO/IEC FDIS 23090-28:2025(en)
© ISO/IEC 2025
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication
may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying,
or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO
at the address below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: + 41 22 749 01 11
E-mail: copyright@iso.org
Website: www.iso.org
Published in Switzerland
© ISO #### /IEC 2025 – All rights reserved
ii
ISO/IEC DIS FDIS 23090-28:20242025(en)
Contents
Foreword . iv
Introduction . v
1 Scope . 1
2 Normative references . 1
3 Terms and definitions . 1
3.1 General . 1
3.2 Abbreviated terms . 3
4 Architectures for scene-based media . 4
4.1 General . 4
4.2 Architecture for scene graph representations . 4
4.3 Scenes nested within scenes . 5
4.4 Architecture for annotated scene graph using IMS . 5
4.5 Architecture for translation between scene graph representations using IMS . 6
5 Independent mapping space (IMS) . 7
5.1 General . 7
5.2 Architecture of IMS . 7
5.3 IMS systems . 8
Annex A (normative) Annotation using IMS and ITMF . 35
Annex B (normative) Mapping of IMS to glTF 2.0 . 40
Annex C (normative) Annotation process for glTF . 42
Bibliography . 49
© ISO/IEC 2024 2025 – All rights reserved
iii
ISO/IEC FDIS 23090-28:2025(en)
Foreword
ISO (the International Organization for Standardization) is a and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide federation of national standardsstandardization.
National bodies (that are members of ISO member bodies). The workor IEC participate in the development of
preparing International Standards is normally carried out through ISO technical committees. Each member
body interested in a subject for which a technical committee has been established has the right to be
represented on that committee. Internationalby the respective organization to deal with particular fields of
technical activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the work.
ISO collaborates closely with the International Electrotechnical Commission (IEC) on all matters of
electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are described
in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the different types of
ISO documentsdocument should be noted. This document was drafted in accordance with the editorial rules
of the ISO/IEC Directives, Part 2 (see www.iso.org/directives or www.iec.ch/members_experts/refdocs).
Attention is drawnISO and IEC draw attention to the possibility that some of the elementsimplementation of
this document may beinvolve the subjectuse of (a) patent(s). ISO and IEC take no position concerning the
evidence, validity or applicability of any claimed patent rights in respect thereof. As of the date of publication
of this document, ISO and IEC had not received notice of (a) patent(s) which may be required to implement
this document. However, implementers are cautioned that this may not represent the latest information,
which may be obtained from the patent database available at www.iso.org/patents and https://patents.iec.ch
rights. ISO. ISO and IEC shall not be held responsible for identifying any or all such patent rights. Details of any
patent rights identified during the development of the document will be in the Introduction and/or on the ISO
list of patent declarations received (see ).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and expressions
related to conformity assessment, as well as information about ISO's adherence to the World Trade
Organization (WTO) principles in the Technical Barriers to Trade (TBT),) see
www.iso.org/iso/foreword.html. In the IEC, see www.iec.ch/understanding-standards.
ISO/IEC 23090-28This document was prepared by Joint Technical Committee ISO/IEC JTC 1, Information
Technology, Subcommittee SC 29, Coding of audio, picture, multimedia and hypermedia information.
A list of all parts in the ISO/IEC 23090 series can be found on the ISO website.
Any feedback or questions on this document should be directed to the user’s national standards body. A
complete listing of these bodies can be found at www.iso.org/members.html.
© ISO #### /IEC 2025 – All rights reserved
iv
ISO/IEC DIS FDIS 23090-28:20242025(en)
Introduction
Emerging display technologies capable of creating fully formed holograms, e.g. 3D aerial images comprised of
waveforms emitted from a display panel surface, require input source media that are sufficiently dense in
information such that each of the objects in a holographic visual scene is described in terms of its complete
and explicit geometry, and surface properties to characterize how the surface responds to the presence of
light. Such visual information can furthermore facilitate display technologies to produce visual scenes of
objects that cannot be distinguished from the same objects when viewed in the natural world; i.e., the visual
information of the media is sufficiently dense such that the display is enabled to produce a photorealistic
result.
Commonly used raster media formats such as those specified in ITU-R BT.601, ITU-R BT.709, ITU-R BT.2020,
ITU-R BT.2100 were originally developed to efficiently provide information sufficient for 2D displays to
produce 2D visual presentations, albeit with varying degrees of spatial resolutions, frame rates, and pixel
depth. These raster formats, however, neither provide sufficient information in a practical nor efficient
manner, to enable a holographic display to produce holograms. As an alternative to raster media, 3D graphics
formats used in digital content creation tools and production workflows offer a solution to provide sufficiently
dense information to enable photorealistic results where raster formats do not. The use of 3D graphics
technologies hence serves as a fundamental technology in enabling immersive displays to create their optimal
visual presentations.
To date, immersive displays provide a media input interface that is capable of ingesting 3D graphics formats
for presentation by the display. Such interfaces maycan be enabled and shipped as part of the onboard
processing features in the display by use of game engine software development kits. However, 3D scene media
that are targeted for processing by the display’s choice of renderer must first be translated into a format that
is consistent with the internal scene representations of the renderer. To facilitate the translation and
distribution of 3D scene media into the renderer interface provided by the display, an Independent Mapping
Spaceindependent mapping space (IMS) is specified as a part of the ISO/IEC 23090 series.
At the core of the IMS is a vocabulary of terms and definitions that can be used to annotate a scene graph or
scene asset with metadata, i.e., to facilitate the translation of media from one format to another. Such
translation from one format to another format is a commonly encountered challenge in the distribution of
scene-based media, as a number of formats have emerged to gain popularity in various application areas, e.g.,
gaming, social media, e-commerce, and content production. The corresponding vocabulary of terms and
definitions describes nodes, media formats, media attributes, and render processing instructions of
commonly-used scene graph formats and media representations. Elements from the IMS maycan be used to
annotate a scene graph or media format so that its translation to another format is more straightforward and
consistently performed. Thus, the IMS facilitates the distribution and interoperability of a variety of scene-
based media into render applications that are limited in the types of formats that they are designed to support.
Another application of this document is to guide the translation of media assets for use across a diverse set of
applications comprising the Metaverse, e.g., gaming, social media, retail applications. In the absence of a
common format that specifies media for use across the variety of Metaverse applications, the Independent
Mapping SpaceIMS can facilitate the translation from one asset representation to another.
© ISO/IEC 2024 2025 – All rights reserved
v
ISO/IEC DIS 23090-28:2024(en)
DRAFT International Standard
Information technology 一 — Coded representation of immersive
media 一 —
Part 28:
Interchangeable scene-based media representations
1 Scope
This document specifies an architecture and data model for interchange of scene-based media for use in
systems and applications that employ renderers for the presentation of immersive media, including audio,
visual, tactile and other representations of media types.
The architecture and data model specified in this document apply to use cases where there is a need for
interchange of scene-based media for presentation by systems with 3D render-based technologies. One
specific application of this document is immersive display technologies that utilize a real-time renderer of 3D
media to create their visual presentations, as opposed to a video decoder and renderer of 2D raster-based
media. Another application of this document is for the distribution of media assets for use across a large
variety of applications comprising the Metaverse.
2 Normative references
There are no normative references in this document.
3 Terms and definitions
3.1 General
For the purposes of this document, the following terms and definitions apply.
ISO and IEC maintain terminology databases for use in standardization at the following addresses:
— — ISO Online browsing platform: available at https://www.iso.org/obp
— — IEC Electropedia: available at https://www.electropedia.org/
3.1.1 3.1.1
scene graph
general data structure commonly used by vector-based graphics editing applications and modern computer
games, that arranges the logical and often (but not necessarily) spatial representation of a graphical scene; a
collection of nodes and vertices in a graph structure
3.1.2
Immersive Technology Media Format
3.1.2
immersive technology media format
ITMF
suite of specifications for the Immersive Technology Media Format developed by the Immersive Digital
Experiences Alliance
© ISO/IEC 2024 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
3.1.43.1.3 3.1.3
node
fundamental element of the scene graph comprised of information related to the logical or spatial or temporal
representation of visual or audio information
3.1.53.1.4 3.1.4
node graph
collection of nodes that can be positioned or modified as a single unit and linked to the remainder of the scene
graph by linker nodes
3.1.63.1.5 3.1.5
node type
general category of node that performs a certain function, for example, a camera node type can be a panoramic
lens camera node, an OSLopen shading language (OSL) node, or other types of cameras
3.1.73.1.6 3.1.6
attribute
metadata associated with a node used to describe a particular characteristic or feature of a node either in a
canonical or more complex form (e.g. in terms of another node)
3.1.83.1.7 3.1.7
container
serialisedserialized format to store and exchange information to represent all natural, all synthetic, or a
mixture of synthetic and natural scenes including a scene graph and all of the media resources that are
required for rendering of the scene
3.1.8
serialisation
3.1.8
serialization
process of translating data structures or object state into a format that can be stored (for example, in a file or
memory buffer) or transmitted (for example, across a network connection link) and reconstructed later
(possibly in a different computer environment)
Note 1 to entry: When the resulting series of bits is reread according to the serialisationserialization format, it can be
used to create a semantically identical clone of the original object.
3.1.113.1.9 3.1.9
renderer
software-based) application or process, based on a selective mixture of disciplines related to: acoustic physics,
light physics, visual perception, audio perception, mathematics, and software development, that, given an
input scene graph and asset container, emits a visual and/or audio signal suitable for presentation on a
targeted device or conforming to the desired properties as specified by attributes of a render target node in
the scene graph
Note 1 to entry: For visual-based media assets, a renderer may emit a visual signal suitable for a targeted display, or for
storage as an intermediate asset (e.g. repackaged into another container i.e. used in a series of rendering processes in a
graphics pipeline); for audio-based media assets, a renderer may emit an audio signal for presentation in a multi-channel
loudspeaker and/or binauralized headphones, or for repackaging into another (output) container.
3.1.123.1.10 3.1.10
scene-based media
audio, visual, haptic, and other primary types of media and media-related information organized logically and
spatially by a use of a scene graph
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
3.1.133.1.11 3.1.11
RGB
referring to an additive colour model in which red, green, and blue are the primary colours; the
Note 1 to entry: The abbreviation RGB is derived from the first letter of each of the primary colours.
3.1.143.1.12 3.1.12
asset
one of potentially multiple objects comprising the media for a scene in scene-based media; each object fully
formed and packaged such that it can be rendered separately from other objects in the scene
Note 1 to entry: individual Individual assets may be shared, e.g. reused across multiple scenes.
3.1.153.1.13 3.1.13
pin
input parameter to a node, or an output from a node
3.1.163.1.14 3.1.14
this document
referring to this International Standard
3.1.15
3.1.15
independent mapping space label
IMS label
sequence of alphanumeric characters using dot notation comprised of an IMS prefix followed by a node label,
[ ]
which may optionally be followed by labels derived from the ITMF Data Encoding Specification 4 [4]
3.1.16
3.1.16
independent mapping space prefix
IMS prefix
sequence of alphanumeric characters using dot notation comprised of ‘ims’"ims" followed by an alphanumeric
label to denote the IMS component, subsequently followed by an alphanumeric label to denote the IMS system
for a particular node
3.2 Abbreviated terms
2D two dimensional
3D three dimensional
AOV arbitrary output variables
IDEA Immersive Digital Experiences Alliance
ITMF Immersive Technology Media Format
IMS independent mapping space
OSL Open Shading Language
RGB red, green, blue
USD Universal Scene Description
glTF graphics language Transmission Format
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
4 Architectures for scene-based media
4.1 General
This clause illustrates architectures related to the organization of scene-based media and how such media
maycan be annotated by a process that stores IMS metadata into the media. Furthermore, this clause provides
a hypothetical architecture for a process that translates one scene-based media format to another.
[ ]
A more in-depth description of these architectures is provided in ISO/IEC TR 23090-27. 2 [2].
[ ]
Note: NOTE This document references only some of the architectures in ISO/IEC TR 23090-27 2 [2] for the purposes
of providing additional context.
4.2 Architecture for scene graph representations
Figure 1Figure 1 illustrates an architecture for the physical representation of scene graphs for scene-based
immersive media.
Note: NOTE The same or closely similar figure can also be drawn to illustrate a logical representation of scene-based
media.
Key
1 binary container for immersive media
2 human-readable or binary scene description
3 media asset #1
4 media asset #2
5 media asset #N
6 network
7 media asset #K
8 media asset #M
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
9 network-based reference from scene description to asset #K
10 local reference from scene description to asset #N stored within binary container
11 local reference from scene description to asset #1 stored within binary container
12 local reference from scene description to asset #2 stored within binary container
13 external storage reference from scene description to asset #M
Figure 1 — Architecture for scene graph representations of scene-based immersive media
In this illustration of a scene graph architecture, the description of the scene (1), including spatial, temporal,
[ ] [
or logical relationships between objects, is stored in a human-readable format (e.g. XML 7 [7] or JSON 13
]
[13]). ). This human-readable portion of the scene graph comprises the information that may be annotated
with metadata from the IMS. The scene is further comprised of binary assets that are stored adjacent to the
human-readable description in a “container” (13), such as files that are combined, stored, and compressed
into a single archive file format (similar to files with a zip archive extension), or assets that are stored
externally from the container, e.g., either in a local or remote access network.
4.3 Scenes nested within scenes
An important aspect of scene graph architectures is that there is no explicit restriction within this document
that prohibits an individual asset from comprising an entirely self-described scene as depicted in
Figure 1Figure 1. For example, referring to the architecture depicted in Figure 1Figure 1,, assets A, B, K, M,
and N may themselves be scenes stored in individual containers. Furthermore, these assets may likewise be
annotated with IMS metadata.
4.4 Architecture for annotated scene graph using IMS
Figure 2Figure 2 illustrates an example of a scene graph that is annotated with metadata from the IMS. In the
figure, items 1A, 1B, 1C, 1D, and 1E refer to individual portions of the human readable description of the scene.
Each such portion may provide information related to the geometry or structure of the scene, and is separately
annotated with IMS metadata for that particular part of the scene. The figure also illustrates that the scene
contains four binary assets labelled: 2A, 2B, 2C, and 2D. The assets themselves may or may not be separately
annotated with IMS metadata. Furthermore, the assets themselves may be individual scenes.
Key
1A portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1B portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1C portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1D portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
1E portion of human-readable scene description, e.g. XML or JSON, annotated with IMS metadata
2A binary asset A optionally annotated with IMS metadata
2B binary asset B optionally annotated with IMS metadata
2C binary asset C optionally annotated with IMS metadata
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
2D binary asset D optionally annotated with IMS metadata
Figure 2 — Architecture for scene graph annotated with IMS metadata
4.5 Architecture for translation between scene graph representations using IMS
A hypothetical architecture for translating one scene graph format to another scene graph format is illustrated
in Figure 3Figure 3.
Key
1A portion of human-readable scene description 1 annotated with IMS metadata
1B portion of human-readable scene description 1 annotated with IMS metadata
1C portion of human-readable scene description 1 annotated with IMS metadata
1D portion of human-readable scene description 1 annotated with IMS metadata
1E portion of human-readable scene description 1 annotated with IMS metadata
2A portion of human-readable scene description 2 translated with IMS metadata from 1A
2B portion of human-readable scene description 2 translated with IMS metadata from 1B
2C portion of human-readable scene description 2 translated with IMS metadata from 1C
2D portion of human-readable scene description 2 translated with IMS metadata from 1D
2E portion of human-readable scene description 1 translated with IMS metadata from 1E
3 scene description 1 annotated with IMS metadata
4 scene description 2 translated from scene description 1 and IMS metadata
5A binary asset A associated with both scene description 1 and scene description 2
5B binary asset B associated with both scene description 1 and scene description 2
5C binary asset C associated with both scene description 1 and scene description 2
5D binary asset D associated with both scene description 1 and scene description 2
6 translation process
Figure 3 — Architecture for scene graph translation
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
In this figure, Scene Descriptionscene description 2 is derived via a translation process (6) from Scene
Descriptionscene description 1 and its corresponding IMS metadata. Scene Descriptiondescription 1 is
comprised of five parts labelled 1A, 1B, 1C, 1D, and 1E. Each part is annotated with corresponding IMS
metadata. Associated with Scene Descriptionscene description 1 are four binary assets, each individually
labelled as: 5A, 5B, 5C, and 5D. Assets 5A, 5B, 5C, and 5D may be optionally annotated. The combined IMS
metadata and parts comprising Scene Descriptionscene description 1 are labelled as 3. Component 3 serves
as input to a translation process (6) that is guided by the IMS metadata to produce output 4 which is scene
description 2. Scene Description 2. Scene Descriptiondescription 2 is also comprised of five parts, although
there does not need to be a one-to-one correspondence between the number of input scene description parts
to the number of output scene description parts.
5 Independent Mapping Space
F.0.0A.1.1 General
5 The Independent Mapping Spacemapping space (IMS)
5.1 General
The IMS defines a vocabulary and corresponding set of labels that can be used to describe scene graphs to aid
in the translation of a single scene graph format into another single scene graph format, or to aid in the
distribution of a scene graph to a particular rendering interface. While descriptions and terms that comprise
the IMS are sufficiently dense and robust to describe a scene graph capable of representing photorealistic
scenes, the IMS may also be used to describe scene representations that are not designed to represent
photorealistic visualizations.
6.15.2 Architecture of IMS
The IMS is comprised of the following three architectural components that describe:
— — the logical organization of geometric objects and their properties within the scene;
NOTE 1 – In general, immersive media may be comprised of media that is created to stimulate any of the human
senses, e.g., visual and auditory senses of humans. Scenes are likewise organized according to the types of media
that stimulate human senses, i.e., visual and auditory scenes. While there are different types of scenes that are used
to create immersive media, this document is currently limited to define an IMS for visual scenes.
— — the physical organization of the scene, e.g., for the purposes of access and storage of binary media
within the scene;
— — processing directions for the presentation engine, e.g., for rendering or animation.
NOTE 2 – A single scene graph can contain multiple sets of render processing directions, each set relevant to the
interface for a particular renderer.
Each of the IMS architectural components is comprised of systems that describe a particular system node type
for which there can be multiple of such subtypes further defined.
NOTE 3 For example, a “lighting” system node type can define multiple subtypes of “lighting” each of which maycan
satisfy the requirements of another system node that needsrequires information about lighting.
Each system node type may be regarded as a process that is executed by a presentation engine. Each of such
processes produces an output that has a type consistent with its definition, e.g., lighting. Likewise, some of the
node types have input requirements that are described in terms of other node types. As an example, a render
instruction node type requires input information of node type “geometry” and node type of “camera,” at a
minimum, in order to produce a rendered output.
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
6.25.3 IMS systems
6.2.15.3.1 General
This subclause specifies the individual systems that comprise the IMS architectural components for logical
organization, physical organization, and processing directions. Associated with each system is a type, label,
description, and hypothetical rendering semantics, i.e., a processing model, for how a renderer mightcan
interpret a node belonging to each system.
NOTE –These systems are designed to mirror the structure, representation, and encoding of scene-based media as
[ ]
specified by the ITMF Scene Graph Specification, 3 [3], ITMF Data Encoding Specification, and ITMF Container
[ ]
Specification. 5 [5].
Associated with each system is one or more nodes that belong to the system. Figure 4Figure 4 provides a
canonical representation of how the IMS systems relate to each other.
Key
1 canonical IMS values (e.g., integer, enumeration, float, …)
2 IMS systems for logical organization, physical organization, and processing directions
3 input or output pin
4 attribute
5 node graph
6 logical organization of geometry
7 processing directions
8 physical organization of scene graph
9 presentation engine (not a system in the IMS)
10 rendered output (not a system in the IMS)
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
Figure 4 — Canonical representation of IMS geometric objects within the scene
An IMS value (item 1) represents an input value from the value node system to any one of the other IMS
systems (item 2) including systems designated for logical organization, physical organization, and processing
directions for the scene.
Input or output relationships between any of the system nodes are established with IMS data pins (item 3).
Attributes represent other IMS system nodes or value nodes that are considered to be immutable (not
changeable by rendering processes). A collection of system nodes is designated to describe the logical
organization of the geometry (item 6); processing directions (item 7); and physical organization (item 8) in
the scene. A node graph (item 5) represents a collection of other IMS system nodes that may also include
attributes or render-specific metadata (not in scope of this document). Inputs to a presentation engine (item
9) include information about the logical organization, processing directions, and phycialphysical organization
(not shown in the figure) to produce a rendered output (item 10).
6.2.25.3.2 IMS logical organization component
IMS systems used to describe the logical organization component of a scene are identified by a label with a
prefix of ‘"ims.logical’. Table 1logical". Table 1 specifies the IMS systems that comprise the logical organization
component.
Table 1 — Summary of IMS systems for logical organization of the scene
System node type IMS system label Description Hypothetical renderer
semantics
system value node ims.logical.sysVal An input value (e.g., Values provided by system
integers, single-precision value nodes are used by
floats, spatial coordinates, renderers for processing of
vectors) to an associated other nodes to which the
node. values are associated.
material node ims.logical.material Surface properties of Materials provide
individual geometric characteristics (e.g.,
objects or groups of porousness, bumps,
geometric objects in the metallic properties) for the
scene. surfaces of individual
objects that are rendered
in the scene.
texture node ims.logical.texture Colour and brightness for a Textures define individual
surface corresponding to a colours and brightness for
geometric objects or group surfaces of individual
of geometric objects in the objects that are rendered
scene. in the scene.
geometry node ims.logical.geometry Geometric shape of an Geometry defines the 3D
object or group of objects shape or volume of
in a scene or geometry individual objects that are
archive. rendered in the scene.
surface (medium) node ims.logical.surface A description of how light The behaviour of
is reflected, refracted, or individual rays in ray-
absorbed by an object. tracing algorithms is
characterized by formulae
provided in surface nodes.
transformation node ims.logical.transform A description of how The parameters are used
geometric objects are to transform individual
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
System node type IMS system label Description Hypothetical renderer
semantics
scaled, translated, or geometric objects within a
rotated. scene.
connection pins ims.logical.pins.input Parameters used as input One or more data pins are
(input pin) to a node, or a single inputs to a node or a single
output from a node, i.e., data pin is an output from
ims.logical.pins.output
input data pin(s) or output a node.
(output pin)
data pin.
data attributes ims.logical.attribute Non-mutable The renderer does not
characteristics, features, or allow other nodes to alter
values of or for an object. attribute values during
serialisationserialization.
[
scene object ims.logical.sceneObject A geometric object, Denotes use of Alembic, 9
]
possibly animated, created [9], Universal Scene
using a digital content Description (USD), glTF,
creation tool, and included and Filmbox or other
in a composited scene. formats as individual
geometry assets, possibly
for a larger scene.
6.2.2.15.3.2.1 Value nodes system
Table 2Table 2 specifies the types of values and labels that comprise the value nodes system within the IMS.
IMS labels for value nodes for use within the logical organization of the scene are identified by an IMS prefix
of ‘ims.logical.sysVal’. The ITMF Data Encoding Specification provides additional information regarding usage
of the value nodes system. Annex AAnnex A of this document provides information about how to use the ITMF
Data Encoding Specification with the IMS.
Table 2 — Nodes for value node system for logical organization component
Node ITMF node Node label IMS label
code point
float value 6 float ims.logical.sysVal.float
int value 9 integer ims.logical.sysVal.integer
boolean value 11 boolean ims.logical.sysVal.boolean
image resolution 12 imageResolution ims.logical.sysVal.imageResolution
sun direction 30 sunDirection ims.logical.sysVal.sunDirection
enumeration 57 enumeration ims.logical.sysVal.enumeration
annotation 68 annotation ims.logical.sysVal.annotation
string 84 string ims.logical.sysVal.string
file 88 file ims.logical.sysVal.file
directory 101 directoryName ims.logical.sysVal.directoryName
bit mask 132 bitMask ims.logical.sysVal.bitMask
6.2.2.25.3.2.2 Material nodes system
Table 3Table 3 specifies the node names and corresponding labels that comprise the material nodes system
within the IMS. IMS labels for material nodes are identified by an IMS prefix of ‘"ims.logical.material’.material".
The ITMF Data Encoding Specification provides additional information regarding usage of the material nodes
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
system. Annex AAnnex A of this document provides information about how to use the ITMF Data Encoding
Specification with the IMS.
Table 3 — Nodes for material nodes system
Node ITMF node Node label IMS label
code point
material map 2 map ims.logical.material.map
glossy material 16 glossy ims.logical.material.glossy
diffuse material 17 diffuse ims.logical.material.diffuse
specular material 18 specular ims.logical.material.specular
mix material 19 mix ims.logical.material.mix
portal material 20 portal ims.logical.material.portal
metallic material 120 metallic ims.logical.material.metallic
toon material 121 toon ims.logical.material.toon
toon ramp material 122 toonRamp ims.logical.material.toonRamp
universal material 130 universal ims.logical.material.universal
composite material 138 composite ims.logical.material.composite
specular material layer 139 specularMaterialLayer ims.logical.material.specularMaterialLaye
r
diffuse material layer 140 diffuseMaterialLayer ims.logical.material.diffuseMaterialLayer
metallic material layer 141 metallicMaterialLayer ims.logical.material.metallicMaterialLaye
r
sheen material layer 142 sheenMaterialLayer ims.logical.material.sheenMaterialLayer
layered material 143 layered ims.logical.material.layered
material layer group 144 materialLayerGroup ims.logical.material.materialLayerGroup
shadow catcher 145 shadowCatcher ims.logical.material.shadowCatcher
hair material 147 hair ims.logical.material.hair
null material 159 null ims.logical.material.null
clipping material 178 clipping ims.logical.material.clipping
6.2.2.35.3.2.3 Texture nodes system
Table 4Table 4 specifies the node names and corresponding labels that comprise the texture nodes system
within the IMS. IMS labels for texture nodes are identified by an IMS prefix of ‘ims.logical.texture’. The ITMF
Data Encoding Specification provides additional information regarding usage of the texture nodes system.
Annex AAnnex A of this document provides information about how to use the ITMF Data Encoding
Specification with the IMS.
Table 4 — Nodes for texture nodes system
Node ITMF node Node label IMS label
code point
turbulence texture 22 turbulence ims.logical.texture.turbulence
grayscale colour 31 grayscaleColour ims.logical.texture.grayscaleColour
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Node ITMF node Node label IMS label
code point
gaussian spectrum 32 gaussianSpectrum ims.logical.texture.gaussianSpectrum
RGB colour 33 rgbColour ims.logical.texture.rgbColour
RGB image 34 rgbImage ims.logical.texture.rgbImage
alpha image 35 alphaImage ims.logical.texture.alphaImage
grayscale image 36 grayscaleImage ims.logical.texture.grayscaleImage
mix texture 38 mix ims.logical.texture.mix
multiply texture 39 multiply ims.logical.texture.multiply
cosine mix texture 40 cosineMix ims.logical.texture.cosineMix
clamp texture 41 clamp ims.logical.texture.clamp
saw wave texture 42 sawWave ims.logical.texture.sawWave
triangle wave texture 43 triangleWave ims.logical.texture.triangleWave
sine wave texture 44 sineWave ims.logical.texture.sineWave
checks texture 45 checks ims.logical.texture.checks
invert texture 46 invert ims.logical.texture.invert
marble texture 47 marble Ims.logical.texture.marble
rigid fractal texture 48 rigidFractal ims.logical.texture.rigidFractal
gradient map texture 49 gradientMap ims.logical.texture.gradientMap
falloff map 50 falloffMap ims.logical.texture.falloffMap
colour correction 51 colourCorrection ims.logical.texture.colourCorrection
dirt texture 63 dirt ims.logical.texture.dirt
cylindrical 74 cylindrical ims.logical.texture.cylindrical
linear XYZ to UVW 75 linearXYZtoUVW ims.logical.texture.linearXYZtoUVW
perspective 76 perspective ims.logical.texture.perspective
spherical 77 spherical ims.logical.texture.spherical
mesh UV 78 meshUV ims.logical.texture.meshUV
box 79 box ims.logical.texture.box
displacement 80 displacement ims.logical.texture.displacement
random colour texture 81 randomColour ims.logical.texture.randomColour
noise texture 87 noise ims.logical.texture.noise
polygon side 89 polygonSide ims.logical.texture.polygonSide
vertex displacement 97 vertexDisplacement ims.logical.texture.vertexDisplacement
W-coordinate 104 wCoordinate ims.logical.texture.wCoordinate
add texture 106 add ims.logical.texture.add
comparison 107 comparison ims.logical.texture.comparison
subtract texture 108 subtract ims.logical.texture.subtract
triplanar map 109 triplanarMap ims.logical.texture.triplanarMap
© ISO #### /IEC 2025 – All rights reserved
ISO/IEC DIS FDIS 23090-28:20242025(en)
Node ITMF node Node label IMS label
code point
triplanar 111 triplanar ims.logical.texture.triplanar
instance colour 113 instanceColour ims.logical.texture.instanceColour
instance range 114 instanceRange ims.logical.texture.instanceRange
baking texture 115 baking ims.logical.texture.baking
OSL texture 117 oslTexture ims.logical.texture.oslTexture
UVW transform 118 uvwTransform ims.logical.texture.uvwTransform
OSL 125 osl ims.logical.texture.osl
OSL delayed UV 127 oslDelayedUV ims.logical.texture.oslDelayedUV
image tiles 131 imageTiles ims.logical.texture.imageTiles
colour vertex attribute 135 colourVertexAttribute ims.logical.texture.colourVertexAttribu
texture te
grayscale colour vertex 136 grayscaleColourVertexAttribut ims.logical.texture.grayscaleColourVert
attribute texture e exAttribute
vertex displacement 151 vertexDisplacementMixer ims.logical.texture.vertexDisplacement
mixer Mixer
spotlight distribution 158 spotlightDistribution ims.logical.texture.spotlightDistributio
n
Cinema4D noise texture 162 cinema4dNoise ims.logical.texture.cinema4dNoise
chaos texture 170 chaos ims.logical.texture.chaos
channel picker 171 channelPicker ims.logical.texture.channelPicker
channel merger 172 channelMerger ims.logical.texture.channelMerger
ray switch 173 raySwitch ims.logical.texture.raySwitch
channel inverter 174 channelInverter ims.logical.texture.channelInverter
channel mapper 175 channelMapper ims.logical.texture.channelMapper
composite texture 176 composite ims.logical.texture.composite
mesh UV 177 meshUV ims.logical.texture.meshUV
iridescent texture 187 iridescent ims.logical.texture.iridescent
volume to texture 256 volumeToTexture ims.logical.texture.volumeToTexture
colour to UVW 258 colourToUVW ims.logical.texture.colourToUVW
smooth Voronoi contours 260 smoothVoronoiContours ims.logical.texture.smoothVoronoiCont
texture ours
tile patterns texture 261 tilePatterns ims.logical.texture.tilePatterns
procedural effects texture 262 proceduralEffects ims.logical.texture.procedurealEffects
chainmail texture 263 chainmail ims.logical.texture.chainmail
Moire mosaic texture 264 moireMosaic ims.logical.texture.moireMosaic
colour squares texture 265 colourSquares ims.logical.texture.colourSquares
stripes texture 266 stripes ims.logical.texture.stripes
flakes texture 267 flakes ims.logical.texture.flakes
© ISO/IEC 2024 2025 – All rights reserved
ISO/IEC FDIS 23090-28:2025(en)
Node ITMF node Node label IMS label
code point
fractal texture 268 fractal ims.logical.texture.fractal
hagelslag texture 269 hagelslag ims.logical.texture.hagelslag
glowing circle texture 270 glowingCircle ims.logical.texture.glowingCircle
curvature texture 271 curvature ims.logical.texture.curvature
cell noise texture 275 cellNoise ims.logical.texture.cellNoise
sample position to UV 317 samplePositionToUV ims.logical.texture.samplePositionToU
V
composite texture layer 318 compositeTextureLayer ims.logical.texture.compositeTextureLa
yer
floats to colour 320 floatsToColour ims.logical.texture.floatsToColour
float3 to colour 321 float3ToColour ims.logical.texture.float3ToColour
distorted mesh UV 322 distortedMeshUV ims.logical.texture.distortedMeshUV
capture to custom AOV 323 captureToCustomAOV ims.logical.texture.captureToCustomA
OV
float to greyscale 324 floatToGreyscale ims.logical.texture.floatToGreyscale
ray direction 326 rayDirection ims.logical.texture.rayDirection
normal texture 327 normal Ims.logical.texture.normal
position texture 328 position ims.logical.texture.position
relative distance 329 relativeDistance ims.logical.texture.relativeDistance
UV coordinate 330 uvCoordinate ims.logical.texture.uvCoordinate
Z depth 331 zDepth ims.logical.texture.zDepth
gradient generator 332 gradientGenerator ims.logical.texture.gradientGenerator
texture
random map 333 randomMap ims.logical.texture.randomMap
range t
...




















Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...