Space data and information transfer systems — Digital motion imagery

The purpose of this document is to provide a common reference and framework of standards for digital motion video and imagery, and to provide recommendations for utilization of international standards for sharing or distributing motion video and imagery between spacecraft elements and ground systems. The scope of this document includes traditional real-time streaming video and television, including human and robotic spacecraft-to-spacecraft and spacecraft-to-ground systems, as well as video recorded and distributed later, either as a real-time stream or as a file transfer. In this context, real-time streaming includes all modes where video is sent from a spacecraft in a continuous stream and is intended for immediate use when received, regardless of the latency of the transmission path. Other specialized motion imagery applications, such as high-speed scientific motion imagery and multi-spectral motion imagery, are not addressed in this document. However, if a specialized imagery camera system has a requirement to interface to spacecraft systems in a video mode, it would be required to match these interfaces. Ground-systems-to-ground-systems video distribution is obviously a key component of the entire video system. However, this is not the primary focus of this document. Currently, there are significant differences in the ways mission video products are exchanged between the various space agencies on the ground. This is the result of differences in network topologies between space agencies, and agreements for video sharing. Those differences preclude there being a standard methodology for delivering video imagery between agencies. Prior to the commencement of video transmission between space agencies, system design reviews and performance testing should be done between the ground systems in use to assure operability when video imagery comes from spacecraft.

Données spatiales et systèmes de transfert d'information - Imagerie du mouvement numérique

General Information

Status
Published
Publication Date
29-Jun-2021
Current Stage
6060 - International Standard published
Start Date
30-Jun-2021
Due Date
14-Sep-2020
Completion Date
30-Jun-2021
Ref Project

Relations

Standard
ISO 21077:2021 - Space data and information transfer systems — Digital motion imagery Released:6/30/2021
English language
34 pages
sale 15% off
Preview
sale 15% off
Preview

Standards Content (Sample)


INTERNATIONAL ISO
STANDARD 21077
Second edition
2021-06
Space data and information transfer
systems — Digital motion imagery
Données spatiales et systèmes de transfert d'information - Imagerie
du mouvement numérique
Reference number
©
ISO 2021
© ISO 2021
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting
on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address
below or ISO’s member body in the country of the requester.
ISO copyright office
CP 401 • Ch. de Blandonnet 8
CH-1214 Vernier, Geneva
Phone: +41 22 749 01 11
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland
ii © ISO 2021 – All rights reserved

Foreword
ISO (the International Organization for Standardization) is a worldwide federation of national
standards bodies (ISO member bodies). The work of preparing International Standards is normally
carried out through ISO technical committees. Each member body interested in a subject for which a
technical committee has been established has the right to be represented on that committee.
International organizations, governmental and non-governmental, in liaison with ISO, also take part
in the work. ISO collaborates closely with the International Electrotechnical Commission (IEC) on all
matters of electrotechnical standardization.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for
the different types of ISO documents should be noted (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO shall not be held responsible for identifying any or all such patent rights. Details
of any patent rights identified during the development of the document will be in the Introduction
and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
expressions related to conformity assessment, as well as information about ISO's adherence to the
World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT) see
www.iso.org/iso/foreword.html.
This document was prepared by the Consultative Committee for Space Data Systems (CCSDS) (as
CCSDS 766.1-B-2, August 2016) and drafted in accordance with its editorial rules. It was assigned to
Technical Committee ISO/TC 20, Space vehicles, Subcommittee SC 13, Space data and information
transfer systems and adopted under the “fast-track procedure”.
This second edition cancels and replaces the first edition (ISO 21077:2016), which has been
technically revised.
The main changes compared to the previous edition are as follows:
— adds support for MPEG4 recording and JPEG2000 transmission.
Any feedback or questions on this document should be directed to the user’s national standards body.
A complete listing of these bodies can be found at www.iso.org/members.html.
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
CONTENTS
Section Page
1 INTRODUCTION . 1-1

1.1 PURPOSE AND SCOPE . 1-1
1.2 APPLICABILITY . 1-1
1.3 NOMENCLATURE . 1-1
1.4 REFERENCES . 1-2

2 OVERVIEW . 2-1

3 SPECIFICATION . 3-1

3.1 OVERVIEW . 3-1
3.2 GENERAL . 3-1
3.3 INTERFACE STANDARDS . 3-1
3.4 VIDEO FORMAT AND CHARACTERISTICS . 3-3
3.5 AUDIO . 3-11
3.6 REAL-TIME VIDEO ENCAPSULATION AND TRANSMISSION . 3-11
3.7 RECORDED VIDEO AND AUDIO . 3-12
3.8 DISTRIBUTION OF VIDEO DATA . 3-13

ANNEX A PROTOCOL IMPLEMENTATION CONFORMANCE
STATEMENT (PICS) PROFORMA (NORMATIVE) . A-1
ANNEX B SECURITY, SANA, AND PATENT CONSIDERATIONS
(INFORMATIVE) .B-1
ANNEX C DTN BUNDLE PROTOCOL FOR VIDEO TRANSMISSION
(INFORMATIVE) . C-1
ANNEX D INFORMATIVE REFERENCES (INFORMATIVE) . D-1
ANNEX E ABBREVIATIONS (INFORMATIVE) .E-1
Figure
3-1 Video System Elements—Non-Compressed Video Design . 3-10
3-2 Video System Elements—Compressed Video Design . 3-10

CCSDS 766.1-B-2 Page vi August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
1 INTRODUCTION
1.1 PURPOSE AND SCOPE
The purpose of this document is to provide a common reference and framework of standards
for digital motion video and imagery, and to provide recommendations for utilization of
international standards for sharing or distributing motion video and imagery between
spacecraft elements and ground systems.
The scope of this document includes traditional real-time streaming video and television,
including human and robotic spacecraft-to-spacecraft and spacecraft-to-ground systems, as well
as video recorded and distributed later, either as a real-time stream or as a file transfer. In this
context, real-time streaming includes all modes where video is sent from a spacecraft in a
continuous stream and is intended for immediate use when received, regardless of the latency
of the transmission path. Other specialized motion imagery applications, such as high-speed
scientific motion imagery and multi-spectral motion imagery, are not addressed in this
document. However, if a specialized imagery camera system has a requirement to interface to
spacecraft systems in a video mode, it would be required to match these interfaces.
Ground-systems-to-ground-systems video distribution is obviously a key component of the
entire video system. However, this is not the primary focus of this document. Currently,
there are significant differences in the ways mission video products are exchanged between
the various space agencies on the ground. This is the result of differences in network
topologies between space agencies, and agreements for video sharing. Those differences
preclude there being a standard methodology for delivering video imagery between agencies.
Prior to the commencement of video transmission between space agencies, system design
reviews and performance testing should be done between the ground systems in use to assure
operability when video imagery comes from spacecraft.
1.2 APPLICABILITY
This document is a CCSDS Recommended Standard. It is intended for all missions that
produce, consume, or transcode video imagery from low-bandwidth video such as web
streaming through high-bandwidth video such as high-definition television imagery.
1.3 NOMENCLATURE
1.3.1 NORMATIVE TEXT
The following conventions apply for the normative specifications in this Recommended
Standard:
a) the words ‘shall’ and ‘must’ imply a binding and verifiable specification;
b) the word ‘should’ implies an optional, but desirable, specification;
c) the word ‘may’ implies an optional specification;
CCSDS 766.1-B-2 Page 1-1 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
d) the words ‘is’, ‘are’, and ‘will’ imply statements of fact.
NOTE – These conventions do not imply constraints on diction in text that is clearly
informative in nature.
1.3.2 INFORMATIVE TEXT
In the normative sections of this document, informative text is set off from the normative
specifications either in notes or under one of the following subsection headings:
– Overview;
– Background;
– Rationale;
– Discussion.
1.4 REFERENCES
The following publications contain provisions which, through reference in this text,
constitute provisions of this document. At the time of publication, the editions indicated
were valid. All publications are subject to revision, and users of this document are
encouraged to investigate the possibility of applying the most recent editions of the
publications indicated below. The CCSDS Secretariat maintains a register of currently valid
CCSDS publications.
[1] Studio Encoding Parameters of Digital Television for Standard 4:3 and Wide Screen
16:9 Aspect Ratios. ITU-R BT.601-7. Geneva: ITU, 2011.
[2] Television—SDTV Digital Signal/Data—Serial Digital Interface. SMPTE ST
259:2008. White Plains, New York: SMPTE, 2008.
[3] Digital Interfaces for HDTV Studio Signals. ITU-R BT.1120-8. Geneva: ITU, 2012.
[4] 1.5 Gb/s Signal/Data Serial Interface. SMPTE ST 292-1:2012. White Plains, New
York: SMPTE, 2012.
[5] High-Definition Multimedia Interface Specification. Version 1.4. Sunnyvale,
California: HDMI Licensing, LLC, 2009.
[6] Electrical Characteristics of Low Voltage Differential Signaling (LVDS) Interface
Circuits. Revision A. TIA/EIA-644-A. Arlington, Virginia: TIA, February 2001.
[7] Serial Digital Interface-Based Transport Interface for Compressed Television Signals
in Networked Television Production Based on Recommendation ITU-R BT.1120. ITU-
R BT.1577. Geneva: ITU, 2002.
CCSDS 766.1-B-2 Page 1-2 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
[8] Television—Serial Data Transport Interface (SDTI). SMPTE ST 305:2005. White
Plains, New York: SMPTE, 2005.
[9] Teletext Systems. ITU-R BT.653-3. Geneva: ITU, 1998.
[10] Television—Time and Control Code. SMPTE ST 12-1:2008. White Plains, New York:
SMPTE, 2008.
[11] Television—Transmission of Time Code in the Ancillary Data Space. SMPTE ST 12-
2:2008. White Plains, New York: SMPTE, 2008.
[12] Ancillary Data Packet and Space Formatting. SMPTE ST 291:2011. White Plains,
New York: SMPTE, 2011.
[13] Vertical Ancillary Data Mapping of Caption Data and Other Related Data. SMPTE
ST 334-1:2007. White Plains, New York: SMPTE, 2007.
[14] Metadata Element Dictionary Structure. SMPTE ST 335:2012. White Plains, New
York: SMPTE, 2012.
[15] Metadata Dictionary Registry of Metadata Element Descriptions. SMPTE RP
210.10:2007. White Plains, New York: SMPTE, 2007.
[16] Ultra High Definition Television—Mapping into Single-link or Multi-link 10 Gb/s
Serial Signal/Data Interface. SMPTE ST 2036-3:2010. White Plains, New York:
SMPTE, 2010.
[17] 1280×720, 16:9 Progressively-Captured Image Format for Production and
International Programme Exchange in the 60 Hz Environment. ITU-R BT.1543.
Geneva: ITU, 2001.
[18] 1280 x 720 Progressive Image 4:2:2 and 4:4:4 Sample Structure—Analog and Digital
Representation and Analog Interface. SMPTE ST 296:2012. White Plains, New York:
SMPTE, 2012.
[19] Parameter Values for the HDTV Standards for Production and International
Programme Exchange. ITU-R BT.709-5. Geneva: ITU, 2002.
[20] Television—1920 x 1080 Image Sample Structure, Digital Representation and Digital
Timing Reference Sequences for Multiple Picture Rates. SMPTE ST 274:2008. White
Plains, New York: SMPTE, 2008.
[21] Dual Link 1.5 Gb/s Digital Interface for 1920 x 1080 and 2048 x 1080 Picture Frames.
SMPTE ST 372:2011. White Plains, New York: SMPTE, 2011.
[22] Television—3 Gb/s Signal/Data Serial Interface. SMPTE ST 424:2006. White Plains,
New York: SMPTE, 2006.
CCSDS 766.1-B-2 Page 1-3 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
[23] Ultra High Definition Television—Image Parameter Values for Program Production.
SMPTE ST 2036-1:2009. White Plains, New York: SMPTE, 2009.
[24] Ultra High Definition Television—Audio Characteristics and Audio Channel Mapping for
Program Production. SMPTE ST 2036-2-2008. White Plains, New York: SMPTE, 2008.
[25] 2048 × 1080 and 4096 × 2160 Digital Cinematography Production Image Formats
FS/709. SMPTE ST 2048-1:2011. White Plains, New York: SMPTE, 2011.
[26] 2048 × 1080 Digital Cinematography Production Image FS/709 Formatting for Serial
Digital Interface. SMPTE ST 2048-2:2011. White Plains, New York: SMPTE, 2011.
[27] Parameter Values for Ultra-High Definition Television Systems for Production and
International Programme Exchange. ITU-R BT.2020-1. Geneva: ITU, 2014.
[28] Information Technology—Coding of Audio-Visual Objects—Part 10: Advanced Video
Coding. 8th ed. International Standard, ISO/IEC 14496-10:2014. Geneva: ISO, 2014.
[29] Advanced Video Coding for Generic Audiovisual Services. ITU-T H.264. Geneva:
ITU, 2012.
[30] Data Services in Digital Television Broadcasting. ITU-R BT.1301-1. Geneva: ITU, 2011.
[31] Interface for Digital Component Video Signals in 525-Line and 625-Line Television
Systems Operating at the 4:2:2 Level of Recommendation ITU-R BT.601. ITU-R
BT.656-5. Geneva: ITU, 2007.
[32] Information Technology—JPEG 2000 Image Coding System: Motion JPEG 2000. 2nd
ed. International Standard, ISO/IEC 15444-3:2007. Geneva: ISO, 2007.
[33] Information Technology—Generic Coding of Moving Pictures and Associated Audio
Information—Part 7: Advanced Audio Coding (AAC). 4th ed. International Standard,
ISO/IEC 13818-7:2006. Geneva: ISO, 2006.
[34] Digital Audio Interface—Part 3: Consumer Applications. Edition 3.1 (2009-12-10).
IEC 60958-3:2006+AMD1:2009 CSV. Geneva: IEC, 2009.
[35] IP over CCSDS Space Links. Issue 1. Recommendation for Space Data System Standards
(Blue Book), CCSDS 702.1-B-1. Washington, D.C.: CCSDS, September 2012.
[36] J. Postel. User Datagram Protocol. STD 6. Reston, Virginia: ISOC, August 1980.
[37] CCSDS File Delivery Protocol (CFDP). Issue 4. Recommendation for Space Data
System Standards (Blue Book), CCSDS 727.0-B-4. Washington, D.C.: CCSDS,
January 2007.
[38] Transport of JPEG 2000 Broadcast Profile Video in MPEG-2 TS over IP.
VSF TR-01 2013-04-15. New Jersey: Video Services Forum, April 15, 2013.
CCSDS 766.1-B-2 Page 1-4 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
2 OVERVIEW
In the early days of human spaceflight, motion imagery was accomplished with motion
picture film cameras, set at varying frame rates depending on lighting conditions. Upon safe
return the film was processed and eventually shared with the world via documentaries or
television. Inevitably live video became operationally desirable for situational awareness
and to satisfy the public’s interest in high-profile events such as the Moon landings or the
Apollo-Soyuz test project. Compromises were made with those first video systems to fit
within the constraints of bandwidth, avionics, and transmission systems. Even in the modern
era, video systems on spacecraft are a hybrid of analog and digital systems, typically made to
work within the existing spacecraft’s avionics, telemetry, and command/control systems.
With the advent of digital cameras, encoding algorithms, and modulation techniques, it is
desirable to treat video as data and to utilize commercially available technologies to capture
and transmit live and recorded motion imagery, possibly in High Definition (HD) or even
better. Thus the Recommended Standard addresses:
– Video Interfaces and Characteristics
– Video Formats and Characteristics
Video data has a number of characteristics which need specification such as frame
rate, aspect ratio, bandwidth and compression standards, color sampling, the inclusion
of audio, etc.
– Encapsulation and Transmission Protocols
Video data needs to be encapsulated, transported, and distributed. Although the
choice of mechanisms and protocols may not be specific to video data, certain aspects
need addressing because of the high bandwidth typically required for video. Thus
this part will address encapsulation schemes (e.g., IP), transport protocols, and use of
CCSDS Encapsulation Packets.
– Interoperability of Standards
Future Human Spaceflight endeavors are expected to be collaborations between many
agencies, with complex interactions between spacecraft and non-Earth surface
systems, with intermediate locations (EVA crew, habitats, etc.) requiring the ability
to view video generated by another agency’s systems. Therefore interoperability
between these systems will be essential to mission success and in some cases crew
safety. Such interoperability will only be achieved by use of common references and
joint agreement on international standards, either commercial or CCSDS or a
combination of the two.
This Recommended Standard does not cover video quality. The intention of this document is
to provide a framework of standards to ensure interoperability, not to define a level of
quality. What is acceptable video quality varies widely with the application and
requirements of users. A science experiment, for example, may have video quality
requirements beyond what is available, or practical, within a spacecraft avionics system. The
CCSDS 766.1-B-2 Page 2-1 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
science team for that experiment might elect to record video on board at high quality and
transfer that video as a digital file after the conclusion of the experiment run. They might
elect to do that and have a real-time downlink of lesser quality as a confirmation the
experiment is working properly. A requirement for real-time video to support a docking
event might sacrifice spatial resolution to lower the latency of the real-time video feed.
Within the parameters listed in this document and the capabilities of any given spacecraft,
users and controllers can determine how equipment should be configured for the best match
to requirements.
CCSDS 766.1-B-2 Page 2-2 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3 SPECIFICATION
3.1 OVERVIEW
There are many system configurations that can be implemented in spacecraft video systems.
Choices of interface standards, resolutions, and frame rates are based on the application, user
requirements, available equipment, and spacecraft capability. There are multiple ways for
signals to flow from the image source through to the spacecraft avionics system and on to the
ground (see figures 3-1 and 3-2). Application of this Recommended Standard limits the
overall number of options by limiting the interfaces to those that are in most common use. It
should be noted that, while scientific imaging systems are excluded from this Recommended
Standard, should a scientific imaging system need to interface to the spacecraft video system,
the same interfaces would apply to them. It would be the responsibility of the user to provide
a matching interface from the user’s imaging system.
3.2 GENERAL
Users shall select from the following interfaces and standards when designing and
implementing new video systems for spacecraft.
3.3 INTERFACE STANDARDS
3.3.1 NON-COMPRESSED STANDARD DEFINITION TELEVISION SIGNALS
The interface for non-compressed Standard Definition (SD) television signals shall be Serial
Digital Interface (SDI), conforming to
– ITU-R BT.601-7 (reference [1]);
– SMPTE ST 259:2008 (reference [2]).
3.3.2 NON-COMPRESSED HIGH DEFINITION TELEVISION SIGNALS
The interface used for non-compressed high definition television signals shall be one of the
following:
– High Definition-Serial Digital Interface (HD-SDI), conforming to
• ITU-R BT.1120-8 (reference [3]);
• SMPTE ST 292-1:2012 (reference [4]);
– High Definition Multimedia Interface (HDMI) 1.4 or higher, as defined by the HDMI
Founders and licensed by HDMI Licensing, LLC (reference [5]);
– Camera Link Low Voltage Differential Signaling (LVDS) Interface Standard, as
defined by the Camera Link Participating Companies (reference [6]).
CCSDS 766.1-B-2 Page 3-1 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.3.3 COMPRESSED DIGITAL TELEVISION SIGNALS
The interface used for compressed digital television signals shall be Digital Video
Broadcasting-Asynchronous Interface (DVB-ASI), conforming to
– ITU-R BT.1577 (reference [7]);
– SMPTE ST 305:2005 (reference [8]).
NOTE – DVB-ASI would be used with compressed digital video while still in the serial
digital domain. For interfacing to spacecraft systems, Internet Protocol (IP) (see
3.6) is the preferred interface.
3.3.4 TELEVISION TIME CODE AND METADATA
3.3.4.1 Television time code and metadata may be inserted in non-compressed video. If
time codes and/or metadata are inserted into non-compressed video, one of the following
standards shall be used:
– ITU-R BT.653-3 (reference [9]);
– SMPTE ST 12-1:2008 (reference [10]);
– SMPTE ST 12-2:2008 (reference [11]);
– SMPTE ST 291:2011 (reference [12]);
– SMPTE ST 292-1-2012 (reference [4]);
– SMPTE ST 334-1:2007 (reference [13]);
– SMPTE ST 335:2012 (reference [14]);
– SMPTE RP 210.10:2007 (reference [15]);
– SMPTE ST 2036-3:2012 (reference [16]).
NOTE – The standards listed above are primarily concerned with the serial digital
standard-definition and high-definition interfaces listed in 3.3.1 and 3.3.2.
Metadata inserted at a camera conforming to HDMI or Camera Link interfaces
conform to the serial digital interfaces when those signals are converted.
3.3.4.2 Compressed video signals in 3.3.3, per the standards listed in 3.3.3, shall carry all
television time code and metadata information inserted into a non-compressed video stream.
CCSDS 766.1-B-2 Page 3-2 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.4 VIDEO FORMAT AND CHARACTERISTICS
3.4.1 VIDEO RESOLUTIONS
3.4.1.1 Overview
Traditionally, video resolution has been categorized as low resolution, standard definition,
high definition, or high resolution. Low resolution is generally defined as less than
640 × 480, standard definition as 640 × 480 and 768 × 576, high definition as 1280 × 720
and 1920 × 1080, and high resolution as anything beyond 1920 × 1080 such as 4K and 8K
resolutions. Low resolution was used for streamed Internet video. Standard definition was
used for broadcast (pre-HD) and security camera systems. High definition was limited to
high-end television broadcast. High resolution was practically non-existent unless it was
film based. Now, however, the distinctions are less clear. Laptop computer cameras are now
often high definition, with options to stream from 320 × 240 up to 1280 × 720. Standard
definition is now in limited use for broadcast television, web streaming, and monitoring
applications. High definition has become the norm for broadcast and cable television. High
resolution or ultra-high-definition cameras are replacing 35mm motion picture film for
imaging requirements beyond HD. Therefore it is more difficult to classify video in terms of
resolutions than in terms of application. A given application can have a broad range of
resolutions, depending upon the requirements of the user, available equipment, and
bandwidth constraints. The specifications below reflect the diversity of choices available for
video systems. Higher resolution applications (e.g., ‘public affairs’, critical operations) can
be used to fulfill lower resolution applications (e.g., ‘personal video conferencing’).
3.4.1.2 Personal Video Conferencing
Personal video conferencing video resolution should be selected from the following range:
– 320 × 240 to 1280 × 720, progressive scan.
NOTE – Selection of resolution is dependent on immediate requirement and available
bandwidth.
3.4.1.3 Medical Conferencing
Medical conferencing video resolution should be selected from the following range:
– 320 × 240 to 1280 × 720, bandwidth-dependent progressive or interlace scan:
• standard definition legacy systems may be 525 or 576 interlace;
• 640 × 480 and 768 × 576 systems shall conform to ITU-R BT.601-7
(reference [1]) or SMPTE ST 259:2008 (reference [2]).
CCSDS 766.1-B-2 Page 3-3 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
NOTE – Lower resolution personal video conferencing and medical conferencing
applications are most likely to be performed using a personal computer or tablet-
type device. All video encoding would be handled internally. Connection to the
spacecraft avionics system would be through wired or wireless data connections
independent of any video systems. Transmission to the ground would also be
handled as part of standard data protocols and also independent of dedicated
video transmission.
3.4.1.4 Situational Awareness
Situational awareness video resolution should be selected from the following range:
– 640 × 480 to 1280 × 720, bandwidth dependent:
• interlace scan for legacy SD systems shall conform to
▫ ITU-R BT.601-7 (reference [1]); or
▫ SMPTE ST 259:2008 (reference [2]);
• progressive scan for HD systems shall conform to
▫ ITU-R BT.1543 (reference [17]); or
▫ SMPTE ST 296:2011 (reference [18]).
NOTE – Situational awareness may be required in situations where only low-bandwidth
transmission is available, such as S-Band, which would likely limit resolution to
as low as 320 × 240. In cases such as this, best effort is acceptable. The
requirement to have visual confirmation of events may be higher than a specific
resolution. This should be considered the exception and not the norm.
3.4.1.5 Public Affairs
3.4.1.5.1 Public affairs video resolution should be selected from the following range:
– 640 × 480 to 1280 × 720, bandwidth dependent:
• Interlace scan for legacy SD systems shall conform to
▫ ITU-R BT601-7 (reference [1]); or
▫ SMPTE ST 259:2008 (reference [2]);
• Progressive scan for HD systems shall conform to
▫ ITU-R BT.1543 1280 (reference [17]); or
▫ SMPTE ST 296:2011 (reference [18]).
CCSDS 766.1-B-2 Page 3-4 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.4.1.5.2 Multiple resolutions may be used to accommodate mission requirements.
NOTE – There are situations where HD formats are not required. It saves considerable
bandwidth to use SD systems. Whether these are US or European standard
resolutions and frame rates is not an issue. Regardless of the actual video format,
the interface standards allow virtually all current equipment to route and encode
the video. Once encoded and packetized, it is not an issue for spacecraft avionics
as the video is compatible data packets. That part of the system is format
agnostic. Regardless of the interface chosen for a particular spacecraft, routing
and encoding utilize the same components.
3.4.1.6 High Resolution Digital Imaging
3.4.1.6.1 High resolution digital imaging video resolution should have a minimum
resolution of 1920 × 1080, progressive scan:
– 1080 HD systems shall conform to
• ITU-R BT.709-5 (reference [19]); or
• SMPTE ST 274:2008 (reference [20]);
– Up to 30 FPS systems shall conform to
• ITU-R BT.1120-8 (reference [3]); or
• SMPTE ST 292-1:2012 (reference [4]);
– Above 30 FPS shall conform to
• ITU-R BT.1120-8 (reference [3]); or
• SMPTE ST 372:2011 (reference [21]); or
• SMPTE ST 424:2006 (reference [22]).
3.4.1.6.2 Systems above 1920 × 1080 shall conform to
– SMPTE ST 2036 Standards Suite, ST 2036 1–3:
• ST 2036-1:2009 Image Parameter Values for Program Production—Ultra High
Definition Television (reference [23]);
• ST 2036-2:2008 Ultra High Definition Television—Audio Characteristics and
Audio Channel Mapping for Program Production (reference [24]);
• ST 2036-3:2010 Mapping into Single-link or Multi-link 10 Gb/s—Ultra High
Definition Television Serial Signal/Data Interface (reference [16]);
CCSDS 766.1-B-2 Page 3-5 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
– SMPTE ST 2048 Standards Suite ST 2048-1–2:
• ST 2048-1:2011 2048 × 1080 and 4096 × 2160 Digital Cinematography
Production Image Formats FS/709 (reference [25]);
• ST 2048-2:2011 2048 × 1080 Cinematography Production Image FS/709
Formatting for Serial Digital Interface (reference [26]);
– ITU-R BT.2020-1 (06/2014) Parameter Values for Ultra-High Definition Television
Systems for Production and International Programme Exchange (reference [27]).
NOTE – 1920 × 1080 and above is to accommodate users with special requirements.
Typically, these systems will have on-board recording and downlink video as file
transfers. Any real-time requirement will include that the video system provide a
compatible signal to spacecraft video systems.
3.4.1.7 Spacecraft to Spacecraft
Spacecraft-to-spacecraft video resolution should follow 3.4.1.2–3.4.1.6.
NOTE – Selection of spacecraft-to-spacecraft video resolution is dependent on mission
requirements.
3.4.2 FRAME RATE
3.4.2.1 Video frame rates shall be selected from the following ranges for the following
applications:
a) personal video conferencing: 10 – 60 Frames Per Second (FPS);
b) medical video: 10 – 60 FPS;
c) situational awareness: 25 – 60 FPS;
d) public affairs: 24, 25, or 60 FPS;
e) high resolution digital imaging: 24 – 120 FPS.
NOTE – These are considered optimum frame rates for these applications. However,
bandwidth constraints may not allow even the lower frame rates to be utilized. In
these cases, best effort should be made to accommodate the recommendations
based on available bandwidth for imaging applications.
3.4.2.2 Spacecraft-to-spacecraft frame rates should be selected from 3.4.2.1 a)–e), above,
depending on application.
CCSDS 766.1-B-2 Page 3-6 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
NOTES
1 Spacecraft-to-spacecraft frame rates are dependent on mission requirements.
2 The listing of specific video applications above does not necessarily imply discrete
equipment sets dedicated for each application. Most cameras, for example, can be
used at multiple resolutions and frame rates allowing them to be used for multiple
applications.
3.4.3 ASPECT RATIO
Aspect ratio of original material shall be maintained from origination through delivery to end
user.
NOTE – By definition within industry standards, HDTV resolution video has an aspect
ratio of 16:9.
3.4.4 VIDEO COMPRESSION
3.4.4.1 Overview
The two compression standards listed below have different applications. MPEG-4 Part 10 is
primarily intended for real-time applications where live, or nearly live, video needs to be
monitored at a ground location during an event or experiment. MPEG-4 may also be used
for recording applications where the quality level is determined to be sufficient. JPEG2000
is intended for requirements for higher quality or where each individual frame needs to be
maintained intact. The data rate required for JPEG2000 would normally preclude JPEG2000
from being used for live transmission. The normal operating mode for JPEG2000 is to
record the video and downlink it later as a data file. However, if the bandwidth is available,
live transmission of JPEG2000 offers very low latency and may be preferable for operations
where low video latency is preferable.
3.4.4.2 Compression Standards
The following video compression standards shall be used as indicated:
– MPEG-4 Part 10 (references [28] and [29]) for real-time transmission and recording:
• 0.5 to 25 Mb/s—application and user requirement-driven data rates;
• 8-bit sampling;
• constant bit rate or variable bit rate acceptable—defined by interface to spacecraft
system;
• Group of Pictures (GOP) from 1 – 30—defined by user requirement;
CCSDS 766.1-B-2 Page 3-7 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
• Constrained Baseline Profile for conferencing type applications (Personal and
Medical Video Conferencing);
• Main Profile for SD applications;
• High Profile for HD applications;
• metadata as required by user:
▫ shall conform to
▪ ITU-R BT.1301-1 (reference [30]); or
▪ SMPTE ST 291:2011 (reference [12]); or
▪ ITU-R BT.656-4 (reference [31]) for ancillary data; or
▪ SMPTE ST 335:2001 (reference [14]); or
▪ SMPTE RP 210.10:2007 (reference [15]);
▫ shall be read and passed by encode/decode systems;
▫ may include system status and control feedback data;
▫ may include embedded television time code conforming to
▪ ITU-R BT.1301-1 (reference [30]); or
▪ SMPTE ST 12-1:2008 (reference [10]); or
▪ SMPTE ST 12-2:2008 (reference [11]).
NOTE – Per specification and established practice, embedded television time
code is used as the time reference for the MPEG transport stream time
code value.
– JPEG2000 (reference [32]) for analysis and high-quality recording requirements for
video stored and transferred as files and real-time transmission:
• 45 to 140+ Mb/s—application and user requirement-driven;
• 10-bit (or greater) Sampling;
• metadata as required by user:
▫ shall conform to
▪ SMPTE ST 291:2011 (reference [12]); or
▪ ITU-R BT.653-3 (reference [9]) for ancillary data; or
▪ SMPTE ST 335:2001 (reference [14]); or
▪ SMPTE RP 210.10:2007 (reference [15]);
CCSDS 766.1-B-2 Page 3-8 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
▫ shall be read and passed by encode/decode systems;
▫ may include system status and control feedback data;
▫ may include embedded television time code.
3.4.5 COLOR SAMPLING
Color sampling should be as follows:
– 4:2:0 for real-time requirements;
– 4:2:2 for high resolution digital imaging:
• science and engineering;
• production and digital cinema applications;
– 4:4:4 for special applications.
3.4.6 DISCUSSION—VIDEO SYSTEM BLOCK DIAGRAMS
The diagrams below illustrate typical video system connectivity and what interfaces are
associated with each stage in the system for a typical human spaceflight video system. These
diagrams assume separate components for each of these functions. While the same functions
occur with the use of a laptop- or tablet-based video system used for medical or personal
video conferencing, they are internal with an IP connection to the spacecraft avionics system
for transmission.
CCSDS 766.1-B-2 Page 3-9 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
Video System Elements
Non-Compressed Video Design
(4)
Space-to-Ground
Ground Distribution
Communications
(3)
Video
Decoding/Recording
Spacecraft Avionics
Video Display
(2)
Video Video
Video
Recording Encoding Monitoring
(1) (1) (1)
1. SDI, Ref. 3.1.1.1, 3.1.1.2
2. DVB-ASI or IP, Ref 3.1.1.2, 3.4.1
Non-Compressed Video
3. Defined Outside MIA by
Switching
spacecraft system design
4. Defined Outside MIA by
(1)
spacecraft and communications
system design
Image Source (Camera)
Figure 3-1: Video System Elements—Non-Compressed Video Design
Video System Elements
Compressed Video Design
(4)
Space-to-Ground
Ground Distribution
Communications
(3)
Video
Decoding/Recording
Video
Video Spacecraft
Decode/
Recording Avionics
(2)
Monitoring
(2)
Video Display
(2)
1. SDI, Ref. 3.1.1, 3.1.2
Video Encoding
2. DVB-ASI or IP, Ref 3.1.2, 3.4.1
3. Defined Outside MIA by
spacecraft system design
(1)
4. Defined Outside MIA by
spacecraft and communications
Image Source
system design
(Camera)
Figure 3-2: Video System Elements—Compressed Video Design
CCSDS 766.1-B-2 Page 3-10 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.5 AUDIO
3.5.1 AUDIO AS PART OF VIDEO STREAM
Audio as part of a video stream should conform to the following standards:
– Advanced Audio Codec (AAC) (reference [33]) for compressed audio;
– AES/EBU-3 (reference [34]) for uncompressed audio.
NOTE – Generally, audio as part of a compressed bitstream for live applications
(MPEG-4) will be compressed. There are several different audio codecs
available. AAC was chosen as the highest quality, most widely used audio codec
for this application. Non-compressed AES/EBU-3 is another possibility
supported by a number of audio/video codecs. It is less common, but might be a
requirement based on user needs.
3.5.2 DISCUSSION—AUDIO SEPARATE FROM VIDEO STREAM
For those applications where audio is distributed separately from video, audio/video
synchronization (lip sync) is handled on the ground. Time stamps in the audio stream,
corresponding to time stamps in the video stream, are recommended to aid in synchronization.
3.6 REAL-TIME VIDEO ENCAPSULATION AND TRANSMISSION
3.6.1 INTERNET PROTOCOL TRANSPORT STREAM
3.6.1.1 MPEG-4-encoded video shall be formatted as Transport Stream (TS) with Packet
Identification (PID) for transport in IP datagrams.
3.6.1.2 JPEG2000 for transmission shall adhere to Video Services Forum (VSF) TR-01
Transport of JPEG2000 Broadcast Profile video in MPEG-2 TS over IP (reference [38]).
NOTE – TR-01 maps JPEG2000 encoding to an MPEG2 Transport Stream. This makes
JPEG2000 compatible with the same IP connections used for MPEG-4
compression.
3.6.1.3 IP datagrams shall be encapsulated for transmission over the CCSDS space link as
specified in reference [35].
It is anticipated the Voice Working Group standard will address this issue with standard practices to be
employed for audio/video synchronization.
Delay Tolerant Networking is being standardized as an internetworking layer for CCSDS missions. Future
missions may want to consider transmitting real-time video encoded with MPEG-4 or file-based video encoded
with JPEG2000via DTN Bundle Protocol (reference [D9]) bundles. Annex C presents a narrative of the classes
of real-time video transmission still under development by the DTN working group.
CCSDS 766.1-B-2 Page 3-11 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.6.2 ELEMENTARY STREAM
3.6.2.1 Real-time video and audio elementary streams may be transmitted via User
Datagram Protocol (UDP) (reference [36]).
3.6.2.2 IP datagrams containing User Datagrams with real-time video and audio elementary
streams shall be encapsulated for transmission over the CCSDS space link as specified in
reference [35].
3.6.3 JITTER AND BIT ERROR RATES
Real-time video delivery jitter and Bit Error Rate (BER) shall be limited as follows:
– jitter (packet delay variation) not to exceed 10 ms (assumes 300 ms decoder buffer);
−6
– BER not to exceed 1 × 10 .
NOTES
1 Use of elementary streams is possible for lower bandwidth video applications, such as
personal video conferencing. However, commercial hardware decoders do not
recognize elementary streams, so transport stream should be used exclusively for
video systems interfacing directly to spacecraft avionics. Also, audio cannot be
embedded with video when using elementary stream for compressed video transport.
Audio and video in an elementary stream are not synchronized for transmission.
2 Real Time Protocol (RTP) and Hypertext Transfer Protocol (HTTP) are commonly used
for video transmission. HTTP is common for computer-based applications, such as
family video conferencing. In this application it is not only acceptable, but may be the
only methodology available on that type of platform. However, for higher bandwidth
video transmission where transport streams are utilized, HTTP is not efficient. RTP is
acceptable, but UDP offers better performance over space based networks. The use of
TCP for video over space based networks is also not recommended.
3.7 RECORDED VIDEO AND AUDIO
3.7.1 ACQUISITION AND STORAGE OF VIDEO DATA
3.7.1.1 Recordings shall be file based.
NOTE – This is required to allow for transfer of recorded video data via established file
transfer methodologies. This standard does not dictate how an application might
create a video file, and specific file formats will vary based on systems being used.
3.7.1.2 Encoding shall be MPEG-4 or JPEG 2000, dependent upon the application.
3.7.1.3 Specific file formats will change based on systems being used. Data rates to be
used for recording shall be determined by user requirements.
CCSDS 766.1-B-2 Page 3-12 August 2016
CCSDS RECOMMENDED STANDARD FOR DIGITAL MOTION IMAGERY
3.7.2 FILE TRANSFER OF RECORDED VIDEO
3.7.2.1 Recorded File Transmission
Recorded video shall be transmitted as files via the CCSDS File Delivery Protocol (CFDP)
Class 1 or Class 2 (reference [37]).
3.7.2.2 Discussion—CFDP
CFDP supports four classes that are distinct from the three video classes:
– Class 1—Unreliable CFDP Transfer;
– Class 2—Reliable CFDP Transfer;
– Class 3—Unreliable Transfer Via One Or More Waypoints In Series;
– Class 4—Reliable Transfer Via One Or More Waypoints In Series.
When using CFDP to transfer video, one of the following mechanisms is used:
– Class 1—Unreliable CFDP transfer over a reliable UT layer;
– Class 2—Reliable CFDP.
3.8 DISTRIBUTION OF VIDEO DATA
3.8.1 REAL-TIME DISTRIBUTION
Real-time video shall be distributed as MPEG-4 program streams with resolution and frame
rate dependent on available bandwidth and user requirements.
3.8.2 DELAYED DISTRIBUTION
Video files should be distributed on the ground via established file distribution methodologies.
NOTE – For file transfer of video, this will likely be recorded JPEG 2000, as MPEG-4
will be used for real-time distribution.
3.8.3 END USER CAPABILITIES
End user capabilities should include:
– decoding
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.

Loading comments...