ASTM E2566-17a
(Test Method)Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity
Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity
SIGNIFICANCE AND USE
5.1 Various levels of visual acuity are essential when remotely operating robots in unstructured and often hazardous environments. Missions typically include establishing situational awareness, finding available paths, maneuvering through obstacles, identifying objects of interest, and performing detailed inspections. This test method measures robot system far-field and near-field visual acuity which are applicable to virtually every mission. These quantitative measures of performance provide a common language that allows robot users to better understand and express their own requirements and improve the way visual sensing capabilities are specified.
5.2 Multiple cameras could be incorporated into remotely operated robotic systems since a single camera is unlikely to be effective for all aspects of a mission. For example, cameras with zoom lenses are often used for far-field tasks. Cameras with close focus capabilities are often used for near-field tasks. Wide-angle lenses are often used for driving and obstacle avoidance. This test method characterizes each onboard camera to understand overall system capabilities.
5.3 This test method provides a way to unambiguously specify robot requirements in terms of the related measures of visual acuity and field of view. This helps quantify the trade-offs and general usefulness of optical versus digital zoom cameras and fixed versus variable focus lenses. The visual acuity charts can also help provide quantitative measures of performance within other test methods and training scenarios. See Figs. 2-4 for illustrations.
FIG. 2 This Baseline Image is Used for Purposes of Comparisons Below
FIG. 3 Three Images of the Same Scene with the Same Image Resolution. Top Row Shows Field of View Increasing from Left to Right (the image “zooms out”) While Bottom Row Shows Acuity Decreasing (features of the same size become harder to clearly observe)
FIG. 4 Three Images of the Same Scene with the Same Field of View. The Top Row ...
SCOPE
1.1 The purpose of this test method is to specify the apparatuses, procedures, and performance metrics necessary to quantitatively measure a robot’s visual acuity as displayed to a remote operator or vision algorithm. The primary performance metric for this test method shall be a robot’s possession of such a capability with a specified statistical significance level.
1.2 Secondary performance metrics are the robot’s field of view and aspect ratio.
1.3 This test method can also be used to measure the operator proficiency in performing the specified task. The corresponding performance metric may be the number of completed task repetitions per minute over an assigned time period ranging from 10 to 30 minutes.
1.4 This test method is a part of the sensing suite of response robot test methods, but this test method is stand-alone and complete. This test method applies to systems operated remotely from a standoff distance appropriate for the intended mission. The system includes a remote operator in control of all functionality and any assistive features or autonomous behaviors that improve the effectiveness or efficiency of the overall system.
1.5 The apparatus, specified in Section 6, can only test a limited range of a robot’s capabilities. When the robot has been tested through the limit or limits of the apparatus, a note shall be associated with the results indicating that the robot’s actual capability may be outside of the limit or limits imposed by the test apparatus. For example, the robot could exceed the capabilities of the printing process used to create the charts used in the apparatus.
1.6 Performing Location—This test method may be performed anywhere the specified apparatuses and environmental conditions can be implemented.
1.7 Units—The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; the...
General Information
- Status
- Published
- Publication Date
- 31-Aug-2017
- Technical Committee
- E54 - Homeland Security Applications
- Drafting Committee
- E54.09 - Response Robots
Relations
- Effective Date
- 01-Sep-2017
- Effective Date
- 01-Jan-2016
- Effective Date
- 01-Oct-2007
- Effective Date
- 01-Aug-2007
- Effective Date
- 01-Feb-2007
- Effective Date
- 01-Sep-2017
- Effective Date
- 01-Sep-2017
Overview
ASTM E2566-17a: Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity establishes procedures and metrics to quantitatively measure the visual acuity of camera systems used in remotely operated robots. Visual acuity is a critical parameter for robots deployed in hazardous or unstructured environments, such as urban search and rescue, where precise identification, navigation, and inspection capabilities are essential. This standard provides a common framework to evaluate and communicate robot visual performance, ensuring system capability aligns with deployment requirements.
Key Topics
- Visual Acuity Measurement: This standard specifies methods to assess both far-field and near-field camera resolution as displayed to a remote operator or processed by a vision algorithm. Metrics are based on resolving specific standardized symbols at defined distances.
- Field of View and Aspect Ratio: Alongside acuity, the test measures the camera's field of view (FOV) and aspect ratio, important for navigation and situational awareness.
- Multiple Camera Configurations: Recognizes that diverse camera types (zoom, close-focus, wide-angle) may be integrated to address different operational tasks.
- Quantitative Performance Metrics: Provides statistically significant measures, enabling clear specification of performance in procurement and deployment.
- Operator Proficiency: Includes optional assessment of an operator’s proficiency during test tasks, enhancing understanding of human-centered system performance.
- Testing Conditions: Visual performance is evaluated under various lighting conditions, simulating illumination challenges in real-world environments.
Applications
ASTM E2566-17a is widely applicable to any remotely operated robotic system, particularly those used for:
- Urban Search and Rescue (USR): Ensures robots can detect and identify objects, signage, hazards, or persons at required distances and under suboptimal light.
- Hazardous Environment Operations: Measures and assures robot vision performance where human access is dangerous or impossible.
- Military and Homeland Security: Facilitates procurement and deployment of ground or aerial unmanned vehicles capable of reconnaissance, route finding, and surveillance.
- System Benchmarking & Development: Enables comparison of robot models and fosters technical innovation by providing reliable, repeatable tests of optical systems.
- Operator Training and Assessment: Used as a repeatable training tool to improve and evaluate operator skill and familiarity with the visual interface.
Implementing this test method benefits manufacturers, researchers, and user organizations by aligning technical development with well-defined operational requirements and supporting informed procurement and field deployment decisions.
Related Standards
Organizations implementing ASTM E2566-17a should also be aware of these closely related standards:
- ASTM E2521: Terminology for Evaluating Response Robot Capabilities - provides definitions for terms used across response robot assessment methods.
- ASTM E2592: Practice for Evaluating Response Robot Capabilities: Logistics for USR Task Force Equipment Caches.
- ISO 12233: Photography - Electronic Still Picture Imaging - Resolution and Spatial Frequency Responses.
- ISO 8596:2009: Ophthalmic Optics - Visual Acuity Testing - Standard Symbol and Its Presentation, relevant for optotype design in vision test charts.
- ISO/IEC 18004:2015: Information - Automatic Identification and Data Capture Techniques - QR Code Symbology Specification, referenced for machine-readable acuity targets.
Practical Value
Applying ASTM E2566-17a ensures objective, quantitative evaluation of robot vision systems, supporting:
- Informed procurement and system selection
- Standardized comparison of camera configurations
- Enhanced reliability and operational safety
- Focused operator training and assessment
- Continual improvement in response robot design and deployment
With its flexible, repeatable methodology, ASTM E2566-17a is an essential standard for advancing the effectiveness and efficiency of remotely operated robots in challenging environments.
Buy Documents
ASTM E2566-17a - Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity
REDLINE ASTM E2566-17a - Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity
Get Certified
Connect with accredited certification bodies for this standard

NSF International
Global independent organization facilitating standards development and certification.
CIS Institut d.o.o.
Personal Protective Equipment (PPE) certification body. Notified Body NB-2890 for EU Regulation 2016/425 PPE.

Kiwa BDA Testing
Building and construction product certification.
Sponsored listings
Frequently Asked Questions
ASTM E2566-17a is a standard published by ASTM International. Its full title is "Standard Test Method for Evaluating Response Robot Sensing: Visual Acuity". This standard covers: SIGNIFICANCE AND USE 5.1 Various levels of visual acuity are essential when remotely operating robots in unstructured and often hazardous environments. Missions typically include establishing situational awareness, finding available paths, maneuvering through obstacles, identifying objects of interest, and performing detailed inspections. This test method measures robot system far-field and near-field visual acuity which are applicable to virtually every mission. These quantitative measures of performance provide a common language that allows robot users to better understand and express their own requirements and improve the way visual sensing capabilities are specified. 5.2 Multiple cameras could be incorporated into remotely operated robotic systems since a single camera is unlikely to be effective for all aspects of a mission. For example, cameras with zoom lenses are often used for far-field tasks. Cameras with close focus capabilities are often used for near-field tasks. Wide-angle lenses are often used for driving and obstacle avoidance. This test method characterizes each onboard camera to understand overall system capabilities. 5.3 This test method provides a way to unambiguously specify robot requirements in terms of the related measures of visual acuity and field of view. This helps quantify the trade-offs and general usefulness of optical versus digital zoom cameras and fixed versus variable focus lenses. The visual acuity charts can also help provide quantitative measures of performance within other test methods and training scenarios. See Figs. 2-4 for illustrations. FIG. 2 This Baseline Image is Used for Purposes of Comparisons Below FIG. 3 Three Images of the Same Scene with the Same Image Resolution. Top Row Shows Field of View Increasing from Left to Right (the image “zooms out”) While Bottom Row Shows Acuity Decreasing (features of the same size become harder to clearly observe) FIG. 4 Three Images of the Same Scene with the Same Field of View. The Top Row ... SCOPE 1.1 The purpose of this test method is to specify the apparatuses, procedures, and performance metrics necessary to quantitatively measure a robot’s visual acuity as displayed to a remote operator or vision algorithm. The primary performance metric for this test method shall be a robot’s possession of such a capability with a specified statistical significance level. 1.2 Secondary performance metrics are the robot’s field of view and aspect ratio. 1.3 This test method can also be used to measure the operator proficiency in performing the specified task. The corresponding performance metric may be the number of completed task repetitions per minute over an assigned time period ranging from 10 to 30 minutes. 1.4 This test method is a part of the sensing suite of response robot test methods, but this test method is stand-alone and complete. This test method applies to systems operated remotely from a standoff distance appropriate for the intended mission. The system includes a remote operator in control of all functionality and any assistive features or autonomous behaviors that improve the effectiveness or efficiency of the overall system. 1.5 The apparatus, specified in Section 6, can only test a limited range of a robot’s capabilities. When the robot has been tested through the limit or limits of the apparatus, a note shall be associated with the results indicating that the robot’s actual capability may be outside of the limit or limits imposed by the test apparatus. For example, the robot could exceed the capabilities of the printing process used to create the charts used in the apparatus. 1.6 Performing Location—This test method may be performed anywhere the specified apparatuses and environmental conditions can be implemented. 1.7 Units—The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; the...
SIGNIFICANCE AND USE 5.1 Various levels of visual acuity are essential when remotely operating robots in unstructured and often hazardous environments. Missions typically include establishing situational awareness, finding available paths, maneuvering through obstacles, identifying objects of interest, and performing detailed inspections. This test method measures robot system far-field and near-field visual acuity which are applicable to virtually every mission. These quantitative measures of performance provide a common language that allows robot users to better understand and express their own requirements and improve the way visual sensing capabilities are specified. 5.2 Multiple cameras could be incorporated into remotely operated robotic systems since a single camera is unlikely to be effective for all aspects of a mission. For example, cameras with zoom lenses are often used for far-field tasks. Cameras with close focus capabilities are often used for near-field tasks. Wide-angle lenses are often used for driving and obstacle avoidance. This test method characterizes each onboard camera to understand overall system capabilities. 5.3 This test method provides a way to unambiguously specify robot requirements in terms of the related measures of visual acuity and field of view. This helps quantify the trade-offs and general usefulness of optical versus digital zoom cameras and fixed versus variable focus lenses. The visual acuity charts can also help provide quantitative measures of performance within other test methods and training scenarios. See Figs. 2-4 for illustrations. FIG. 2 This Baseline Image is Used for Purposes of Comparisons Below FIG. 3 Three Images of the Same Scene with the Same Image Resolution. Top Row Shows Field of View Increasing from Left to Right (the image “zooms out”) While Bottom Row Shows Acuity Decreasing (features of the same size become harder to clearly observe) FIG. 4 Three Images of the Same Scene with the Same Field of View. The Top Row ... SCOPE 1.1 The purpose of this test method is to specify the apparatuses, procedures, and performance metrics necessary to quantitatively measure a robot’s visual acuity as displayed to a remote operator or vision algorithm. The primary performance metric for this test method shall be a robot’s possession of such a capability with a specified statistical significance level. 1.2 Secondary performance metrics are the robot’s field of view and aspect ratio. 1.3 This test method can also be used to measure the operator proficiency in performing the specified task. The corresponding performance metric may be the number of completed task repetitions per minute over an assigned time period ranging from 10 to 30 minutes. 1.4 This test method is a part of the sensing suite of response robot test methods, but this test method is stand-alone and complete. This test method applies to systems operated remotely from a standoff distance appropriate for the intended mission. The system includes a remote operator in control of all functionality and any assistive features or autonomous behaviors that improve the effectiveness or efficiency of the overall system. 1.5 The apparatus, specified in Section 6, can only test a limited range of a robot’s capabilities. When the robot has been tested through the limit or limits of the apparatus, a note shall be associated with the results indicating that the robot’s actual capability may be outside of the limit or limits imposed by the test apparatus. For example, the robot could exceed the capabilities of the printing process used to create the charts used in the apparatus. 1.6 Performing Location—This test method may be performed anywhere the specified apparatuses and environmental conditions can be implemented. 1.7 Units—The values stated in either SI units or inch-pound units are to be regarded separately as standard. The values stated in each system may not be exact equivalents; the...
ASTM E2566-17a is classified under the following ICS (International Classification for Standards) categories: 13.200 - Accident and disaster control. The ICS classification helps identify the subject area and facilitates finding related standards.
ASTM E2566-17a has the following relationships with other standards: It is inter standard links to ASTM E2566-17, ASTM E2592-16, ASTM E2592-07, ASTM E2521-07a, ASTM E2521-07, ASTM E2854/E2854M-21, ASTM E2853/E2853M-22. Understanding these relationships helps ensure you are using the most current and applicable version of the standard.
ASTM E2566-17a is available in PDF format for immediate download after purchase. The document can be added to your cart and obtained through the secure checkout process. Digital delivery ensures instant access to the complete standard document.
Standards Content (Sample)
This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the
Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
Designation: E2566 − 17a
Standard Test Method for
Evaluating Response Robot Sensing: Visual Acuity
This standard is issued under the fixed designation E2566; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision.Anumber in parentheses indicates the year of last reapproval.A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
INTRODUCTION
Theroboticscommunityneedswaystomeasurewhetheraparticularrobotiscapableofperforming
specific missions in unstructured and often hazardous environments. These missions decompose into
elemental robot tasks that can be represented individually as standard test methods and practices.The
associated test apparatuses and performance metrics provide a tangible language to communicate
various mission requirements. They also enable repeatable testing to establish the reliability of
essential robot capabilities.
TheASTM International Standards Committee on Homeland SecurityApplications (E54) specifies
standard test methods and practices for evaluating individual robot capabilities. These standards
facilitate comparisons across diverse models or multiple configurations of a single model. The
standards support robot researchers, manufacturers, and user organizations in different ways.
Researchers use the standards to understand mission requirements, encourage innovation, and
demonstratebreak-throughcapabilities.Manufacturersusethestandardstoevaluatedesigndecisions,
integrate emerging technologies, and harden developed systems. User organizations leverage the
resultingrobotcapabilitiesdatatoguidepurchasingdecisions,aligndeploymentobjectives,andfocus
training with standard measures of operator proficiency. Associated usage guides describe how such
standards can be applied to support these various objectives.
The overall suite of standards addresses critical subsystems of remotely operated response robots,
including maneuvering, mobility, dexterity, sensing, energy, communications, durability, proficiency,
autonomy, logistics, safety, and terminology. This test method is part of the Sensing test suite and
addresses the visual acuity of onboard cameras.
1. Scope completed task repetitions per minute over an assigned time
period ranging from 10 to 30 minutes.
1.1 The purpose of this test method is to specify the
apparatuses,procedures,andperformancemetricsnecessaryto
1.4 This test method is a part of the sensing suite of
quantitatively measure a robot’s visual acuity as displayed to a
responserobottestmethods,butthistestmethodisstand-alone
remote operator or vision algorithm.The primary performance
and complete. This test method applies to systems operated
metricforthistestmethodshallbearobot’spossessionofsuch
remotely from a standoff distance appropriate for the intended
a capability with a specified statistical significance level.
mission.Thesystemincludesaremoteoperatorincontrolofall
1.2 Secondary performance metrics are the robot’s field of functionality and any assistive features or autonomous behav-
view and aspect ratio. iors that improve the effectiveness or efficiency of the overall
system.
1.3 This test method can also be used to measure the
operator proficiency in performing the specified task. The
1.5 The apparatus, specified in Section 6, can only test a
corresponding performance metric may be the number of
limitedrangeofarobot’scapabilities.Whentherobothasbeen
tested through the limit or limits of the apparatus, a note shall
be associated with the results indicating that the robot’s actual
This test method is under the jurisdiction of ASTM Committee E54 on
Homeland Security Applications and is the direct responsibility of Subcommittee
capability may be outside of the limit or limits imposed by the
E54.09 on Response Robots.
test apparatus. For example, the robot could exceed the
Current edition approved Sept. 1, 2017. Published November 2017. Originally
capabilities of the printing process used to create the charts
approved in 2008. Last previous edition approved in 2017 as E2566–17. DOI:
10.1520/E2566-17A. used in the apparatus.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
E2566 − 17a
1.6 Performing Location—This test method may be per- 3.2.1 aspect ratio, n—the ratio of width to height of the
formed anywhere the specified apparatuses and environmental image produced by a camera system.
conditions can be implemented.
3.2.2 camera system, n—a specific camera with its associ-
ated lighting, compression, interface, display, and operator
1.7 Units—The values stated in either SI units or inch-
station settings that may be required to work together on the
pound units are to be regarded separately as standard. The
hosting robot to display the image.
values stated in each system may not be exact equivalents;
therefore,eachsystemshallbeusedindependentlyoftheother.
3.2.3 dynamic range, n—a measure of the ability of a
Combiningvaluesfromthetwosystemsmayresultinnoncon-
camera system to, in a single scene, simultaneously observe
formance with the standard. Both units are referenced to
details on objects in the dark environment and in the environ-
facilitate acquisition of materials internationally and minimize
ment or environments with certain light intensity.
fabrication costs.
3.2.4 focal length, n—a measure of how wide or narrow a
1.8 This standard does not purport to address all of the
camera’s field of view is. A longer focal length provides a
safety concerns, if any, associated with its use. It is the
narrower field of view than a shorter one.
responsibility of the user of this standard to establish appro-
3.2.5 foveated vision (system), n—a camera system that has
priate safety, health, and environmental practices and deter-
higher resolution (provides more information) at the center
mine the applicability of regulatory limitations prior to use.
than the edges of the image.
1.9 This international standard was developed in accor-
3.2.6 framerate, n—a measure of the temporal resolution of
dance with internationally recognized principles on standard-
a camera system and refers to the number of complete images
ization established in the Decision on Principles for the
per second displayed on a remote system interface.
Development of International Standards, Guides and Recom-
3.2.7 image, n—a two-dimensional matrix of values with
mendations issued by the World Trade Organization Technical
Barriers to Trade (TBT) Committee. each of the two dimensions representing angular deviation
(possibly non-linear) in orthogonal direction from the sensor’s
2. Referenced Documents
optical axis.
2.1 ASTM Standards: 3.2.8 imager, n—a sensor, or system of sensors, that pro-
E2521Terminology for Evaluating Response Robot Capa-
duces an image.
bilities
3.2.9 image acuity (or acuity), n—a measure of the resolv-
E2592Practice for Evaluating Response Robot Capabilities:
ing capability of the robot’s camera system.
Logistics: Packaging for Urban Search and Rescue Task
3.2.10 image field of view (or field of view), n—ameasureof
Force Equipment Caches
the extent of the robot’s environment that may be observed in
2.2 Additional Standards:
a single visual image, measured in terms of degrees in the
ISO 12233Photography–Electronic Still Picture Imag-
horizontal and vertical directions.
ing–Resolution and Spatial Frequency Responses
3.2.11 image resolution, n—a measure of the amount of
ISO 8596:2009Ophthalmic Optics–Visual Acuity Test-
visual information that the robot’s camera system is capable of
ing–Standard Symbol and Its Presentation
conveying to the operator (regardless of the field of viewof the
ISO/IEC 18004:2015Information–Automatic Identifica-
system)andismeasuredasthenumberofblackandwhitelines
tion and Data Capture Techniques–QR Code Barcode
thatcanbeclearlyresolved,perimageheight,inthehorizontal,
Symbology Specification
vertical and diagonal directions. Reporting relative to image
height normalizes for variations in aspect ratio.
3. Terminology
3.2.12 Landolt Ring or Landolt C, n—a symbolconsistingof
3.1 Thefollowingtermsareusedinthistestmethodandare
ablackcircularringwithawhitegap,bothwithspecifiedsizes,
defined in Terminology E2521: abstain, administrator or test
as defined in ISO 8596:2009.
administrator, emergency response robot or response robot,
fault condition, operator, operator station, remote control,
3.2.13 QR Code (or Quick Response Code), n—a pattern of
repetition, robot, teleoperation, test event or event, test form,
black squares that encodes digital information and is designed
test sponsor, test suite, testing target or target, testing task or
to be read by a computer.
task, and trial or test trial.
3.2.14 resolution wedge, n—aseriesoflinesthatdecreasein
3.2 Definitions: The following terms are used in this test size and spacing and are used to measure the image resolution
method and defined below. For further discussion, please refer of a camera system. The point along the resolution wedge at
to Appendix X1. which the lines are no longer distinct indicates the image
resolution of the camera system.
3.2.15 resolve, v—the act of discerning the presence of a
For referenced ASTM standards, visit the ASTM website, www.astm.org, or
marking or object.
contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM
Standards volume information, refer to the standard’s Document Summary page on
the ASTM website. 4. Summary of Test Method
Available from International Organization for Standardization (ISO), ISO
4.1 Thistestmethodusesstandardsymbolsofincrementally
Central Secretariat, BIBC II, Chemin de Blandonnet 8, CP 401, 1214 Vernier,
Geneva, Switzerland, http://www.iso.org. small sizes viewed by a robot from specified distances to
E2566 − 17a
measure the far-field and near-field visual acuity of each avoidance. This test method characterizes each onboard cam-
onboard camera as displayed on a remote operator interface. era to understand overall system capabilities.
The metric is the size of the smallest object that can be
5.3 This test method provides a way to unambiguously
resolved, in millimeters, at the far-field and near-field. Attri-
specify robot requirements in terms of the related measures of
butes of aspect ratio and field of view are also measured for
visual acuity and field of view. This helps quantify the
each camera.
trade-offsandgeneralusefulnessofopticalversusdigitalzoom
4.2 The apparatuses required to perform the measurements
cameras and fixed versus variable focus lenses. The visual
are visual acuity test charts displayed at a far-field distance of
acuity charts can also help provide quantitative measures of
6 m [20 ft] and a near-field distance of 40 cm [16 in.] from the
performance within other test methods and training scenarios.
robot. The remote operator identifies Landolt C symbols with
See Figs. 2-4 for illustrations.
gaps in any of eight different orientations. Autonomous sys-
tems with image processing capabilities identify machine
5.4 This test method helps evaluate the effect of illumina-
readable symbols known as quick response codes (QR codes).
tion on visual acuity. In dark environments, robots typically
need to illuminate the scene to be effective. Far-field objects
4.3 The conditions include lighted and darkened room or
downrange require much greater light intensity than near-field
hallway of sufficient length and width to accommodate the
robot and the charts at the specified distances (see Fig. 1). objects close to the robot. Variable illumination helps ensure
the scene is neither too dark nor overwhelmingly lighted so as
Illumination from the robot is allowed. A light meter shall be
available to measure the light conditions. to thwart the camera’s ability to discern visual details (so-
called “washout” of the image). Variable illumination is
5. Significance and Use
especially important when quickly transitioning from far-field
to near-field and back again.
5.1 Various levels of visual acuity are essential when
remotely operating robots in unstructured and often hazardous
5.5 Key features of response robots are that they are
environments. Missions typically include establishing situ-
remotely operated from safe standoff distances, deployable at
ational awareness, finding available paths, maneuvering
operational tempos, capable of operating in complex
through obstacles, identifying objects of interest, and perform-
environments, sufficiently hardened against harsh
ing detailed inspections. This test method measures robot
environments, reliable and field serviceable, durable or cost-
system far-field and near-field visual acuity which are appli-
effectively disposable, and equipped with operational safe-
cable to virtually every mission. These quantitative measures
guards.Assuch,amajoradvantageofusingrobotsinresponse
of performance provide a common language that allows robot
operations is to enhance the safety and effectiveness of
users to better understand and express their own requirements
responders or soldiers.
and improve the way visual sensing capabilities are specified.
5.6 This test method aligns user expectations with actual
5.2 Multiple cameras could be incorporated into remotely
capabilities to understand the inherent trade-offs in deployable
operatedroboticsystemssinceasinglecameraisunlikelytobe
systems at any given cost. For example, an increase in image
effective for all aspects of a mission. For example, cameras
resolution typically results in improved field of view or acuity,
with zoom lenses are often used for far-field tasks. Cameras
withclosefocuscapabilitiesareoftenusedfornear-fieldtasks. but not necessarily both. An increase in both may not be
Wide-angle lenses are often used for driving and obstacle possible for robots of a desired weight, endurance, or cost.
FIG. 1 (A) An example of a far-field visual acuity chart as viewed by a robot6m[2ft] away and displayed on a remote operator inter-
face. The chart contains lines of ten standard symbols in incrementally small sizes along with items of interest to highlight the applica-
bility of different levels with acuity (road signs, hazmat placards, shipping labels, etc.). (B) The standard symbols to identify are called
Landolt Rings with gaps in any of eight different orientations. Concentrically displayed Landolt Rings contain increasing small sym-
bols within each other. (C) Similarly, increasingly small QR codes are used to evaluate the acuity of autonomous systems with image
processing capabilities.
E2566 − 17a
FIG. 2 This Baseline Image is Used for Purposes of Comparisons Below
FIG. 3 Three Images of the Same Scene with the Same Image Resolution. Top Row Shows Field of View Increasing from Left to Right
(the image “zooms out”) While Bottom Row Shows Acuity Decreasing (features of the same size become harder to clearly observe)
Appropriate levels of understanding can help ensure that 5.7.1 Inspire technical innovation and guide manufacturers
requirement specifications are articulated within the limit of towardimplementingcombinationsofcapabilitiesnecessaryto
current capabilities. perform essential mission tasks.
5.7.2 Measure and compare essential robot capabilities.
5.7 This test method provides a tangible representation of
This test method can establish the reliability of the system to
essential robot capabilities with quantifiable measures of per-
perform specified tasks, highlight break-through capabilities,
formance. When considered with other related test methods in
and encourage hardening of developmental systems.
the suite, it facilitates communication among communities of
robotusersandmanufacturers.Assuch,thistestmethodcanbe 5.7.3 Inform purchasing decisions, conduct acceptance
used to: testing, and align deployment objectives with statistically
E2566 − 17a
FIG. 4 Three Images of the Same Scene with the Same Field of View. The Top Row Shows the Field of View is Unchanged While Bot-
tom Row Shows Both Resolution and Acuity Increasing (features become clearer)
significant robot capabilities data captured through repeated
testing and comparison of quantitative results.
5.7.4 Focus operator training and measure proficiency as a
repeatable practice task that exercises actuators, sensors, and
operator interfaces. The test method can be embedded into
training scenarios to capture and compare quantitative scores
even within uncontrolled environmental variables. This can
help develop, maintain, measure, and track very perishable
skillsovertimeandenablecomparisonsacrosssquads,regions,
or national averages.
5.8 Although this test method was developed as part of a
suite of sensing tests for response robots, it may be applicable
tootherdomains.Differentusercommunitiescansettheirown
thresholds of acceptable performance within the test method
for various mission requirements.
5.9 It is recommended that users of this test method con-
sider their particular robot requirements when interpreting the
test results. The capability evaluated in this test method alone
shall be interpreted according to the scope of this test method
and shall not be considered as an overall indication of the
capability of the robot’s mobility subsystem nor of the entire
roboticsystem.Asingletestmethodonlycapturesthespecified
single aspect of a robot’s capabilities. A more complete
characterization of a robot’s capabilities requires test results
from a wider set of test methods.
6. Apparatus
6.1 The components required to perform this test method
are visual acuity test charts described below and devices to
FIG. 5 An Example of a Robot Positioned in Front of the Visual
hold the test charts at specified distances from the robot. For
Acuity Test Apparatus. The LED Studio Lights to the Right are
lighted conditions, outdoor daylight testing is preferable.
Set Up to Provide an Even Illumination Across the Chart. For
Space Reasons, Only 10 or 20 Symbols are Visible Here. Multiple
Indoor testing requires a room or hallway of sufficient length
Charts are Used to Achieve the Prescribed 30 Symbols
and width to accommodate the robot and the charts at the
far-field distance and lighting equipment. For darkened
conditions, the same room can be used if windowless. A light 6.2 For robotic systems with camera images displayed on a
meter is required to measure both lighting conditions. Fig. 5 remote operator interface, human-readable symbols shall be
shows an example of the apparatus. used to measure visual acuity. Landolt Ring symbols are used
E2566 − 17a
as described in ISO 8596 (see Figs. 6 and 7). Each symbol 6.2.6 For testing near-field visual acuity at a distance of 40
consists of a black ring with an outer diameter equal to five cm [16 in.], high quality conventional office printers or
times the ring thickness displayed on a white background to
commercially available charts shall be used to ensure small
maximize contrast.The ring contains a gap with parallel edges symbols with features as small as 0.25 mm are printed with
equal to the ring thickness. The size of the gap represents the
sufficient resolution and contrast.
smallest discernible feature when measuring visual acuity and
6.2.7 For testing far-field visual acuity at a distance of 6 m
isreportedasthemetric.Thegapappearsinoneofeightradial
[20 ft], much larger symbols may be required. Large-format
orientations around the ring at 45° intervals starting from
printers or commercially available charts shall be used to
straight up. Correctly identifying a series of randomly oriented
ensure smoothly contoured symbols with sufficient resolution
symbolswithindefinederrorratesprovidesstatisticallysignifi-
and contrast.
cant measures of visual acuity.
6.2.8 Eq 1 may be used to compute the appropriate visual
6.2.1 Correctly discerning the gap orientations of multiple
acuity for a particular size symbol viewed from a given
symbols with a specified error rate represents the ability to
distance and may be reported as a convenience. Table 1
resolve features the size of the gap. The gap orientations shall
provides the conversion between metric ratios (6/6), imperial
beidentifiedrelativetothetopofthecameraimageintermsof
ratios (20/20 ft), decimal (1.0), and percentage of average
compass directions as follows: 0°/North (N), 45°/Northeast
human vision (100%), along with examples of objects that
(NE), 90°/East (E), 135°/Southeast (SE), 180°/South (S),
may be approximately discernible from the given distances.
225°/Southwest (SW), 270°/West (W), and 315°/Northwest
ChartDistance 3tan1⁄60 °
(NW). ~ !
%HumanVision 5100 3 (1)
6.2.2 Although the test is performed with a single sequence LandoltRinggapsize
of 30 symbols at a selected size, multiple sizes may be
6.2.9 For robotic systems with image processing
provided on one chart to facilitate repeated testing at larger or
capabilities, a visual acuity test chart with machine-readable
smaller sizes.
symbols is used to measure visual acuity. Increasingly small
6.2.3 The chart may be scaled to provide for various levels
symbols called “quick response codes” or “QR codes” are
of acuity for robots under test depending on testing require-
used, which are two-dimensional bar codes described in
ments.Typicalchartswillhavesymbolsfortestingatascaleof
ISO/IEC 18004:2015 (see Fig. 8). Each QR code consists of a
0.25 to 5 mm [0.01 to 0.2 in.] for testing at the near field
grid of black and white squares arranged in a pattern that
distance of 40 cm [16 in.] and 0.5 to 25 mm [0.02 to 1 in.] for
encodes information. The ability to successfully read a QR
testing at the far field distance of 6 m [20 ft].
code requires resolving features equal to the size of the
6.2.4 The incremental size of each scale shall be no more
individual squares in the grid. It is the sizes of the individual
than 20% smaller or larger than the next closest scale. The
black and white squares that is reported as the metric.
symbols shall range from easily readable at the largest scale to
6.2.10 Grids can range from 11×11, 21×21, up to 57×57 or
unambiguously unreadable at the smallest scale. All symbols
larger than encode more information and contain embedded
shall be printed with sufficient resolution to maintain smoothly
contoured symbols. The metric used shall be in millimeters error checking. Smaller 11×11 grids suffice for the purpose of
testingvisualacuitybecausetheyencodesufficientinformation
(inches).
6.2.5 At each scale, 30 randomly oriented symbols shall be to identify the acuity measurement and some additional iden-
identified with at least one vertical, one horizontal, and one tification words such as they chart name or number, for
diagonal. One or more test charts may contain sets of multiple example.Ateachscale,30randomlyorientedsymbolsshallbe
symbols at various scales. Alternatively, a single symbol at identified with at least one vertical and one diagonal. One or
each scale can be rotated and identified sequentially. more test charts may contain sets of multiple symbols at
FIG. 6 (A) The relative dimensions of the human-readable symbol used to measure visual acuity of systems as displayed on a remote
operator interface. (B) A concentric set of increasingly small symbols enables measurement of different levels of acuity within a com-
pact size that is particularly useful for embedding into training scenarios. But multiple concentric sets are required to achieve the
granularity desired for testing acuity. This chart can be rotated to produce all eight orientations of the symbol.
E2566 − 17a
FIG. 7 An Example Visual Acuity Chart with Increasingly Small Sets of 10 Randomly Oriented Symbols at Each Scale to Capture Statis-
tically Significant Evaluations Based on Defined Error Rates. The Scale of Each Line Must be Proportionally Adapted for either the Far-
Field or Near-Field Distance to Correctly Identify the Percentage of Average Human Vision and Visual Acuity Ratio
TABLE 1 Example Levels of Acuity with Relative Percentages of Average Human Vision at Various Distances
Fixed Distance: Able to resolve alternating black and
Example Object – Must be no more than the
white lines of the following spacing when observed
Acuity in Metric % of Average Human following distance to reliably read:
from a distance of:
(Imperial) Vision
Typical Vehicle License Typical Newsprint or
6 m [20 ft] 0.4 m [16 in.]
Plate Return Address Label
6/150 (20/500) 4 % 45 mm [1.772 in.] 3 mm [0.12 in.] 2.67 m [8.75 ft] 0.06 m [0.197 ft]
6/120 (20/400) 5 % 36 mm [1.417 in.] 2.4 mm [0.096 in.] 3.33 m [10.9 ft] 0.075 m [0.246 ft]
6/90 (20/300) 7 % 26 mm [1.024 in.] 1.7 mm [0.069 in.] 4.67 m [15.3 ft] 0.105 m [0.345 ft]
6/70 (20/240) 9 % 20 mm [0.787 in.] 1.3 mm [0.053 in.] 6 m [19.7 ft] 0.135 m [0.443 ft]
6/60 (20/200) 10 % 18 mm [0.709 in.] 1.2 mm [0.048 in.] 6.67 m [21.9 ft] 0.15 m [0.492 ft]
6/50 (20/160) 12 % 15 mm [0.591 in.] 1 mm [0.040 in.] 8 m [26.3 ft] 0.18 m [0.591 ft]
6/40 (20/125) 15 % 12 mm [0.472 in.] 0.8 mm [0.032 in.] 10 m [32.8 ft] 0.225 m [0.738 ft]
6/30 (20/100) 20 % 8.7 mm [0.343 in.] 0.58 mm [0.023 in.] 13.3 m [43.8 ft] 0.3 m [0.984 ft]
6/24 (20/80) 25 % 7 mm [0.276 in.] 0.47 mm [0.019 in.] 16.7 m [54.7 ft] 0.375 m [1.23 ft]
6/19 (20/63) 32 % 5.5 mm [0.217 in.] 0.37 mm [0.015 in.] 21.3 m [70 ft] 0.48 m [1.57 ft]
6/15 (20/50) 40 % 4.4 mm [0.173 in.] 0.29 mm [0.012 in.] 26.7 m [87.5 ft] 0.6 m [1.97 ft]
6/12 (20/40) 50 % 3.5 mm [0.138 in.] 0.23 mm [0.0093 in.] 33.3 m [109 ft] 0.75 m [2.46 ft]
6/9.5 (20/30) 63 % 2.8 mm [0.110 in.] 0.19 mm [0.0074 in.] 42 m [138 ft] 0.945 m [3.1 ft]
6/7.5 (20/25) 80 % 2.2 mm [0.087 in.] 0.15 mm [0.0058 in.] 53.3 m [175 ft] 1.2 m [3.94 ft]
6/6.0 (20/20) 100 % 1.8 mm [0.071 in.] 0.12 mm [0.0048 in.] 66.7 m [219 ft] 1.5 m [4.92 ft]
6/4.8 (20/16) 125 % 1.4 mm [0.055 in.] 0.09 mm [0.0037 in.] 83.3 m [273 ft] 1.88 m [6.15 ft]
6/3.8 (20/13) 158 % 1.1 mm [0.043 in.] 0.07 mm [0.0029 in.] 105 m [346 ft] 2.37 m [7.78 ft]
6/3.0 (20/10) 200 % 0.87 mm [0.034 in.] 0.06 mm [0.0023 in.] 133 m [437 ft] 3 m [9.84 ft]
6/2.4 (20/8.0) 250 % 0.7 mm [0.028 in.] 0.05 mm [0.0019 in.] 167 m [547 ft] 3.75 m [12.3 ft]
6/1.9 (20/6.3) 316 % 0.55 mm [0.022 in.] 0.04 mm [0.0015 in.] 211 m [691 ft] 4.74 m [15.6 ft]
6/1.5 (20/5.0) 400 % 0.44 mm [0.017 in.] 0.03 mm [0.0012 in.] 267 m [875 ft] 6 m [19.7 ft]
various scales. Alternatively, multiple symbols may be ar- 6.4 Other Apparatuses Include:
ranged in a “flip chart” layout and identified sequentially. 6.4.1 A measuring tape to determine the distance between
6.2.11 Auxiliaryimageprocessingdevices,suchasQRcode the robot and the charts.
scanners available on many mobile phones, can be used to 6.4.2 A device for displaying the charts at fixed distances
identify these symbols as displayed on a remote operator from the robot, including wall hanging, tripods, or other such
interface. This can provide a consistent indication of system devices.
acuity without introducing potential errors due to operator 6.4.3 Equipment for marking a line at the prescribed dis-
eyesight with or without corrective lenses. tance from the charts (40 cm [16 in.] or 6 m [20 ft]). In an
indoor environment, colored adhesive tape may be used.
6.3 A ruler or other object of known size of approximately
6.4.4 Alightmeterwithrangefrom0.01to1000lux 65%.
90 to 120 cm [3 to 4 ft] is used to measure the field of view of
6.4.5 For lighted conditions, an outdoor area with indirect
each camera. It is recommended that this be an ISO 12233
sunlight or a room with sufficient length and width to accom-
chart, which can also be used to independently measure the
modatetherobotandthechartsatthefar-fielddistance6m[20
image resolution of each camera. The chart contains aspect
ft] and near-field distance of 40 cm [16 in.]. Either way, the
ratio indicators to determine the field of view of the camera
environment should be evenly lighted to 150 to 300 lux or
(seeFig.9).Thechartmustfilltheverticalframeofthecamera
more using a smooth spectrum, non-flickering light source,
while being in the optimal range of focus so different size
such as an incandescent lamp or studio LED panel.
charts are typically needed to ensure both objectives are
achieved simultaneously. A chart stand with variable orienta- 6.5 Fordarkenedconditions,awindowlessroomorhallway
tionandelevationfromgroundleveltothetopmostcameraon withilluminationofnomorethan0.1lux.Theroomshallbeof
therobotisneededtoquicklyadjustthechartrelativetoallthe sufficient length and width to accommodate the robot and the
onboard cameras. charts at the far-field distance 6 m [20 ft] and near-field
E2566 − 17a
FIG. 8 An example of a QR code using 21×21 square grid. This symbol encodes a line of text that would be read as “TestCode1”. The
ability to successfully read this code represents the ability to resolve features equal in size to the individual squares that make up the
code (or 1/21st of the size of the whole code). Note that QR codes require more space to encode a similar size acuity symbol, in this
case 21 times larger than the acuity measure where a Landolt Ring symbol is five times larger than its acuity measurement feature. An
alternative layout that combines the Landolt Ring symbols and QR Code symbols on one vision test chart. The ability to resolve the
QR Code at a given size is equivalent to reading the adjacent set of Landolt Ring symbols.
FIG. 9 (A) A chart suitable for determining field-of-view and image resolution is an ISO 12233 chart, as implemented here by Cornell
University. However, any vision test chart that contains field-of-view markings and resolution wedges (highlighted) will suffice. (B)An
example view showing a 4:3 aspect ratio as indicated by the arrows in the corners and some loss of resolution shown in the resolu-
tion wedges.
distance of 40 cm [16 in.]. When testing camera systems that 7.3 While the robot is active and the emergency stop button
are sensitive to near-infrared light (for example, “night vision” is disengaged, avoid:
cameras), ensure there are no additional sources of infrared
7.3.1 the areas directly in front of and behind the robot,
light in the scene. Such sources may include surveillance
7.3.2 the reachable radius of the robot’s manipulator, as
cameras with infrared light.
equipped, and
6.6 A test sponsor is authorized to specify other test appa- 7.3.3 touchingtherobototherthantoengagetheemergency
ratus sizes, distances, and conditions to suit particular deploy-
stop button.
ment requirements.
8. Procedure
7. Hazards
8.1 Ensure that the apparatus and environmental conditions
7.1 In addition to 1.8, users of this test method shall also
are set up properly according to the Apparatus Sectio
...
This document is not an ASTM standard and is intended only to provide the user of an ASTM standard an indication of what changes have been made to the previous version. Because
it may not be technically possible to adequately depict all changes accurately, ASTM recommends that users consult prior editions as appropriate. In all cases only the current version
of the standard as published by ASTM is to be considered the official document.
Designation: E2566 − 17 E2566 − 17a
Standard Test Method for
Determining Visual Acuity and Field of View of On-Board
Video Systems for Teleoperation of Robots for Urban
Search and Rescue ApplicationsEvaluating Response Robot
Sensing: Visual Acuity
This standard is issued under the fixed designation E2566; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
INTRODUCTION
The robotics community needs ways to measure whether a particular robot is capable of performing
specific missions in unstructured and often hazardous environments. These missions decompose into
elemental robot tasks that can be represented individually as standard test methods and practices. The
associated test apparatuses and performance metrics provide a tangible language to communicate
various mission requirements. They also enable repeatable testing to establish the reliability of
essential robot capabilities.
The ASTM International Standards Committee on Homeland Security Applications (E54) specifies
standard test methods and practices for evaluating individual robot capabilities. These standards
facilitate comparisons across robot models, or across various diverse models or multiple configura-
tions of a particular robot model. They single model. The standards support robot researchers,
manufacturers, and user organizations in different ways. Researchers use them the standards to
understand mission requirements, encourage innovation, and demonstrate break-through capabilities.
Manufacturers use them the standards to evaluate design decisions, integrate emerging technologies,
and harden developed systems. User organizations leverage the resulting robot capabilities data to
guide purchasing, purchasing decisions, align deployment objectives, and focus training with standard
measures of operator proficiency. An associated usage guide describesAssociated usage guides
describe how such standards can be implementedapplied to support these various objectives.
The overall suite of standards addresses critical subsystems of remotely operated response robots,
including maneuvering, mobility, dexterity, sensing, energy, communications, durability, proficiency,
autonomy, logistics, safety, and terminology. This test method is part of the sensingSensing test suite
and addresses the visual acuity of onboard cameras.
1. Scope
1.1 This test method covers the measurement of several key parameters of video systems for remote operations. It is initially
intended for applications of robots for Urban Search and Rescue but is sufficiently general to be used for marine or other remote
platforms. Those parameters are (The purpose of this test method is to specify 1) field of view of the camera system, (the
apparatuses, procedures, 2) visual acuity at far distances with both ambient lighting and lighting on-board the robot, (and
performance metrics necessary to quantitatively3) visual acuity at near distances, again in both light and dark environments, and
( measure a robot’s visual acuity as displayed to a remote operator or vision algorithm. The primary performance4), if available,
visual acuity in both light and dark environments with zoom lens capability. metric for this test method shall be a robot’s possession
of such a capability with a specified statistical significance level.
1.2 Secondary performance metrics are the robot’s field of view and aspect ratio.
This test method is under the jurisdiction of ASTM Committee E54 on Homeland Security Applications and is the direct responsibility of Subcommittee E54.09 on
Response Robots.
Current edition approved Jan. 1, 2017Sept. 1, 2017. Published February 2017November 2017. Originally approved in 2008. Last previous edition approved in 20082017
as E2566 – 08.E2566 – 17. DOI: 10.1520/E2566-17.10.1520/E2566-17A.
Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959. United States
E2566 − 17a
1.3 These tests measure only end-to-end capability, that is, they determine the resolution of the images on the display screen
at the operator control unit since that is the important issue for the user. This test method can also be used to measure the operator
proficiency in performing the specified task. The corresponding performance metric may be the number of completed task
repetitions per minute over an assigned time period ranging from 10 to 30 minutes.
1.4 This test method is intended to be used for writing procurement specifications and for acceptance testing for robots for urban
search and rescue applications.a part of the sensing suite of response robot test methods, but this test method is stand-alone and
complete. This test method applies to systems operated remotely from a standoff distance appropriate for the intended mission. The
system includes a remote operator in control of all functionality and any assistive features or autonomous behaviors that improve
the effectiveness or efficiency of the overall system.
1.5 The apparatus, specified in Section 6, can only test a limited range of a robot’s capabilities. When the robot has been tested
through the limit or limits of the apparatus, a note shall be associated with the results indicating that the robot’s actual capability
may be outside of the limit or limits imposed by the test apparatus. For example, the robot could exceed the capabilities of the
printing process used to create the charts used in the apparatus.
1.6 Performing Location—This test method will use the Snellen fraction to report visual acuity; readers may wish to convert
to decimal notation to improve intuitive understanding if they are more familiar with that notation. Distances will be given in
metres with English units in parentheses following. may be performed anywhere the specified apparatuses and environmental
conditions can be implemented.
1.7 Units—This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the
responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of
regulatory limitations prior to use.The values stated in either SI units or inch-pound units are to be regarded separately as standard.
The values stated in each system may not be exact equivalents; therefore, each system shall be used independently of the other.
Combining values from the two systems may result in nonconformance with the standard. Both units are referenced to facilitate
acquisition of materials internationally and minimize fabrication costs.
1.8 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility
of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of
regulatory limitations prior to use.
1.9 This international standard was developed in accordance with internationally recognized principles on standardization
established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued
by the World Trade Organization Technical Barriers to Trade (TBT) Committee.
2. Referenced Documents
2.1 ASTM Standards:
E2521 Terminology for Evaluating Response Robot Capabilities
E2592 Practice for Evaluating Response Robot Capabilities: Logistics: Packaging for Urban Search and Rescue Task Force
Equipment Caches
2.2 Additional Standards:
ISO 12233 Photography – Electronic Still Picture Imaging – Resolution and Spatial Frequency Responses
ISO 8596:2009 Ophthalmic Optics – Visual Acuity Testing – Standard Symbol and Its Presentation
ISO/IEC 18004:2015 Information – Automatic Identification and Data Capture Techniques – QR Code Barcode Symbology
Specification
3. Terminology
3.1 The following terms are used in this test method and are defined in Terminology E2521: abstain, administrator or test
administrator, emergency response robot or response robot, fault condition, operator, operator station, remote control, repetition,
robot, teleoperation, test event or event, test form, test sponsor, test suite, testing target or target, testing task or task, and trial or
test trial.
3.2 Definitions: The following terms are used in this test method and defined below. For further discussion, please refer to
Appendix X1.
3.2.1 field of view, aspect ratio, n—angle subtended by the largest object that can be imaged with the video the ratio of width
to height of the image produced by a camera system.
Messina, E., et al., “Statement For referenced ASTM standards, visit the ASTM website, www.astm.org, or contact ASTM Customer Service at service@astm.org. For
of Requirements forAnnual Book of ASTM Standards Urban Search and Rescue Robot Performance Standards,” http://www.isd.mel.nist.gov/US&R_Robot_Standards/
Requirements Report (prelim).pdf volume information, refer to the standard’s Document Summary page on the ASTM website.
Available from International Organization for Standardization (ISO), ISO Central Secretariat, BIBC II, Chemin de Blandonnet 8, CP 401, 1214 Vernier, Geneva,
Switzerland, http://www.iso.org.
E2566 − 17a
3.2.2 camera system, n—a specific camera with its associated lighting, compression, interface, display, and operator station
settings that may be required to work together on the hosting robot to display the image.
3.2.3 optotype, dynamic range, n—character used a measure of the ability of a camera systemon a chart for testing visual acuity.
to, in a single scene, simultaneously observe details on objects in the dark environment and in the environment or environments
with certain light intensity.
2.1.2.1 Discussion—
Optotypes are generally built on a 5 by 5 grid, with the size for “standard” vision subtending a square 5 min of arc on a side. This
makes one grid element 1 min of arc square.
3.2.4 tumbling E,focal length, n—specific optotype that can be drawn in various orientations (facing left, right, up, or down) and
in various sizes to create an eye chart (seea measure of how wide or narrow a camera’s field of view is. A longer focal length Fig.
1).provides a narrower field of view than a shorter one.
2.1.3.1 Discussion—
This optoptype is reported in the literature as being maximally distinguishable. Eye charts with Tumbling Es are available
commercially for use at different distances.
3.2.5 foveated vision (system), n—a camera system that has higher resolution (provides more information) at the center than the
edges of the image.
3.2.6 standard vision, framerate, n—ability a measure of the temporal resolution of a camera systemto resolve target features
subtending 1 min of arc. and refers to the number of complete images per second displayed on a remote system interface.
3.2.7 visual acuity, image, n—ability to resolve features subtending some angle, as compared with “standard” vision measured
at the same distance.a two-dimensional matrix of values with each of the two dimensions representing angular deviation (possibly
non-linear) in orthogonal direction from the sensor’s optical axis.
2.1.5.1 Discussion—
An angle Θ subtends a feature of size h at a distance d, of size 2h at a distance of 2d, of size 3h at a distance 3d, and so on. If
2d is the “standard” measurement distance of 6 m (20 ft), an eye chart for use at 3 m (10 ft) would have characters of h high rather
than 2h high and the measurement of visual acuity would be the same. See Fig. 2 for an illustration of the angle/distance
relationship.
3.2.8 imager, n—a sensor, or system of sensors, that produces an image.
3.2.9 image acuity (or acuity), n—a measure of the resolving capability of the robot’s camera system.
3.2.10 image field of view (or field of view), n—a measure of the extent of the robot’s environment that may be observed in a
single visual image, measured in terms of degrees in the horizontal and vertical directions.
3.2.11 Snellen fraction,image resolution, n—a measure of visual acuity.the amount of visual information that the robot’s camera
system is capable of conveying to the operator (regardless of the field of view of the system) and is measured as the number of black
and white lines that can be clearly resolved, per image height, in the horizontal, vertical and diagonal directions. Reporting relative
to image height normalizes for variations in aspect ratio.
2.1.6.1 Discussion—
The subject is placed a standard distance from an eye chart, typically 6 m (20 ft). The subject is asked to identify the line with
the smallest characters that he can resolve. The Snellen fraction is the ratio of the distance at which that line would be resolved
by a subject with standard vision to the standard test distance. Thus, a subject with standard vision would have 6/6 (20/20) vision.
3.2.12 Landolt Ring or Landolt C, n—a symbol consisting of a black circular ring with a white gap, both with specified sizes,
as defined in ISO 8596:2009.
3.2.13 QR Code (or Quick Response Code), n—a pattern of black squares that encodes digital information and is designed to
be read by a computer.
3.2.14 resolution wedge, n—a series of lines that decrease in size and spacing and are used to measure the image resolution of
a camera system. The point along the resolution wedge at which the lines are no longer distinct indicates the image resolution of
the camera system.
E2566 − 17a
3.2.15 remote operation, resolve, n—v—act of controlling a distant robot on a continuous or intermittent basis via tethered or
radio-linked devices while being provided with sensory information (for example, visual information through cameras onboard the
robot).the act of discerning the presence of a marking or object.
2.1.7.1 Discussion—
Remote operation includes teleoperation as well as forms of intermittent autonomy or assisted autonomy.
3. Units for Reporting Visual Acuity
3.1 The commonly used distance for measuring visual acuity is 20 ft in the United States. This leads to the “Snellen fraction”
as the common measure of visual acuity: 20/20, 20/40, and so on. The Snellen fraction is also used in England, referred to 6 m
as the standard measurement distance (6/6, 6/12, etc.), while the rest of Europe generally used the decimal fraction equivalent:
20/20 = 6/6 = 1.0; 20/40 = 6/12=0.5, etc. Measurements may be taken at any distance and the result scaled to the common distance.
3.2 The meaning of 6/12 (20/40 or 0.5) is that features that can be resolved at 6 m (20 ft) by the test subject are of a size such
that a person with “standard” visual acuity could resolve them at 12 m (40 ft). The characters on the 6/12 (20/40, 0.5) line of an
eye chart are twice the size of the characters on the 6/6 (20/20, 1.0) line. The best human vision is not 6/6 (20/20, 1.0), resolving
1 min of arc (1/60° = .016°) but more like 6/3.6 (20/12, 1.7), resolving about 0.01°.
4. Summary of Test Method
4.1 This test method uses standard symbols of incrementally small sizes viewed by a robot from specified distances to measure
the far-field and near-field visual acuity of each onboard camera as displayed on a remote operator interface. The metric is the size
of the smallest object that can be resolved, in millimeters, at the far-field and near-field. Attributes of aspect ratio and field of view
are also measured for each camera.
4.2 The apparatuses required to perform the measurements are visual acuity test charts displayed at a far-field distance of 6 m
[20 ft] and a near-field distance of 40 cm [16 in.] from the robot. The remote operator identifies Landolt C symbols with gaps in
any of eight different orientations. Autonomous systems with image processing capabilities identify machine readable symbols
known as quick response codes (QR codes).
4.3 The conditions include lighted and darkened room or hallway of sufficient length and width to accommodate the robot and
the charts at the specified distances (see Fig. 1). Illumination from the robot is allowed. A light meter shall be available to measure
the light conditions.
5. Significance and Use
5.1 Various levels of visual acuity are essential when remotely operating robots in unstructured and often hazardous
environments. Missions typically include establishing situational awareness, finding available paths, maneuvering through
obstacles, identifying objects of interest, and performing detailed inspections. This test method measures robot system far-field and
near-field visual acuity which are applicable to virtually every mission. These quantitative measures of performance provide a
common language that allows robot users to better understand and express their own requirements and improve the way visual
sensing capabilities are specified.
FIG. 1 Tumbling(A E Optotype in Various Orientations) An example of a far-field visual acuity chart as viewed by a robot 6 m [2 ft] away
and displayed on a remote operator interface. The chart contains lines of ten standard symbols in incrementally small sizes along with
items of interest to highlight the applicability of different levels with acuity (road signs, hazmat placards, shipping labels, etc.). (B) The
standard symbols to identify are called Landolt Rings with gaps in any of eight different orientations. Concentrically displayed Landolt
Rings contain increasing small symbols within each other. (C) Similarly, increasingly small QR codes are used to evaluate the acuity of
autonomous systems with image processing capabilities.
E2566 − 17a
5.2 Responder-defined requirements for these test methods are documented in a preliminary document entitled “Statement of
Requirements for Urban Search and Rescue Robot Performance Standards.”Multiple cameras could be incorporated into remotely
operated robotic systems since a single camera is unlikely to be effective for all aspects of a mission. For example, cameras with
zoom lenses are often used for far-field tasks. Cameras with close focus capabilities are often used for near-field tasks. Wide-angle
lenses are often used for driving and obstacle avoidance. This test method characterizes each onboard camera to understand overall
system capabilities.
5.3 This test method provides a way to unambiguously specify robot requirements in terms of the related measures of visual
acuity and field of view. This helps quantify the trade-offs and general usefulness of optical versus digital zoom cameras and fixed
versus variable focus lenses. The visual acuity charts can also help provide quantitative measures of performance within other test
methods and training scenarios. See Figs. 2-4 for illustrations.
5.4 This test method helps evaluate the effect of illumination on visual acuity. In dark environments, robots typically need to
illuminate the scene to be effective. Far-field objects downrange require much greater light intensity than near-field objects close
to the robot. Variable illumination helps ensure the scene is neither too dark nor overwhelmingly lighted so as to thwart the
camera’s ability to discern visual details (so-called “washout” of the image). Variable illumination is especially important when
quickly transitioning from far-field to near-field and back again.
5.5 Key features of response robots are that they are remotely operated from safe standoff distances, deployable at operational
tempos, capable of operating in complex environments, sufficiently hardened against harsh environments, reliable and field
serviceable, durable or cost-effectively disposable, and equipped with operational safeguards. As such, a major advantage of using
robots in response operations is to enhance the safety and effectiveness of responders or soldiers.
5.6 Field of View is important in terms of the ability of the operator to drive the robot. Looking at the world through a zoom
lens is like “looking through a soda straw.” Looking with a 30 or 40° field of view lens is like “driving with blinders on.” On the
other hand, using a very wide field of view lens (with a field of view of 120 or 150°), the operator’s use of optic flow to cue depth
perception is severely degraded and navigating in a tight environment is very difficult. Multiple cameras are recommended, with
one providing a very wide This test method aligns user expectations with actual capabilities to understand the inherent trade-offs
in deployable systems at any given cost. For example, an increase in image resolution typically results in improved field of view
or all together providing a very wide field of view.acuity, but not necessarily both. An increase in both may not be possible for
robots of a desired weight, endurance, or cost. Appropriate levels of understanding can help ensure that requirement specifications
are articulated within the limit of current capabilities.
FIG. 2 Angle Subtended by Various Size Objects at Various Distances This Baseline Image is Used for Purposes of Comparisons Below
E2566 − 17a
FIG. 43 Geometry of Three Images of the Same Scene with the Same Image Resolution. Top Row Shows Field of View DeterminationIn-
creasing from Left to Right (the image “zooms out”) While Bottom Row Shows Acuity Decreasing (features of the same size become
harder to clearly observe)
FIG. 34 Test of Visual Acuity and Field of ViewThree Images of the Same Scene with the Same Field of View. The Top Row Shows the
Field of View is Unchanged While Bottom Row Shows Both Resolution and Acuity Increasing (features become clearer)
5.7 This test method provides a tangible representation of essential robot capabilities with quantifiable measures of
performance. When considered with other related test methods in the suite, it facilitates communication among communities of
robot users and manufacturers. As such, this test method can be used to:
5.7.1 Inspire technical innovation and guide manufacturers toward implementing combinations of capabilities necessary to
perform essential mission tasks.
5.7.2 Measure and compare essential robot capabilities. This test method can establish the reliability of the system to perform
specified tasks, highlight break-through capabilities, and encourage hardening of developmental systems.
5.7.3 Inform purchasing decisions, conduct acceptance testing, and align deployment objectives with statistically significant
robot capabilities data captured through repeated testing and comparison of quantitative results.
5.7.4 Focus operator training and measure proficiency as a repeatable practice task that exercises actuators, sensors, and
operator interfaces. The test method can be embedded into training scenarios to capture and compare quantitative scores even
within uncontrolled environmental variables. This can help develop, maintain, measure, and track very perishable skills over time
and enable comparisons across squads, regions, or national averages.
5.8 Although this test method was developed as part of a suite of sensing tests for response robots, it may be applicable to other
domains. Different user communities can set their own thresholds of acceptable performance within the test method for various
mission requirements.
E2566 − 17a
5.9 Far Vision Visual Acuity is important for both unmanned air vehicles (UAVs) and ground vehicles for wide area survey.
Zoom is required for ground vehicles for wide area survey. It is recommended that users of this test method consider their particular
robot requirements when interpreting the test results. The capability evaluated in this test method alone shall be interpreted
according to the scope of this test method and shall not be considered as an overall indication of the capability of the robot’s
mobility subsystem nor of the entire robotic system. A single test method only captures the specified single aspect of a robot’s
capabilities. A more complete characterization of a robot’s capabilities requires test results from a wider set of test methods.
4.4 Near Vision Visual Acuity is important for ground vehicles for wide area survey in examining objects at close range and
also for small robots which operate in constrained spaces.
4.5 Testing in the dark is important for small robots since they must sometimes operate in spaces with no ambient lighting.
6. Apparatus
6.1 The components required to perform this test method are visual acuity test charts described below and devices to hold the
test charts at specified distances from the robot. For lighted conditions, outdoor daylight testing is preferable. Indoor testing
requires a room or hallway of sufficient length and width to accommodate the robot and the charts at the far-field distance and
lighting equipment. For darkened conditions, the same room can be used if windowless. A light meter is required to measure both
lighting conditions. Fig. 5 shows an example of the apparatus.
6.2 For robotic systems with camera images displayed on a remote operator interface, human-readable symbols shall be used
to measure visual acuity. Landolt Ring symbols are used as described in ISO 8596 (see Figs. 6 and 7). Each symbol consists of
a black ring with an outer diameter equal to five times the ring thickness displayed on a white background to maximize contrast.
The ring contains a gap with parallel edges equal to the ring thickness. The size of the gap represents the smallest discernible
feature when measuring visual acuity and is reported as the metric. The gap appears in one of eight radial orientations around the
ring at 45° intervals starting from straight up. Correctly identifying a series of randomly oriented symbols within defined error rates
provides statistically significant measures of visual acuity.
FIG. 5 Example Data Collection Form for Visual Acuity and Field of View TestsAn Example of a Robot Positioned in Front of the Visual
Acuity Test Apparatus. The LED Studio Lights to the Right are Set Up to Provide an Even Illumination Across the Chart. For Space
Reasons, Only 10 or 20 Symbols are Visible Here. Multiple Charts are Used to Achieve the Prescribed 30 Symbols
E2566 − 17a
FIG. 6 (A) The relative dimensions of the human-readable symbol used to measure visual acuity of systems as displayed on a remote
operator interface. (B) A concentric set of increasingly small symbols enables measurement of different levels of acuity within a com-
pact size that is particularly useful for embedding into training scenarios. But multiple concentric sets are required to achieve the
granularity desired for testing acuity. This chart can be rotated to produce all eight orientations of the symbol.
FIG. 7 An Example Visual Acuity Chart with Increasingly Small Sets of 10 Randomly Oriented Symbols at Each Scale to Capture Statis-
tically Significant Evaluations Based on Defined Error Rates. The Scale of Each Line Must be Proportionally Adapted for either the Far-
Field or Near-Field Distance to Correctly Identify the Percentage of Average Human Vision and Visual Acuity Ratio
6.2.1 Correctly discerning the gap orientations of multiple symbols with a specified error rate represents the ability to resolve
features the size of the gap. The gap orientations shall be identified relative to the top of the camera image in terms of compass
directions as follows: 0°/North (N), 45°/Northeast (NE), 90°/East (E), 135°/Southeast (SE), 180°/South (S), 225°/Southwest (SW),
270°/West (W), and 315°/Northwest (NW).
6.2.2 Although the test is performed with a single sequence of 30 symbols at a selected size, multiple sizes may be provided
on one chart to facilitate repeated testing at larger or smaller sizes.
6.2.3 The chart may be scaled to provide for various levels of acuity for robots under test depending on testing requirements.
Typical charts will have symbols for testing at a scale of 0.25 to 5 mm [0.01 to 0.2 in.] for testing at the near field distance of 40
cm [16 in.] and 0.5 to 25 mm [0.02 to 1 in.] for testing at the far field distance of 6 m [20 ft].
6.2.4 The incremental size of each scale shall be no more than 20 % smaller or larger than the next closest scale. The symbols
shall range from easily readable at the largest scale to unambiguously unreadable at the smallest scale. All symbols shall be printed
with sufficient resolution to maintain smoothly contoured symbols. The metric used shall be in millimeters (inches).
6.2.5 At each scale, 30 randomly oriented symbols shall be identified with at least one vertical, one horizontal, and one diagonal.
One or more test charts may contain sets of multiple symbols at various scales. Alternatively, a single symbol at each scale can
be rotated and identified sequentially.
6.2.6 For testing near-field visual acuity at a distance of 40 cm [16 in.], high quality conventional office printers or commercially
available charts shall be used to ensure small symbols with features as small as 0.25 mm are printed with sufficient resolution and
contrast.
6.2.7 For testing far-field visual acuity at a distance of 6 m [20 ft], much larger symbols may be required. Large-format printers
or commercially available charts shall be used to ensure smoothly contoured symbols with sufficient resolution and contrast.
6.2.8 Eq 1 may be used to compute the appropriate visual acuity for a particular size symbol viewed from a given distance and
may be reported as a convenience. Table 1 provides the conversion between metric ratios (6/6), imperial ratios (20/20 ft), decimal
(1.0), and percentage of average human vision (100 %), along with examples of objects that may be approximately discernible
from the given distances.
Chart Distance 3tan~1 ⁄ 60 °!
%Human Vision 5 100 3 (1)
Landolt Ring gap size
E2566 − 17a
TABLE 1 Example Levels of Acuity with Relative Percentages of Average Human Vision at Various Distances
Fixed Distance: Able to resolve alternating black and
Example Object – Must be no more than the
white lines of the following spacing when observed
Acuity in Metric % of Average Human following distance to reliably read:
from a distance of:
(Imperial) Vision
Typical Vehicle License Typical Newsprint or
6 m [20 ft] 0.4 m [16 in.]
Plate Return Address Label
6/150 (20/500) 4 % 45 mm [1.772 in.] 3 mm [0.12 in.] 2.67 m [8.75 ft] 0.06 m [0.197 ft]
6/120 (20/400) 5 % 36 mm [1.417 in.] 2.4 mm [0.096 in.] 3.33 m [10.9 ft] 0.075 m [0.246 ft]
6/90 (20/300) 7 % 26 mm [1.024 in.] 1.7 mm [0.069 in.] 4.67 m [15.3 ft] 0.105 m [0.345 ft]
6/70 (20/240) 9 % 20 mm [0.787 in.] 1.3 mm [0.053 in.] 6 m [19.7 ft] 0.135 m [0.443 ft]
6/60 (20/200) 10 % 18 mm [0.709 in.] 1.2 mm [0.048 in.] 6.67 m [21.9 ft] 0.15 m [0.492 ft]
6/50 (20/160) 12 % 15 mm [0.591 in.] 1 mm [0.040 in.] 8 m [26.3 ft] 0.18 m [0.591 ft]
6/40 (20/125) 15 % 12 mm [0.472 in.] 0.8 mm [0.032 in.] 10 m [32.8 ft] 0.225 m [0.738 ft]
6/30 (20/100) 20 % 8.7 mm [0.343 in.] 0.58 mm [0.023 in.] 13.3 m [43.8 ft] 0.3 m [0.984 ft]
6/24 (20/80) 25 % 7 mm [0.276 in.] 0.47 mm [0.019 in.] 16.7 m [54.7 ft] 0.375 m [1.23 ft]
6/19 (20/63) 32 % 5.5 mm [0.217 in.] 0.37 mm [0.015 in.] 21.3 m [70 ft] 0.48 m [1.57 ft]
6/15 (20/50) 40 % 4.4 mm [0.173 in.] 0.29 mm [0.012 in.] 26.7 m [87.5 ft] 0.6 m [1.97 ft]
6/12 (20/40) 50 % 3.5 mm [0.138 in.] 0.23 mm [0.0093 in.] 33.3 m [109 ft] 0.75 m [2.46 ft]
6/9.5 (20/30) 63 % 2.8 mm [0.110 in.] 0.19 mm [0.0074 in.] 42 m [138 ft] 0.945 m [3.1 ft]
6/7.5 (20/25) 80 % 2.2 mm [0.087 in.] 0.15 mm [0.0058 in.] 53.3 m [175 ft] 1.2 m [3.94 ft]
6/6.0 (20/20) 100 % 1.8 mm [0.071 in.] 0.12 mm [0.0048 in.] 66.7 m [219 ft] 1.5 m [4.92 ft]
6/4.8 (20/16) 125 % 1.4 mm [0.055 in.] 0.09 mm [0.0037 in.] 83.3 m [273 ft] 1.88 m [6.15 ft]
6/3.8 (20/13) 158 % 1.1 mm [0.043 in.] 0.07 mm [0.0029 in.] 105 m [346 ft] 2.37 m [7.78 ft]
6/3.0 (20/10) 200 % 0.87 mm [0.034 in.] 0.06 mm [0.0023 in.] 133 m [437 ft] 3 m [9.84 ft]
6/2.4 (20/8.0) 250 % 0.7 mm [0.028 in.] 0.05 mm [0.0019 in.] 167 m [547 ft] 3.75 m [12.3 ft]
6/1.9 (20/6.3) 316 % 0.55 mm [0.022 in.] 0.04 mm [0.0015 in.] 211 m [691 ft] 4.74 m [15.6 ft]
6/1.5 (20/5.0) 400 % 0.44 mm [0.017 in.] 0.03 mm [0.0012 in.] 267 m [875 ft] 6 m [19.7 ft]
6.2.9 For robotic systems with image processing capabilities, a visual acuity test chart with machine-readable symbols is used
to measure visual acuity. Increasingly small symbols called “quick response codes” or “QR codes” are used, which are
two-dimensional bar codes described in ISO/IEC 18004:2015 (see Fig. 8). Each QR code consists of a grid of black and white
squares arranged in a pattern that encodes information. The ability to successfully read a QR code requires resolving features equal
to the size of the individual squares in the grid. It is the sizes of the individual black and white squares that is reported as the metric.
6.2.10 Grids can range from 11×11, 21×21, up to 57×57 or larger than encode more information and contain embedded error
checking. Smaller 11×11 grids suffice for the purpose of testing visual acuity because they encode sufficient information to identify
the acuity measurement and some additional identification words such as they chart name or number, for example. At each scale,
30 randomly oriented symbols shall be identified with at least one vertical and one diagonal. One or more test charts may contain
sets of multiple symbols at various scales. Alternatively, multiple symbols may be arranged in a “flip chart” layout and identified
sequentially.
6.2.11 Auxiliary image processing devices, such as QR code scanners available on many mobile phones, can be used to identify
these symbols as displayed on a remote operator interface. This can provide a consistent indication of system acuity without
introducing potential errors due to operator eyesight with or without corrective lenses.
FIG. 8 An example of a QR code using 21×21 square grid. This symbol encodes a line of text that would be read as “TestCode1”. The
ability to successfully read this code represents the ability to resolve features equal in size to the individual squares that make up the
code (or 1/21st of the size of the whole code). Note that QR codes require more space to encode a similar size acuity symbol, in this
case 21 times larger than the acuity measure where a Landolt Ring symbol is five times larger than its acuity measurement feature. An
alternative layout that combines the Landolt Ring symbols and QR Code symbols on one vision test chart. The ability to resolve the
QR Code at a given size is equivalent to reading the adjacent set of Landolt Ring symbols.
E2566 − 17a
6.3 A ruler or other object of known size of approximately 90 to 120 cm [3 to 4 ft] is used to measure the field of view of each
camera. It is recommended that this be an ISO 12233 chart, which can also be used to independently measure the image resolution
of each camera. The chart contains aspect ratio indicators to determine the field of view of the camera (see Fig. 9). The chart must
fill the vertical frame of the camera while being in the optimal range of focus so different size charts are typically needed to ensure
both objectives are achieved simultaneously. A chart stand with variable orientation and elevation from ground level to the top most
camera on the robot is needed to quickly adjust the chart relative to all the onboard cameras.
6.4 Other Apparatuses Include:
6.4.1 A measuring tape to determine the distance between the robot and the charts.
6.4.2 A device for displaying the charts at fixed distances from the robot, including wall hanging, tripods, or other such devices.
6.4.3 Equipment for marking a line at the prescribed distance from the charts (40 cm [16 in.] or 6 m [20 ft]). In an indoor
environment, colored adhesive tape may be used.
6.4.4 A light meter with range from 0.01 to 1000 lux 6 5 %.
6.4.5 For lighted conditions, an outdoor area with indirect sunlight or a room with sufficient length and width to accommodate
the robot and the charts at the far-field distance 6 m [20 ft] and near-field distance of 40 cm [16 in.]. Either way, the environment
should be evenly lighted to 150 to 300 lux or more using a smooth spectrum, non-flickering light source, such as an incandescent
lamp or studio LED panel.
6.5 For darkened conditions, a windowless room or hallway with illumination of no more than 0.1 lux. The room shall be of
sufficient length and width to accommodate the robot and the charts at the far-field distance 6 m [20 ft] and near-field distance of
40 cm [16 in.]. When testing camera systems that are sensitive to near-infrared light (for example, “night vision” cameras), ensure
there are no additional sources of infrared light in the scene. Such sources may include surveillance cameras with infrared light.
6.6 A test sponsor is authorized to specify other test apparatus sizes, distances, and conditions to suit particular deployment
requirements.
7. Hazards
7.1 There are noIn addition to 1.8hazards and no environmental issues associated with this test method., users of this test method
shall also address equipment preservation and human-robot coexistence concerns. Safety setups such as belays and containment
walls shall be used when there are such concerns. Environmental conditions, such as high or low temperatures, excessive moisture,
and rough terrains can be stressful, exceed the respective ranges within which the robot is built to properly operate, or damage
robotic components. These conditions can also cause unexpected robot motions that, in turn, can have negative effects on the
humans that are nearby or on the robot itself.
7.2 Identify all the emergency stop button(s) on the robot chassis and the OCU before operating or interacting with the robot.
7.3 While the robot is active and the emergency stop button is disengaged, avoid:
7.3.1 the areas directly in front of and behind the robot,
7.3.2 the reachable radius of the robot’s manipulator, as equipped, and
7.3.3 touching the robot other than to engage the emergency stop button.
8. Procedure
8.1 Ensure that the apparatus and environmental conditions are set up properly according to the Apparatus Section 6.
8.2 Ensure that the robot system configuration has been identified and documented.
8.2.1 The particular system configuration to be tested shall be identified and uniquely named by using the make, model, and
applicable configuration name as provided by the manufacturer. This identification process includes measuring and documenting
dimensions and weights of all the subsystem, components, and as-shipped packaging. These include the robot, OCU, and other
FIG. 9 (A) A chart suitable for determining field-of-view and image resolution is an ISO 12233 chart, as implemented here by Cornell
University. However, any vision test chart that contains field-of-view markings and resolution wedges (highlighted) will suffice. (B) An
example view showing a 4:3 aspect ratio as indicated by the arrows in the corners and some loss of resolution shown in the resolu-
tion wedges.
E2566 − 17a
sustainment and maintenance items such as power sources and spare parts. Also list subsystems, payloads, and items in the
field-maintenance kit. These include tools and consumable items such as duct tape, cable ties, and other items. Documentation shall
also include detailed photographs of all of the above as well as videos of routine maintenance tasks (for example, battery change).
The system configuration shall remain the same for all relevant tests to enable direct comparison of performance and to identify
capability trade-offs between different configurations. Any number of identified system configurations can be subjected to testing.
8.3 Determine the number of required repetitions if measuring system capabilities.
8.3.1 Test trials shall produce enough successful repetitions to demonstrate the reliability of the system or operator necessary
for the envisioned mission. The higher the ratio of successful repetitions to faults, the more reliable the system or operator. The
more repetitions completed with that ratio, the more confidence may be placed in that reliability. The calculated reliability and
confidence levels can be determined from statistical tables. Some missions may require higher reliability. Others may be more
resilient to failure and can accommodate lower reliability. A test trial of 30 repetitions or more is recommended to establish a
system’s capability. Operator proficiency trials are typically time limited as specified in 8.3.4.
8.3.2 A reasonable starting threshold may be at least 80 % reliability with 80 % confidence. This can be achieved by performing
30 repetitions with 27 or more successes. When 30 repetitions is not feasible, this reliability and confidence may still be achieved
if the first test trial includes 20 repetitions with 19 or more successes or 10 successful repetitions.
8.3.3 Multiple trials are allowed to improve the performance of the same tested robot configuration. If this is the case, the latest
30 consecutive repetitions from across the multiple trials shall be considered together when determining the ratio of successful
repetitions to faults.
8.3.4 Time limited trials, in minutes, shall be used to measure operator proficiency. A reasonable timer increment may be the
estimated time to complete five or more successful repetitions. When measuring and comparing operator proficiency, it is important
to equalize the elapsed time of operation for “expert” and “novice” operators to normalize for fatigue. However, shorter elapsed
times resulting in fewer successful repetitions reduces confidence in the measured reliability of the operator to perform the task.
8.4 Select an operator to perform the test.
8.4.1 To measure system capabilities, the operator or operators shall be designated by the manufacturer in order to align interests
and to ensure the best possible robot performance. The best results obtained with the manufacturer-designated operator shall be
used as the 100th percentile of operator proficiency for the given system. Any other operator can measure her or his proficiency
as a percentage. Example levels of operator proficiency may be “Novice” (0 % to 39 %), “Proficient” (40 % to 79 %), and “Expert”
(80 % to 100 %). Operator proficiencies may be compared across regional or national averages.
8.4.2 Practice is optional before testing. The operator shall be familiar with the test procedure, the apparatus settings, and the
environmental conditions enough to be tested.
8.4.3 During a test trial, the operator shall be located remotely at a station that is out of direct sight and sound of the robot in
the test apparatus while still maintaining communications with the robot.
8.4.4 During a test trial, any human communication to the operator about the robot’s status within the test apparatus shall be
considered a fault. However, this should not deter communications regarding the safety of the robot or personnel.
8.5 Place or drive the robot to the start point within the apparatus as specified in Section 6.
8.6 Field of View: Perform the specified task and repeat until the desired number of symbols have been read.
8.6.1 The test environment foroperator shall read 6.2 below is established, with eye charts on a wall and the robot located at
a set test distance 6 m (20 ft) away from the wall (seeeach symbol on the chart, in turn, at the chosen size Fig. 3). Vertical lines
are drawn on the wall subtending fields of view from the test distance of 20 to 60° (or more if space allows) in increments of 10°
and labeled. and distance, through the operator interface. It is recommended that each symbol take no more than 5 to 10 seconds
to be read.
6.1.2 Taking the line from the robot camera to the center of the eye chart as the center line, field of view lines need only be
drawn to one side because of symmetry.
6.1.3 Determine field of view and record the result.
8.6.2 If the camera lens has a field of view beyond 60°, and test site space does not allow further reference marks, the field of
view can be calculated using trigonometry (seeFor scoring purposes, a successful repetition shall be awarded when t
...








Questions, Comments and Discussion
Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.
Loading comments...