MIPC Vol-1 Ver-1-00 20160608
MIPC Vol-1 Ver-1-00 20160608
0 MIPCDIDD
Volume 1
Version: 1.00
Date: 2016-06-08
2016-06-08
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
CONTACTS
The following point of contact is provided for assistance in understanding the contents of
this implementation profile
NGA/TAEA
Office of the Chief Information Officer, Information Technology Directorate (CIO-T)
7500 GEOINT Drive
Springfield, VA 22150
2016-06-08 i
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
CHANGE LOG
TBR/TBD LOG
2016-06-08 ii
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Executive Summary
The Modality Independent Point Cloud (MIPC) specification establishes a data and meta-
data standard for three-dimensional (3-D) geospatial point clouds developed from any remote
sensing modality or any sensed phenomenology. It typically applies to a composite point
cloud derived from multiple source data sets. This Volume 1 of the MIPC standard pro-
vides both the conceptual design of the MIPC data model and the detailed implementation
descriptions. The corresponding Volume 2 of the MIPC standard provides the HDF5 Imple-
mentation Profile for a MIPC dataset in the recommended file format.
The main objective of MIPC is to create a general purpose point cloud file for storage and
transmission within the National System for Geospatial-Intelligence (NSG) in a standard
form that will maximize interoperability and data fusion. A related standard, Sensor In-
dependent Point Cloud (SIPC), standardizes a single point cloud created by a single Light
Detection and Ranging (LIDAR) sensor in a single pass1 , and therefore includes sufficient
LIDAR metadata to enable custom processing by image scientists as well as exploitation by
image analysts. The MIPC specification can be considered a downstream product, gener-
alizing and abstracting the 3-D point cloud data structure, while still maintaining critical
metadata from the originating modalities. As such, the MIPC standard is appropriate for
merging LIDAR point clouds from different collections2 or point clouds derived from multi-
ple two-dimensional (2-D) images. Therefore, the MIPC standard can accommodate point
clouds created from other modalities besides LIDAR, such as Electro-Optical (EO) imagery,
Radio Detection and Ranging (RADAR), and Wide Area Motion Imagery (WAMI) systems.
Since MIPC files can accomodate temporal, radiometric, and spectral attributes per point,
MIPC can be considered a multi-dimensional point cloud for greater than three dimensions.
Non-LIDAR point clouds are in the early stages of development; and they vary greatly in
metadata content, signal content, and file format. There are no standards currently across
these systems for any of these topics. The MIPC imposes standards on these modalities while
also generalizing the point cloud structure to be modality independent. At this time, there is
no common metadata dictionary for point clouds agnostic of their collection modality. The
MIPC standardizes signal units and metadata terms when they exist within the remote sens-
ing community. Documents referenced in the development of MIPC include the Conceptual
Model and Metadata Dictionary (CMMD), the Sensor Independent Derived Data (SIDD)
specifications, the NSG Metadata Foundation (NMF) specifications, the Open Geospatial
Consortium (OGC) schemas, and the Department of Defense Discovery Metadata Specifica-
tion (DDMS)3 .
1
This is typically referred to as a Level 2 (L2) LIDAR product or Enterprise Level 3 in CMMD terminology
2
This is sometimes referred to as a Level 3 (L3) product in traditional LIDAR image chains.
3
http://metadata.ces.mil/dse/irs/DDMS/DDMS_5_0_overview.html
2016-06-08 iii
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Contents
1 Introduction 1
1.1 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Applicable Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Product Design and Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3.1 Conceptual Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3.2 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3.3 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3.4 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 Design 7
2.1 Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.1.1 File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.2 Scene . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.3 Mission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.5 Modality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.6 Sensor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.7 Metadata . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.8 Look . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.9 Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.10 Positions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.11 Tilemap . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2 Additional Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.3 Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.3.1 Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.3.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.3 Indices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.4 Data Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3.5 Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3 Implementation 15
3.1 Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.3 Dataset Descriptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.3.1 Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.3.2 Spectral Bands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.3.2.1 Spectral Bands for EO Systems . . . . . . . . . . . . . . . . . . 31
3.3.2.2 Spectral Bands for RADAR Systems . . . . . . . . . . . . . . . 32
3.3.3 Polarization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.3.3.1 Polarization for EO Systems . . . . . . . . . . . . . . . . . . . . 33
3.3.3.2 Polarization for RADAR Systems . . . . . . . . . . . . . . . . . 34
3.3.4 Channel Descriptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
2016-06-08 iv
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
4 Attributes 46
4.1 Standardized Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
4.1.1 Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.1.2 Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.1.3 Class . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.1.4 Intensity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.1.4.1 Original Intensities . . . . . . . . . . . . . . . . . . . . . . . . . . 50
4.2 User Defined Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
5 Geopositioning Error 51
2016-06-08 v
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
List of Figures
1 MIPC Groups and their Relationships. . . . . . . . . . . . . . . . . . . . . . . . . 8
2 Translated and Rotated ECEF Coordinate System. . . . . . . . . . . . . . . . . 43
3 Local Origin Shared Between Tiles. . . . . . . . . . . . . . . . . . . . . . . . . . . 45
2016-06-08 vi
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
List of Tables
1 Applicable Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 Enumeration Table for ModalityTypes . . . . . . . . . . . . . . . . . . . . . . . . 9
3 Placeholder Index Variables used in this Document . . . . . . . . . . . . . . . . 13
4 MIPC Data Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5 MIPC Units . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
6 MIPC Groups and Hierarchical Levels . . . . . . . . . . . . . . . . . . . . . . . . 16
7 MIPC Requirements for Groups and Datasets . . . . . . . . . . . . . . . . . . . 18
8 MIPC Dataset Dimensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
9 Row and Column Definitions for Array Objects . . . . . . . . . . . . . . . . . . 28
10 WorldView-3 VNIR Bands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
11 Enumeration Table for SarPolarization . . . . . . . . . . . . . . . . . . . . . . . . 35
12 Enumeration Table for SpectralRegions . . . . . . . . . . . . . . . . . . . . . . . 38
13 Enumeration Table for ProcessingMethods . . . . . . . . . . . . . . . . . . . . . 40
14 Enumeration Table for CoordinateSystems . . . . . . . . . . . . . . . . . . . . . 41
15 Enumeration Table for Datums . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
16 Enumeration Table for Classifiers . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
17 Enumeration Table for RadiometricDimensions . . . . . . . . . . . . . . . . . . . 49
18 Enumeration Table for RadiometricUnits . . . . . . . . . . . . . . . . . . . . . . 50
2016-06-08 vii
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
1 Introduction
1.1 Scope
This Volume 1 of the specification provides a Design and Implementation Description for the
Modality Independent Point Cloud (MIPC) standard. The corresponding Volume 2 specifi-
cation provides the Hierarchical Data Format 5 (HDF5) Implementation Profile for a MIPC
dataset in the recommended standardized file format. The HDF5 was selected as the file
format for MIPC based on lessons learned from the development of the Sensor Independent
Point Cloud (SIPC) standard and the associated SIPC File Format Trade Study [1] con-
ducted in 2012 4 .
The MIPC defines a standard for point clouds, where a point is defined, at minimum, in
three-dimensional (3-D) geospatial coordinates and generically represented as x, y, z data,
but where these parameters can refer to any geospatial reference frame consisting of a co-
ordinate system along with horizontal and vertical datums. The term cloud refers to the
fact that the data structure is a set of points, typically irregularly sampled and amorphous
in extents. Geospatial intelligence also requires a date and time value associated with data
over a given location, but the temporal resolution will vary greatly across remote sensing
modalities. Time per point is ideal for fusion and change detection, but not available in some
data sets at this stage in the image chain. MIPC point clouds must, at a minimum, contain
the earliest and latest collection times that are involved in the product. The temporal extent
of a point cloud must be handled with care and awareness during analysis.
The term modality refers to a specific remote sensing technology, such as Electro-Optical
(EO) Panchromatic (Pan), Synthetic Aperture RADAR (SAR), Light Detection and Rang-
ing (LIDAR), Wide Area Motion Imagery (WAMI), Infrared (IR), Polarimetric Imaging (PI),
Full Motion Video (FMV), Multi-Spectral Imagery (MSI), Hyper-Spectral Imagery (HSI), or
Overhead Persistent Infrared (OPIR). Different modalities measure different phenomenolo-
gies of interest. Therefore, MIPC needs to handle these variations in the meaning and units
of an intensity value per point across different remote sensing modalities, and this aspect
must be handled carefully in fusion and analysis processes. So the term, intensity, merely
refers to some general signal amplitude. See § 4.1.4 for the MIPC definitions.
The term independent refers to the fact that the format can handle point cloud data from
any modality. The format will still include the metadata specific to each modality necessary
for exploitation and analysis. However, the wide band signal data, and most of the meta-
data, have been structured in a manner to be modality agnostic to the extent possible. At
this time, there is still a need to accommodate some modality-specific metadata within this
high level product. Similar to the National Imagery Transmission Format (NITF), MIPC is
a data format for transmission, exploitation, and analysis. It has not been determined if the
format is optimal for processing, but these products are typically generated towards the end
4
The SIPC file format trade study should not be confused with similar trade studies recently conducted
by NGA for video data.
2016-06-08 1
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
of the image chain after sensor specific processing has completed. Whether it is the optimal
format for storage within an imagery archive or other database has not yet been determined.
2016-06-08 2
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Title Version
CMMD Level 2, LIDAR Data and Metadata: Ver. 1.0
Conceptual Model and Metadata Dictionary for
Enterprise Level 2 Volume Return Products
Generic Point Cloud error model 5 Ver. 1.0
Intelligence Community (IC), Information Secu- Ver. 13
rity Marking (ISM) Metadata Specification 6
NGA, SIPC Volume 1, Sensor Independent Point Ver. 1.02
Cloud, Design and Implementation Description
Document
NGA, Sensor Indepdendent Derived Data 01 Aug 2011
(SIDD), Vol. 1, Design and Implementation
Description Document, NGA.STND.0025-1 1.0
NITFS, The Compendium of Controlled Exten- Ver. 2.1, 16 Nov 2000
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix E,
Airborne Support Data Extensions (ASDE)
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 01 Aug 2007
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix L,
HISTOA Extension
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 31 Mar 2006
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix O,
Multi-image Scene (MiS) Table of Contents (MI-
TOCA) Tagged Record Extension (TRE)
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 30 Sep 2004
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix X,
General Purpose Band Parameters (BANDSB)
Tagged Record Extension (TRE)
NMF Part 1, NSG Metadata Foundation, Core Ver. 2.1
W3C, XML Schema Part 2: Datatypes Second Accessed 13 Jul 2015
Edition 7
5
https://nsgreg.nga.mil/doc/view?i=1799
6
http://www.dni.gov/index.php/about/organization/chief-information-officer/
information-security-marking-metadata
7
http://www.w3.org/TR/xmlschema-2
2016-06-08 3
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
The MIPC standard has been optimized for composite point clouds, those generated from
a combination of separate images, looks, or scans of the same scene. In addition, it was
determined that there are benefits to accommodating a more complex structure that in-
cludes separate coincident point clouds from different sensors or processing algorithms. For
a given point in x, y, z, any signal data or metadata for that point should be understood as
a composite value from the source data that contributed to that point. For example, the
intensity value for a given point may be a mean intensity from all the separate image pixels
that contributed to that point. This specification does not dictate how that composite value
is calculated, or how 3-D point clouds are developed in general. This is a broad topic for
many research endeavors. This specification standardizes the container for that information
once it has been derived.
In order to contain separate coincident point clouds from different sensors, the MIPC ac-
counts for different collections of x, y, z points with different attributes per point. As a
consequence, the MIPC standard will be able to accommodate different modalities within
the same file.
Example data products appropriate for MIPC include, but are not limited to:
• point clouds derived from ray tracing two or more imagery data sets from one or more
sensors of the same modality8 over the same scene at varying collection angles
• point clouds developed from imagery or video using Structure from Motion (SfM)
techniques
• aggregated points clouds9 from two or more LIDAR SIPC10 data sets
• point clouds produced from interferometric SAR techniques collected from two or more
passes / flights over the target
8
Preferably of similar spectral response
9
This would be considered an L3 point cloud in traditional LIDAR terminology
10
SIPC is optimized for Level 2, Enterprise Level 3, LIDAR point clouds
2016-06-08 4
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
1.3.2 Approach
The development of the MIPC standard included the following processes:
1. define the conceptual definition of a modality independent point cloud, along with
assumptions and requirements
2. examine data structures and metadata of existing point cloud products to be subsumed
by MIPC, to include LIDAR and non-LIDAR sources
3. leverage modality agnostic metadata from the upstream point cloud standard, SIPC
[2] [3], which is based on the Conceptual Model and Metadata Dictionary (CMMD) [4]
4. accommodate metadata not covered in the SIPC standard by leveraging the Sensor
Independent Derived Data (SIDD) standard
5. for any remaining metadata, examine the NITF, NSG Metadata Foundation (NMF),
and Open Geospatial Consortium (OGC) standards for options
6. develop and discuss focused questions on non-LIDAR point clouds with relevant stake-
holders to identify data and metadata that is needed but cannot be accommodated in
the current formats; stakeholders include modality Subject Matter Experts (SMEs),
point cloud developers, and point cloud users
7. analyze trades between options for the overall structure and present most feasible
options to Government for review and guidance
9. implement appropriate comments and correct for review by the NITF Technical Board
(NTB)
1. generate sample data within the format options and determine limitations
4. develop converters for legacy point cloud products to MIPC for experimentation and
transition
5. help develop system integration strategies for incorporation of MIPC into NSG pro-
grams
2016-06-08 5
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
1.3.3 Assumptions
The development of the MIPC standard was conducted under the following assumptions:
1. The minimal, required signal data for a geospatial point cloud is a collection of points
with 3-D coordinates x, y, z in some geospatial reference frame (coordinate system and
datums)
2. Non-LIDAR point clouds are developed by multiple looks at the same scene at different
times and typically from different angles
3. A current unified standard does not exist for non-LIDAR point clouds from different
modalities with sufficient metadata
4. It is not yet appropriate to synthesize single points from images collected from differ-
ent remote sensing modalities since they measure different phenomenologies; however,
those resulting point clouds could be, and should be, combined for a fused, multi-
dimensional representation of a scene of interest
1.3.4 Requirements
A successful MIPC standard must accomplish the following minimal requirements.
1. The standard shall be generic enough to accommodate any 3-D geospatial cloud of
points
2. The standard shall not impose a pre-defined limit on the number of attributes per
point (advantage over LAS)
3. The standard shall accommodate points, attributes, and metadata in various user-
determined units, data types, sizes, and bit depths (advantage over binary point for-
mat)
4. The standard shall accommodate the superset of metadata relevant to the point clouds
contained in the current products that characterize the MIPC product
5. The standard shall accommodate additional metadata that the product developers and
modality experts state are required in the point cloud product
6. The standard shall accommodate additional metadata that analysts state are required
in the point cloud product
7. The standard shall accommodate error metadata based on the Generic Point-Cloud
Model (GPM)
8. The standard shall accommodate multiple collections of points created from the same
sensors in a single file
2016-06-08 6
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
9. The standard shall accommodate multiple collections of points from different modalities
in the same file, provided each collection of points is derived from a single modality
(but potentially from different sensors within that modality)
10. The standard shall accommodate multiple collections of points across the scene that
possess a different list of attributes per point
The last three requirements greatly impact the complexity of the MIPC standard, but are
necessary given the lack of standards in the upstream file formats for the non-LIDAR point
clouds. The overall structure of a MIPC file has been designed with these additional require-
ments in mind.
2 Design
The following sections present the overall design philosophy of the MIPC standard.
2.1 Groups
An MIPC file is organized in a hierarchical structure using an object-oriented approach. The
hierarchical structure allows software to drill down to the level of complexity desired along
specific branches in the tree. This allows a single MIPC file to contain multiple point clouds
from different sensors and even different modalities. The hierarchical structure also reduces
redundant information.
The high level collections of information content are represented as groups in this design.
The actual data elements are stored as datasets under these groups. These groups are
described in the following subsections; then the implementation details are provided in § 3.
Figure 1 provides a Unified Modeling Language (UML) class diagram of the relationships
of these groups. The composition arrows indicate the which groups are contained within
other groups. Mandatory leaf groups are written inside a group; while non-mandatory
subgroups, or mandatory subgroups with children, are drawn outside to show their additional
relationships.
2016-06-08 7
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2.1.1 File
This group includes datasets that uniquely identify and scope the file, its source, and security
classification information.
2.1.2 Scene
This group summarizes the contiguous technical content that can be considered a single
scene within this file. In addition to the geospatial extents, this information includes the
temporal, spectral, and radiometric coverage of the entire file. Therefore, there is only a
single scene in a MIPC that summarizes the entire data set in the file. A MIPC file may
contain multiple point clouds from multiple sensors, but these individual point clouds are
assembled together in the same MIPC product when they share some common geographical
2016-06-08 8
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2.1.3 Mission
This group describes the purpose of the file and the relevant target identifiers, such as Basic
Encyclopedia (BE) numbers, within the scene.
2.1.4 References
This group contains datasets where each one serves as a Look-Up Table (LUT) for enu-
merated lists of values for any of the finite, standardized string fields within the MIPC
specification. Those fields then store an integer as an index into one of these tables. This
technique is used for any standardized values to prevent variations in spellings and enforce
common definitions. Some file formats force the user to look up the information in an In-
terface Control Document (ICD). This method puts the ICD tables within the file for easy
programmatic reference. This technique also makes it easy to expand the list of values as
needed at any time with little or no modifications to any MIPC software. These tables
should appear in every file when fields using that table are included; however, it is presumed
that it would be simple to just have MIPC generation software include all tables in all cases.
These tables add a negligible number of bytes to the file.
2.1.5 Modality
This term refers to a specific sensing technology, as listed in Table 2. In this context, EO
includes Pan, MSI, HSI, IR, WAMI, and FMV since the data and metadata structures do
not change for these systems. A single MIPC file can contain data from multiple modalities
that are covering the same scene. The GENERIC modality allows for any other system
that generates a 3-D geospatial point cloud, but for which modality-specific metadata may
not yet be accommodated, such as Sound Navigation and Ranging (SONAR). Bathymetric
LIDAR and SONAR will be addressed in the next iteration of the MIPC standard.
2.1.6 Sensor
A MIPC file may contain multiple point clouds derived from multiple sensors, even for a single
modality. This group contains information about the sensor(s) and the sensor products as
the individual looks from which the point cloud(s) are constructed. Sensor-level metadata
is separate from point-cloud data in the MIPC hierarchy because a MIPC file may contain
2016-06-08 9
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
point clouds generated from a combination of different sensors, but anticipated to be from
the same modality.
2.1.7 Metadata
Although much of the information in the file can be considered metadata, this group con-
tains the modality-specific information. Although this group contains information about the
sensor, the list of potential fields is modality specific, with some minor overlap to be self-
sufficient. The LIDAR, SAR, and EO groups are therefore conditional.
As an example, a fictional LIDAR system named THX1139 with a Pulse Repetition Fre-
quency (PRF) of 10 [ kHz] would have fields populated as follows:
MODALITY 0.Type = 1
2.1.8 Look
The term look comes from Appendix O of the NITF specification that describes data sets
synthesized through the aggregation of multiple input images. Since this is almost always
the case for MIPC, this term is used here as well. For non-LIDAR point clouds, MIPC clouds
are created from multiple 2-D images or frames over the same scene, so each input image
is a single look at a scene. For LIDAR point clouds, each scan or pass over the scene can
be provided as a Level 2 product in compliance with the SIPC standard. Consequently, a
MIPC point cloud for LIDAR can be used to store the multiple look (Level 3), co-registered
products.
2.1.9 Cloud
Any unique set of points as (x, y, z) triplets represents a separate point cloud in MIPC.
Each point cloud in a MIPC file must come from a single modality, but may be developed
by data from multiple sensors. Cloud-level metadata provides information pertaining to a
single point cloud within the file, including references to the source look images used to
generate the point cloud. Note that in the case of only one point cloud in the MIPC file,
some cloud-level metadata will be redundant with the scene-level metadata.
2.1.10 Positions
This refers to the actual 3-D point data. The 3-D point positions provide the coordinates
for all the attribute data.
2016-06-08 10
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2.1.11 Tilemap
Within an individual point cloud, data is internally organized into groups of points called
tiles. TILEMAP metadata enables the tremendous advantages that MIPC provides over
more monolithic data structures, particularly the capability to query, discover, and extract
subsets of points based on multiple parametric criteria without opening, ingesting, sorting,
and searching the entire dataset. TILEMAP metadata falls below CLOUD metadata in the
hierarchy because the tiles are expected to be unique for each point cloud and not shared
between point clouds in the same MIPC dataset.
A common basis for tiling in a MIPC file will be geospatial tiling. This method groups points
together into tiles based on their geographic proximity. Resolution-based tiling schemes are
also possible, such as quadtrees and octrees. A specific tiling strategy is not mandated in the
MIPC standard since the format is not affected by any chosen method. In order for ground
spatial queries to benefit from tiling, each tile must include metadata describing its bounding
extents in geodetic Latitude-Longitude-Altitude (LLA) coordinates11 , even though the point
positions are in the Earth-Centered, Earth-Fixed (ECEF) system. Tiling in yet a different
coordinate system is strongly discouraged, and it is recommended to either define tile cuts
within the coordinate system of the points (ECEF) or that of the tile metadata (LLA).
Point data must be tiled in at least one tile per cloud. Tiles are referenced through a unique
name for each tile within a cloud. A naming convention for tile objects is Tile d, where d is
the sequential integer tile index, starting from 0. However, tiles could technically have any
name. The tile names are stored in the following dataset by the tile index (d):
2.3 Conventions
2.3.1 Groups
The term group refers to large organizations of data within the hierarchical structure. In
the HDF5 format, these items are also called Groups, and are similar to tags in XML or
directories in a file system.
11
altitude is the height above the defined vertical datum
2016-06-08 11
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
The convention is to write the names of these with all UPPERCASE characters and no
spaces. Groups that can have multiple instances, such as MODALITY, will have the name
followed by an underscore and a number, e.g., MODALITY 0. When describing a group
path in this document, letters will be used as placeholders for numbers, such as
with the understanding that these letters (a, c) will be replaced with actual numbers in an
actual file. Since any Application Programming Interface (API) for advanced file formats,
such as HDF5, provide for the discovery of datasets, it is not necessary for the user to know
the number or name of datasets a priori, or for the numbering in the name to be assigned a
fixed length.
2.3.2 Datasets
The term dataset is used in this standard to refer to the individual elements that contain
data. The names of these datasets will be written in CamelCase text. The hierarchical path
to a dataset will be written in this document with groups, subgroups, and datasets separated
by a period (.), such as
where Tile 17 is a dataset (numerical array) containing the points for the 18th tile within
the 4th cloud under the 1st modality.
2.3.3 Indices
There are two types of indexing within the MIPC standard. The first type refers to multiple
or repeated instances of the same group or dataset. The name of the group or dataset is
followed by an underscore ” ” and an integer index. There is no need to zero-pad these
numbers to a fixed length. An example would be:
MODALITY 0
MODALITY 1
⋮
MODALITY 10, etc.
Within this document, standard letters are used as variable placeholders for the indices of
specific data items, as listed in Table 3.
2016-06-08 12
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
The second type of indexing is for array subscripts within a dataset. MIPC supports datasets
as 1-D, 2-D, and 3-D arrays. The indexing is row major. In all cases, indices are 0-based
since this is the standard approach within HDF5 files and most programming languages. Ar-
ray locations within datasets can be written with array subscripting in a standard manner as:
SOMEGROUP.MyDataset(i, j, k)
where
i is the row index,
j is the column index, and
k is the page index.
As an example,
SOMEGROUP 4.MyDataset(0, 2, 3)
refers to the value at the 1st row, 3rd column, and 4th page in a 3-D table called MyDataset
under the 5th instance of the group called SOMEGROUP.
2016-06-08 13
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Another key enabler of MIPC is the ability to use word sizes tailored to balance precision
and file size. This is supported by the HDF5 file format and API. Because the API can
determine the data type for any dataset, it is not necessary for a user of MIPC to know data
types a priori or declare what type is to be read when extracting data from the file. The
API user simply provides the name of the dataset and the API adapts to the format of the
data accordingly.
2.3.5 Units
All units shall be in the SI system in the Meter-Kilogram-Second (MKS) range. Through-
out the MIPC standard, the unit conventions in Table 5 will be followed unless otherwise
specified.
2016-06-08 14
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Timestamp values should use the datetime type, which represents a POSIX/UNIX datetime
in decimal seconds which began 01-Jan-1970. This is due to the fact that most of the current
LIDAR data holdings contain either POSIX time, GPS time, or no time data at all. It should
be noted that POSIX/UNIX time does not include leap seconds. Adjustments should be
made as necessary.
3 Implementation
3.1 Groups
Groups are used to capture the high level collections of data described above. The object-
oriented approach allows the metadata and data to be organized by these groups so that the
process of locating the appropriate data and metadata is intuitive. The hierarchical list of
groups is given in Table 6. Although some file formats, such as HDF5 and XML, allow for
additional attributes and metadata to be attached to a group, this feature is not mandated
in MIPC since there is no clear advantage over placing all data within array datasets, and
it may impose additional complexities when searching for information.
2016-06-08 15
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 16
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
3.2 Datasets
Datasets are fields containing the actual data, typically in a numerical array. Although there
are several manners in which this data could be organized, the following philosophies were
followed where possible:
2. Since there are several specifications from which to select existing dataset names, the
best name was selected in the spirit of philosophy 1
3. Although arrays are efficient, an array is formed only when the information belongs
together, and all columns are the same data type
4. Standardized strings are stored as integers that reference an enumerated list in the
REFERENCES group
The resulting datasets under each group are given in the tables that follow. Table 7 provides
a complete list of the datasets under each group. The Req. column provides the require-
ment status, identical to that provided in Figure 1, indicating which groups and datasets
are Mandatory (M), Optional (O), or Conditional (C). This requirement is hierarchical. If
a parent is O or C and a child is M, it means that the child is only mandatory if the parent
is present.
2016-06-08 17
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 18
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 19
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 20
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 21
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Table 8 lists the datasets under each group, and the dimensions of each dataset, along with
the physical units and data type as defined in Table 4. When the unit starts with
E: SomeList
it means the value is an integer index into an enumerated list in the REFERENCES group.
The integer is the row in a table that is a dataset named SomeList under the REFERENCES
group.
2016-06-08 22
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 23
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 24
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 25
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 26
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
For those datasets that are arrays, the meanings of the rows and columns are given in Table
9 if they are not obvious. For items that will be numbered sequentially from 0 to n, the row
position will be used as the index, so no additional index column is needed. For example,
the enumerated lists will use this row index (0 to n) as the integer stored in the dataset field.
The various enumerated tables are also provided in this document as additional tables in
§ 3.
2016-06-08 27
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 28
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 29
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
3.3.1 Channels
Any point belonging to a MIPC point cloud may have one or more intensity values associated
with it. The physical meaning of these intensities varies depending upon the modality. For
EO systems, including LIDAR, each intensity value represents light within a defined spectral
band, after a defined polarization filter, or as a combination of both a spectral band and
polarization filter. The interpretation of SAR intensity differs somewhat, as described below.
Within MIPC, the term Channel is not referring to sensor channels, but rather a dimension
in dataspace for the intensity based on spectral band and polarization state.
Many sensors will produce single-channel data, meaning that they provide only one chan-
nel of intensity value per point, an example being single-wavelength LIDAR, panchromatic
imagery, or single-pol SAR. Multiple-channel point clouds include those derived from multi-
spectral, hyperspectral, or polarimetric imagery.
It is recognized that in many cases the intensity data is not the result of a single measurement
but is determined during processing as a composite value derived from multiple measure-
ments, as is the case for point clouds generated from SfM techniques.
2016-06-08 30
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
The sensor-level metadata within a MIPC file contains information describing the sensors
spectral bands and the polarization filters. Each point cloud contains cloud-level metadata
that refers back to those sensor-level descriptions, informing the user which spectral band
and polarization filters apply to each intensity value.
EO systems include passive imaging systems operating in the optical regime of the electro-
magnetic spectrum, such as Pan, MSI, HSI, IR, FMV, and WAMI; and active systems, such
as the various forms of LIDAR. Because the primary purpose of MIPC is 3-D geometry and
not spectral analysis, the precise definition of band cutoffs is left to the data producer. As an
example, the WorldView-3 satellite has eight (8) spectral bands in the VNIR multispectral
payload.
If Worldview-3 multispectral imagery were a sensor under the EO modality, the actual wave-
length range for each sensor band would be stored in the MIPC file as:
and this dataset would be an 8 x 2 floating point array with the following values:
2016-06-08 31
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
⎡ 4.00E − 7 4.50E − 7 ⎤
⎢ ⎥
⎢ 4.50E − 7 5.10E − 7 ⎥
⎢ ⎥
⎢ 5.10E − 7 5.80E − 7 ⎥
⎢ ⎥
⎢ ⎥
⎢ 5.85E − 7 6.25E − 7 ⎥
⎢ ⎥
⎢ 6.30E − 7 6.90E − 7 ⎥
⎢ ⎥
⎢ ⎥
⎢ 7.05E − 7 7.45E − 7 ⎥
⎢ ⎥
⎢ 7.70E − 7 8.95E − 7 ⎥
⎢ ⎥
⎢ 8.60E − 7 1.04E − 6 ⎥
⎣ ⎦
A point cloud built from a monochromatic or panchromatic sensor would contain only one
band, and in this case:
would be a 1 x 2 floating point array. In the case of Worldview’s panchromatic sensor, those
values would be
[ 4.50E − 7 8.00E − 7 ]
Spectral Bands for RADAR systems are defined by the frequency of the waveform. However,
the precise frequency is not as meaningful for RADAR as it is in the optical regime. Other
parameters such as how the waveform is modulated, the signal bandwidth, and the length
of the synthesized aperture, are equally, if not more, important. The choice of frequency
band is to first order based on balancing factors such as the slant range, which improves at
lower frequencies, as well as antenna gain and target brightness, which improve at higher
frequencies. Frequency reveals less about the phenomenology at the target than it does in
the optical regime. For RADAR data, the actual frequency min and max values per operat-
ing band are provided in MIPC under:
The dataset can store a row for each band if needed. Column 0 is the min frequency and
column 1 is the max frequency for that band.
3.3.3 Polarization
Some point clouds are built from sensors that filter and measure the polarization of incoming
electromagnetic radiation. Knowledge of how the reflective target polarizes the incoming
light can provide information on the nature of the reflective surface or volume. In order to
retain the exploitation value from these point clouds, MIPC carries metadata that can be
used to determine the polarization state of electromagnetic radiation.
2016-06-08 32
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
The light source for passive EO systems is assumed to be unpolarized. LIDAR is unique in
that it is an active electro-optical sensor in which the polarization state of the light source
may also be known, so a LIDAR system can directly measure how interaction with the target
changes the laser light polarization.
The polarization-measuring sensor is assumed to have one or more polarization filters, after
which the intensities of light are separately measured. The polarization state of incident light
(and outgoing light in the case of LIDAR) can be represented by Stokes parameters, and the
effects of the polarizing filters can be represented by Mueller matrices. By recording Stokes
properties of the detected light after passing through the Mueller matrices, one can recover
information regarding the polarization state of the incoming (at aperture) light through the
inverse of the Mueller matrices. For LIDAR, comparing the polarization state of the outgo-
ing light with the detected light can directly measure how the target changed the polarization.
It is important to note that in MIPC, the polarization filter information is defined entirely
within a sensor frame of reference whose orientation is not known and in turn not referenced
to the ground space coordinate system in which the point cloud is stored. In most cases
the orientation of this sensor reference frame to the ground space coordinate system would
be time-varying. Therefore, it is possible to measure the polarization components of incom-
ing light and derive parameters such as the degrees of polarization, linear polarization, or
circular polarization, which are useful parameters for analysis. However, it is not possible
to determine precisely the orientation of the polarization ellipse in ground space using the
metadata in MIPC.
Consider for example a LIDAR system which performs a simple linear depolarization mea-
surement. In this type of system, linearly polarized laser light is transmitted and two channels
record the amount of reflected light collected: one with a linear polarization parallel to that
of the laser, and one with a linear polarization orthogonal to that of the laser. This type of
system measures the amount of depolarization that occurs, which can be useful for target
discrimination, but it cannot determine all the Stokes components to characterize fully the
polarization of the incoming light.
In this case, a data producer would probably define either vertical or horizontal as the po-
larization of the outgoing laser. Assuming they pick horizontal and that this is an ideal case
where the laser is perfectly polarized, then under the laser information they would put S =
[1, 1, 0, 0]. This is stored in MIPC as:
Then they would define two polarization filters for each measurement channel, the one for
the parallel (horizontal) channel would be:
2016-06-08 33
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
⎡ 1 1 0 ⎤
⎢ 0 ⎥
⎢ 1 1 0 ⎥
⎢ ⎥
⎢ ⎥
0
⎢ 0 0 0 ⎥
⎢ 0 ⎥
⎢ 0 0 0 ⎥
⎣ 0 ⎦
And the one in the orthogonal (vertical) channel would be:
⎡ 1 −1 0 0 ⎤
⎢ ⎥
⎢ −1 1 0 0 ⎥
⎢ ⎥
⎢ ⎥
⎢ 0 0 0 0 ⎥
⎢ ⎥
⎢ 0 0 0 0 ⎥
⎣ ⎦
These Mueller matrix entries are stored as floating point numbers in a MIPC array as:
Each row of the array corresponds to a different filter, and has 16 columns which correspond
to the Mueller matrix components for that filter in row-major order, for this example:
1 1 0 0 1 1 0 0 0 0 0 0 0 0 0 0
[ ]
1 −1 0 0 −1 1 0 0 0 0 0 0 0 0 0 0
For any channel where no polarization filter is used, the Mueller matrix is just a 4 x 4 identity
matrix, so the corresponding row is
[ 1 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 ]
Polarization information for RADAR systems must be handled differently in MIPC than
polarization information for EO systems. Although the Stokes vector and Mueller ma-
trix constructs are equally valid for electromagnetic radiation in the Radio Frequency (RF)
regime, they are seldom used in the SAR community. Instead, SAR transmitter and receiver
polarizations are generalized to several different states. The polarization of the transmitter
and receiver channels are stored in the following datasets:
These datasets store UInt8 values that are enumerations for the reference table SarPolariza-
tion defined in Table 11.
2016-06-08 34
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
A single-pol system would have the same value, either V or H, for both TxPol and RxPol.
Some SAR systems (e.g., RADARSAT-2) operate in a multi-pol mode. A standard dual-pol
linear system transmits in H or V and receives in both H and V. Other systems transmit at a
polarization that is oriented at a 45-degree angle (π/4) to both the H and V and then receive
in both H and V. A Circular Transmit Linear Receive (CTLR) system transmits a right or
left circular polarization by transmitting both H and V signals that are shifted in phase by
90 degrees, and receiving in H and V. Finally, a quad-pol system requires alternating the
polarization between H and V between pulses and receiving in both H and V, thereby giving
all HH, VV, VH, HV. So RxPol, if included, will always have the value of 0, 1, or 6.
For EO systems, including LIDAR, Channels is an g x 3 integer array, where g is the number
of intensity channels in this cloud. It is essentially a look-up table of indices. Each row
represents the corresponding intensity channel i = 0 to g-1 and contains 3 indices in the
columns (j = 0 to 2) encoded as follows:
• The first index (j = 0) is the sensor number under the current modality
• The second index (j = 1) for EO indicates the spectral band
– As a row index into MODALITY a.SENSOR b.EO.BandWavelengths
• The second index (j = 1) for LIDAR refers to the laser number
– As a row index into LIDAR.BandWavelengths
2016-06-08 35
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
• The third index (j = 2) is the polarization index, referring to the row which contains
the Mueller matrix values
– As a row index into MODALITY a.SENSOR b.LIDAR.PolFilters for LIDAR
– As a row index into MODALITY a.SENSOR b.EO.PolFilters for EO
For RADAR systems, Channels is an g x 4 integer array, where g is the number of intensity
channels in this cloud. It is essentially a look-up table of indices. Each row represents the
corresponding intensity channel i = 0 to g-1 and contains 4 indices in the columns (j = 0 to
3) encoded as follows:
• The first index (j = 0) is the sensor number within the SAR modality
• The second index (j = 1) indicates the spectral band
– As a row index into MODALITY a.SENSOR b.SAR.BandFrequencies
• The third index (j = 2) indicates the transmit polarization
– As a row index into MODALITY a.SENSOR b.SAR.TxPol
• The fourth index (j = 3) indicates the receive polarization
– As a row index into MODALITY a.SENSOR b.SAR.RxPol
REFERENCES.SarPolarization
2016-06-08 36
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
values.
2016-06-08 37
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 38
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
3.3.8 Algorithm
To capture information about how this particular point cloud was created, the algorithm
used for that final process is captured in the following group:
The field Algorithm is a single string to contain the official or common name of the algorithm.
There is a separate field for the Version of the algorithm, and an array for the various input
parameters. The Algorithm field is a single string because it is only intended to capture the
algorithm used to form the point cloud itself, not all processing algorithms used in the chain.
3.4 Points
Often referred to as wideband data or signal data, this content refers to the 3-D geolo-
cated points, and any additional point-wise attributes. The structure and definitions of these
data are given in the Tables 8 and 9.
Wideband point position and information can be stored in MIPC as either UInt16 or UInt32.
Within a specific cloud, all position data must be of the same type. This feature allows the
data to reflect the true precision of the sensor while not consuming unnecessary bytes with
meaningless precision. This represents a fundamental improvement over legacy file formats
which are rigid in their data types. Point data are stored as integers to maximize compres-
sion. The positions are converted to floating point when applying the scales and offsets.
Spatial tiling of point data also helps to reduce data size. The dynamic range of the position
data need only cover the spatial extent of a single tile, not the entire cloud. It is often pos-
sible to capture the inherent precision of the data with fewer bits over the smaller tile area.
2016-06-08 39
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
For high-density point clouds, tiles that are 65 meters on each side can represent positions
with precision better than a millimeter when values are stored as 16 bit scaled integers. For
wide area mapping applications, tiles that are 1 km on each side can represent positions with
approximately 1.5 cm of precision when positions are stored as 16 bit integers. In either case,
this consumes half the storage space per point of LAS or Binary Point Format (BPF). While
it is true that each tile adds metadata overhead, the overhead is exceedingly small when
compared to the savings in storage for tens of thousands of points in a tile.
A cloud may also contain, on a point-by-point basis, one or more intensity values, each of
which corresponds to a defined channel. There may be additional attributes associated with
each point sorted in the same manner as the position data so that they can both be indexed
by their row number.
There are various high level methods to create points, and these are enumerated. An integer
indicating the method used is stored in the dataset:
REFERENCES.ProcessingMethods
SCENE.SPATIAL.PointCoordinateSystem
2016-06-08 40
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
defined in Table 14
This information is needed to interpret the (x, y, z) position values found in the POSITIONS
group with respect to the Earth and based on the horizontal and vertical datums indicated in:
SCENE.SPATIAL.Datums
The limited number of options for coordinate systems and datums is intentional in order to
limit variability across data sets and data producers. Regardless of which system is used,
point cloud position data is stored as unsigned integers and translated to (x ,y, z) positions
in meters through linear translations governed by the following metadata:
• Origin data: specifies the origin of the local Cartesian coordinate system in meters
within the ECEF frame on a tile-by-tile basis (not used for UTM). Found in:
– MODALITY a.CLOUD c.TILEMAP.ECEFTranslate
• Alignment data: unit vectors that specify the direction of the local Cartesian coordinate
systems in the ECEF frame on a tile-by tile basis (not used for UTM). Found in:
– MODALITY a.CLOUD c.TILEMAP.ECEFRotate
• Position: scaled integer representations of the point positions in the local frame. Found
in:
– MODALITY a.CLOUD c.POINT.POSITIONS.Tile d
• Scale metadata: provides scale factor (multiplier) applied to the integer position values
in the same linear translation. Found in:
2016-06-08 41
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
If 16-bit position data will be inadequate to capture the inherent precision of the sensor using
a managed tile size, 32-bit scaled integers allow for millimeter precision within a single tile
4300 km in extent (roughly the distance from San Francisco to New York), or sub-micron
precision in a single tile 400 m in extent. The HDF5 API knows the way that datasets are
stored in the MIPC file, so it is not necessary to declare explicitly within the dataset how
many bytes are used for position storage.
This section and related fields indicate the coordinate system used for the point position data.
There are other metadata fields within the MIPC structure that may use other coordinate
systems for convenience or speed.
3.4.1.1 ECEF
MIPC uses the geocentric ECEF system to express the geolocation of point cloud data from
individual tiles in a common reference frame, free of projections. This provides a convenient
3-D Cartesian system that is practical for computing 3-D distance, but not geodesic distance
along the surface of the Earth ellipsoid.
One disadvantage to ECEF is that the axis orientation is not convenient for assessing height
above ground in small areas. To address this, MIPC provides an optional translation and
rotation of the ECEF frame to create an ECEF-based coordinate frame, which is more intu-
itive in terms of horizontal and vertical directions, but is still tied to an unambiguous ECEF
geolocation reference in three dimensions.
Figure 2 provides an illustration of how a local Cartesian frame is referenced to ECEF, and
illustrates the meaning of the important Translate, Rotate, and Offset metadata items.
To define a local Cartesian frame referenced to the ECEF system, it is necessary only to
specify the origin of the local Cartesian frame in ECEF space (equivalent to a translation
in X, Y, and Z), and the direction of the three coordinate axes in ECEF space. Points can
then be converted back and forth through simple translation and rotation operations. There
is no projection involved in this process.
2016-06-08 42
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
An origin for the translated Cartesian frame should be selected in close proximity to or
within the range of the points themselves. Selecting an origin near the centroid of the points
maximizes the benefit.
The rotation is based entirely on the location of the translated frame origin. The z-axis in
local Cartesian space is defined as the vertical direction at the origin point, defined as normal
to the ellipsoid. The local Cartesian x-axis, pointing in the east direction at the origin, is
orthogonal to both the local z-axis and the ECEF z-axis (i.e., it is parallel to the ECEF X-Y
plane). The local Cartesian y-axis in the north direction can then be found by taking the
cross product of the local z and x axes. The origin and therefore rotation must be the same
for all tiles in a MIPC dataset.
The vectors defining the local Cartesian axis directions in ECEF are stored in
which is a 3 x 3 floating point array per tile, equivalent to the 3 x 3 rotation operator. This
makes the complete set a 3-D array where each page in the 3rd dimension is for a different
tile, 0 to d, where d is the number of tiles in the cloud.
2016-06-08 43
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
which is a 1 x 3 array per tile equivalent to specifying the origin of a new 3-D Cartesian
coordinate system. For the sake of consistency with ECEFRotate, each tile will be a page
in the 3rd dimension from 0 to d, where d is the number of tiles in the cloud.
In the degenerate case at the North Pole where local z is aligned with ECEF Z, the local x
vector will be in the direction of the positive ECEF X axis. Conversely, for the degenerate
case at the South Pole, when local z is aligned with the negative ECEF Z axis, the local x
will align with the negative ECEF X axis.
Volume 2 of the MIPC standard provides detailed procedures and mathematical formulas
for defining the local 3-D Cartesian frames and translating and rotating point positions into
these coordinate systems.
A principal consideration when selecting the origin location for a local Cartesian frame is
that for any point not at the origin, the z-axis direction will depart slightly from local vertical
and the x and y axis directions will depart slightly from local horizontal. This effect will
increase with distance from the origin but can be mitigated partially by placing the common
origin near the centroid of the dataset, as shown in Figure 3. In this case some tiles will
have positive offsets and some will have negative offsets, depending on their location relative
to the shared origin. If preferred, the origin could have just as easily been placed below the
ellipsoid surface to produce only positive z offset values. As long as coordinate conversion is
performed correctly when required, this is not a source of error because the positions of the
points are still properly placed in 3-D space without any projection distortion.
Offsets in 3 dimensions are defined for each tile. They are in the rotated local reference
frame, and must be added to the scaled integer point positions found in the records. Offsets
should be selected for each tile in order to optimized the dynamic range of the position data.
If this is not the case, then dynamic range is wasted on regions where there are no points.
Use of the translated and rotated ECEF frame is not mandatory. The points may be stored
directly in ECEF coordinates. If this option is used, then
should be set to (0, 0, 0). Offsets should still be set for each tile which correspond to the
minimum ECEF X, Y, and Z coordinates found in each tile.
2016-06-08 44
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
3.4.1.2 UTM
Point clouds in legacy file formats are commonly stored in the UTM system. There are com-
promises and limitations with storing point cloud data in UTM. Most notably, UTM is a
projection, meaning that distortions are incurred in the process of representing coordinates
from a curved Earth surface to a flat plane. Moreover, the zone system in UTM causes
discontinuities if the dataset spans zone boundaries. UTM can only be used for a limited
range of latitudes, beyond which special polar projections must be used.
Despite these limitations, it is recognized that the UTM system is currently a widely-
implemented standard for storing point cloud data, and that many users and their tools
are accustomed to working with data in UTM space. In recognition of these realities, MIPC
includes the provision for UTM storage.
It should be noted that the values for x and y position represent easting and northing values
in a UTM zone, not displacements from the origin in a Cartesian frame. The z value repre-
sents elevation above the vertical datum. Accordingly, the
fields should not be populated in MIPC when UTM is used because there is no single con-
sistent direction vector which represents an axis alignment in ECEF across the UTM zone.
Similarly, because the UTM system defines the zone origin at a specific location where the
zones central meridian crosses the equator, it is not necessary to populate
2016-06-08 45
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
These fields should not be included when UTM is used. MIPC includes a dataset:
SCENE.SPATIAL.Zone
which provides the UTM zone as a signed integer, where negative values indicate the Southern
hemisphere. The UTM zone origin defines the location on the equator to which the x and y
(easting and northing) values are referenced. Only one option, UTM or ECEF, is allowed in
a single MIPC file.
Likewise, ground control points, also known as Truth Points, can be provided as a separate
cloud. A Ground Control Point (GCP) is useful for registration, calibration, or for assessing
geolocation accuracy and detection performance. The dataset
stores an UInt8 indicating the type of data based on the reference table:
REFERENCES.ProcessingMethods
4 Attributes
4.1 Standardized Attributes
The MIPC standard specifies the structure and storage of data such that it may accommo-
date any number of attributes per point. The standard is optimized to maintain integrity of
the data values, range, precision, and units. How the data is rendered within a visualization
application should be addressed in the application, not in the file format. For example, in
previous formats, signal data may have been encoded in the R, G, B values of the LAS file
so that the tool would know how to display the points. However, this is really a limitation
2016-06-08 46
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
of the application being built around the LAS format, and not a generic point cloud data
model. Ideally, an application should allow the user to select which attributes from those
available in the data that they would like to visualize, and how. For example, the tool could
allow the user to scale the R, G, B color channels, or apply a composite LUT, to either the
elevation, intensity data, the time, or some other attribute, etc.
Although MIPC may accommodate any attribute, there are some common GEOINT point
cloud attributes that should be standardized to consistent terminology, form, precision, and
units. These data elements are described below.
4.1.1 Time
When a date and time are needed together, they shall always be referenced in a common
manner as a single value. Tools will then be able to convert this value to different formats as
needed. The common value will be in decimal POSIX time, which is seconds from midnight,
01 Jan 1970. Seconds will be stored as a Float64 value, which should be sufficient for most
GEOINT applications. POSIX time does not include leap seconds, and those should be
added when appropriate.
4.1.2 Noise
Similar to the SIPC specification, this term refers to a probability that the point is a false
detection as opposed to a detection from of a real surface. There are many noise sources for
LIDAR, especially Geiger-mode systems. In order to generalize this concept to any remote
sensing modality that may generate a 3-D point cloud, MIPC stores the noise score for each
point as the probability that a given point is a false detection. This implies that point cloud
generation algorithms should convert whatever residual metrics that are generated to a per-
cent score in order to be modality agnostic. If an algorithm produces some sort of confidence
value instead, the noise would be 1 − conf idence. The percent shall be stored in its decimal
form on the scale from 0.00 to 1.00. Given the maturity of current algorithms, a Float32
type should provide sufficient bit depth for this information. Noise data is stored per point
in tile blocks within each cloud under the group:
where each tile is a p x 1 array of noise scores where the row is the point index, and the
value in column 0 is the noise score as a decimal probability.
2016-06-08 47
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
4.1.3 Class
This group represents terrain classification codes per point, not to be confused with security
classification. This is similar to the classification field in the LAS point record. The CLASS
group includes a single String field, Source, to describe how the classification was performed,
such as manual or some automated application. The group also includes an integer value per
point by tile. The integer is an index into REFERENCES.Classifiers. This table is currently
based on those values defined by American Society of Photogrammetry and Remote Sensing
(ASPRS) in the LAS specification, with additional room for custom values as needed up to
256 values (0 to 255). These values are given in Table 16.
Index Value
0 Never Classified
1 Unclassified
2 Ground
3 Low Vegetation
4 Medium Vegetation
5 High Vegetation
6 Buidling
7 Low Point (noise)
8 Model Key-Point (mass point)
9 Water
10 Reserved for ASPRS
11 Reserved for ASPRS
12 Overlap
13 to 31 Reserved for ASPRS
32 to 255 Openly available
4.1.4 Intensity
Although intensity has a specific radiometric meaning, it is a commonly used term to refer
to the signal strength used to modulate the brightness within an image display system. In
this context, across different remote sensing imaging modalities, intensity can have different
physical meanings. Furthermore, some modalities, such as HSI, would like to store both
radiance and reflectivity values for intensity. However, the brightness should be rendered by
the Electronic Light Table (ELT) and not defined within the file. Ideally, an ELT should
allow a user to specify which attribute to render as the brightness, and the ELT should then
scale that value to a suitable viewing range (e.g., 0 to 255), and perhaps even the shape of the
scale (e.g., linear or logarithmic). MIPC permits the ability to differentiate between these
different physical meanings rather than having a single intensity value. While LAS stores a
single intensity value, MIPC will differentiate into the physical phenomenologies to permit
any combination of intensity data. This is accomplished by having separate INTENSITY
groups labeled INTENSITY e, where e is a unique integer for each type of intensity data.
2016-06-08 48
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Each INTENSITY e group will store intensity data in a p x g array where p is the number
of points in the tile, and g are the number of channels in the cloud.
The group will also include datasets for Dimension and Units. Each stores a single UInt8
value for the enumerations given in Tables 17 and 18 respectively. These lists are identical
to those defined in the BANDSB Tagged Record Extension (TRE) in Appendix X of the
NITF specification [5], specifically the fields RADIOMETRIC QUANTITY and RADIO-
METRIC QUANTITY UNIT.
As an example:
means that for the 2nd modality, 3rd cloud, the 1st set of intensity data (0) has the dimen-
sion of Radiance and the units of L which should be interpreted as [W /m2 sr], and
contains the intensity value for point 12 (13th point), channel 7 (8th channel) of the 6th tile.
2016-06-08 49
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
If there is a standard visualization scale or one generated by the ground processing system,
it can be stored with each INTENSITY e group in the DefaultRemap dataset where the row
is the display value (0 to 256) and column 0 is each minimum pixel value for that step in
the data as any appropriate type (e.g., Float32). This is convenient for systems that remap
to a stretched log scaled, for example.
Therefore, it is not believed at this time that the benefit for including pixel intensities from
originating source data would outweigh the significant impacts to file size, considering that
the data could be found within the source images.
2016-06-08 50
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
5 Geopositioning Error
The MIPC standard utilizes the GPM standard as the error model. All errors shall be in
ground-space so as not to rely on analysts or applications to calculate error based on a sensor
model. This approach is even more important in MIPC compared to SIPC since the sensor
model is no longer relevant to the composite points. The GPM data will be specific to a
cloud, and will provide for absolute and relative geolocation errors for a given cloud, but
will not provide relative error information for mensurations between clouds, since the ground
error can only be defined at the cloud level.
MIPC provides support for reporting geopositioning error. There are three separate options
to accommodate data sets where different types of error information are available.
• Error case 0: No error information is available.
• Error case 1: Only single representative values of accuracy are available to characterize
the data set as a whole, such as Circular Error 90% (CE90), Linear Error 90% (LE90),
and Spherical Error 90% (SE90).
• Error case 2: Error is reported per the GPM Standard. Both the direct storage and
indirect storage methods are accommodated. For convenience, the dataset may also
include the representative CE90, LE90, and/or SE90 values from Error case 1 so the
user does not need to compute these from the GPM data.
There are several important constraints that are imposed by the GPM model implementation
which must be noted:
1. The GPM record is defined at the cloud level in MIPC. The GPM record is valid only
for that particular cloud. If a MPIC file contains multiple clouds, the GPM record for
one cloud does not apply to the GPM record for another cloud.
2. Within a given cloud, it is possible to compute both absolute and relative errors. It is
not possible to compute relative errors between points from different clouds.
The MIPC datasets that comprise a GPM record are stored under:
Those familiar with the GPM standard will recognize the organization of the structure below
this level. The full list of dataset names and details on the method of storage and definitions
are found in the GPM documentation. The various subgroups include:
• MODALITY a.CLOUD c.GPM.Master
– Objects from the GPM master record including model coordinate system (MCS)
information.
– The GPM MCS should match the coordinate system used in the MIPC point
records. Both MIPC and GPM support ECEF, UTM, and the rotated and trans-
lated ECEF frame referred to as Local Space Rectangular (LSR) in GPM.
2016-06-08 51
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
2016-06-08 52
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
References
[1] NGA, “Sensor Indepdendent Point Cloud (SIPC) File Format Trade Study,” Report,
Nov 2012. Ver. 2.0.
[2] NGA, “Sensor Indepdendent Point Cloud (SIPC), Design and Implementation Descrip-
tion Document (DIDD),” Specification, Oct 2015. Ver. 1.02.
[3] NGA, “Sensor Indepdendent Point Cloud (SIPC), File Format Description Document
(FFDD),” Specification, Oct 2015. Ver. 1.02.
[4] NGA, “LIDAR Conceptual Model and Metadata Dictionary (CMMD) for Enterprise
Level 2 Volume Return Products,” Specification. Ver. 1.0.
[5] NITF Technical Board, “The Compendium of Controlled Extensions (CE) for the Na-
tional Imagery Transmission Format Standard (NITFS), STDI-0002, Appendix X, Gen-
eral Purpose Band Parameters (BANDSB) Tagged Record Extension (TRE),” Specifica-
tion, Sep 2004. Ver. 1.0.
2016-06-08 53
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
Glossary
attribute a data value that exists per point; it may be a scalar value, such as intensity and
time, or a vector such as surface normals. 46
binary point format a quick, binary file format for the storage of unorganized point cloud
data. BPF allows unlimited attributes per point. 6
CamelCase a syntax where words are combined without spaces or underscores, but the
first letter of each word is uppercase, and all other letters are lowercase. The first
letter may or may not be uppercase depending on the meaning and convention. 12
CE90 a statistical method used to represent horizontal accuracy by stating the radius of a
circle that encompasses 90% of the error. 51
direct storage a method of storing parameters needed to compute geolocation error us-
ing the Ground Space version of the Universal LIDAR Error Model (ULEM). Direct
storage involves explicit storage of covariance data between all anchor points. Error
at intermediate positions is computed by error propagation methods, involving partial
derivatives of the intermediate positions with respect to the neighboring Anchor Point
parameters. 51
ground control point fixed, surveyed locations on the ground treated as truth points for
reference, sometimes referred to as fiducials. 46
ground-space data values are based on final ground distances within the scene so the user
does not need to factor in the sensor, processing method, or collection geometry. 51
indirect storage a method of storing parameters needed to compute geolocation error using
the Ground Space version of the Universal LIDAR Error Model (ULEM). Indirect
storage involves storage of a 3 x 3 covariance matrix for each of the anchor points,
plus a set of parameters to populate a spatial correlation decay model that is used to
compute error at intermediate positions. 51
LAS the LASER file format developed and maintained by the American Society for Pho-
togrammetry and Remote Sensing (ASPRS) for LIDAR point clouds. 6, 40, 46
LE90 a statistical method used to represent vertical accuracy by stating the height of a
cylinder that encompasses 90% of the error, where the radius of the cylinder is the
CE90. 51
2016-06-08 54
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
look a single imaging collection of a scene with an insignificant temporal duration; some-
times referred to as a scan, this may be a single 2D image or an L2 LIDAR point cloud.
4, 10
modality a specific remote sensing technology that measures a specific set of phenomenolo-
gies, and hopefully standardized to consistent units. Examples include RADAR,
Panchromatic imagery, HSI, LIDAR, and OPIR. iii, 1
scene a geospatially contiguous region of interest. This is the basis of a single MIPC data
set. This is sometimes referred to as a target, site, or coverage. It may include
multiple country codes. The definition may be extended to include the extents along
other dimensions of interest, such as temporal, spectral, or radiometric, as long as they
are essentially contiguous. 8
SE90 a statistical method used to represent 3-D accuracy by stating the radius of a sphere
that encompasses 90% of the error. 51
Acronyms
2-D two-dimensional. iii
BE Basic Encyclopedia. 9
2016-06-08 55
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
IR Infrared. 1, 9, 31
LLA Latitude-Longitude-Altitude. 11
MKS Meter-Kilogram-Second. 14
Pan Panchromatic. 1, 9, 31
PI Polarimetric Imaging. 1
2016-06-08 56
MIPC Vol. 1 NGA.STND.0055-01 1.0 MIPCDIDD
RF Radio Frequency. 34
2016-06-08 57