MIPC Vol-2 Ver-1-00 20160608
MIPC Vol-2 Ver-1-00 20160608
0 MIPCFFDD
Volume 2
Version: 1.00
Date: 2016-06-08
2016-06-08
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
CONTACTS
The following point of contact is provided for assistance in understanding the contents of
this implementation profile
NGA/TAEA
Office of the Chief Information Officer, Information Technology Directorate (CIO-T)
7500 GEOINT Drive
Springfield, VA 22150
2016-06-08 i
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
CHANGE LOG
TBR/TBD LOG
2016-06-08 ii
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Executive Summary
This Modality Independent Point Cloud (MIPC) standard, Volume 2, File Format Descrip-
tion Document (FFDD), describes the conventions, methods, and processes for storing and
extracting geospatial point cloud information based on the MIPC standard for data and
metadata content established by the National Geospatial-Intelligence Agency (NGA). The
NGA has selected the Hierarchical Data Format 5 (HDF5) as the data transport layer for
MIPC. This document describes methods whereby data producers can place MIPC-compliant
data and metadata in the HDF5 file format in accordance with the MIPC standard. It also
describes methods whereby users of MIPC datasets may extract data components from the
HDF5 file. The corresponding Volume 1 of the MIPC standard provides the design and
description of the content and strucutre of MIPC data and metadata.
The main objective of MIPC is to create a general purpose point cloud file for storage and
transmission within the National System for Geospatial-Intelligence (NSG) in a standard
form that will maximize interoperability and data fusion. A related standard, Sensor In-
dependent Point Cloud (SIPC), standardizes a single point cloud created by a single Light
Detection and Ranging (LIDAR) sensor in a single pass, and therefore includes sufficient
LIDAR metadata to enable custom processing by image scientists as well as exploitation by
image analysts. The MIPC specification can be considered a downstream product, gener-
alizing and abstracting the three-dimensional (3-D) point cloud data structure, while still
maintaining critical metadata from the originating modalities. As such, the MIPC standard
is appropriate for merging LIDAR point clouds from different collections or point clouds
derived from multiple two-dimensional (2-D) images. Therefore, the MIPC standard can
accommodate point clouds created from other modalities besides LIDAR, such as Electro-
Optical (EO) imagery, Radio Detection and Ranging (RADAR), and Wide Area Motion
Imagery (WAMI) systems. Since most MIPC files accomodate temporal, radiometric, and
spectral attributes per point, MIPC can be considered a multi-dimensional point cloud.
A MIPC file is structured such that the data and metadata are designed to be modality
independent to the extent possible and appropriate. This allows all point clouds conforming
to the MIPC standard to be treated in a similar manner for archiving, visualization, and
exploitation. Consequently, a single MIPC file may contain multiple point clouds from the
same or different sensors or even from different modalities. As GEOINT data, all data within
a MIPC file pertains to a single SCENE of interest. MIPC provides an interoperable mech-
anism for disseminating, exploiting, and visualization different sensor data for that scene.
The container for MIPC content is the HDF5 file format. HDF5 is a non-proprietary, open,
industry standard format for storing complicated, scientific (numeric) data. It was chosen
by NGA as the optimum format for ingest, storage, and dissemination of exploitation-ready
point cloud data via NSG archives. Legacy point cloud data standards have limited in meta-
data content, are rigid in data typing, and store wideband data as monolithic binary blocks.
These formats often lack any logical grouping or self-description mechanisms within the
dataset. The HDF5 file format, as well as the MIPC metadata content and data structures,
provide efficient discovery and random access of data across multiple point clouds within a
2016-06-08 iii
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
single file.
This document provides an HDF5 profile for MIPC data. Should a future file format prove
more beneficial to MIPC, a new profile volume could be written for that format without
changing the content in Volume 1 of the MIPC standard; but that format would need to
be capable of handling the structures within MIPC. HDF5 is also a self-discoverable for-
mat. If the content of the MIPC Volume 1 standard changes due to future metadata and
data requirements or enhancements, this HDF5 implementation profile would require very
minor changes with minimal impact to software tools developed against this HDF5 interface.
2016-06-08 iv
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Contents
1 Introduction 1
1.1 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Applicable Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 File Format 4
2.1 Relevant HDF5 Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.1.1 Physical Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Components of the API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Generic Code Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.1 C++ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.2 Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.3.3 IDL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.4 MIPC Specific Implementations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4.1 Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.4.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3 File Workflows 23
3.1 Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.2 Select Coordinate System and Datum . . . . . . . . . . . . . . . . . . . . . . . . 23
3.3 Asssemble Point Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.4 Define Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.5 Tile Point Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.5.1 Translate and Rotate Points . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.5.2 Convert Position Data to Scaled Integer . . . . . . . . . . . . . . . . . . 29
3.6 Error Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3.6.1 Global Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.6.2 GPM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.6.3 Per Point Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4 Extending MIPC 32
4.1 Extending Groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.2 Extending Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.3 Adding Imagery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2016-06-08 v
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
List of Figures
1 High level groups within a sample MIPC file. . . . . . . . . . . . . . . . . . . . . 17
2 CLOUD and SENSOR indices restart counting under each MODALITY group. 18
3 Groups under a CLOUD group within a sample MIPC file. . . . . . . . . . . . 19
4 String datasets within a sample MIPC file. Data is fictional. . . . . . . . . . . . 22
5 Numeric datasets within a sample MIPC file. Data is fictional. . . . . . . . . . 23
6 ECEF-Referenced Local Coordinate System. . . . . . . . . . . . . . . . . . . . . 26
2016-06-08 vi
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
List of Tables
1 Applicable Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 HDF Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3 MIPC Data Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2016-06-08 vii
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
1 Introduction
1.1 Scope
The Modality Independent Point Cloud (MIPC) File Format Description Document (FFDD)
provides descriptions for how point cloud data sets are stored in, and read from, files conform-
ing to the MIPC standard and required file format - Hierarchical Data Format 5 (HDF5).
This document provides guidance to producers of point cloud data sets delivered to the Na-
tional Geospatial-Intelligence Agency (NGA), as well as consumers of National System for
Geospatial-Intelligence (NSG) point cloud data, regardless of the sensor modality.
In keeping with the precedent set by forerunners of the MIPC, most notably the Sensor
Independent Complex Data (SICD) and Sensor Independent Point Cloud (SIPC) standards,
this document is designated Volume 2 of the MIPC series because it describes for data
providers the placement of MIPC point cloud data and metadata in the HDF5 file format.
It also describes methods whereby users of MIPC data can read and properly extract the
data components from the HDF5 file. MIPC Volume 1, Design and Implementation Descrip-
tion Document (DIDD), specifies the content from the MIPC file in terms of the data and
metadata object definitions, the hierarchical structure, data types, object naming and tag
conventions, data tiling conventions, and coordinate systems. The SICD series of documents
includes a Volume 3, Image Projections Description Document. An analogous Volume 3 is
not required for MIPC because point cloud data is inherently georeferenced and potentially
coregistered by the time it is processed to the level at which it is placed in the MIPC file.
The MIPC defines a standard for point clouds developed from any remote sensing modality.
A MIPC file is structured such that the data and metadata are designed to be modality
independent to the extent possible and appropriate. This allows all point clouds conform-
ing to the MIPC standard to be treated in a similar manner for archiving, visualization,
and exploitation. However, there is a need for a small amount of critical modality specific
metadata, and such information is in a specific group within the file, called METADATA,
separate from the rest of the modality independent content.
While the MIPC family of documents describe the content of an MIPC file and how it or-
ganizes data, specifications and standards on the HDF5 file format itself are maintained by
an external consortium and are outside the scope of this document.
With other file formats, especially those based on a Key-Length-Value (KLV) approach,
the concept of the byte-ordered location of specific data objects within a binary data file
is extremely important. One needs to know, for instance, how many bytes every particular
data object uses, as well as their precise order and quantity within the file in order to as-
certain precisely which bytes in the binary file correspond to any particular piece of data or
metadata. For instance, to read a 4-byte data field A, one needs to compute the number of
data fields which precede it and how many bytes they consume, X, skip X bytes into the file
and read bytes X+1 to X+4. Many file format description documents analogous to this one
devote the bulk of the documentation to describing the order and placement of specific data
2016-06-08 1
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
fields at specific byte locations in the binary file, which is necessary in order to map data
components to their locations in the file.
HDF5, on the other hand, is an object-oriented system in which data objects are written
to, and read from, HDF5 files through calls to an open and publicly available application
programming interface. Each data component is treated as an object, and the Application
Programming Interface (API) encapsulates and tracks the internal locations, sizes, and types
of all the objects. These details of how each data component maps to a specific byte location
in a file are invisible and irrelevant to the HDF5 user. Consequently, it is unnecessary for
this document to specify many parameters that comprise the bulk of other similar file format
specifications and interface control documents. These irrelevant concerns for HDF5 include
issues such as the size, data type, and location of various data records, record size limitations
and overflow extensions, header and subheader locations, content, and organization.
Instead, this document will present high-level workflows and API implementation examples
for reading and writing data and metadata to and from a data file that conforms to the
MIPC standard. The API examples used throughout this document are presented in a small
set of common languages in order to demonstrate concepts concepts clearly, but are easily
translatable to other languages by those readers proficient with programming in those lan-
guages.
The corresponding Volume 1 specification provides the design and description of the MIPC
content. The HDF5 was selected as the file format for MIPC based on lessons learned from
the development of the SIPC standard and the associated SIPC File Format Trade Study
[1] conducted in 2012.
2016-06-08 2
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Title Version
CMMD Level 2, LIDAR Data and Metadata: Ver. 1.0
Conceptual Model and Metadata Dictionary for
Enterprise Level 2 Volume Return Products
Generic Point Cloud error model 1 Ver. 1.0
Intelligence Community (IC), Information Secu- Ver. 13
rity Marking (ISM) Metadata Specification 2
NGA, SIPC Volume 1, Sensor Independent Point Ver. 1.02
Cloud, Design and Implementation Description
Document
NGA, Sensor Indepdendent Derived Data 01 Aug 2011
(SIDD), Vol. 1, Design and Implementation
Description Document, NGA.STND.0025-1 1.0
NITFS, The Compendium of Controlled Exten- Ver. 2.1, 16 Nov 2000
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix E,
Airborne Support Data Extensions (ASDE)
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 01 Aug 2007
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix L,
HISTOA Extension
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 31 Mar 2006
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix O,
Multi-image Scene (MiS) Table of Contents (MI-
TOCA) Tagged Record Extension (TRE)
NITFS, The Compendium of Controlled Exten- Ver. 1.0, 30 Sep 2004
sions (CE) for the National Imagery Transmis-
sion Format (NITFS), STDI-0002, Appendix X,
General Purpose Band Parameters (BANDSB)
Tagged Record Extension (TRE)
NMF Part 1, NSG Metadata Foundation, Core Ver. 2.1
W3C, XML Schema Part 2: Datatypes Second Accessed 13 Jul 2015
Edition 3
1
https://nsgreg.nga.mil/doc/view?i=1799
2
http://www.dni.gov/index.php/about/organization/chief-information-officer/
information-security-marking-metadata
3
http://www.w3.org/TR/xmlschema-2
2016-06-08 3
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2 File Format
The file format used as a container for MIPC is the HDF5 format. This format was developed
by the National Center for Supercomputing Applications (NCSA), and used extensively
by National Aeronautics and Space Administration (NASA) for remote sensing data, and
already in use for other GEOINT platforms supporting the NGA. HDF5 is a Geospatial
Intelligence Standards Working Group (GWG) approved standard and provides the following
benefits:
self-discoverable HDF files inform the user of its contents (field names, dimensions, bit
depth), so a data description document is not needed to read an HDF file and ex-
tract content, but a document can provide additional support context, such as in this
document
api HDF has an extensive API with bindings in every major programming language
flexible HDF5 can store any typical data structure including vector, images, video, point
clouds, and meshes; and at any bit depth defined and interpreted as needed
scalable HDF5 has no theoretical limitations in file size or dataset size; however, current
computer limitations recommend files less than 2 Terabytes (TB) in size
parallel HDF5 supports parallel file access using Message Passing Interface (MPI)
free The basic HDF5 API code and a viewer are free software
maintained HDF5 tools and code libraries are maintained by a dedicated working group
platform independent HDF files can be used on any major operating system, including
Windows, Linux, Unix, and FreeBSD.
Multiple API libraries are available for use in the following programming languages:
• C • FORTRAN
• Java • MATLAB
• Python • Mathematica
There are several internet sites that provide information on using HDF5 in the various
programming languages. Such sites provide documentation, modules, and code examples of
implementations. Some of these sites are listed in Table 2.
2016-06-08 4
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Therefore, the conceptual groups in MIPC Volume 1 are implemented as HDF5 groups within
the file structure.
Similarly, the datasets described in Volume 1 of the MIPC standard are implemented as
HDF5 datasets within the file structure. These elements are the atomic data structures con-
taining numeric and string arrays, whereas the groups are used to organize this information.
As in Volume 1, groups will be named in all UPPERCASE characters, and dataset names
2016-06-08 5
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
will be written in CamelCase, unless the group is inherited from another specification with
a different convention, such as the NMF and Generic Point-Cloud Model (GPM) standards.
In summary, the HDF5 format allows for a physical implementation to match the conceptual
and logical design identically.
Attributes are a somewhat useful construct within the HDF5 library; however, there is only
minor advantage to using attributes over datasets for textual metadata. Although reading
and writing attributes are easier operations than string datasets, using attributes would
require a separate type of function to access this type of data as opposed to datasets. Addi-
tionally, attributes are not compressible. Furthermore, attributes are not as salient in some
HDF data viewing tools such as HDFView.
However, attributes are not restricted by the MIPC standard either, and are therefore
available for use by various programs to store additional information (meta-metadata), as
desired. Attributes are a simple mechanism to store any additional information to aid in
the use or interpretability of the dataset to which it is assigned. Attributes should only be
used for informal notes. Column names are a good example of the type of information that
could be placed in attributes. Attributes should not be used to store critical data that is not
included in the MIPC standard. This use of the term attributes within HDF5 should not be
confused with point-wise data attributes.
HDF5 also has a concept called tables, which are compound data structures that permit
different columns to be different data types, and adds a title to each column. There is also
a convenient Python module for working with these called PyTables. However, this table
concept may not translate to other formats, and might therefore be HDF5 specific. The
intention of this standard is to be flexible enough to accommodate more advanced formats
in the future, which is why Volume 1 is format agnostic. Therefore, this standard does not
require the data to be in tables, but we do not believe that using tables in specific MIPC
writers would break this standard.
They are additional elements available within HDF5 not discussed in this document, such as
enumeration, hyperslabs, and hardlinks. These additional concepts are not necessarily
precluded either. The implementation decision lies within the MIPC software tools and does
not need to be mandated by this specification. As long as the required groups and datasets
are included as specified in the two volumes of the MIPC standard, the file is compliant.
Similarly, advanced HDF5 techniques, such as chunking, compression, and Fastbit, are
not forbidden as long as the content remains within specification. Linking to information in
separate HDF5 files is also possible, but such an implementation is a systems engineering
decision.
2016-06-08 6
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
to be an interface for tool developers. Information about supported languages, software de-
velopment environments, and the HDF5 APIs themselves can be downloaded from the HDF
Group website4 . Although some languages require a download or installation of a library,
HDF5 functions are built in to the current versions of MATLAB and the Interactive Data
Language (IDL).
Once the API installation is complete, a programmer can start developing code that makes
calls to the different functions of the API. The atomic data type supported in HDF5 are
given in Table 3. Compound types (arrays of structures) are discouraged given the simplicity
and structure of MIPC.
4
http://www.hdfgroup.org
2016-06-08 7
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2.3.1 C++
The previous code snippet shows how to create a group under the root (the highest possible
hierarchical level) of an HDF5 file. Code Listing 2 provides an example of how to create a
group under a group, a technique that will be particularly useful in creating the multi-level
hierarchical data structure of SIPC.
Listing 2: Sample C++ code for creating child groups in an HDF5 file
1 /∗ C r e a t e new c h i l d group under an e x i s t i n g group
2 by s p e c i f y i n g t h e a b s o l u t e p a t h o f t h e group ∗/
3
4 n e w c h i l d = H5Gcreate ( f i l e , ” /PARENT/CHILD” , H5P DEFAULT, H5P DEFAULT,
H5P DEFAULT) ;
The following code in Listing 3 demonstrates some of the functions used to store a dataset
within HDF5.
2016-06-08 8
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
16 d a t a t y p e . s e t O r d e r ( H5T ORDER LE ) ;
17 // Cr e a t e a new d a t a s e t w i t h i n t h e f i l e
18 // u s i n g d e f i n e d d a t a s p a c e and d a t a t y p e and d e f a u l t d a t a s e t c r e a t i o n
properties .
19 DataSet d a t a s e t = f i l e . c r e a t e D a t a S e t ( DATASET NAME, datatype , d a t a s p a c e ) ;
20 // Write t h e d a t a t o t h e d a t a s e t
21 // u s i n g d e f a u l t memory space , f i l e space , and t r a n s f e r p r o p e r t i e s .
22 d a t a s e t . w r i t e ( data , PredType : : NATIVE INT ) ;
23 } // end o f t r y
24
25 // Catch f a i l u r e c a u s e d by t h e H5File o p e r a t i o n s
26 catch ( F i l e I E x c e p t i o n e r r o r )
27 {
28 error . printError () ;
29 return −1;
30 } // end o f c a t c h
2016-06-08 9
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2.3.2 Python
This section provides generic example code for working with any HDF5 file in Python. The
examples are based on the h5py library, and there may be other techniques within h5py
as well as other libraries for some of these processes. First, Listing 4 demonstrates many
functions related to writing content to a new HDF5 file, including groups and datasets.
5 import h5py
6 import numpy as np
7
8 f i l e p a t h = r ’C: \ Test \ m y f i l e . h5 ’
9
10 print ( ’ C r e a t i n g and o p e n i n g f i l e f o r streamed w r i t i n g . . . ’ )
11 f = h5py . F i l e ( f i l e p a t h , ’ a ’ )
12
13 print ( ’ C r e a t i n g g r o u p s . . . ’ )
14
15 p a r e n t = f . c r e a t e g r o u p ( ’ Parent ’ )
16 print ( ’ Group Name : ’ , p a r e n t . name )
17
18 print ( ’ C r e a t i n g a subgroup . . . ’ )
19 c h i l d = parent . c r e a t e g r o u p ( ’ Child ’ )
20 print ( ’ Subgroup Name : ’ , c h i l d . name )
21
22 print ( ’ C r e a t i n g a group and s ub g r ou p s d i r e c t l y . . . ’ )
23 grp3 = f . c r e a t e g r o u p ( ’A/B/C ’ )
24 print ( grp3 . name )
25
26 print ( ’ D e l e t i n g a l e a f group . . . ’ )
27 grp4 = f [ ’ /A/B ’ ]
28 del grp4 [ ’C ’ ]
29
30 print ( ’ C r e a t i n g a s c a l a r d a t a s e t . . . ’ )
31 c h i l d [ ’ data1 ’ ] = 3 . 4 1
32
33 print ( ’ C r e a t i n g an a r r a y d a t a s e t , method 1 . . . ’ )
34 s i g n a l = np . a r a n g e ( 2 0 0 0 )
35 child [ ’ signal1 ’ ] = signal
36
37 print ( ’ C r e a t i n g an a r r a y d a t a s e t , method 2 . . . ’ )
38 s i g n a l 2 = np . a r a n g e ( 1 0 0 0 )
39 s i g n a l 2 s e t = c h i l d . c r e a t e d a t a s e t ( ’ s i g n a l 2 ’ , data=s i g n a l 2 )
40
41 print ( ’ C r e a t i n g hard l i n k s on g r o u p s . . . ’ )
42 f [ ’ /A/B/HARDLINKED ’ ] = f [ ’ / Parent / C h i l d ’ ]
43
2016-06-08 10
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
44 print ( ’ C r e a t e a t t r i b u t e s on a d a t a s e t . . . ’ )
45 s i g n a l 2 s e t . a t t r s [ ’ S o u r c e ’ ] = ’NGA’
46
47 print ( ’ C r e a t i n g a f i x e d l e n g t h ASCII s t r i n g d a t a s e t . . . ’ )
48 ds2 = f . c r e a t e d a t a s e t ( ” /A/ F i x e d A s c i i ” , ( 1 0 0 , ) , dtype=” S10 ” )
49 ds2 [ 0 ] = b ’ABCDEFGHIJKLM ’
50 ds2 [ 1 ] = b ’NOPQRSTUVWXYZ’
51
52 print ( ’ C r e a t i n g a v a r i a b l e l e n g t h ASCII s t r i n g d a t a s e t . . . ’ )
53 db = h5py . s p e c i a l d t y p e ( v l e n=b y t e s )
54 ds3 = f . c r e a t e d a t a s e t ( ” /A/ V a r i a b l e A s c i i ” , ( 1 0 0 , ) , dtype=db )
55 ds3 [ 0 ] = b ’ABCDEFGHIJKLM ’
56 ds3 [ 1 ] = b ’NOPQRSTUVWXYZ’
57 print ( ’ ID : ’ , ds3 . id )
58 print ( ’ Value : ’ , ds3 . v a l u e )
59
60 print ( ’ L a b e l i n g d i m e n s i o n s . . . ’)
61 f [ ’ /A/LABELED ’ ] = np . o n e s ( ( 4 , 3, 2) , ’ f ’ )
62 f [ ’ /A/LABELED ’ ] . dims [ 0 ] . l a b e l = ’z ’
63 f [ ’ /A/LABELED ’ ] . dims [ 1 ] . l a b e l = ’y ’
64 f [ ’ /A/LABELED ’ ] . dims [ 2 ] . l a b e l = ’x ’
65
66 print ( ’ Ragged a r r a y s w i t h i n c e l l s . . . ’ )
67 dt = h5py . s p e c i a l d t y p e ( v l e n=np . dtype ( ’ i n t 3 2 ’ ) )
68 d s e t = f . c r e a t e d a t a s e t ( ’ v a r i a b l e L e n g t h I n t s ’ , ( 1 0 0 , ) , dtype=dt )
69 dset [ 0 ] = [1 ,2 ,3]
70 dset [ 1 ] = [1 ,2 ,3 ,4 ,5]
71 f [ ’ v a r i a b l e L e n g t h I n t s ’ ] . a t t r s [ ’ column names ’ ] = b ’My Column Name ’
72
73 print ( ’ Enumerations . . . ’ )
74 de = h5py . s p e c i a l d t y p e ( enum=( ’ i ’ , {”RED” : 0 , ”GREEN” : 1 , ”BLUE” : 2 } ) )
75 print ( ’ Enumeration : ’ , h5py . c h e c k d t y p e ( enum=de ) )
76 ds = f . c r e a t e d a t a s e t ( ” /A/ EnumColors ” , ( 1 0 0 , 1 0 0 ) , dtype=de )
77 print ( ’ D a t a s e t kind : ’ , ds . dtype . kind )
78 # Use i n t e g e r v a l u e s w i t h i n code , and s t r i n g k e y s w r i t t e n t o c e l l s
79 ds [ 0 , : ] = 2
80 ds [ 1 , : ] = 1
81 ds [ 2 , : ] = 0
82 print ( ds [ 0 , 0 ] )
83 print ( ds [ 1 , 1 ] )
84 print ( ds [ 2 , 2 ] )
85
86
87 f . close ()
2016-06-08 11
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Second, Listing 5 demonstrates many functions related to reading content from an existing
HDF5 file.
14 # P r i n t name o f a l l t o p l e v e l g r o u p s under r o o t
15 f o r g r o u p s in f :
16 print ( g r o u p s )
17
18 # Test i f group e x i s t s
19 t e s t = ’ / Parent / C h i l d ’ in f
20 print ( t e s t )
21
22 # Get group
23 # Use d i c t i o n a r y method
24 c h i l d = f [ ’ Parent ’ ] [ ’ C h i l d ’ ]
25
2016-06-08 12
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
48 sdata = sdataset [ ( ) ]
49 print ( s d a t a )
50
51 # Read s t r i n g s from d a t a s e t
52
53 f . close ()
2016-06-08 13
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2.3.3 IDL
This section provides generic code for working with HDF5 files within the IDL. Listing 6
demonstrates how to read content from an HDF5 file programmatically.
21 ; Close the f i l e
22 H5F CLOSE, f i l e i d
Listing 7 shows how to call an IDL tool to display HDF5 data interactively. The tool is very
similar to HDFView with two additional capabilities. First, 2D arrays are automatically
displayed as images within the tool. Second, the tool includes a button to load any dataset
into the IDL workspace as a variable of the same name.
Listing 7: Sample IDL code for loading HDF5 data into an IDL viewer
1 ; D i s p l a y metadata and d a t a
2 d = H5BROWSE( f i l e p a t h )
3
4 ; Manually i m p o r t m y d a t a s e t from H5 BROWSE i n t o IDL as a s t r u c t u r e v a r i a b l e
5
6 ; Then g e t t h e d a t a from t h a t v a r i a b l e s t r u c t u r e
7 data = mydataset . d a t a
The variable is imported into the workspace as a structure, with one of the fields (. data)
containing the actual dataset array. So the last line of the code example would actually be
executed within the IDL workspace.
2016-06-08 14
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
It is anticipated that extensive software classes for MIPC systems will be developed and
evolve for data manipulation and streamlining of the I/O processes, at least convenience
functions for wrapping HDF5 API functions calls for MIPC that are common and repetitive.
However, sufficient capability is available in the core HDF5 API.
2.4.1 Groups
Listing 8 demonstrates simple methods for creating a MIPC file containing some of the high
level groups. In particular, repated groups containing incremental indices are demonstrated
in this example.
Listing 8: Sample Python code for writing groups to a MIPC file with repetition indices
1 # −∗− c o d i n g : u t f −8 −∗−
2
9 # S p e c i a l HDF5 d a t a t y p e s
10 # V a r i a b l e l e n g t h ASCII s t r i n g s
11 s t = h5py . s p e c i a l d t y p e ( v l e n=s t r )
12 db = h5py . s p e c i a l d t y p e ( v l e n=b y t e s )
13
14 # Groups as l i s t s by l e v e l
15 r o o t = [ ’ FILE ’ , ’REFERENCES ’ , ’MISSION ’ , ’SCENE ’ ]
16 f i l e = [ ’IDENTIFICATION ’ , ’SOURCE ’ , ’SECURITY ’ ]
17 m i s s i o n = [ ’PURPOSE ’ , ’TARGETS ’ ]
18 s c e n e = [ ’SPATIAL ’ , ’TEMPORAL’ , ’SPECTRAL ’ , ’RADIOMETRIC ’ ]
19 # D i c t i o n a r y o f g r o u p s by l i s t
20 g r o u p s = { ’ FILE ’ : f i l e , ’MISSION ’ : m i s s i o n , ’SCENE ’ : s c e n e }
21
22 # Constants
23 p r o d u c t C l a s s = ’MIPC ’
24 productType = ’ P oi nt Cloud ’
25
26 # Functions
27
28 # C o n v e r t i n g a l i s t o f v a r i a b l e l e n g t h s t r i n g s t o ASCII i n s t e a d o f Unicode
29 def p r e p S t r i n g L i s t ( s t r i n g L i s t ) :
2016-06-08 15
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
30 a s c i i L i s t = [ n . encode ( ” a s c i i ” , ” i g n o r e ” ) f o r n in s t r i n g L i s t ]
31 return a s c i i L i s t
32
33 # Adding l i s t o f v a r i a b l e l e n g t h s t r i n g s
34 def w r i t e S t r i n g L i s t ( f i l e , groupPath , s t r i n g L i s t ) :
35 a s c i i L i s t = prepStringList ( stringList )
36 n = len ( a s c i i L i s t )
37 d s e t = f . c r e a t e d a t a s e t ( groupPath , ( n , ) , dtype=db )
38 f o r i in range ( n ) :
39 dset [ i ] = a s c i i L i s t [ i ]
40 return f i l e
41
42 # Program
43
44 # Open f i l e
45 productName = ’ my mipc 2 ’
46 f i l e p a t h = ’C: \ \ Test \\ ’ + productName + ’ . h5 ’
47 print ( ’ C r e a t i n g and o p e n i n g f i l e f o r streamed w r i t i n g . . . ’ )
48 f = h5py . F i l e ( f i l e p a t h , ’w ’ )
49
50 print ( ’ C r e a t i n g g r o u p s . . . ’ )
51 f o r x in g r o u p s :
52 print ( x )
53 temp = f . c r e a t e g r o u p ( x )
54 f o r y in g r o u p s [ x ] :
55 temp = f [ x ] . c r e a t e g r o u p ( y )
56
57 # Adding s t r i n g s c a l a r d a t a s e t s
58 f [ ’ FILE ’ ] [ ’IDENTIFICATION ’ ] [ ’ ProductName ’ ] = productName
59 f [ ’ FILE ’ ] [ ’IDENTIFICATION ’ ] [ ’ P r o d u c t C l a s s ’ ] = p r o d u c t C l a s s
60 f [ ’ FILE ’ ] [ ’IDENTIFICATION ’ ] [ ’ ProductType ’ ] = productType
61
2016-06-08 16
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
84 d a t a s = data [ ( ) ]
85 print ( d a t a s )
86
87 f . close ()
Figure 1 provides a screen shot of a MIPC file depicting the high level groups as they ap-
pear in HDFView. Notice that there are multiple MODALITY, CLOUD, and SENSOR
groups each numbered with their index, starting with 0. This index number is defined in
Volume 1 as letter placeholders, such as MODALITY a, SENSOR b, etc. for the pur-
poses of generic, conceptual explanation. All index numbering in MIPC is 0-based since
HDF5 numbering is inherently 0-based.
The group indices restart from 0 under each parent as demonstrated in Figure 2. This
means that a group name is only unique if the entire path is included. Although it may seem
intuitive that CLOUD should be under sensor, recall from Volume 1 that a cloud can be
made from different sensors, provided (at this time) that those sensors are from the same
modality.
2016-06-08 17
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Figure 2: CLOUD and SENSOR indices restart counting under each MODALITY group.
Figure 3 depicts the possible groups under a CLOUD group. Shown are the group containing
GPM error information as well as the POINTS group containing the actual data points
for that cloud. There will be a separate POINTS group under each CLOUD. Recall from
Volume 1 that GPM data is per CLOUD only at this time, but may be expanded in the
future as processing algorithms mature, stabilize, and standardize.
2016-06-08 18
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2016-06-08 19
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
2.4.2 Datasets
Listing 9 demonstrates writing datasets to groups within a MIPC file. The example groups
are simple, but identification for large datasets. In particular, string arrays are included in
the example. String arrays are the only topic in HDF5 that can be confusing, and there are
varying references on the subject. The code listing includes convenience functions at the top
for simplifying the processing of string arrays. Notice that ProductClass and Product-
Type are standard constants within this code.
Listing 9: Sample Python code for writing numeric and string datasets to a MIPC file
1 # −∗− c o d i n g : u t f −8 −∗−
2
15 print ( ’ C r e a t i n g g r o u p s . . . ’ )
16
17 FILE = f . c r e a t e g r o u p ( ’ FILE ’ )
18 IDENTIFICATION = FILE . c r e a t e g r o u p ( ’IDENTIFICATION ’ )
19 SOURCE = FILE . c r e a t e g r o u p ( ’SOURCE ’ )
20 SECURITY = FILE . c r e a t e g r o u p ( ’SECURITY ’ )
21
22 REFERENCES = f . c r e a t e g r o u p ( ’REFERENCES ’ )
23
24 MISSION = f . c r e a t e g r o u p ( ’MISSION ’ )
25 PURPOSE = MISSION . c r e a t e g r o u p ( ’PURPOSE ’ )
26 TARGETS = MISSION . c r e a t e g r o u p ( ’TARGETS ’ )
27
28 modalities = [0 , 1 , 1]
29 n m o d a l i t i e s = len ( m o d a l i t i e s )
30 MODALITIES = [ ]
31 f o r a in range ( n m o d a l i t i e s ) :
32 modalityName = ’MODALITY ’ + s t r ( a )
33 m o d a l i t y = f . c r e a t e g r o u p ( modalityName )
34 MODALITIES . append ( m o d a l i t y )
35 s e n s o r s = [ ’A ’ , ’B ’ , ’C ’ ]
36 n s e n s o r s = len ( s e n s o r s )
37 f o r b in range ( n s e n s o r s ) :
38 sensorName = ’SENSOR ’ + s t r ( b )
39 s e n s o r = m o d a l i t y . c r e a t e g r o u p ( sensorName )
40 d e s c r i p t i o n = s e n s o r . c r e a t e g r o u p ( ’DESCRIPTION ’ )
41 f [ modalityName ] [ sensorName ] [ ’DESCRIPTION ’ ] [ ’ SensorID ’ ] = s e n s o r s [ b ]
2016-06-08 20
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
42 l o o k s = s e n s o r . c r e a t e g r o u p ( ’LOOKS ’ )
43 metadata = s e n s o r . c r e a t e g r o u p ( ’METADATA’ )
44
45 clouds = [ ’a ’ , ’b ’ , ’ c ’ ]
46 n c l o u d s = len ( c l o u d s )
47 f o r c in range ( n c l o u d s ) :
48 cloudName = ’CLOUD ’ + s t r ( c )
49 c l o u d = m o d a l i t y . c r e a t e g r o u p ( cloudName )
50 d e s c r i p t i o n = c l o u d . c r e a t e g r o u p ( ’DESCRIPTION ’ )
51 p r o c e s s i n g = c l o u d . c r e a t e g r o u p ( ’PROCESSING ’ )
52 q u a l i t y = c l o u d . c r e a t e g r o u p ( ’QUALITY ’ )
53 t i l e m a p = c l o u d . c r e a t e g r o u p ( ’TILEMAP ’ )
54
55 p o i n t s = c l o u d . c r e a t e g r o u p ( ’POINTS ’ )
56 p o s i t i o n s = p o i n t s . c r e a t e g r o u p ( ’POSITIONS ’ )
57 t i m e s = p o i n t s . c r e a t e g r o u p ( ’TIMES ’ )
58 e r r o r s = p o i n t s . c r e a t e g r o u p ( ’ERROR ’ )
59 c l a s s e s = p o i n t s . c r e a t e g r o u p ( ’CLASS ’ )
60 n o i s e = p o i n t s . c r e a t e g r o u p ( ’NOISE ’ )
61 i n t e n s i t y 0 = p o i n t s . c r e a t e g r o u p ( ’ INTENSITY 0 ’ )
62 i n t e n s i t y 1 = p o i n t s . c r e a t e g r o u p ( ’ INTENSITY 1 ’ )
63 a t t r i b u t e 0 = p o i n t s . c r e a t e g r o u p ( ’ATTRIBUTE 0 ’ )
64 a t t r i b u t e 1 = p o i n t s . c r e a t e g r o u p ( ’ATTRIBUTE 1 ’ )
65
66 gpm = c l o u d . c r e a t e g r o u p ( ’GPM’ )
67 master = gpm . c r e a t e g r o u p ( ’MASTER ’ )
68 d3c = gpm . c r e a t e g r o u p ( ’ 3DC ’ )
69 ap = gpm . c r e a t e g r o u p ( ’AP ’ )
70 umerr = gpm . c r e a t e g r o u p ( ’UMERR’ )
71 ppe = gpm . c r e a t e g r o u p ( ’PPE ’ )
72
73 f . close ()
Figure 4 demonstrates the string variables within a MIPC file, specifically the fields under
the FILE group. Also shown is the dataset SCENE.SPATIAL.Countries depicting a
string array of multiple country codes based on the digraphs as defined in Volume 1 of the
standard.
2016-06-08 21
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Figure 5 demonstrates some simple numeric datasets within a MIPC file, specifically the
fields under the SCENE.SPATIAL group. The Datums and PointCoorindateSystems
datasets contain integers referring to enumeration tables in the REFERENCES group as
defined in Volume 1 of the standard. The Coverage dataset contains the simplified minimum
and maximum geospatial extents of the data coverage over this scene from all CLOUDs.
2016-06-08 22
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
3 File Workflows
3.1 Process
Given the hiearchical nature of the data, the file generation process can be followed by walk-
ing down the tree. As parent objects are defined, children objects become less ambiguous.
For example, information in the FILE and SCENE groups define information that applies
to all clouds and point positions, and should be able to be determined prior to channelizing
or tiling the points.
Processes for generating or writing a MIPC file are defined in the sections that follow.
Processes for reading MIPC files will simply employ the reverse of the subprocesses given
below.
2016-06-08 23
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
matically pure model; and all other datums, including geoids, can be derived downstream
from this reference system. This choice must apply to the entire file, and is stored within
the following datasets:
SCENE.SPATIAL.PointCoordinateSystem
SCENE.SPATIAL.Datums
SCENE.SPATIAL.EpochDate
SCENE.SPATIAL.Zone
The EpochDate dataset specifies the realization for the given Datums. The Zone dataset
is conditional on:
PointCoordinateSystem = 1
which indicates UTM. The Zone field is a signed integer, with negative values indicating
the Southern hemisphere.
• POSITIONS
• ERROR
• NOISE
• TIMES
• CLASS
• INTENSITY
POSITIONS is the only required piece of point data. Geospatial data must have some tem-
poral context. Data in the TIMES group is highly recommended, but if not available per
point, the global duration of the file should be captured under:
SCENE.TEMPORAL.Coverage
2016-06-08 24
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
respectively, in the entire file. From LIDAR composite clouds, these would be the earliest
point and the latest point. For clouds created from imagery, this would be the collection
times of the earliest image and the latest image.
The NOISE group is intended to store the probability that a given point is not from a true
surface. Some sources of noise are universal to the nature of point clouds, and some are
modality specific. The metrics for uncertainty may vary across algorithms. Regardless, in
the end, this must be converted to an estimate of the probability in order to be modality
agnostic as well as useful for human analysis, software filtering, or advanced visualizaiton.
which is an array with g rows and 3 or 4 columns, where g is the number of channels in the
cloud. Each row is a unique channel, so the row index is the channel identification number
used in other references. Clouds from EO and LIDAR systems have 3 columns, and clouds
from SAR systems have 4 columns. The information is then populated in accordance with
§ 3.3.4 in Volume 1 of this standard.
The data can be tiled using any tiling strategy, such as a simple geospatial pattern, a
quadtree, octree, or K-D tree. Metadata for the tile must then be computed to enable rapid
searching for relevant tiles within the file. All pointwise attributes are then tiled in the same
manner so that the row index within a tile is the point record identifier for the position and
all associated attributes.
MIPC uses the ECEF system to fix point cloud data from individual tiles to a common ref-
erence frame and to geolocate them at a specific point on the earth. Every tile in a CLOUD
has a local three-dimensional (3-D) Cartesian (L3DC) coordinate frame which is ECEF with
the option to translate and rotate. This option may be necessary for automated processing
and exploitation algorithms that prefer a local vertical axis (i.e., Z up).
2016-06-08 25
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
To utilize the translate and rotate options, it is necessary only to specify the location of the
origin of the local Cartesian frame in ECEF space, and the direction of the three coordinate
axes in ECEF space. Points can then be converted back and forth through simple translation
and rotation operations. Figure 6 illustrates the Local 3-D Cartesian (L3DC) system, its
principal definitions, and relationship to the ECEF frame.
2016-06-08 26
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
– Scale metadata: provides scale factor (multiplier) applied to the integer position
values in the same linear translation.
• MODALITY a.CLOUD c.TILEMAP.Offsets
– Offset metadata: provides the intercept term in a linear translation of the integer
position values to meters on a tile-by tile basis.
Once the location of each tile’s origin is selected, a local Cartesian frame can be specified
where the z-axis is in a direction normal to the ellipsoid surface at the origin location, the
x-axis is in the east direction at the origin location, and the y-axis is in the north direction
at the origin location. The correspondence of the z-axis to vertical, the x-axis to east, and
the y-axis to north is only precisely true at the origin location. At locations away from the
origin, the z-direction departs from the true local up direction, and x any y depart from
true east and north direction, and this effect increases with distance from the origin. As
long as the coordinate conversion is performed properly, this is not a source of error because
the positions of the points are still properly placed in 3-D space without any projection
distortion. In data sets of small spatial extent, this effect may not be appreciable. This
is not a coordinate system where the points have been projected to a local tangent plane.
Instead, all points are exactly placed in 3-D space without projection. For example, points
that lie exactly on the ellipsoid surface (i.e. height above the ellipsoid is zero) will still vary
in z and have a 3-D curvature in the shape of the ellipsoid, whereas in local East-North-Up
(ENU) or a Universal Transverse Mercator (UTM) projection, when the ellipsoid is used as
the elevation datum, these points would all lie on the x-y plane with a z value of zero.
To find the directions of the L3DC axes in ECEF space, let F (X, Y, Z) be the equation of
the WGS-84 ellipsoid in ECEF. The z-axis in local Cartesian space, z ′ , is defined as the
vertical direction at the origin point (X0 , Y0 , Z0 ). The vertical direction is independent from
the vertical elevation of the origin above or below the ellipsoid, and so can be determined
at the surface. The surface normal is found by taking the divergence of the surface function
and normalizing.
x2 y 2 z 2
F (x, y, z) = + + (1)
a2 a2 b 2
Ð
→ X0 Y0 Z0
z ′ (X0 , Y0 , Z0 ) = ∇F (X0 , Y0 , Z0 ) = 2 ( 2 , 2 , 2 ) (2)
a a b
∇F (X0 , Y0 , Z0 )
ẑ′ (X0 , Y0 , Z0 ) = (3)
∥∇F (X0 , Y0 , Z0 )∥
a = 6378137.0 m
2016-06-08 27
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
b = 6356752.314245 m
The local Cartesian x-axis, pointing in the east direction at the origin, is orthogonal to
both the local z-axis and the ECEF Z axis (i.e., it is parallel to the ECEF X-Y plane).
Therefore, x′ , can be found by taking the cross product of the local vertical direction z′ with
the negative ECEF Z direction. The negative ECEF Z direction is used to ensure it points
in the direction of increasing longitude.
The local Cartesian y-axis in the north direction y′ , can then be found by taking the cross
product of z′ and x′
ẑ′ × x̂′
ŷ′ = (5)
∥ẑ′ × x̂′ ∥
There are two cases where these cross products go to zero, namely if the origin is placed at
precisely the north or south pole where the definitions of north and east become pathologi-
cal. Conventions are established for MIPC and SIPC to deal with these cases. At the north
pole, the ECEF X and Y directions will be used for the local x and y axes, respectively. The
ECEF Z axis is already the local vertical. At the south pole the ECEF negative Z direction
is local vertical, and the local Cartesian x axis will be in the positive ECEF X direction and
our local Cartesian Y axis will be in the negative ECEF Y direction.
The vectors defining the local Cartesian axes directions in ECEF should now be stored in:
MODALITY a.CLOUD c.TILEMAP.ECEFRotate
in accordance with Volume 1 of this standard. Specifically, the rotation matrix is an array
for each tile with 3 rows and 3 columns, where columns j = 1, 2, 3 represents the unit vector’s
components in the ECEF x, y, z directions, respectively. Then the array for each tile is a
page into a single 3-D array dataset ECEFRotate.
in accordance with Volume 1 of this standard. Specifically, the coordinates are an array for
each tile with 1 row and 3 columns, where columns j = 1, 2, 3 represents the ECEF x, y, z
positions, respectively. Then the array for each tile is a page into a single 3-D array dataset
ECEFTranslate.
2016-06-08 28
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
x′ − of f setx
xSI = ⌊ + 0.5⌋ (10)
scalex
and similarly for y and z. The SI subscript denotes the actual scaled integer value stored
in the file. Notice that xSI is the f loor of the function in equation 10. N is the number of
points in the tile, and,
2016-06-08 29
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
where b = 16, 32, or 64; depending on the bit depth chosen for storage.
The offset terms provide the intercept in a linear translation of the integer position values to
meters so of f setX should be set to the minimum value of x′ found in the tile, and similarly
for of f setY and of f setZ .
The scale factor is the multiplier applied to the integer position values in the same linear
translation. This factor’s minimum value is the maximum tile dimension in X, Y, or Z (across
all tiles in the channel) divided by 216 or 232 but should be set considering the precision of
the data so that meaningful precision is not lost, but artificial precision is not introduced.
2016-06-08 30
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
respectively.
3.6.2 GPM
MIPC implements the GPM as a method for storing and calculating error in geopositioning.
If a complete GPM implementation is desired, this information is stored in the groups and
datasets under:
This group has a dataset, n Records, indicating the number of error records, and an array
dataset, Covariance, storing different covariance matrices. The Covariance dataset is a n
x 6 array, where n is the error index, which allows for the storage of less errors than points
assuming that some errors are used for multiple points. The 6 columns are the variance in
x, y, z along with the covariance in xy, xz, yz.
as a single integer index into the table of errors under GPM.PPE. The data under the
ERROR group is tiled, named as Tile d, where d is the integer index identical to the same
tile under the POSITIONS group. Within a tile dataset, each row corresponds to the point
at the same row in the same Tile d in the POSITIONS group.
2016-06-08 31
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
4 Extending MIPC
This section provides information on adding or extending upon the MIPC standard. Al-
though HDF5 is a flexible file format, a standard, such as MIPC, is not flexible by definition.
Blind flexibility leads to interoperability challenges. However, MIPC is tailorable in a stan-
dard manner to meet the requirements of the implementation.
MIPC was designed to be tailorable in the following manner. The file is hierarchical so that
each branch is self-sufficient. The MIPC standard defines the minimum required content,
and standardizes optional information. The current structure should not be changed with-
out a formal process. However, there is room to add information in the forms of groups
and datasets as described in the following sections. Additionally, HDF attributes, text that
describes a group or dataset, were intentionally left out of the standard to be available to be
used as desired by programs and systems.
2016-06-08 32
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
the images are overviews related to a specific CLOUD, then it should be included under that
CLOUD. If they are quicklooks of the source imagery from a specific sensor, they should be
included under LOOKS. These example paths are given below:
2016-06-08 33
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
References
[1] NGA, “Sensor Indepdendent Point Cloud (SIPC) File Format Trade Study,” Report,
Nov 2012. Ver. 2.0.
2016-06-08 34
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
Glossary
application programming interface a library of software source code or executables with
supporting documentation that simplifies and encapsulates the interface to third party
components or data in order to streamline programming. 2
CamelCase a syntax where words are combined without spaces or underscores, but the
first letter of each word is uppercase, and all other letters are lowercase. The first
letter may or may not be uppercase depending on the meaning and convention. 6
channel a specific sensor configuration that modulates the intensity data in ground-space.
This will typically indicate a combination of the spectral band and the polarization
state. 25
data generally refers to the input to some process or system. For systems engineering
models of NGA CONOPS, this term refers to GEOINT signal content that have been
processed and are ready for analyst exploitation. This data can be used as input to
various analysis tasks. iii
group a data component within HDF used to organize data, similar to a directory in a file
system. 5
metadata generally refers to data about data (signal). This is information that refers to
the sensor specifics, or the manner in which the data was collected. This is sometimes
referred to as Narrow Band Data or Support Data. iii
Acronyms
2-D two-dimensional. iii, 23
ENU East-North-Up. 27
EO Electro-Optical. iii
2016-06-08 35
MIPC Vol. 2 NGA.STND.0055-02 1.0 MIPCFFDD
KLV Key-Length-Value. 1
TB Terabytes. 4
2016-06-08 36