0% found this document useful (0 votes)
7 views18 pages

An introduction to multisensor data fusion

This paper provides an overview of multisensor data fusion, a technology that combines data from multiple sensors to improve accuracy in various applications, including military and non-military fields. It discusses the evolution of data fusion techniques, their applications in automated target recognition and surveillance, and the challenges faced in implementing effective data fusion systems. The paper also highlights the significance of data fusion in enhancing operational performance and decision-making processes in complex environments.

Uploaded by

Guillaume Rossi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views18 pages

An introduction to multisensor data fusion

This paper provides an overview of multisensor data fusion, a technology that combines data from multiple sensors to improve accuracy in various applications, including military and non-military fields. It discusses the evolution of data fusion techniques, their applications in automated target recognition and surveillance, and the challenges faced in implementing effective data fusion systems. The paper also highlights the significance of data fusion in enhancing operational performance and decision-making processes in complex environments.

Uploaded by

Guillaume Rossi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

An Introduction to Multisensor Data Fusion

DAVID L. HALL, SENIOR MEMBER, IEEE, AND JAMES LLINAS

Invited Paper

Multisensor data fusion is an emerging technology applied to computer and the Lambda machine) in the early 1970’s
Department of Defense (DoD) areas such as automated target provided an impetus to artificial intelligence [119], recent
recognition, battlefield surveillance, and guidance and control advances in computing and sensing have provided the
of autonomous vehicles, and to non-DoD applications such as
monitoring of complex machinery, medical diagnosis, and smart ability to emulate, in hardware and software, the natural
buildings. Techniques for multisensor data fusion are drawn from data fusion capabilities of humans and animals. Currently,
a wide range of areas including artificial intelligence, pattern data fusion systems are used extensively for target tracking,
recognition, statistical estimation, and other areas. This paper automated identification of targets, and limited automated
provides a tutorial on data fusion, introducing data fusion applica-
reasoning applications. Spurred by significant expenditures
tions, process models, and identification of applicable techniques.
Comments are made on the state-of-the-art in data fusion. by the Department of Defense (DoD), data fusion technol-
ogy has rapidly advanced from a loose collection of related
I. INTRODUCTION techniques, to an emerging true engineering discipline with
standardized terminology (see Fig. 1), collections of robust
In recent years, multisensor data fusion has received
mathematical techniques [2]–[4], and established system
significant attention for both military and nonmilitary appli-
design principles. Software in the area of data fusion
cations. Data fusion techniques combine data from multiple
applications is becoming avavailable in the commercial
sensors, and related information from associated databases,
marketplace [16].
to achieve improved accuracies and more specific infer-
Applications for multisensor data fusion are widespread.
ences than could be achieved by the use of a single
Military applications include: automated target recognition
sensor alone [1]–[4]. The concept of multisensor data fusion
(e.g., for smart weapons), guidance for autonomous vehi-
is hardly new. Humans and animals have evolved the
cles, remote sensing, battlefield surveillance, and automated
capability to use multiple senses to improve their ability
threat recognition systems, such as identification-friend-
to survive. For example, it may not be possible to assess
foe-neutral (IFFN) systems [14]. Nonmilitary applications
the quality of an edible substance based solely on the
include monitoring of manufacturing processes, condition-
sense of vision or touch, but evaluation of edibility may
based maintenance of complex machinery, robotics [129],
be achieved using a combination of sight, touch, smell,
and medical applications. Techniques to combine or fuse
and taste. Similarly, while one is unable to see around
data are drawn from a diverse set of more traditional
comers or through vegetation, the sense of hearing can
disciplines including: digital signal processing, statistical
provide advanced warning of impending dangers. Thus
multisensory data fusion is naturally performed by animals estimation, control theory, artificial intelligence, and classic
and humans to achieve more accurate assessment of the sur- numerical methods [16], [12], [54]. Historically, data fusion
rounding environment and identification of threats, thereby methods were developed primarily for military applications.
improving their chances of survival. However, in recent years these methods have been applied
While the concept of data fusion is not new, the emer- to civilian applications, and there has been bidirectional
gence of new sensors, advanced processing techniques, and technology transfer [5]. Various annual conferences pro-
improved processing hardware make real-time fusion of vide a forum for discussing data fusion applications and
data increasingly possible [5], [6]. Just as the advent of techniques [7]–[10].
symbolic processing computers (viz., the SYMBOLIC’s In principle, fusion of multisensor data provides signifi-
cant advantages over single source data. In addition to the
Manuscript received April 23, 1996; revised October 14, 1996.
D. L. Hall is with the Applied Research Laboratory, The Penn- statistical advantage gained by combining same-source data
sylvania State University, University Park, PA 16802 USA (e-mail: (e.g., obtaining an improved estimate of a physical phenom-
dlh28@psu.edu). ena via redundant observations), the use of multiple types
J. Llinas is with the State University of New York, Buffalo, NY 14260
USA (e-mail: llinas@acsu.buffalo.edu). of sensors may increase the accuracy with which a quantity
Publisher Item Identifier S 0018-9219(97)00775-5. can be observed and characterized. In the accompanying

0018–9219/97$10.00  1997 IEEE

6 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 1. Table of terminology and definitions.

Fig. 2. FLIR and radar sensor data correlation.

Fig. 2 [2], a simple example is provided of a moving object, by either of the two independent sensors. This results in a
such as an aircraft, observed by both a pulsed radar and an reduced error region as shown in the fused or combined
infrared imaging sensor. The radar provides the ability to location estimate. A similar effect may be obtained in
accurately determine the aircraft’s range, but has a limited determining the identity of an object based on observations
ability to determine the angular direction of the aircraft. of an object’s attributes. For example, there is evidence
By contrast, the infrared imaging sensor can accurately that bats identify their prey by a combination of factors
determine the aircraft’s angular direction, but is unable that include size, texture (based on acoustic signature), and
to measure range. If these two observations are correctly kinematic behavior.
associated (as shown in the central part of the figure), The most fundamental characterization of data fusion
then the combination of the two sensors data provides an involves a hierarchical transformation between observed
improved determination of location than could be obtained energy or parameters (provided by multiple sources as

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.7
Fig. 3. Inference hierarchy.

input) and a decision or inference (produced by fusion


estimation and/or inference processes) regarding the lo- Fig. 4. Example of Monte Carlo evaluation of data fusion ben-
cation, characteristics, and identity of an entity, and an efits.
interpretation of the observed entity in the context of a
surrounding environment and relationships to other entities
(see Fig. 3). The definition of what constitutes an entity or two acoustic sensors). Techniques for raw data fusion
depends upon the specific application under considera- typically involve classic detection and estimation methods.
Conversely, if the sensor data are noncommensurate, then
tion (e.g., an enemy aircraft for a tactical air-defense
the data must be fused at a feature/state vector level or
application, or the location and characteristics of a tumor
decision level.
in a medical diagnosis application). The transformation
Feature-level fusion involves the extraction of represen-
between observed energy or parameters and a decision or
tative features from sensor data. An example of feature
inference proceeds from an observed signal to progressively
extraction is the use of characteristics of a human’s face
more abstract concepts. In a target tracking application, for
to represent a picture of the human. This technique is used
example, multisensor energy, converted to observations of
by cartoonists or political satirists to evoke recognition of
angular direction, range, and range-rate may be converted
famous figures. There is evidence that humans utilize a
in turn into an estimate of the target’s position and velocity
feature-based cognitive function to recognize objects. In
(using observations from one or more sensors). Similarly,
feature-level fusion, features are extracted from multiple
observations of the target’s attributes, such as radar cross sensor observations, and combined into a single concate-
section, infrared spectra, and visual image, may be used to nated feature vector which is input to pattern recognition
classify the target, and allow a feature-based classifier to approaches based on neural networks, clustering algorithms,
declare an assignment of a label specifying target identity or template methods.
(e.g., F-16 aircraft). Finally, understanding the motion of Finally, decision level fusion involves fusion of sensor
the target and its relative motion with respect to the information, after each sensor has made a preliminary deter-
observer, may allow a determination of the intent of the mination of an entity’s location, attributes, and identity. Ex-
target (e.g., threat, no-threat, etc.). amples of decision level fusion methods include weighted
The determination of the target’s position and velocity decision methods (voting techniques), classical inference,
from a noisy time-series of measurements constitute a Bayesian inference, and Dempster–Shafer’s method.
classical statistical estimation problem [68], [62], [63]. Qualitative advantages of data fusion for DoD systems
Modern techniques involve the use of sequential estimation have been cited by numerous authors. Waltz [1], for ex-
techniques such as the Kalman filter or its variants. To ample, cites the following benefits for tactical military
establish target identity, a transformation must be made systems; robust operational performance, extended spatial
between observed target attributes and a labeled identity. coverage, extended temporal coverage, increased confi-
Methods for identity estimation involve pattern recognition dence (i.e., of target location and identity), reduced ambigu-
techniques based on clustering algorithms, neural networks, ity, improved target detection, enhanced spatial resolution,
or decision-based methods such as Bayesian inference improved system reliability, and increased dimensionality.
[107], Dempster–Shafer’s method [111], [130], [110], or Waltz performed Monte Carlo numerical studies to show the
weighted decision techniques [3]. Finally, the interpreta- quantitative utility of data fusion for improved noncooper-
tion of the target’s intent entails automated reasoning us- ative target recognition (see Fig. 4), leading to advantages
ing implicit and explicit information, via knowledge-based in tactical air-to-air engagements.
methods such as rule-based reasoning systems [116]–[118]. Despite these qualitative notions and quantitative calcu-
Observational data may be combined, or fused, at a lations of improved system operation by using multiple
variety of levels from the raw data (or observation) level sensors and fusion processes, actual implementation of
to a state vector level, or at the decision level. Raw effective data fusion systems is far from simple. In practice,
sensor data can be directly combined if the sensor data fusion of sensor data may actually produce worse results
are commensurate (i.e., if the sensors are measuring the than could be obtained by tasking the most appropriate
same physical phenomena such as two visual image sensors sensor in a sensor suite. This is caused by the attempt

8 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 5. DoD applications sumary.

to combine accurate (i.e., good data) with inaccurate or


biased data, especially if the uncertainties or variances
of the data are unknown. Quantitative evaluation of the
effectiveness of data fusion system must, in most cases, be
performed by Monte Carlo simulations or covariance error
analysis techniques [3], [46], [47]. Fundamental issues to be
addressed in building a data fusion system for a particular
application include:

1) what algorithms or techniques are appropriate and


optimal for a particular application;
2) what architecture should be used (i.e., where in the
processing flow should data be fused);
3) how should the individual sensor data be processed Fig. 6. An example of multisensor ocean surveillance.
to extract the maximum amount of information;
4) what accuracy can realistically be achieved by a data
each of these in turn, and provide examples of applications.
fusion process;
5) how can the fusion process be optimized in a dynamic The DoD community focuses on problems involving the
sense; location, characterization, and identification of dynamic
6) how does the data collection environment (i.e., signal entities such as emitters, platforms, weapons, and military
propagation, target characteristics, etc.) affect the units. These dynamic data are often termed an Order-of-
processing; Battle database or Order-of-Battle display (if superimposed
7) under what conditions does multisensor data fusion on a map display). Beyond achieving an Order-of-Battle
improve system operation? database, DoD users seek higher level inferences about
the enemy situation (i.e., the relationships among entities,
This paper provides a brief overview of multisensor data their relationships with the environment, higher level en-
fusion technology and its applications. An introduction to emy entity organizations, etc.). Examples of DoD related
data fusion techniques is provided along with a discussion applications include ocean surveillance, air-to-air defense,
of some fundamental issues. Some projections for the future battlefield intelligence, surveillance, and target acquisition,
of data fusion are provided along with an assessment of the and strategic warning and defense (see Fig. 5). Each of
state-of-the-art and state-of-practice. these military applications involves a particular focus, sen-
sor suite, desired set of inferences, and a particular set of
II. MILITARY APPLICATIONS OF DATA FUSION challenges.
Two broad communities have focused on data fusion for Ocean surveillance systems are designed to detect, track,
specific applications: DoD and non-DoD. We will address and identify ocean-based targets and events. Examples

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.9
Fig. 7. Overview of non-DoD application.

include antisubmarine warfare systems to support Navy Another application, Battlefield Intelligence, Surveil-
tactical fleet operations (Fig. 6), and automated systems lance, and Target Acquisitions systems attempt to detect
to guide autonomous vehicles. Sensor suites may include and identify potential ground targets. Examples include the
radar, sonar, electronic intelligence (ELINT), observation of location of land mines and automatic target recognition of
communications traffic (COMINT), infrared, and synthetic high value targets. Sensors include airborne surveillance
aperture radar (SAR) observations [100]. The surveillance via Moving Target Indicator (MTI) radar, synthetic
area for ocean surveillance may encompass hundreds of aperture radar, passive electronic support measures, photo
nautical square miles, and a focus on air, surface, and reconnaissance, ground-based acoustic sensors, remotely
subsurface targets. Multiple surveillance platforms may also piloted vehicles, electro-optic sensors, and infrared sensors.
be involved with numerous targets tracked. Challenges to Key inferences sought are information to support battlefield
ocean surveillance involve the large surveillance volume, situation assessment and threat assessment, and course-of-
the combination of targets and sensors, and the complex action estimation.
signal propagation environment—especially for underwater A detailed discussion of DoD data fusion applications
sonar sensing. An example of an ocean surveillance system can be found in the collected annual Proceedings of the
is shown in Fig. 6. Data Fusion Systems Conference [7], Proceedings of the
Air-to-air and surface-to-air defense systems have been National Symposium on Sensor Fusion [9], and various
developed by the military to detect, track, and identify strategic documents [15].
aircraft and anti-aircraft weapons and sensors. These de-
fense systems use sensors such as radar, passive elec- III. NONMILITARY APPLICATIONS OF DATA FUSION
tronic support measures (ESM), infrared, identification- A second broad community which addresses data fusion
friend-foe (IFF) sensors, electro-optic image sensors, and problems is the academic/commercial/industrial commu-
visual (human) sightings. These systems support counter- nity. This diverse group addresses problems such as the
air, order-of-battle aggregation, assignment of aircraft to implementation of robotics, automated control of industrial
raids, target prioritization, route planning, and other activi- manufacturing systems, development of smart buildings,
ties. Challenges to these data fusion systems include enemy and medical applications (see Fig. 7), among other evolving
countermeasures, the need for rapid decision making, and applications. As with the military applications, each of these
potentially large combinations of target-sensor pairings. A applications has particular challenges, sensor suites, and
special challenge for IFF systems is the need to confi- implementation environments.
dently and noncooperatively identify enemy aircraft. The Remote sensing systems have been developed to identify
proliferation of weapon systems throughout the world, and and locate entities and objects. Examples include systems
the subsequent lack of relationship between the nationality to monitor agricultural resources (e.g., the productivity and
of weapon origin and combatants who use the weaponry, health of crops), to locate natural resources, and to monitor
causes increased IFF challenges. weather and natural disasters. These systems rely primarily

10 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 8. Top level data fusion process model.

on image systems using multispectral sensors. Such pro- A final example of a data fusion system for nonmilitary
cessing systems are dominated by automatic and multispec- applications is the area of medical diagnosis. Currently,
tral image processing. The multispectral imagery employed increasingly sophisticated sensors are being developed for
includes the Landsat satellite system, the SPOT system, or medical applications. Sensors such as nuclear magnetic
others. A frequently used technique for multisensor image resonance (NMR) devices, acoustic imaging devices, and
fusion involves adaptive neural networks. Multi-image data medical tests, individually provide improvements in med-
are processed on a pixel-by-pixel basis and input to a neural ical diagnostic capability. The ability to fuse these data
network to automatically classify the contents of the image. together promises to improve the diagnostic capability, and
False colors are usually associated with types of crops, reduce false diagnoses. A clear challenge here is the signal
vegetation, or classes of objects. The resulting false color propagation environment, and difficulties in obtaining train-
synthetic image is readily interpreted by human analysts. A ing data for adaptive techniques such as neural networks.
key challenge in multi-image data fusion is interimage co- Military and nonmilitary communities are beginning to
registration. This problem requires the alignment of two or share information to create real technology transfer across
more photos so that the images are overlaid in such a way application domains. For example, the first International
that corresponding picture elements (pixels) on each picture Conference on Multi-Sensor Fusion and Integration for
represent the same location on earth (each pixel represents Intelligent Systems was sponsored by the IEEE and held
the same direction from an observer’s point of view). in Las Vegas, NV, on 2–5 October 1994 [10]. Also,
This co-registration problem is exacerbated by the fact annual (on-going) SPIE conferences focus on non-DoD
that image sensors are nonlinear, and perform a complex applications [8].
transformation between observed three-dimensional (3-D)
space, and a two-dimensional (2-D) image plane [86]. A IV. A DATA FUSION PROCESS MODEL
second application area which spans both military and One of the historical barriers to technology transfer in
nonmilitary users is the monitoring of complex mechanical data fusion has been the lack of a unifying terminology,
equipment such as turbomachinery, helicopter gear-trains, which crosses application-specific boundaries. Even within
or industrial manufacturing equipment. For a drivetrain military applications, related but different applications such
application, for example, available sensor data may include as IFF systems, battlefield surveillance, and automatic target
accelerometers, temperature gauges, oil debris monitors, recognition, have used different definitions for fundamen-
acoustic sensors, and even infrared measurements. An on- tal terms such as correlation and data fusion. In order
line condition monitoring system would seek to combine to improve communications among military researchers
these observations in order to identify precursors to fail- and system developers, the Joint Directors of Laboratories
ure, such as abnormal wear of gears, shaft misalignment, (JDL) Data Fusion Working Group, established in 1986,
or bearing failure. It is anticipated that the use of such began an effort to codify the terminology related to data
condition-based monitoring would reduce costs for mainte- fusion. The result of that effort was the creation of a
nance, improve safety, and improve reliability [126]. Such process model for data fusion, and a Data Fusion Lexicon
systems are beginning to be developed for helicopters and [12], [11]. The top level of the JDL data fusion process
other high cost systems. Special difficulties for data fusion model is shown in Fig. 8. The JDL process model is a
involve noncommensurate sensors and challenging signal functionally oriented model of data fusion and is intended
propagation and noise environments. to be very general and useful across multiple application

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
11
areas. While the boundaries of the data fusion process are performs four key functions: 1) transforms sensor
fuzzy and case-by-case dependent, generally speaking the data into a consistent set of units and coordinates, 2)
input boundary is usually at the post-detection, extracted refines and extends in time estimates of an object’s
parameter level of signal processing. The output of the data position, kinematics, or attributes, 3) assigns data to
fusion process is (ideally) a minimally ambiguous identifi- objects to allow the application of statistical estimation
cation and characterization (viz., location and attributes) of techniques, and 4) refines the estimation of an object’s
individual entities, as well as a higher level interpretation of identity or classification.
those entities in the context of the application environment. • Level 2 Processing (Situation Refinement). Level 2
The JDL Data Fusion Process model is a conceptual processing develops a description of current relation-
model which identifies the processes, functions, categories ships among objects and events in the context of
of techniques, and specific techniques applicable to data their environment. Distributions of individual objects
fusion. The model is a two-layer hierarchy. At the top level, (defined by Level 1 processing) are examined to aggre-
shown in Fig. 8, the data fusion process is conceptualized gate them into operationally meaningful combat units
by; sources of information, human computer interaction, and weapon systems. In addition, situation refinement
source preprocessing, Level 1 processing, Level 2 process- focuses on relational information (i.e., physical prox-
ing, Level 3 processing, and Level 4 processing. Each of imity, communications, causal, temporal, and other
these is summarized below and summarized in Fig. 9. relations) to determine the meaning of a collection of
• Sources of Information. The left side of Fig. 8 indicates entities. This analysis is performed in the context of
that a number of sources of information may be environmental information about terrain, surrounding
available as input including: 1) local sensors associated media, hydrology, weather, and other factors. Situation
with a data fusion system (e.g., sensors physically refinement addresses the interpretation of data, analo-
associated with the data fusion system or organic gous to how a human might interpret the meaning of
sensors physically integrated with a data fusion system sensor data. Both formal and heuristic techniques are
platform), 2) distributed sensors linked electronically used to examine, in a conditional sense, the meaning
to a fusion system, and 3) other data such as reference of Level 1 processing results.
information, geographical information, etc. • Level 3 Processing (Threat Refinement). Level 3 pro-
• Human Computer Interaction (HCI). The right side of cessing projects the current situation into the future
Fig. 8 shows the human computer interaction (HCI) to draw inferences about enemy threats, friendly and
function for fusion systems. HCI allows human in- enemy vulnerabilities, and opportunities for operations.
put such as commands, information requests, human Threat assessment is especially difficult because it
assessments of inferences, reports from human op- deals not only with computing possible engagement
erators, etc. In addition, HCI is the mechanism by outcomes, but also assessing an enemy’s intent based
which a fusion system communicates results via alerts, on knowledge about enemy doctrine, level of train-
displays, dynamic overlays of positional and identity ing, political environment, and the current situation.
information on geographical displays. In general, HCI The overall focus is on intent, lethality, and opportu-
incorporates not only multimedia methods for human nity. Level 3 processing develops alternate hypotheses
interaction (graphics, sound, tactile interface, etc.), about an enemy’s strategies and the effect of uncertain
but also methods to assist humans in direction of knowledge about enemy units, tactics, and the envi-
attention, and overcoming human cognitive limitations ronment. Game theoretic techniques are applicable for
(e.g., difficulty in processing negative information). Level 3 processing.
• Source Preprocessing (Process Assignment). An initial • Level 4 Processing (Process Refinement). Level 4 pro-
process allocates data to appropriate processes and cessing may be considered a meta-process, i.e., a
performs data pre-screening. Source preprocessing re- process concerned about other processes. Level 4 pro-
duces the data fusion system load by allocating data cessing performs four key functions: 1) monitors the
to appropriate processes (e.g., locational and attribute data fusion process performance to provide infor-
data to Level 1 object refinement, alerts to Level mation about real-time control and long-term per-
3 processing, etc.). Source preprocessing also forces formance, 2) identifies what information is needed
the data fusion process to concentrate on the data to improve the multilevel fusion product (inferences,
most pertinent to the current situation. Extensive signal positions, identities, etc.), 3) determines the source
processing and detection theory may be required [41], specific requirements to collect relevant information
[42]. A special case of source preprocessing is the (i.e., which sensor type, which specific sensor, which
synthesis of multiple component sensory array data to database), and 4) allocates and directs the sources to
estimate the location and velocity of a target. achieve mission goals. This latter function may be
• Level 1 Processing (Object Refinement). This process outside the domain of specific data fusion functions.
combines locational, parametric, and identity informa- Hence, Level 4 processing is shown as partially inside
tion to achieve refined representations of individual ob- and partially outside the data fusion process.
jects (e.g., emitters, platforms, weapons, or geograph- • Data Management. The most extensive support func-
ically constrained military units). Level 1 processing tion required to support data fusion processing is

12 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 9. JDL process model components.

database management. This collection of functions and basic processing approach. At this lowest level in the
provides access to, and management of, data fusion hierarchy (shown in the third column of Fig. 10), specific
databases, including data retrieval, storage, archiving, methods such as Kalman filters, alpha-beta filters, multiple
compression, relational queries, and data protection. hypothesis trackers, etc. are identified to perform each
Database management for data fusion systems is par- function.
ticularly difficult because of the large and varied data The JDL model described here is generic, and is in-
managed (i.e., images, signal data, vectors, textural tended merely as a basis for common understanding and
data) and the data rates both for ingestion of incoming discussion. The separation of processes into Levels 1–4 is
sensor data, as well as the need for rapid retrieval. an artificial partition. Implementation of real data fusion
A summary of the JDL data fusion process components systems integrates and interleaves these functions into an
are shown in Fig. 9. Each of these components can be overall processing flow. The data fusion process model
hierarchically broken down into subprocesses. The first is augmented by a hierarchical taxonomy which identifies
level decomposition and associated applicable problem categories of techniques and algorithms for performing the
solving techniques as shown in Fig. 10. For example, Level identified functions. In addition, an associated lexicon has
1 processing is subdivided into four types of functions: data been developed to provide a consistent definition of data
alignment, data/object correlation, object positional, kine- fusion terminology [11]. The JDL model, while originally
matic, and attribute estimation, and finally, object identity developed for military applications, is clearly applicable to
estimation. The object positional, kinematic, and attribute nonmilitary applications. For example, in condition-based
estimation function is further subdivided into system mod- monitoring, the concept of Level 3 threat refinement can
els, defined optimization criteria, optimization approaches, be associated with the identification of potential system me-

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
13
Fig. 10. Examples of data fusion algorithms and techniques.

chanical faults (and their anticipated progression). Thus, the Bowman has developed the concept of a hierarchical data
JDL model is useful for nonmilitary applications. Indeed, fusion tree to partition fusion problems into nodes, each
the JDL model terminology is beginning to experience conceptually involving functions such as data association,
wide utilization and acceptance throughout the data fusion correlation, and estimation, etc.
technical community. It should be noted, however, that
V. ARCHITECTURES FOR MULTISENSOR DATA FUSION
there have been a number of extensions to the JDL model
as well as discussion about its overall utility. Waltz [86], One of the key issues in developing a multisensor data
fusion system is the question of where in the data flow
for example, demonstrated that the JDL model does not
to actually combine or fuse the data. We will focus on
adequately address multi-image fusion problems. Waltz
two situations for Level 1 fusion: 1) fusion of locational
described how the JDL model can be extended to include information (such as observed range, azimuth, and eleva-
concepts of fusion of image data, especially those involving tion) to determine the position and velocity of a moving
complex synthetic aperture imagery. Hall and Ogrodnik object, and 2) fusion of parametric data (such as radar
[127] extended the model further to account for complex cross section, infrared spectra, etc.) to determine the identity
meta sensors (e.g., sensors involving multiple components of an observed object. We will discuss these two cases
and utilization of wideband processing techniques). Bow- separately, though in an actual system, fusion of locational
man [128] has argued that the JDL model is useful, but does and parametric identity information could be performed in
not help in developing an architecture for a real system. an integrated fashion.

14 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 11. Generic tracker/correlator architectures.

There are three broad alternatives to fusing locational data, as desired. These alternative fusion architectures are
information to determine the position and velocity of an illustrated in Fig. 11.
object: 1) fusion of the raw observational data (so-called The top part of the figure shows fusion of raw ob-
data level or report level fusion), 2) fusion of state vectors servational data. Data from each sensor (or each differ-
(in this case, a state vector is an optimum estimate using ent sensor type) are aligned to transform the sensor data
an individual sensor’s measurements of the position and from sensor-based units and coordinates to convenient
velocity of an observed object), and 3) a hybrid approach coordinates and units for central processing. The data
which allows fusion of either raw data or state vector are then associated/correlated to determine which sensor

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
15
observations belong together. That is, in a multisensor, desired accuracy, the capabilities of the sensors, and avail-
multitarget environment, it is necessary to determine which able funding.
observations represent observations of the same physical The second type of Level 1 processing considered is
entity or target. Thus association/correlation determines identity fusion. Here, we are trying to convert multiple
which observation-to-observation or observation-to-existing sensor observations of a target’s attributes (such as radar
target track belong together. This association/correlation cross section, infrared spectral coefficients, etc.) to a joint
problem may be very complicated in dense target tracking declaration of target identity. For identity fusion, there are
environments. Nevertheless, once a determination has been several types of architectures which can be used: 1) data
made, then the data are fused, typically using sequential level fusion, 2) feature level fusion, and 3) decision level
estimation techniques such as Kalman filters. This central- fusion. These three architectures are illustrated in Fig. 12.
ized fusion approach is theoretically the most accurate way The first architecture [Fig. 12(c)] performs data level fu-
to fuse data, assuming that the association and correlation sion. Each sensor observes an object and the raw sensor data
can be performed correctly. However, this approach also are combined. Subsequently, an identity declaration process
requires that the raw data be transmitted (via local commu- is performed. This is commonly achieved by extracting a
nications networks or other mechanism) from the sensors to feature vector from the fused data, and a transformation
the central processing facility or computer. For the case of made between the feature vector and a declaration of
image data, this may require a communications bandwidth identity. Methods for this feature-based identity declaration
which would exceed what is actually available. include neural networks, template methods, and pattern
The second architecture for locational fusion is dis- recognition methods such as cluster algorithms. In order
tributed (or autonomous) fusion, in which each sensor to fuse the raw data, the original sensor data must be
performs single-source positional estimation, producing a commensurate (i.e., must be observations of the same or
state vector from each sensor. (This is, each sensor provides similar physical quantities such as visual images) and must
an estimate of the position and velocity of an object, be able to be properly associated. Thus, for example, if two
based only on its own single source data). These estimates image sensors are used, the images must be able to be co-
of position and velocity (i.e., a state vector estimate) aligned on a pixel-by-pixel basis. Analogous to locational
are input to a data fusion process to achieve a joint or fusion, raw data identity fusion provides the most accurate
fused state vector estimate, based on multiple sensors. It results, assuming proper sensor association and alignment.
should be noted that the functions of data alignment and The second architecture for identity fusion is feature level
association/correlation still need to be performed, but are fusion. In this case, each sensor provides observational data
now performed at the state vector level versus the data level. from which a feature vector is extracted. These features are
Distributed fusion architectures reduce the communications concatenated together into a single feature vector which
required between sensors and the fusion processor (because in turn is input to an identity declaration technique such
the sensor data are compressed into a representative state as a neural network or cluster algorithm. The output then
vector). In addition, the association/correlation process is becomes a joint or fused declaration of target identity based
conceptually easier than that performed for data level on the combined feature vectors from all of the sensors.
fusion. However, in general, state vector fusion is not as The functions of data alignment and association/correlation
accurate as data level fusion, because there is an informa- must still be performed prior to linking the feature vectors
tion loss between the sensors and the fusion process. In from individual sensors into a single larger feature vector.
particular, the original data contains information about the The third architecture [Fig. 12(b)] is decision level fu-
quality of the signal, which is only approximated by the sion. In this architecture, each sensor performs an identity
state vector and its associated covariance matrix. declaration process based only on its own single-source
Finally, the third architecture for locational fusion is a data. That is, each sensor converts observed target attributes
hybrid architecture that combines data level fusion and into a preliminary declaration of target identity. Again,
state vector fusion. In this case, during ordinary operations, this may be performed using a feature extraction/identity
state vector fusion is performed to reduce computational declaration approach involving neural networks or other
workload and communications demands. Under specified feature-based pattern recognition techniques. The iden-
circumstances (e.g., when more accuracy is required, or tity declarations provided by the individual sensors are
in dense tracking environments), data level fusion is per- combined using decision level fusion techniques such as
formed. Alternatively, based on available sensors, a com- classical inference, Bayesian inference, weighted decision
bination of state vectors and data may be fused. While methods, or Dempster–Shafer’s method, among others. As
the hybrid architecture provides the most flexibility, it also with the other architectures, data association and correlation
requires overhead to monitor the fusion process and select are still required to insure that the data to be fused refer to
between data and state vector fusion. the same physical entity or target.
Selection from among these types of locational archi- The selection among these architectures for a particular
tectures is ultimately a system engineering problem. There application is also a system engineering problem which
is no single optimal architecture for any given data fusion depends upon issues such as the available communications
application. Instead, the choice of architecture must balance bandwidth, characteristics of the sensors, computational
computing resources, available communication bandwidth, resources available, and other issues. There is no one

16 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
(a)

(b)

(c)
Fig. 12. Alternate architectures for multisensor identity fusion.

universal architecture which is applicable to all situations or interacting reasoning techniques to solve the component
applications. The architectures shown here provide a range problems, with an evolving solution obtained by combining
of possible implementations. the results for each subproblem. This is analogous to how
human experts might gather around a blackboard and solve
a problem (hence the name of the KBS architecture). An
A. Knowledge-Based Methods for Data Fusion
example of a blackboard architecture is shown in Fig. 13.
Interpretation of fused data for situation assessment or Regardless of the specific KBS technique used, three
threat assessment requires automated reasoning techniques
elements are required: 1) one or more knowledge repre-
drawn from the field of artificial intelligence. In particular,
sentation schemes, 2) an automated inference/evaluation
knowledge-based systems (KBS) or expert systems have
process, and 3) control schemes. Knowledge representation
been developed to interpret the results of Level 1 processing
systems, analyzing issues such as the context in which the schemes are techniques for representing facts, logical re-
data are observed, the relationship among observed entitles, lationships, procedural knowledge, and uncertainty. Many
hierarchical groupings of targets or objects, and predictions techniques have been developed for knowledge representa-
of future actions of targets or entities. Such reasoning is tion including production rules, frames, semantic networks,
normally performed by humans, but may be approximated scripts, and others. For each of these techniques, uncer-
by automated reasoning techniques. The reader is referred tainty in the observed data and the logical relationships
to the artificial intelligence literature for more details. A can be represented using probability, fuzzy set theory,
frequently-applied approach for data fusion involves the Dempster–Shafer evidential intervals, or other methods.
use of so-called blackboard KBS [122]. These systems The goal in building an automated reasoning system is
partition the problem into related subproblems and use to capture the reasoning capability of a human expert by

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
17
Fig. 13. Notational “blackboard” architectures for higher level fusion processing.

specifying the rules, frames, scripts, etc. which represent rapid prototyping of such an expert data fusion system. Key
the essence of the interpretive task. In contrast to the highly issues for developing such a system include the creation
numerical fusion processes at Level 1, the fusion of data of the knowledge base (i.e., actually specifying the rules,
and information at these higher levels of inference is largely frames, scripts, etc. via a knowledge engineering process),
(but not exclusively) conducted at the symbolic level,. Thus, and the test and evaluation of such a system. Despite these
in general applications can require a mixture of numerical difficulties, such systems are increasingly being developed
and symbolic processing. for data fusion.
Given a knowledge base, an inference or evaluation
process must be developed to utilize the knowledge. There B. Assessment of the State-of-the-Art
are formal schemes which have been developed based on The technology of multisensor data fusion is rapidly
formal logic, fuzzy logic, probabilistic reasoning, template evolving. There is much concurrent ongoing research to
methods, case-based reasoning, and many other techniques. develop new algorithms, improve existing algorithms, and
Each of these automated reasoning schemes has an inter- to understand how to assemble these techniques into an
nally consistent formalism which prescribes how to utilize overall architecture to address diverse data fusion applica-
the knowledge base (i.e., the rules, frames, etc.) to obtain tions. A brief assessment of the state-of-the-art is provided
a resulting conclusion or inference. here and shown in Fig. 14.
Finally, automated reasoning requires a control scheme The most mature area of data fusion processing is Level
to implement the reasoning process. Techniques include 1 processing, using multisensor data to determine the posi-
search methods (e.g., search a knowledge base to identify tion, velocity, attributes, and identity of individual objects
applicable rules), reason maintenance systems, assumption- or entities. In particular, determining the position and
based and justification-based truth maintenance, hierar- velocity of an object based on multiple sensor observations
chical decomposition, control theory, etc. Each of these is a relatively old problem. Gauss and Legendre developed
schemes involves assumptions and an approach for control- the method of least square for the particular problem of orbit
ling the evolving reasoning process. Control schemes direct determination for asteroids [66], [68]. Numerous mathemat-
the search through a knowledge base in order to exploit and ical techniques exist to perform coordinate transformations,
fuse the multisensor, dynamic data. associate observations-to-observations or observations-to-
The combination of selected knowledge representation tracks, and to estimate the position and velocity of a target.
scheme(s), inference/evaluation process, and control Multisensor target tracking is dominated by sequential
scheme are used to achieve automated reasoning. Popular estimation techniques such as the Kalman filter. Challenges
techniques are rule-based KBS and more recently fuzzy in this area involve circumstances in which there is a
logic based techniques. There are numerous prototype dense target environment, rapidly maneuvering targets, or
expert systems for data fusion, and readily available complex signal propagation environments (e.g., involving
commercial expert system development tools to help the multipath propagation, co-channel interference, or clut-

18 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Fig. 14. Assessment of data fusion technology.

ter). Single target tracking in high signal-to-noise envi- taneously, guided by a knowledge-based system to select
ronments, for dynamically well-behaved (i.e., dynamically the appropriate solution, based on algorithm performance.
predictable) targets is straightforward. Current research A special problem in Level 1 processing is achieving
focuses on solving the correlation and maneuvering target robustness in the automatic identification of targets based
problem for the more complex multisensor multitarget on observed characteristics or attributes. At this time, object
cases. Techniques such as multiple hypothesis tracking recognition is dominated by feature-based methods in which
(MHT), probabilistic data association methods, random a feature vector (i.e., a representation of the sensor data)
set theory [81], [82], and multiple criteria optimization is mapped into feature space with the hope of identifying
theory [76] are all being used to resolve these issues. the target based on the location of the feature vector
Some researchers are utilizing multiple techniques simul- relative to a priori determined decision boundaries. Popular

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
19
pattern recognition techniques include neural networks and is the advent of smart, self-calibrating sensors, which can
statistical classifiers. While there are numerous techniques accurately and dynamically assess their own performance.
available, the ultimate success of these methods depends A final comment on the discipline of data fusion is that
upon the ability to select good features. (Good features are it has suffered from a lack of rigor on test and evaluation
those which provide excellent class separability in feature of algorithms, and in considerations of transitioning theory
space, while bad features are those which result in greatly to application. The data fusion community needs to exert
overlapping areas in feature space for several classes of discipline in insisting on high standards for algorithm
targets.) More research is needed in this area to guide the development, test and evaluation, creation of standard test
selection of features, and to incorporate explicit knowledge cases, and systematic evolution of the technology to meet
about target classes. (For example, model-based or syntactic realistic applications. On a positive note, the introduction
methods provide additional information about the makeup of the JDL process model and emerging nonmilitary appli-
of a target and reduce the so-called training requirement cations are expected to result in increased cross discipline
necessary for the feature-based approaches. These model- communication and research. The nonmilitary research in
based methods have in fact become the dominant focal point robotics, condition-based maintenance, industrial process
of much of the recent ID fusion algorithm research.) In control, transportation, and intelligent buildings will pro-
addition, some limited research is proceeding to incorporate duce innovations which will cross fertilize the whole area
contextual information, such as target mobility with respect of data fusion technology.
to terrain, to assist in target identification. Similar to the
position estimation case, fused target ID estimation methods VI. SUMMARY
can work well in simpler cases where the target is exposed The data fusion community is rapidly evolving. Sig-
(i.e., not occluded), but when targets are partially occluded, nificant investments in DoD applications, rapid evolution
decoys and countermeasures are used, etc., achieving de- of microprocessors, advanced sensors, and new techniques
pendable, automated, fusion-based ID has been elusive. have led to new capabilities to combine data from multiple
Level 2 and Level 3 fusion (situation refinement and sensors for improved inferences. Applications of data fusion
threat refinement) are currently dominated by knowledge- range from DoD applications such as battlefield surveillance
based methods such as rule-based blackboard systems. and automatic target recognition for smart weapons to
These areas are relatively immature with numerous pro- non-DoD applications such as condition-based maintenance
totypes, but very few robust operational systems. The main and improved medical diagnosis. Implementation of such
challenge in this area is the need to establish a viable systems requires an understanding of basic terminology,
knowledge base of rules, frames, scripts, or other methods data fusion processing models, and architectures. This paper
to represent knowledge that support situation assessment is intended to provide an introduction to these areas as a
or threat assessment. Unfortunately, there exist only very basis for further study and research.
primitive cognitive models for how humans accomplish
these functions. Much research is needed in this area before REFERENCES
reliable and large-scale knowledge-based systems can be [1] E. Waltz, “Data fusion for C3 I: A tutorial,” in Command,
developed for automated situation assessment and threat Control, Communications Intelligence (C3 I) Handbook. Palo
assessment. New approaches which appear promising are Alto, CA: EW Communications, 1986, pp. 217–226.
[2] J. Llinas and E. Waltz, Multisensor Data Fusion. Boston, MA:
the use of fuzzy logic, and hybrid architectures which Artech House, 1990.
extend the concept of blackboard systems to hierarchical [3] D. Hall, Mathematical Techniques in Multisensor Data Fusion.
and multitime scale reasoning. Boston, MA: Artech House, 1992.
[4] L. A. Klein, Sensor and Data Fusion Concepts and Applications,
Finally, Level 4 processing, which assesses and improves SPIE Opt. Engineering Press, Tutorial Texts, vol. 14, 1993.
the performance operation of an on-going data fusion [5] D. L. Hall and J. Llinas, “A challenge for the data fusion
process, has a mixed maturity. For single sensor operations, community I: Research imperatives for improved processing,”
in Proc. 7th Natl. Symp. on Sensor Fusion, Albuquerque, NM,
techniques from operations research and control theory Mar. 1994.
have been applied to develop effective systems, even for [6] J. Llinas and D. L. Hall, “A challenge for the data fusion
community II: Infrastructure imperatives,” in Proc. 7th Natl.
complex single sensors such as phased array radars. By Symp. on Sensor Fusion, Albuquerque, NM, Mar. 1994.
contrast, situations that involve multiple sensors, external [7] Proc. Data Fusion Syst. Conf., Johns Hopkins Univ., Naval Air
mission constraints, dynamic observing environments, mul- Development Center, Warminster, PA, 1986–1994.
[8] Proc. 1991 SPIE Conf. on Sensors, Boston, MA, Nov. 1991.
tiple targets, etc. are more challenging. At this time, it is [9] Proc. 1994 7th Natl. Symp. on Sensor Fusion, ERIM, Ann
difficult to model and incorporate mission objectives and Arbor, MI, 1994.
constraints to balance optimized performance with limited [10] Proc. 1994 IEEE Conf. on Multisensor Fusion and Integration
for Intell. Syst., Las Vegas, NV, Oct. 1994.
resources such as computing power, limited communica- [11] Data Fusion Lexicon, Data Fusion Subpanel of the Joint Direc-
tion bandwidth (viz., between the sensors and processors), tors of Laboratories Technical Panel for 3 , F. E. White, Code
4202, NOSC, San Diego, CA, 1991.
and other effects. Methods from utility theory are being [12] Kessler et al., Functional Description of the Data Fusion
applied to develop measures of system performance and Process, Tech Rep., Office of Naval Technol., Naval Air
measures of effectiveness. Knowledge-based systems are Development Ctr., Warminster, PA, Jan. 1992
[13] D. L. Hall and R. J. Linn, “A taxonomy of algorithms for
being developed for context-based approximate reasoning. multisensor data fusion,” in Proc. 1990 Tri-Service Data Fusion
A special area which will provide significant improvements Symp., Apr. 1991, pp. 13–29.

20 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
[14] D. L. Hall, R. J. Linn, and J. Llinas, “A survey of data fusion [41] H. L. Van Trees, Detection, Estimation, and Modulation Theory,
systems,” in Proc. SPIE Conf. on Data Structure and Target Part I: Detection, Estimation, and Linear Modulation Theory.
Classification, vol. 1470, Orlando, FL, Apr. 1991, pp. 13–36. New York: Wiley, 1968.
[15] “Data fusion development strategies,” Tech Rep., Office of [42] , Detection, Estimation, and Modulation Theory, Part II:
Naval Technology, Naval Air Development Center, Warminster, Radar-Sonar Signal Processing and Gaussian Signals in Noise.
PA, Jan. 1992. New York: Krieger, 1992.
[16] D. L. Hall and R. J. Linn, “Survey of commercial software for [43] W. S. Burdic, Underwater Acoustic System Analysis. Engle-
multisensor data fusion,” in Proc. SPIE Conf. Sensor Fusion wood Cliffs, NJ: Prentice-Hall, 1984.
and Aerospace Applications, Orlando, FL, Apr. 1991. [44] R. R. Escobal, Methods of Orbit Determination. New York:
[17] R. L. Streit, private communication, Apr. 1995. Wiley, 1965.
[18] D. L. Hall and R. J. Linn, “A comparison of association [45] D. Willner, C. Chang, and K. Dunn, “Kalman filter configura-
metrics for emitter classification,” in Techn. Proc. 1991 Data tions for multiple radar systems,” Tech Rep. TN 1876-21, MIT
Fusion Syst. Conf., p. DFS-91, vol. 1, Appl. Physics Lab., Johns Lincoln Lab., Apr. 1976.
Hopkins Univ., Oct. 1991. [46] D. L. Hall and S. Waligora, “Orbit/altitude estimation with
[19] W. D. Bair, “Tracking maneuvering targets with multiple inter- Landsat landmark data,” in Proc. GSFC Flight Mechan-
mittent sensors: Does more data always mean better estimates?,” ics/Estimation Theory Symp., Oct. 1978.
in Proc. 1994 Data Fusion Symp., DFS-94, Appl. Phys. Lab., [47] E. L. Zaveleta and E. J. Smith, “Goddard trajector determination
Johns Hopkins Univ., 1994. systems user’s guide,” Tech. Rep. CSC/SD-7S/6005 Computer
[20] S. C. A. Thomopoulos, “Sensor selectivity and intelligent data Sci. Corp., Apr. 1975.
fusion,” in Proc. 1994 Int. Conf. on Multisensor Fusion and [48] Global Positioning System User’s Overview, GPS NAVSTAR,
Integration for Intell. Syst., Las Vegas, NV, 2–5 Oct. 1994, pp. USD 301.2:G51, I.S. Dept. Air Force, 1991.
529–537. [49] J. O. Cappellari, C. E. Velez, and A. J. Fuchs, “Mathematical
[21] A. Weigend and N. A. Gershenfeld, Eds., Time Series Pre- theory of the Goddard trajectory determination system, NASA
diction: Forecasting the Future and Understanding the Past. Tech. Rep. X-582-767-77, NASA Goddard Space Flight Center,
Reading, MA: Addison-Wesley, 1994. Greenbelt, MD, Apr. 1976.
[22] L. Cohen, “Time-frequency distribution—A review,” Proc. [50] Explanatory Supplement to the American Ephemeris and Nau-
IEEE, vol. 77, pp. 941–981, July 1989. tical Almanac (U.S. Govt. Printing Office), U.S. Naval Obser-
[23] P. A. Delaney and D. O. Walsh, “A bibliography of higher- vatory, 1960.
order spectra and cumulants,” IEEE Signal Process. Mag., pp. [51] R. Deutsch, Orbital Dynamics of Space Vehicles. Englewood
61–74, July 1994. Cliffs, NJ: Prentice-Hall, 1963.
[24] L. G. Weiss, “Wavelets and wideband correlation processing,” [52] S. Blackman, Multiple Target Tracking with Radar Applications.
IEEE Signal Process. Mag., pp. 13–32. Jan. 1994. Norwood, MA: Artech House, 1986.
[25] L. G. Weiss, R. K. Young, and L. H. Sibul, “Wideband process- [53] W. H. Press, B. P. Flannery, S. A. Teukoolsky, and W. T.
ing of acoustic data using wavelet transform: Part I—theory,” Vetterling, Numerical Recipes: The Art of Scientific Computing.
J. Acoust. Soci. Amer., vol. 92, no. 2, part 1, pp. 857–866, Aug. New York: Cambridge, 1986.
1994. [54] F. Wright, “The fusion of multi-source data,” Signal, pp. 39–43,
[26] L. G. Weiss, “Wideband processing of acoustic signals using Oct. 1980.
wavelet transforms, Part II. Efficient implementation and exam- [55] A. M. Leibetrau, Measures of Association in the series Quan-
ples,” J. Acoust. Soc. Amer., vol. 96, no. 2, part 1, pp. 857–866, titative Applications in the Social Sciences, Paper No. 07-044.
Aug. 1994. London, U.K.: Sage, 1984.
[27] R. O. Schmidt, “A signal subspace approach to multiple emitter [56] K. Fukanaga, Introduction to Statistical Pattern Recognition,
location and spectral estimation,” Ph.D. dissertation, Stanford 2nd ed. New York: Academic, 1990.
Univ., 1981. [57] M. S. Aldenderfer and R. K. Blashfield, Cluster Analysis in the
[28] R. H. Roy and T. Kailath, “ESPRIT—estimation of signal series Quantitative Applications in the Social Sciences, Paper
parameters via rotational invariance techniques,” IEEE Trans. No. 07-044. London, U.K.: Sage, 1984.
Acoust., Speech, Signal Process., vol. 37, June 1989. [58] J. Cohen, “A coefficient of agreement for nominal scaler,”
[29] M. D. Zoltowski and D. Stavrinides, “Sensor array signal Educat. Psycholog. Measure., vol. 20, pp. 37–46, 1960.
processing via a procrustes rotation based eigenanalysis of [59] P. C. Fishburn, Nonlinear Preference and Utility Theory. Bal-
the ESPRIT data pencil,” IEEE Trans. Acoust., Speech, Signal timore: Johns Hopkins Univ. Press, 1988.
Process., vol. 37, June 1989. [60] L. A. Zadeh, “Fuzzy algorithms,” Inform. Contr., vol. 12, pp.
[30] M. Kotanchek, “Stability exploitation and subspace array pro- 94–102, 1968.
cessing,” Ph.D. dissertation, Pennsylvania State Univ., 1995. [61] G. Guisewite, D. L. Hall, and D. Heinze, “Parallel implementa-
[31] Rubano, patent application. tion of a pattern marking expert system,” in Proc. SCS Eastern
[32] J. P. Burg, “The relationship between maximum entropy spec- Multiconf., Pittsburgh, PA, Mar. 1989.
tra and maximum likelihood spectra,” Geophys., vol. 37, pp. [62] C. T. Zahn, “Graph-theoretical methods for detecting and
375–376, Apr. 1972. describing Gestalt cluster,” IEEE Trans. Computers, vol. C-20,
[33] J. Capon, “High resolution frequency wave number spectrum Jan. 1971.
analysis,” Proc. IEEE, vol. PROC-57, Aug. 1969. [63] Y. Bar-Shalom and E. Tse, “Tracking in a cluttered environment
[34] G. V. Borgiotti and L. J. Kaplan, “Super-resolution of uncorre- with probablistic data association,” Automatica, vol. 2, pp.
lated interference sources by using adaptive array techniques,” 451–460, Sept. 1975.
IEEE Trans. Anten. Propagat., vol. AP-27, Nov. 1979. [64] T. E. Fortmann, Y. Bar-Shalom, and M. Scheffe, “Multi-target
[35] V. F. Pisarenko, “The retrieval of harmonics from covariance tracking using joint probabilistic data association,” in Proc.
functions,” Geophys. J. Royal Astronom. Soc., vol. 33, pp. 1980 IEEE Conf. on Decision and Contr., Dec. 1980, pp.
511–531, 1973. 807–812.
[36] A. J. Barabell, “Improving the resolution performance of [65] Y. Bar-Shalom, Ed., Multi-Target, Multi-Sensor Tracking: Ap-
eigenstructure-based direction-finding algorithm,” in Proc. plications and Advances. Norwood, MA: Artech House, 1989.
IEEE ICASSP 83, 1983, pp. 336–339. [66] K. G. Gauss, Theoria Motus Corporum Celestium, No. 1809;
[37] S. S. Reddi, “Multiple source location—A digital approach,” Theory of Motion of the Heavenly Bodies. New York: Dover,
IEEE Trans. Aerosp. Electron. Syst., vol. 15, no. 1, Jan. 1979. 1963.
[38] R. Kumaresan and D. W. Tufts, “Estimating the angles of arrival [67] R. A. Fisher, “On the absolute criterion for fitting frequency
of multiple plane waves,” IEEE Trans. Aerospace Electron. curves,” Messenger of Mathematics, vol. 41, p. 153, 1912.
Syst., vol. AES-9, Jan. 1983. [68] H. W. Sorenson, “Least-square estimation: From Gauss to
[39] B. H. Kwon, “New high resolution techniques and their perfor- Kalman,” IEEE Spectrum, pp. 63–68, July 1970.
mance analysis for angles-of-arrival estimation,” Ph.D. disser- [69] E. Polak, Computational methods in Optimization. New York:
tation, Polytechnic Univ., 1989. Academic, 1971.
[40] J. Chun and N. K. Bose, “A novel subspace-based approach [70] J. A. Nelder and R. Meade, Computer J., vol. 7, p. 308, 1965.
to parameter estimation,” Digital Signal Process., vol. 4, pp. [71] R. P. Brent, Algorithms for Minimization without Derivatives.
1–48, 1994. Englewood Cliffs, NJ: Prentice-Hall, 1973, ch. 7.

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
21
[72] L. David, Ed., Handbook of Genetic Algorithms. New York: [97] B. Widrow and R. Winter, “‘Neural nets for adaptive filtering
Van Nostrand Reinhold, 1991. and adaptive pattern recognition,” IEEE Computer, vol. 21-3,
[73] A. Gelb, Applied Optimal Estimation. Cambridge, MA: MIT pp. 24–40, Mar. 1988.
Press, 1974. [98] D. Mush and B. Horne, “Progress in supervised neural net-
[74] A. H. Sayed and T. Kailath, “A state-space approach to adaptive works: What’s new since Lippman?,” IEEE Signal Process.
RLS filtering,” IEEE Signal Process. Mag., pp. 18–70, July Mag., pp. 8–39, Jan. 1993.
1994. [99] D. Touretzky, Advances in Neural Information Processing Sys-
[75] G. J. Bierman, “Factorization methods for discrete sequential tems, 2nd ed. Palo Alto, CA: Morgan Kaufmann, 1990.
estimation,” in Mathematics in Science and Engineering. New [100] C. Priebe and D. Marchette, “An adaptive hall-to-emitter cor-
York: Academic, 1977, vol. 128. relator,” in Proc. 1988 Tri-Service Data Fusion Conf. DFS-88,
[76] A. Poore and N. Rigaves, “Partitioning multiple data sets: Johns Hopkins Univ., Laurel, MD, 1988, pp. 433–436.
Multidimensional assignments and Lagranagian relaxation,” [101] M. Eggers and T. Khuon, “Neural network data fusion for
Quadratic Assignment and Related Problems, DIMACS Series decision making,” in Proc. 1989 Tri-Service Data Fusion Symp.,
in Discrete Mathematics and Theoretical Computer Science, 16–18 May 1989, pp. 104–118.
P. Pardalos and H. Wolkowicz, Eds. Providence, RI: Amer. [102] R. L. Streit and T. E. Luginbuhl, “Maximum likelihood training
Mathemat. Soc., 1994, vol. 16, pp. 1–26. of probabilistic neural networks,” IEEE Trans. Neur. Networks,
[77] , “A numerical study of some data association problems vol. 5, pp. 764–783, Sept. 1994.
arising in multitarget tracking,” Large Scale Optimiza- [103] A. K. Garga and N. K. Bose, “A neural network approach
tion: State-of-the-Art, W. W. Hagar, D. W. Hearn, and to the construction of Delanay tessellation of points in d ,”
P. M. Pardalospp, Eds. New York: Kluwer, 1993, pp. Trans. Circ. Syst.: Fundamental Theory and Applicat., vol. 41,
347–370. pp. 611–613, Sept. 1994.
[78] A. Poore, “Multi-dimensional assignment formulation of data [104] N. K. Bose and A. K. Garga, “Neural network design using
association problems arising from multi-target and multi-sensor Voronoi diagrams,” IEEE Trans. Neur. Networks, vol. 4, no. 5,
tracking,” Computational Optimizat. Applicat., vol. 3, pp. pp. 778–787, Sept. 1993.
27–57, 1994. [105] R. O. Duda and P. E. Hart, Pattern Classification and Scene
[79] R. L. Streit and T. E. Luginbuhl, “A probablistic multi- Analysis. New York: Wiley, 1973.
hypothesis tracking algorithm without enumeration and [106] R. E. Henkel, Tests of Significance. Beverly Hills, CA: Sage,
pruning,” in Proc. 6th Joint Sevice Data Fusion Symp., Laurel, 1976.
MD, June 1993. [107] G. R. Iverson, Bayesian Statistical Inference. Beverly Hills,
[80] R. L. Streit, “Maximum likelihood method for probablistic CA: Sage, 1984.
multi-hypothesis tracking,” SPIE Proc., vol. 2234-5, pp. [108] G. B. Wilson, “Some aspects of data fusion,” 1985 Intell. Conf.
394–406, 1994. in 3 , Proc. IEEE Conf. Publ. No. 247, Apr. 1985.
[81] R. Mahler, “A unified foundation for data fusion,” in Proc. 1994 [109] J. Pearl, Probablistic Reasoning in Intelligent Systems: Networks
Data Fusion Syst. Conf., Applied Phys. Lab., Johns Hopkins of Plausible Inference. San Mateo, CA: Morgan Kaufmann,
Univ., June 1987. 1988.
[82] I. A. Goodman, “A general theory for the fusion of data,” in [110] J. D. Lowrance and T. D. Garvey, “Evidential reasoning: A
Proc. 1987 Joint Services Data Fusion Symp., Applied Phys. developing concept,” in Proc. Int. Conf. on Cybern. and Soc.,
Lab., Johns Hopkins Univ., June 1987. Oct. 1982, pp. 6–9.
[83] , “Modeling and fusing of linguistic, probablistic, and/or [111] A. P. Dempster, “Generalization of Bayesian inference,” J.
conditional and unconditional forms,” Invited Tutorial, in Royal Statist. Soc., vol. 30, pp. 205–247, 1968.
Proc. 5th Joint Service Data Fusion Symp., part 2, Oct. [112] S. C. A. Thomopoulos, “Theories in distributed decision fusion:
1991. Comparison and generalization,” in Proc. SPIE 1990 on Sensor
[84] I. R. Goodman and H. T. Nguyen, Uncertainty Models for Fusion III: 3-D Perception and Recog., Boston, MA, 5–9 Nov.
Knowledge-Based Systems. Amsterdam: North-Holland, 1985. 1990, pp. 623–634.
[85] L. I. Perlovsly and J. V. Jaskolski, Concurrent Adaptive Clas- [113] Willett and Swazek, “Centralized performance in decentralized
sification and Tracking, Low Frequency Active Techn. Rev., detection with feedback,” in Proc. Conf. on Informat. Sci. Syst.,
Office of Naval Technol., Washington, DC. Princeton, NJ, Mar. 1992.
[86] E. Waltz, “The principles and practice of image and spatial [114] D. L. Hall and R. J. Linn, “Comments on the use of templating
data fusion,” in Proc. 8th Natl. Data Fusion Conf., Dallas, TX, for multi-sensor data fusion,” Proc. 1989 Tri-Service Data
15–17 Mar. 1995. Fusion Symp., vol. 1, pp. 345–354, May 1989.
[87] J. Kittler, “Mathematical methods in feature selection in pattern [115] D. F. Noble, “Template-based data fusion for situation assess-
recognition,” Int. J. Man-Mach. Studies, vol. 7, pp. 609–637, ment,” in Proc. 1987 Tri-Service Data Fusion Symp., vol. 1, pp.
1975. 152–162, June 1987.
[88] C. Lee and D. A. Landgrebe, “Decision boundary feature [116] P. Jackson, Introduction to Expert Systems. Reading, MA:
extraction for nonparametric classification,” Trans. Syst., Man Addison-Wesley, 1986.
Cybern., vol. 23, pp. 433–444, Mar./Apr. 1993. [117] R. A. Benfer, E. E. Brent, Jr., and L. Furbee, “Expert systems,”
[89] S. G. Grieneder, T. E. Luginbuhl, and R. L. Streit, “Procrustes: Paper No. 77. London, U.K.: Sage Univ., 1991.
A feature set reduction technique,” NUWC Tech. Rep., no. 10, [118] J. Durkin, Expert Systems, Design, and Development. New
p. 633, Newport, RI, June 1994. York: Macmillan 1994.
[90] A. K. Jain and W. G. Waller, “On the optimal number of [119] J. Gelfaud, Selective Guide to Literature on Artificial Intelli-
features in the classification of multi-variate Gaussian data,” gence and Expert Systems, Amer. Soc. for Engineering Educa-
Patt. Recognition, vol. 10, pp. 365–374, 1978. tion, 1992.
[91] P. W. Narendra and K. Fukanage, “A branch and bound [120] T. Sammon, Jr., “The transient expert processor,” in Proc. 24th
algorithm for features subset selecton,” IEEE Trans. Computers, Asilomar Conf. on Signals, Syst., and Computers, vol. 2, pp.
vol. C-26, pp. 917–922, Sept. 1977. 612–617, Nov. 1990.
[92] P. Sneath and R. Sokul, Numerical Taxonomy. San Francisco: [121] R. E. Gibson, D. L. Hall, and J. A. Stover, “An autonomous
Freeman, 1973. fuzzy logic architecture for multi-sensor data fusion,” in
[93] R. Dubes and A. Jain, “Clustering methodologies in exploratory Proc. 1994 IEEE Conf. on Multi-Sensor Fusion and Inte-
data analysis,” Advances in Computers, vol. 19, pp. 113–228, gration for Intell. Syst., Las Vegas, NV, Oct. 1994, pp.
1980. 143–150.
[94] H. Skinner, “Dimensions and clusters: A hybrid approach to [122] J. Llinas and R. Antony, “Blackboard concepts for data fu-
classification,” Appl. Psycholog. Measure., vol. 3, pp. 327–341, sion and command and control applications,” Int. J. on Patt.
1979. Recognit. and Artif. Intell., vol. 7, no. 2, Apr. 1993.
[95] A. J. Cole and D. Wishart, “An improved algorithm for the [123] L. I. Perlovsky, ”Decision directed information fusion using
Jardine-Sibson method of generating clusters,” Comput. J., vol. MLANS,” in Information Fusion Workshop, Army Res. Office
13, pp. 156–163, 1970. and George Mason Univ., Harpers Ferry, WV, p. 124.
[96] R. P. Lippman, “An introduction to computing neural net- [124] M. Liggins, M. Alford, and W. Berry, “Predetection fusion for
works,” IEEE ASSP Mag., vol. 3-4, pp. 4–22, 1987. adaptive thresholding and control,” in Proc. 6th Joint Service

22 Authorized licensed use limited to: UNIVERSITE Cote d'Azur (Nice). Downloaded on JulyPROCEEDINGS
18,2024 at 05:59:42 UTCIEEE,
OF THE from IEEE
VOL. Xplore.
85, NO. Restrictions
1, JANUARY apply.
1997
Data Fusion Symp., Appl. Physics Lab., Johns Hopkins Univ., James Llinas is an Adjunct Research Profes-
Laurel, MD, 14–18 June 1993, pp. 411–424. sor at the State University of New York at
[125] J. Llinas and D. L. Hall, “Dual-use of data fusion technology: Buffalo. He is an expert in data fusion, co-
Applying the JDL model to non-DoD applications,” in Proc. 4th authored the first integrated book on the subject,
Annu. 1994 IEEE Dual-Use Technologies and Applicat. Conf., and has lectured internationally for about 15
SUNY Inst. Technol., Rome, NY, May 1994. years on this topic. He has been a Technical
[126] R. J. Hansen, D. L. Hall, and S. K. Kurtz, “A New approach Advisor to the Defense Department’s Joint Di-
to the challenge of machinery prognostics,” Trans. ASME, J. rectors of Laboratories Data Fusion Panel for
Engineering for Gas Turbines and Power, pp. 320–325, Apr. the past decade. His experience in research-
1995. ing and applying this technology to different
[127] D. L. Hall and Ogrodnik, “Passive exploitation of the electro- problem areas ranges from complex defense
magnetic environment for improved target tracking, situation and intelligence-system applications to nondefense applications including
assessment, and threat refinement,” in Proc. 9th Natl. Symp. intelligent transportation systems, fingerprint recognition, and medical
on Sensor Fusion, Naval Postgrad. School, Monterey, CA, 12 diagnostics. His current projects include basic and applied research in
Mar. 1996. automated reasoning, distributed, cooperative problem-solving, avionics
[128] C. Bowman, private communication to D. L. Hall, Oct. 1995. information fusion architectures, scientific foundations of data correlation
[129] M. A. Abidi and R. C. Gonzalez, Data Fusion in Robotics and techniques, and infrared/radar data fusion for object recognition.
Machine Intelligence. Boston, MA: Academic, 1992.
[130] G. Shafer, A Mathematical Theory of Evidence. Princeton, NJ:
Princeton Univ. Press, 1976.

David L. Hall (Senior Member, IEEE) is a


Senior Scientist at The Pennsylvania State Uni-
versity’s Applied Research Laboratory and a
Professor of Electrical Engineering at The Penn-
sylvania State University, University Park. He
has performed research in data fusion and related
technical areas for more than 20 years and
lectured internationally on the topics of data
fusion and artificial intelligence. In addition, he
has participated in the implementation of real-
time data fusion systems for several military
applications. He is the author of three textbooks and more than 100
technical papers. He has worked at HRB Systems, Inc., at the Computer
Sciences Corporation, and at the MIT Lincoln Laboratory.

Authorized
HALL AND LLINAS:licensed use limited to: UNIVERSITE
AN INTRODUCTION Cote d'Azur
TO MULTISENSOR DATA(Nice).
FUSIONDownloaded on July 18,2024 at 05:59:42 UTC from IEEE Xplore. Restrictions apply.
23

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy