High Quality/Low Dimension Data for Sensor Integration

Period of Performance: 07/30/2015 - 04/29/2016


Phase 1 STTR

Recipient Firm

Metron, Inc.
1818 Library Street Suite 600
Reston, VA 20190
Principal Investigator

Research Institution

University of Rhode Island
School of Oceanography 215 South Ferry Road
Kingstown, RI 02882
Institution POC


ABSTRACT: This project addresses the development and evaluation of high-quality low-dimensional (HQLD) sensor data selection and fusion (SDSF) algorithms. ?The objective is to develop algorithms that process the data from multiple sensors to reduce dimensionality while retaining the information content of the full-dimension data. An HQLD-SDSF algorithm will be developed within the context of a likelihood ratio detection and tracking (LRDT) system. LRDT is a nonlinear Bayesian filtering technique that incorporates target detection, localization, and tracking, and naturally fuses multi-sensor data via sensor likelihood functions. The exponentially embedded families (EEF) approach to probability density function (PDF) modeling and feature selection will be used to select and fuse lower-dimensional sensor data in an optimal and tractable manner. The LRDT-HQLD-SDSF algorithm will take system objectives and knowledge of the current target state from the LRDT tracking system and an information-theoretic measure of current data quality from the EEF constructed likelihood function, and balance these with system constraints to determine the best set of low-dimension sensor data to send to a central processor to achieve system objectives under the current situation. Performance will be demonstrated on a simulated multi-sensor system consisting of a passive radar sensor, an infrared sensor, and a visible light sensor.; BENEFIT: Many different types of sensor systems have been developed for both defense and commercial applications, including radar, sonar, electro-optical, infrared, visible light, hyperspectral, and electromagnetic, to name a few. ?Each measures a different characteristic of the target or scene and multiple sensors together can provide a more complete picture of the situation than any one sensor alone. Multi-sensor data fusion algorithms seek to integrate the data from disparate sensors so that information relevant to a particular system objective is combined and distilled to a useful form. ?There is a need to select data from the most relevant sensors and to reduce the dimension of the available data while retaining mission-critical information in order to simplify processing, eliminate unnecessary and possibly distracting information, and/or satisfy system constraints. The goal of this project is to develop algorithms that reduce the dimension of the data at multiple sensors in an optimal way taking into account system objectives, system constraints, and the observed quality of the data. ?The benefits are improved detection, localization, and tracking systems, which have applicability in intelligence, surveillance, and reconnaissance (ISR) systems within the Department of Defense, Department of Homeland Security, and intelligence and law enforcement agencies as well as numerous commercial applications, including medical, pharmaceutical, automotive, manufacturing, and robotics.