Computational Methods for Dynamic Scene Reconstruction

Period of Performance: 07/11/2016 - 05/10/2017

$79.9K

Phase 1 STTR

Recipient Firm

Near Earth Autonomy
5001 Baum Blvd. Suite 750 Array
Pittsburgh, PA 15213
Firm POC
Principal Investigator

Research Institution

Carnegie Mellon University
5000 Forbes Avenue
Pittsburgh, PA 15213
Institution POC

Abstract

Military and civilian agencies can benefit from systems that create 4D, spatio-temporal representations of dynamic scenes for target tracking, law enforcement, crowd control, disaster recovery, etc. Proliferation of imaging sensors such as surveillance cameras, car- and body-mounted cameras, and cell phones, provides the opportunity for reconstructing a dynamic scene in a very inexpensive way. Today, 3D spatial reconstruction of static scenes from multiple viewpoints is a reality, but adding the temporal dimension to account for both camera motion and dynamic scenes is still elusive. Near Earth Autonomy and Carnegie Mellon Universitys Illumination and Imaging Laboratory propose to create software tools that solve the 4D scene reconstruction problem by leveraging imagery from static and moving cameras with varying fields-of-view and fields-of-regard, calibrated or uncalibrated. Initial versions will create accurate scene representations post facto; ultimately, the technology will evolve to allow for real-time, accurate scene reconstruction. Near Earth Autonomy specializes in sensors, perception tools, and systems that enable situational awareness for improved safety, efficiency, and performance and operate in hostile, unprepared environments. CMUs ILIM is led by Prof. Srinivasa Narasimhan, one of the worlds foremost authorities in novel illumination and imaging technologies.