Building Semantic Knowledge of Large Data Sets through Collaborative Visual Approaches

Period of Performance: 05/29/2012 - 11/29/2012

$149K

Phase 1 SBIR

Recipient Firm

Intelligent Models, Inc.
10303 Sweetwood Ave Array
Rockville, MD 20850
Principal Investigator

Abstract

Human Social and Cultural and Behavior (HSCB) models are increasingly used to provide critical support to US military decision making. HSCB models are highly reliant on data; they need data from many sources, types, and areas of human behavior. The concomitant data streams have variable data quality and are constantly changing. Despite these challenges, HSCB applications may need near real-time access to relevant knowledge to make rapid decisions. To promote rapid acquisition and effective use of relevant HSCB knowledge, we propose SELMA (Semantic Exploration of Large Multi-modal Archives) which automates the processes of: (1) semantic cross-modal exploration of HSCB archives (to link people, events, objects, knowledge, and actions); (2) data-mining to fill gaps in information, resolve uncertainty, and classify behaviors and events; (3) mission-critical HSCB knowledge discovery; and (4) binding and visualizing the captured HSCB insights and semantic knowledge. Together, these components allow SELMA to offer a solution to HSCB data needs that: (1) collects, stores, and analyses mission-critical HSCB insights, (2) works autonomously for extended periods of time, and (3) actively reasons over the regional Human Terrains. SELMA dynamically creates an HSCB knowledge meta-network, explores concepts, discovers relationships with certain properties, and carries out versatile on-demand analyses.