Mobile Motion Capture for Human Skeletal Modeling in Natural Environments

Period of Performance: 12/03/2015 - 11/23/2017

$750K

Phase 2 SBIR

Recipient Firm

Apdm, Inc.
PORTLAND, OR 97201
Firm POC
Principal Investigator

Abstract

ABSTRACT:Analysis of human movement with motion capture systems is used in many applications in the domains of health care, military training and simulation, sports, and entertainment. The requirements of these applications have driven the development of new motion capture technologies to improve accuracy, automation, portability, and cost. There is no single motion capture technology that is well suited to all applications. Optical motion capture systems are extremely accurate, but are costly, time-consuming, sensitive to occlusions, non-portable, and are limited to indoor spaces. Magnetic motion capture systems provide accurate orientation and position, but have limited range, low sample rates, and are sensitive to magnetic field disturbances. Inertial motion capture systems can overcome some of these limitations, but cannot provide absolute position and are sensitive to magnetic field disturbances. In Phase I, we demonstrated the feasibility of an innovative approach that fuses ranging sensors with our inertial monitors to continuously estimate orientation and position. Our primary objective of this Phase II project is to use this technology to develop a complete motion capture system that can be used to track human full-body kinematics in a wide variety of research applications that are not well suited to other motion capture technologies.BENEFIT:The commercial Ruby system will be ready for research and development applications requiring biomechanics analysis. This will include both civilian and military applications for which current motion capture technologies are poorly suited due to their limitations. Ruby motion capture will be especially valuable in clinical and military research that requires portability, automation, or accuracy in environments with magnetic field disturbances.