BLAST: A System for Bandwidth- and Latency-Scalable Teleoperation

Period of Performance: 09/18/2015 - 09/18/2017

$1000K

Phase 2 SBIR

Recipient Firm

Neya Systems, LLC
145 Lake Drive Array
Wexford, PA 15090
Firm POC
Principal Investigator

Abstract

This topic addresses the problem of robustly commanding and controlling unmanned ground vehicles operating in complex, unstructured environments. Current approaches to this task rely on dense scene reconstruction from a variety of sensor data such as LIDAR and video imagery. Scene representations are then relayed to a remote human operator who provides commands at varying levels of supervisory control (intelligent teleoperation). Challenges arise in scenarios where the available communication link allows only low bandwidth data transmission and/or exhibits high latency. In such scenarios, bandwidth limitations prevent rich scene representations from being transmitted from the vehicle to the operator in a timely manner. In addition, high latency may have a destabilizing effect, causing commands issued by the operator to lead to unsafe actions by the vehicle. This effect is exacerbated as the frequency of command inputs increases. Novel frameworks, and associated algorithms, are required to enable robust operations of autonomous unmanned ground vehicles operating in complex, unstructured environments, over a low-bandwidth, high latency communication link. Such approaches would provide an operator with sufficient information to make timely command and control decisions, even in harsh communication scenarios. They would also provide contingency-based assurance of system safety in the absence of timely command and control decisions. In addition, other functional relationships such as sensor costs need to be addressed since these costs are generally proportional to the level of autonomy or intelligence. Approaches to this problem may emphasize perception, vehicle control, or some combination of the two. In the perception domain, approaches to intelligent data compression and minimal scene representation are desired [1]. Such approaches may condense raw sensor data into compact, human-recognizable primitives that can be efficiently transmitted over low-bandwidth communication links. These methods may be optimized for particular contexts (e.g. urban operations) to enable improved data compression, and they may also dynamically vary scene representation richness or complexity depending on available bandwidth. In the control domain, contingency-based control algorithms that ensure vehicle safety in the absence of operator inputs, or when provided with unsafe command inputs (perhaps due to the effects of latency) are desired. Again, such approaches may be optimized for particular contexts to enable improved performance. Methods that act as vehicle co-pilots, which both ensure vehicle safety and attempt to predict operator intent, would be particularly useful [2]. The output of this work is software that would be integrated with an existing autonomous vehicle(s) to yield measureable improvements in safety and operational speed compared to a baseline system, for the low-bandwidth, high-latency scenarios of interest. If successful, this work will have broad applications for autonomous and semi-autonomous military vehicle operations.