STTR Phase I: A platform for reactive and adaptive motion generation for real-world manipulation

Period of Performance: 07/01/2016 - 12/31/2016

$225K

Phase 1 STTR

Recipient Firm

Lula Robotics Inc.
535 13th Ave E Apt. 307
Seattle, WA 98102
Firm POC, Principal Investigator

Research Institution

University of Washington
Department of Aeronautics&Astronautics, Box 352250
Seattle, WA 98195
Institution POC

Abstract

The broader impact/commercial potential of this project revolves around the democratization of extremely complex motion generation, perception, and control systems. Most roboticists today program without understanding heat transfer, motor controllers, and lowlevel message passing. Abstractions improve robot accessibility and enable organizations to leverage a broader range of expertise. This project will further the modern robotics abstraction to obviate specialized knowledge of low level task processes such as movement and vision across a variety of robots. Product driven companies are best suited to develop business logic and a thoughtful customer experience, and most struggle to acquire suitable in-house talent to remain competitive with foundational robotics techniques. This system addresses fundamental behavioral prerequisites for applications ranging from collaborative robotics in manufacturing and fulfillment to assistive robotics, caretaking, and personal home robotics as well as Unmanned Aerial Vehicles (UAVs), autonomous vehicles, and entertainment robots. With higher level interfaces, programmers can more easily leverage their creativity and intuition enabling a new generation of applications unhindered by the present day difficulties of low level development. Beyond the commercial potential, these advantages are pervasive: researchers, students, and children alike, of all genders and ethnicities, will be empowered to interact with, program, and study robots at a substantially higher level. This Small Business Innovation Research (SBIR) Phase I project will address the increasing need for a higher level interface to robots that abstracts away perception, motion generation, and control. Many competing techniques for motion generation and perception populate the literature, but recently optimization has become a common unifying theme playing a prominent role in continuously running and adapting behavior with the promise of speed and generality. This project will utilize state of the art technology in continuous motion optimization and modern real time vision and tracking techniques to build an integrated system appropriate for use across multiple robots. These techniques involve optimization, Riemannian geometry, low level control, online learning and more; many components are well suited to generalizing across robotic platforms, but questions remain regarding processing speed, robustness, and the utility of existing modeling tools. This project will assess the viability of such a higher level system by studying the latencies of reactions, precision of manipulation, and its long term robustness especially in the context of collaborative robotics for manufacturing using evaluations across multiple physical robotic platforms. The technical results will illuminate the role of optimization in motion generation and perception for real world productionizable systems and uncover any remaining scientific question that must be addressed.