SBIR Phase I: Development of a Natural Language Dialogue System for the Blind and Visually Impaired to Enable Greater Efficiency in Remote Assistance

Period of Performance: 07/01/2017 - 03/31/2018


Phase 1 SBIR

Recipient Firm

Aira Tech Corp
4225 Executive Square #460 Array
La Jolla, CA 92037
Firm POC, Principal Investigator


The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase I project is to dramatically increase the quality of life and economic independence for the nearly 22 million blind or visually impaired people in the US. The economic benefits include a significant reduction in the nearly $100B in annual economic losses from lower productivity due to visual impairment and in annual cost of social services to the blind or visually impaired. This project will create low-cost, high-value tools and services that will give blind and visually impaired people the same level of environmental awareness as fully sighted people. At scale, this technology will result in the direct employment of more than 20,000 people to support the service, while also providing millions blind/visually impaired people with the tools and opportunity to join the work force. The proposed project will develop a real-time, life-enhancing service to the blind enabled by a mix of machine learning and a remote human assistant to create a real-time, semi-virtual, personal assistant with the quality of an in-person, trained human assistant. In the proposed project, the core innovation is a high-level machine intelligence tool created by integration of state-of-the-art Natural Language Understanding (NLU) software and Image Recognition and Analysis (IA) software with novel inter-agent routing and state-of-the-art Graceful Degradation, or seamless error handling and resolution by either software or human agents. Despite the incredible advancements in NLU and IA agents, integration of multiple different software agents into a single, context-specific application remains a difficult and risky development effort worthy of funding by the NSF. By doing this, we will (1) enhance the user experience with richer environmental feedback, (2) enhance the productivity of our human agents, allowing individual human agents to serve many more users and thereby create scalability that will make the service both self-sustaining and affordable, and finally (3) increase the user?s experience of personal independence through interaction with machine-based tools rather than emotionally and socially charged interactions with human agents.