BlindTool – A mobile app that gives a “sense of vision” to the blind with deep learning

In 2015 the ability to perform object classification locally on a cell phone allowed for a new kind of tool for the blind to be developed. I developed an Android app for the blind that tells the user what the camera is looking at and vibrates based on how confident it is. When using this app the user will wave the phone around you until you feel it vibrating more and more which means you are getting closer to an object it understands. This app has been said to “represent the New Frontier of Assistive Technology” by the Massachusetts Association for the Blind and Visually Impaired and has also been called “Shazam for the world around you” by Fast Company Magazine. The app has been downloaded by over 26k users (as of Oct 2018).

The convolutional neural network inside the app can understand 1000 classes based on ImageNet classes. This app has been tested on the Nexus 4 and Nexus 5 phones. Please report issues so I can make a note of them here.

Cite

Joseph Paul Cohen, BlindTool – A mobile app that gives a “sense of vision” to the blind with deep learning, https://github.com/ieee8023/blindtool, 2015

Links

Press

2017

2016

 

Another video of BlindTool in action:

BlindTool watches Star Wars:

Citation

@misc{Cohen2015,
author = {Cohen, Joseph Paul},
title = {BlindTool – A mobile app that gives a "sense of vision" to the blind with deep learning},
url = {https://github.com/ieee8023/blindtool},
year = {2015}
}