In collaboration with nonprofit organization Guiding Eyes for the Blind, Google today piloted an AI system called Project Guideline, designed to help blind and low-vision people run races independently with just a smartphone. Using an app that tracked the virtual race via GPS and a Google-designed harness that delivered audio prompts to indicate the location of a prepainted line, Guiding Eyes for the Blind CEO Thomas Panek attempted to run New York Road Runners’ Virtual Run for Thanks 5K in Central Park.
According to the U.S. Centers for Disease Control and Prevention, in 2015, a total of 1.02 million people in the U.S. were blind and approximately 3.22 million people had vision impairment. Technologies exist to help blind and low-vision people navigate challenging everyday environments, but those who wish to run must either rely on a guide animal or a human guide who’s tethered to them.
“Imagine walking down a hallway in the dark with your arms outstretched. As you drift to the left, you will feel the wall with your left hand and move back to center to correct,” a Google spokesperson told VentureBeat via email. “As you drift to the right, you will feel the wall with your right hand and move back to center to correct. The same applies with Project Guideline, only you hear the boundaries to your left and right, rather than feel them.”
Beyond the pilot with Panek, Google plans to partner with organizations to help paint guidelines in different communities and provide additional feedback.
Above: Images used to train the Guideline model.
Image Credit: Google
The launch of Guideline comes after Google debuted more in-depth spoken directions for Maps, which inform users when to turn and tell them when they’re approaching an intersection so they can exercise caution when crossing. The company also continues to develop Lookout, an accessibility-focused app that can identify packaged foods using computer vision, scan documents to make it easier to review letters and mail, and more.