How Designing For Disabled People Is Giving Google An Edge

Google’s Eve Andersson tells Co.Design how today’s accessibility problems could lead to improvements in robots, Google Maps, and even YouTube.


Like Microsoft, which recently announced a computer vision-based accessibility project called Seeing AI, Google’s interested in how to convey visual information to blind users through computer vision and natural language processing. And like Microsoft, Google is dealing with the same problems: How do you communicate that information without just reading out loud an endless stream-of-conscious list of what a computer sees around itself—regardless of how trivial they may or may not be?

Thanks to Knowledge Graph and machine learning—the same principles that Google uses to let you search photos by content (like photos of dogs, or photos of people hugging)—Andersson tells me that Google is already good enough at identifying objects to decode them from a video stream in real time. So a blind user wearing a Google Glass-like wearable, or a body cam hooked up to a smartphone, could get real-world updates on what can be seen around them.

But again, the big accessibility problem that needs to be solved here is one of priority.



Much has been made recently of Google’s advances in natural language processing, or Google’s ability to understand and transcribe human speech. Google’s accessibility efforts lean heavily upon natural language processing, particularly its latest innovation, Voice Access. But Andersson says computers need to understand more than just speech. Forget natural language processing: computers need non-language processing.


Sighted users are so used to taking directions from computers that many people (like me) can barely find their way around without first plugging an address into Waze. But moving sighted individuals from point A to point B, across well-plotted roads and highways, is navigation on macro scale. Things get much more complicated when you’re trying to direct a blind person down a busy city street, or from one store to another inside a shopping mall. Now, you’re directing people on a macro scale, but in an environment that is not as well understood or documented as roads are.


Curated by (Lifekludger)
Read full article at Source: How Designing For Disabled People Is Giving Google An Edge

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: