Screen Recognition On iPhone, Improving The Interface For Blind Users
This Is Definitely An Ethical Use Of AI
Apple has rolled out a new feature on iOS 14, called Screen Recognition, which will improve the experience of visually impaired people when using iThangs. They fed thousands of images of icons, buttons and apps in use into a machine learning algorithm to train it to be able to immediately identify and label the various GUI elements on just about any app. This would allow an iThang to label elements and read out the description or location on demand, enhancing the experience of blind users.
Many apps already have modes which can be enabled for visually impaired users and Screen Recognition would not override those labels but would ensure any labels the designer missed would be covered. Instead this new feature would enhance any existing labelling and add missing features such as audible descriptions of pictures.
This would not have been possible a few years ago, not just because we hadn’t developed training algorithms advanced enough to accurately recognize images but also because this takes a fair amount of processing power to do in real time. The difficult of training means that we won’t see this on other platforms, including Macs, for a while but for now this is a great feature for mobile Apple devices. If you know someone who would benefit from Screen Recognition, or are simply curious about it’s accuracy and overall design you should check it out.
Apple has always gone out of its way to build features for users with disabilities, and VoiceOver on iOS is an invaluable tool for anyone with a vision impairment -- assuming every element of the interface has been manually labeled. But the company just unveiled a brand new feature that uses machine learning to identify and label every button, slider and tab automatically.