Connecting people who are blind or have low vision with sighted volunteers and companies through live video and AI, app Be My Eyes is now working with Microsoft to make AI models more inclusive.
By incorporating accessibility data, AI can better serve diverse user needs, making technology more usable and beneficial for everyone, said Be My Eyes founder Hans Jørgen Wiberg, adding this collaboration with Microsoft is the first of its kind for the app developer.
Publicly available datasets used to train AI models often lack accessibility context and can fail to reflect the lived experience of people who are blind or have low vision, he said. “Today, disability is often underrepresented or incorrectly categorised in datasets used to train AI, which can limit the utility of the technology or even magnify bias.”
Be My Eyes will provide video data collected through its platform (stripped of all user, account and identifiable personal information) to Microsoft for AI model training. The video datasets represent the lived experience of the blind and low vision community and will be used to improve the accuracy and precision of scene understanding and descriptions, said Wiberg.