In September last year, we launched our iOS program to support startups as they developed best-in-class Augmented Reality applications. One such startup was myfinder, who use state of the art technology to support disabled people. Their iOS app acts as an AI assistant to help the blind and visually impaired to locate objects, hear texts, scene descriptions and which can even connect them to a human volunteer. We chatted to myfinder co-founder Ghita El Haitmy and disability and technology expert Ashley Shew, associate professor at Virginia Tech in the Department of Science, Technology and Society, about technology, disabilities and what the tech world can do to improve their offerings for the disabled community.
Ghita tells us that myfinder came to life thanks to the co-founders’ personal experiences: her grandmother is blind, Ghita herself has Attention Deficit/Hyperactivity Disorder and used assistive tech while at school and her co-founder is also visually impaired. As such, disabilities had always been at the forefront of their minds. The pair met at a hackathon at the University of Oxford which asked participants to use AI to solve issues of disability and Ghita wanted to “create a better future, make technology that will help us and others to cope better.”
They launched a beta version of the app in February 2021 and now have over 12K users. Ghita argues that thanks to their participation in the hubraum program they have been able to get the app through a lot more iterations. “We have been able to do a lot of focus groups,” she says, “so we’re able to make sure our products really fit our users’ needs. This is crucial: when people use tools like this on a daily basis, they need to get the job done, they’re very reliant on them and you don’t want to give them something they can’t rely on.”
In the latest version of the myfinder app, a digital cane has been added which transforms the phone into a sensor which communicates what obstacles are in the way (“whether it’s a flat surface or if there are stairs”) and transmits all this through haptic feedback. Ghita explains that they would like their device to eventually pose an alternative to a guide dog — “But we all know that when adopting innovation, sometimes it can be a slow curve, especially as the AI is still learning, much as a dog would when they’re training.” She explains that for now, it’s something that can be used alongside a guide dog or other tools in specific situations.
“People use it a lot at home to find their other shoe or a cup of tea they made when they don’t remember exactly where they left it.” She also points out that since the assistant can connect a visually impaired person with a human volunteer instantly, a lot of users deploy it to call a volunteer in the supermarket to quickly identify something on a packet or at home if they’re not sure of an expiry date on a bottle of milk.
While these are the sort of situations the app typically gets used in, she explains that they’re trying to encourage people to use it outdoors more often. “It’s kind of slow because people don’t entirely trust the tech yet but we’re slowly trying to encourage people to develop that trust and for myfinder to gradually become a user’s AI assistant. It learns your behaviour and it adapts as you use it more.”
We ask both Ghita and Ashley about what technology can do for people with disabilities. Ghita points out that apps like myfinder are useful in situations where a disabled person might require privacy — a user might not be comfortable asking another person to come into a toilet to find the toilet roll, but might feel less self conscious using the human volunteer function on myfinder to ask for support (Ghita emphasizes it’s possible to adapt your settings to ensure the app doesn’t send any data or transfer any images to train the AI, so all computations are done on that device and your privacy is ensured).
Ashley discusses how the pandemic lead to features which were commonly used by disabled communities being developed or perfected at faster speeds — for example, the autocaptioning function, which was previously only offered by Google Meet but which Zoom adopted during the pandemic. She notes that while this doesn’t offer the same level of perfection as a live captioner who can capture the nuance of live conversations, it’s still useful for people with lower levels of hearing loss or people with audio processing disorders, where making sense of spoken language takes a little longer “so reading along can be really useful even if the words aren’t quite right.”
Ashley also notes how valuable programmable hearing aids can be: these are Bluetooth enabled and allow users to change the settings for different types of environments. “So if you’re in a noisy restaurant, you can change the settings of your hearing aids, you can play music or phone calls directly into your hearing aids.”
And where could the tech sector improve on their support of disabled people? Both Ashley and Ghita talk about the prohibitively high price of digital products marketed at disabled people. “Oftentimes, when you put the word “assistive” or “disability” in front of something, there gets to be a huge inflation,” says Ashley. She points out that this can be especially punitive since disabled people tend to be disproportionately impoverished. Ghita cites the example of a pen which reads text from a book out loud and costs around two thousand pounds (almost 2400 euros). “Looking at the sort of economic disparities that people with disabilities are already often prey to, you’re taxing them even more for something they need on a daily basis to operate.” She argues this was one of the reasons she wanted to combine multiple features covered by separate apps at myfinder: “Instead of paying for multiple subscriptions for different things, you’re just paying a small one for one app that does everything.” Ghita hopes that the cost of such products will eventually be paid for by governments or insurance companies.
Another aspect Ashley hopes will change is the lifespan of products aimed at disabled people. She points out that there’s often a focus on what’s shiny and new within tech companies. “I would like to see a lot less emphasis on innovation and more on making good technologies that endure and are serviced and serviceable,” she tells us. After all, it can take money for someone to invest in a product and time to grow comfortable with using it, especially if they’re heavily reliant on it. She suggests this change of mindset to thinking about the long-term could also expand to thinking about the bigger picture, like how software updates might affect those with disabilities.
“People always used to worry about computer updates, because a lot of text to voice and voice to text software would be displaced every time there was a computer update and there was a huge lag in getting the same access you used to have.” She cites the example of a news item a few weeks ago about an implant which helped people make sense of shapes when they had some level of blindness. “The tech companies stopped servicing this product and all of their implants went dead at the same time because of an update. People were walking onto subway stations and using it to help navigate and it just cut out. What it means for a tech company to fail in the context where disabled people have adopted a technology is pretty serious.”
But while the responsibility is undoubtedly huge, the difference that tech can make is also notable. For Ghita, assistive tech isn’t just about making something to help people “but also reducing the isolation they feel from their environment and helping them be more integrated into society, more independent. The more we can make that gap a bit smaller, the easier it will be to include people.”
Ghita participated in our iOS Augumented Reality Program. Right now at hubraum we are open for applications from startups with great products/apps that could make people’s lives easier. We’re currently calling for applications for our Snapdragon Spaces Program and hope to see breakthrough solutions here.