“Technology is most powerful when it empowers everyone”
As someone who takes great pride in developing apps and monitoring customer sentiment, accessibility has always been a compelling topic. Apple’s accessibility page states: “Technology is most powerful when it empowers everyone”, a clear statement from the company that was winning awards for accessibility on their mobile devices soon after they started shipping them. Implementing features offered by the OS enables users to use your app in ways you cannot always predict. Let’s take an example; the Philips Hue app, on first glance it wouldn’t necessarily make sense to implement features that would enable blind people to get the best lighting experience now would it? Once you realize however that guide dogs cannot see in the dark, it makes total sense to enable the user to set up lighting automation via motion sensors & routines. Let’s dive into some current developments that enable everyone!
Understanding our senses
If you want to be on the cutting edge of accessibility you need to have an understanding of how the brain works. For the longest time people thought that humans needed their ears for hearing and their eyes for seeing. All up until the 1960’s where it was hypothesized that it’s actually the brain that sees for you. In the current day we understand that your eyes and ears simply translate what they pick up into signals and send those signals to the brain. To oversimplify, if someone speaks to you, your ears turn the vibrations in the air into a specific pattern of brain signals which your brain has learned to understand as language. For 0.4% of the population however their ears don’t work at all, that’s not counting the people with other hearing impairments since about 15% of people report some kind of hearing problem. Modern technology is using our brain its unconscious processing power to allow people with non-functioning ears to hear again. Companies like NeoSensory are developing products like “VEST” and creating their own “language” for our brains to interpret. To oversimplify again; a hearing aid for deaf people as a shirt. VEST listens via a microphone and translates sounds into specific vibrations on the wearers back. The wearer is able to learn the VEST “language” in a matter of weeks via an app by taking classes for only a couple of minutes a day. And let’s take a different example, BrainPort. This mouth-worn device enables the user to “see” via stimuli on the tongue. In a sense it mimics what eyes do, the camera turns visual input into signals on the tongue which in turn reach the brain and voila, people are able to “see” well enough up to the point where people are using it to climb mountains.
The brain is plug and play
Once we understand that we can teach our brains to interpret sensory input for specific results we could enable all kinds of new use cases inside the accessibility world and outside it. What if there is a camera on your front porch that would signal a device you’re wearing which leads to a feeling in your left arm not strong enough for you to even notice. But instinctively you would know there would be a visitor. What if we would use this technology to enable their users to see heat? By teaching the brain specific sensory input you could safely grab that pan from the stove, you simply know it’s safe. You’ve already “seen” you won’t get burned before you even thought about grabbing it. The possibilities are endless.
Writing your own language
With technological breakthroughs happening every day it’s important to have a clear understanding of the use cases for your application and how they could enable those who are in need of accessibility features. By observing what companies like NeoSensory are doing we can deduce that we can help our users by inventing our own language so to speak.
So the next time you’re asked to implement accessibility features, don’t see this as a hassle but see this as an opportunity to do some innovative work by writing your own language!
If this article peaked your interest I recommend David Eagleman’s TED talk on this topic: https://www.ted.com/talks/david_eagleman_can_we_create_new_senses_for_humans
Jurian de Cocq van Delwijnen works for Sogeti Netherlands as part of the mobile team. He builds apps by craft but his passion for refining the development process has taken him far as a Scrum Master. He prides himself in being able to switch effortlessly between being a developer and taking the perspective of the business. After previously developing the Philips Hue app he works for Rabobank now, the second largest bank in the Netherlands.
More on Jurian de Cocq van Delwijnen.