Just out of curiosity, I conducted a little experiment in device accessibility:
I closed my eyes and tried to use my PC.
Of course it was more an exercise in futility. I’m no touch-typist, so even with the raised features on certain keys I was useless on the keyboard. With no indication of where my mouse pointer was or what it was doing, I was even more lost on the screen. I didn’t even know where or how to start.
Welcome to the world of the sensory-deprived computer user.
Now, I could implement things that would help. Audio cues. Vibration. Operating systems in fact do typically provide for accessibility enhancements. Often though they require someone else to set up.
And, do they go far enough? Most enhancements assume mild impairments, not total absence of one or more sensory characteristics.
I’m already severely nearsighted, and suffer some hearing damage, but that’s admittedly nothing compared to what others endure. The panic of misplacing eyeglasses, and groping for them through the fog between me and the world, can’t compare to the every day ordeal of absolute blindness in a sight-centric world. But I did have a brush with sight loss that has made me very sympathetic.
Several years ago I developed an inflammatory eye disease known as uveitis. It starts as an extreme sensitivity to light and often develops into full blindness. I wore a blindfold for two days to avoid the excruciating pain that even dim lights inflicted. I had to go so far as to avoid glancing at the red LEDs of my alarm clock.
I was fortunate; with treatment I recovered from that scary episode and have only had a few minor repeats since. Others aren’t so lucky… and I’m told that there’s still a possibility of completely losing my eyesight at any time.
That thought creeps in whenever I consider the intertwining topics of usability and accessibility. If I went blind, could I adapt? Design work would be out of the question… so could I still use my computer for, say, music composition?
Right now I don’t see how. The typical digital audio workstation software is complex enough for the sighted– how could it be made to work for the blind? Advancements in tablet technology tell me that, eventually, many if not all of the current hurdles could be conquered.
One drawback: there’s not as much money to be made accommodating the sensory-impaired and disabled as there is the mainstream. Many overtures in this area will depend more on goodwill than a calculable return on investment. But consider that solving usability issues for the most challenging customers could produce side benefits for everyone else. Improve the experience for the hearing-impaired and sightless, and you may well solve focus/distraction problems for every user.
Nokia’s design chief, Marko Ahtisaari, thinks our personal devices are too immersive. Too demanding. Given the number of people I see texting as they drive these days, I have to believe he’s right. We need new modes of engagement.
Ahtisaari hints at techniques and technologies to solve such dilemmas, and we can guess as to what he may be alluding. In addition to novel UI and UX approaches, we can take more exotic elements into consideration. Touchscreen haptics, magnetic tattoos, neural sensors– all viable tools to use in improving any person’s interactivity with the digital universe.
Some cities have added audio to their crosswalk signals. The pitch and/or tempo changes as the available time to cross approaches its limit. Little things like this go a long way to making a pedestrian’s walk safer and more enjoyable, and it works for everyone. This is how we should approach user experiences in general. This is how we help everyone navigate the physical world with equal ease, via a harmonized synthesis of the conventional and electronic. Overlay the Internet on the physical world; augmented reality need not be limited to visual enhancements!
The future may see our various written languages converge toward universal gestures that can be interpreted on intelligent surfaces or even in the air itself. Maybe it’s time we all learn sign language; speech may one day take a distant back seat to nonverbal communication. This could lead to great interaction equalizers for the blind and deaf– indeed, the heightened sensitivity to touch that comes with such conditions could even confer an advantage.
So my challenge to designers and developers: tie on that blindfold. Plug your ears. Pin an arm behind your back. Then project your mind forward… and start making things better.
Someday soon I want to try that experiment again, and just go to work.
Latest posts by Randall "texrat" Arnold (see all)
- Math is Hard, part 1 - 16 March 2014
- No Stranger to Fiction - 3 March 2014
- Down-Voting Disqus, or Why Binary Feedback Mechanisms Matter - 1 March 2014