The UX Future: “Touchscreens” Even for the Blind?

Braille bumps on touchscreens? Input and output on any surface? Companies like Nokia and Microsoft say you ain’t felt nothing yet.

Print Friendly

http://www.flickr.com/photos/freemind/2853403004/

Image: bobinson, Flikr Open Source Photography Group CC BY-NC-SA 2.0

Interacting with our handheld gadgets has always been an immersive experience.  Devices like the Sony Walkman finally liberated us from our home stereo systems but, when combined with headphones, drew us into our own little mobile worlds.  For some time afterward very little changed with input and output modalities.  Buttons, knobs and dials worked– why mess with that?  They functioned, for the most part, equally well for the sighted and sightless.

These days seeing may still be believing but touch, in the mobile computing world at least, is everything else.  There’s an industrial design arms race going on, with the goal of providing users with fewer buttons to push– just one on the face of a flagship smartphone is now one too many.

Read: Microsoft to Demo New Surface Multi-touch Systems in mid-January

The ongoing rush to nothing-but-touch ignores the vision-impaired, who are surely suffering unintended consequences of this relatively new device interaction paradigm.  Physical keypads typically contain homing indicators that help them navigate the previous generation of cell phones.  Audio augments this further, but can only help to a small extent alone.  A disaster for users dependent on physical feedback?

Maybe not.

Read: A Touch Screen That Plays Sticky

Out on the hopefully not-to-distant horizon are practical haptic surfaces.  Current touchscreen feedback mechanisms are limited to lightly buzzing a user’s fingertips.  Nokia is working on implementations that can respond to applied pressure as well.  And up ahead will be surfaces that can warp into the third dimension, creating dynamic protrusions and opening up a whole new world for the vision-impaired.  At some point our engagement peripherals will move beyond vibrating game controllers to mice with buttons that emerge on demand from smooth surfaces.  Texture and temperature, too, are useful feedback modes and I’m sure we can expect to see their use grow in novel ways as well.

We don’t need to necessarily build enhanced I/O functionality into devices.  In the early 2000s we first saw the potential of turning just about any flat surface into an input controller.  Projected virtual keyboards continued to represent the traditional in layout but promised more flexibility by “painting” ghostly buttons on our desks.  For whatever reason, though, they have not really caught on for mainstream use.

Microsoft and Carnegie Melon University seem to think the desktop may have been the limiting factor.  Their research is taking this idea mobile:

Soon you, too, will be able to talk to the hand. A new interface created jointly by Microsoft and the Carnegie Mellon Human Computer Interaction Institute allows for interfaces to be displayed on any surface, including notebooks, body parts, and tables.  The UI is completely multitouch and the “shoulder-worn” system will locate the surface you’re working on in 3D space, ensuring the UI is always accessible.  It uses a picoprojector and a 3D scanner similar to the Kinect.

As long as the projections are consistent, they could even be utilized by sight-impaired users relying on their own bodies or property to interact with the world around them.  Couple such a technology with a GPS-enabled camera system capable of recognizing everyday objects, as well as the usual audio signallers, and even the totally blind can easily navigate unfamiliar territory.

With full-fledged computers being shrunk down to ridiculously small sizes and prices, we’re looking at an impending revolution in environmental information processing that will serve to liberate and engage more people in the near future.  I can’t wait to see– and feel– what’s coming.

Print Friendly
The following two tabs change content below.
Editor-in-Chief. Making tech accessible since the Jurassic. Personal ramblings at texrat.net. Follow @texrat on Twitter.

Latest posts by Randall "texrat" Arnold (see all)