Stuck Behind the Keyboard
So here we are,almost mid-way through 2010. I'm currently typing on a keyboard based on a patent filed by Christopher Latham Sholes in 1868. That's 142 years ago! Being generous, let's count down from the 1970 when keyboards where not very different from what we use today - that's still a healthy 40 years of being stuck hitting keys.
Where is the revolution? Why can't I download the latest episode of Top Gear by just thinking about it during a boring Monday lecture? Why do we still have to struggle with interfaces, to this late date.
In many ways input devices seem to have taken a step back. This might be subjective, but does today's membrane keyboard really match up to the visual feedback that the not-so-old mechanical keys brought to bear?
In 40 years, all we have been offered is a wireless setup, more keys to mistakenly hit, and LCD panels that tell us whether it's hot outside, or if you have unread mail. The mouse does not fare any better. The fact is that our being stuck with the same interface for decades is not entirely a matter of manufacturer laziness. Human-machine interface is an actively researched field. We also are very open to exploring options other than keyboard and mouse.The acceptance of Apple's iPhone, the Nintendo DS and the Wii console stands evidence to this.
So,where is the revolution?
When the Nintendo DS was introduced,it made mainstream what was until then only for a few PDA-philes - the touchscreen. In 1971, the first step in touch-screen technology was taken by Dr Sam Hurst, while he was an instructor at the University of Kentucky. In 1974, he created a sensor called Elograph, which was later developed into a transparent surface, similar to what we use today.
Fast forward to 1993 and Apple introduces the Newton - a touchscreen driven, monochromatic device dubbed the Personal Digital Assistant. Real success only came to touchscreens with the arrival of the Palm Pilot in 1996. The touchscreens built into these devices used to take single point of input.
That changed with the introduction of the iPhone in 2007, when multi-touch became the new buzzword and had enough traction to trickle into devices other than the iPhone and iPod Touch. The success of this interface has spawned many "me too" products ranging from the poorly implemented HTC Touch to more recent implementations, which seem to do better, such as the HTC Diamond. While multi-touch has been commercially successful in smaller devices, there are larger plans ahead.
Microsoft Surface, for example, was showcased not long ago. It successfully captured the imagination of all of us bored by the tedious keyboard and mouse interface. The technology behind the surface makes it inherently costly for widespread use. For now, it will be shipped to commercial clients such as AT&T and T-Mobile.
While Surface seems out of our reach, there is another product from Microsoft dubbed the TouchWall. This implementation actually consists of two pieces - the hardware(TouchWall) and the Vista-based software called the Plex. It is quite similar to Surface in functionality like multi-touch interaction with the interface. It consists of three infrared light sources, which scan the entire surface. The surface in this case consists of a 4X6 foot Plexiglas sheet. When something hinders the infrared beams, a camera notes the position and feeds the information to the software heart - Plex.
The relatively simpler hardware used for TouchWall should make the product more pocket-friendly than Surface. Microsoft suggested a price in the range of a few hundred dollars. More exciting than the multi-touch is the tactile interaction offered by these products. The iPhone, for example brought a revolution in the way we interact with a phone.
The Future is Exciting
What else can we expect down the years, as either an evolution or a revolution of the man-machine interface? The difficulty lies in prediction.
Voice-activated Interfaces: The staple of science fiction - talking directly to a machine to carry out your whims and fancies. They are not science fiction anymore. Dictation software like Dragon NaturallySpeaking that jot down spoken words to your word processor have been around for quite some time. Current OSes such as Windows Vista/Windows 7 come with rudimentary voice command interfaces. But these are baby steps. Software is not smart enough to carry out a natural conversation with a machine. The major hurdle being language itself - there are so many ways in which we humans speak.