Forums | India's Top Technology forum. Forums | India's Top Technology forum. (
-   Reviews & Previews (
-   -   Article Stuck Behind the Keyboard (

d@rK nEmEsIs 06-04-10 10:34 PM

Stuck Behind the Keyboard
10 Attachment(s)

So here we are,almost mid-way through 2010. I'm currently typing on a keyboard based on a patent filed by Christopher Latham Sholes in 1868. That's 142 years ago! Being generous, let's count down from the 1970 when keyboards where not very different from what we use today - that's still a healthy 40 years of being stuck hitting keys.

Where is the revolution? Why can't I download the latest episode of Top Gear by just thinking about it during a boring Monday lecture? Why do we still have to struggle with interfaces, to this late date.

In many ways input devices seem to have taken a step back. This might be subjective, but does today's membrane keyboard really match up to the visual feedback that the not-so-old mechanical keys brought to bear?

In 40 years, all we have been offered is a wireless setup, more keys to mistakenly hit, and LCD panels that tell us whether it's hot outside, or if you have unread mail. The mouse does not fare any better. The fact is that our being stuck with the same interface for decades is not entirely a matter of manufacturer laziness. Human-machine interface is an actively researched field. We also are very open to exploring options other than keyboard and mouse.The acceptance of Apple's iPhone, the Nintendo DS and the Wii console stands evidence to this.

So,where is the revolution?
When the Nintendo DS was introduced,it made mainstream what was until then only for a few PDA-philes - the touchscreen. In 1971, the first step in touch-screen technology was taken by Dr Sam Hurst, while he was an instructor at the University of Kentucky. In 1974, he created a sensor called Elograph, which was later developed into a transparent surface, similar to what we use today.

Fast forward to 1993 and Apple introduces the Newton - a touchscreen driven, monochromatic device dubbed the Personal Digital Assistant. Real success only came to touchscreens with the arrival of the Palm Pilot in 1996. The touchscreens built into these devices used to take single point of input.

That changed with the introduction of the iPhone in 2007, when multi-touch became the new buzzword and had enough traction to trickle into devices other than the iPhone and iPod Touch. The success of this interface has spawned many "me too" products ranging from the poorly implemented HTC Touch to more recent implementations, which seem to do better, such as the HTC Diamond. While multi-touch has been commercially successful in smaller devices, there are larger plans ahead.

Microsoft Surface, for example, was showcased not long ago. It successfully captured the imagination of all of us bored by the tedious keyboard and mouse interface. The technology behind the surface makes it inherently costly for widespread use. For now, it will be shipped to commercial clients such as AT&T and T-Mobile.

While Surface seems out of our reach, there is another product from Microsoft dubbed the TouchWall. This implementation actually consists of two pieces - the hardware(TouchWall) and the Vista-based software called the Plex. It is quite similar to Surface in functionality like multi-touch interaction with the interface. It consists of three infrared light sources, which scan the entire surface. The surface in this case consists of a 4X6 foot Plexiglas sheet. When something hinders the infrared beams, a camera notes the position and feeds the information to the software heart - Plex.

The relatively simpler hardware used for TouchWall should make the product more pocket-friendly than Surface. Microsoft suggested a price in the range of a few hundred dollars. More exciting than the multi-touch is the tactile interaction offered by these products. The iPhone, for example brought a revolution in the way we interact with a phone.

The Future is Exciting
What else can we expect down the years, as either an evolution or a revolution of the man-machine interface? The difficulty lies in prediction.
Voice-activated Interfaces: The staple of science fiction - talking directly to a machine to carry out your whims and fancies. They are not science fiction anymore. Dictation software like Dragon NaturallySpeaking that jot down spoken words to your word processor have been around for quite some time. Current OSes such as Windows Vista/Windows 7 come with rudimentary voice command interfaces. But these are baby steps. Software is not smart enough to carry out a natural conversation with a machine. The major hurdle being language itself - there are so many ways in which we humans speak.

Brain-Machine Interfaces

This idea is still fascinating, although it has already hatched and taken its first primordial steps in laboratories around the world over - the human brain directly interfaced to the machines, the ultimate interface one can think. Think and the mail is typed and sent. Blink and you are at your favorite site. Sneeze and your computer gets a maybe not the last bit. Brains have been tested to drive machines for decades now - from rats to humans. The biggest human success stories and research are done to ease paraplegics.

Such machine links are generally done in one of two ways:
  1. Electrodes are either linked to the surface of the cortex,in a direct brain-machine connection.
  2. Or electrodes are placed on the surface of head and run on the same brain firings that an EEG machine detects and analysis.
Both these methods have met limited success due to the technicalities involved. A more practical implementation of an EEG-running interface was released in the form of a baseball cap. A team of researchers from Taiwan designed a brain monitoring system inside a baseball cap .The wireless and portable system can process data and provide feedback in real time. It is currently being tested in real time.

Army Mind-Control Projects

The American army is developing a project codenamed “Thoughts Helmets” that would one day enable direct mind to mind communication between soldiers in the warzone. The primary objective of the project is to enable entire control of the military system by just thoughts. While realization of such projects is still far off but the fact that us government has granted a $4 million contract to a team of scientists from California University and Maryland University makes one believe that we will be seeing prototypes of these models within next decade for sure.

MEMS-Based Robotic Probe
MEMS-based robotic probes can be implanted into your brain directly to talk with the specific neurons ('Matrix' style). It has been developed by the researchers at the Caltech Labs. The application of this technology is mind blowing, it could allow for realistic control of prosthetic limbs and other similar body parts. The software of the device is already complete and in final stages of its testing. But the micro mechanical part which actually goes into one’s brain is still in early stages of development.

OCZ Neural Impulse Actuator
The NIA is one of the finest examples of brain computer interface currently. The whole system consists of a headband, a controller that incorporates an electro-myogram, an electro-occulogram and an electro-encephalogram. All these strange sounding devices helps the controller to translate the brain waves, facial muscle movements as well as eye movements into computer inputs. It can be setup to work with virtually any game but still its far away from being plug and play.
Biometrics and Cybernetics Interfaces

Warfighter Physiological Status Monitoring

The warfighter in simple words can be explained as a chip which is embedded into the clothing to monitor soldier’s physiological behavior. It can be also used as a means to give input to the predictive models which military uses to evaluate success rate of its missions.
Fingerprint Scanners

Fingerprint scanners is not longer a alien thing to us. It has being a integral part of sci-fi movies for ages. But they have been made readily available in recent ages. Mostly it is used for allowing or denying access to specific users say for a computer systems, vehicle, or controlled access area. As we all know that finger prints of each human being is unique, hence fingerprint scanner works as a nearly foolproof way of security.

Digital Paper and Digital Glass

Digital paper will be the common paper of the future generations. Normal paper will be overshadowed by its digital counterparts in the neat future due to its many advantages over traditional papers. Digital paper is a reflective type of display which is flexible and it does not need any kind of backlighting similar to normal paper. It consumes very less power as it requires power only when changing what it’s displaying. Digital glass is nothing but a transparent display that bears resembles to standard LCD monitor.

Transparent OLED Display
This kind of display is likely to appear in MP3 players or ads displays in the future. One of its prototype was displayed by Samsung on a notebook at CES 2010.
LG Flexible Display
Paper will be one day replaced by these flexible e-paper displays.They are as flexible as real cardboards and is thin too. LG has been one of the few companies to created these prototype flexible display. Its flexible display is made up of metal foils so it will always regain its original shape.
Electronic-Ink or E-Ink as its popularly known is an amazing technology. We got taste of it in form of motorola’s F3 phone in India. While it is currently limited to grayscale. It will be available in full color glory in coming future. E-Ink is currently famous for use in Kindle and the Sony Reader.
Augmented Reality is a technique of superimposing real world data over real-time images of that world. The most common example of Augmented Reality is application made specifically made for iPhone 3GS like Le Bar Guide, Worksnug and many more. There are numerous Augmented Reality projects in the pipeline. I am listing some interesting ones below.

Contact Lens Embedded Augmented Reality

Imagine a displayed contained within a contact lens. Now one would think why to use eyes as an interface, its because the connection between the brain and the eyes is much than any high-speed net connection. Our eyes can see a lot more than we can perceive. It can detect millions of color and minute shifts in lighting.

Currently University of Washington have been working on developing prototypes of these lenses. These lens have been embedded with an LED which can be used to power the lens wirelessly with RF waves. These lenses would contain hundreds of tiny LEDs which will display words, images and other useful information all in front of the wearers eyes.

Wearable Retinal Display
Do you remember the universal translator from Star Trek universe which made communication between various species a realtity. NEC have been working on a similar kind of device called as Tele Scouter that will translate foreign languages into subtitles of user’s default language in real time, sound interesting.

Heads-Up Display

It is one of the augmented reality which has been around for years. First seen in military applications and then it gradually founds it ways into commercial airlines as well as automobiles (high-end ones).

HUDs are quite helpful as it eliminates the need to look away from windshield as all information is displayed on the screen itself. The future of HUDs is bright and it will be used in synthetic vision systems. In layman term everything the users see in his screen will be constructed from database containing information. It will not be seen in near future but it can revolutionize the way in which automobiles are designed and will make aircrafts and cars more safer too.
The movements which we make with our hands, feet, or any other body parts are known as gestures. When these gestures are interpreted by a computer into commands with the help of a camera, it is known as gesture recognition. It has been made popular due to its use in video games, although there are numerous other potential uses.
Acceleglove had been the brainchild of the George Washington University. It will be able to identify American sign language gestures and convert them into text. It makes use of multiple accelerometers on each finger of the glove along with sensors on hands which are used to send electrical signals to a microprocessor which finds the correct text associated with executed movement. All these conversion work is done in milliseconds after the user completes the sign using the gloves.
Gesture-Based Control for TVs
Control of TV using gesture recognition sounds cool instead of using the old boring remote control. Imagine flipping your hands changes the channel while swiping your fingers increases/deceases volume. One can expect these things as early as 2011, companies like Panasonic have already made its foray into it. They had showcased their product at CES 2009.

While Panasonic developed finger gesture based TV control system, Hitachi has gone one step further and introduced a TV with a 3D depth camera to identify gestures on a much larger scale. It makes use of hand gestures to perform various tasks including turning TV on and off.
Nintendo Wii
One of the first gaming system to adopt gesture recognition technology. Although the technology used in Wii is crude as it needs one to hold the special remote and Nunchuk in order to have gestures recognized, still it’s a pioneering gaming device which had made its place into many homes despite mammoth consoles like PS3 and XBOX 360.

Xbox Project Natal
It is one the project which can revolutionize the way in which we play games. Like the iPhone revolutionize the mobile user interface. It takes the Wii’s gesture recognition a step into future. No controller or remote is needed, the user directly interacts with what’s on the screen as one would do in a real world scenario. It makes the game completely immersive without any controllers.

Head and Eye Tracking

As the name suggests it uses one’s eye and head movements to control a specific thing. It is another technology which can be used in many field in innovative ways in near future.

Gran Turismo 5
Gran Turismo is know as one of the most realistic racing simulator. With the launch of Gran Turismo 5, they have included head tracking capabilities. With the use of Playstation Eye Camera, a players head movement will be tracked and it will control the view within the car’s cockpit.

Pseudo-3D with a Normal Webcam
A guy named Chris Harrison has developed a head tracking system that works with a standard system. Sadly its available for Mac users only and it can be used with number of 3D interface.

Summing it up:

The application of these technologies are mind blowing in future. It would allow us to control TVs, computers, and many more appliances all by just thinking about it.

Multi-touch has made decent inroads into our interacting with computing devices. Will the Microsoft surface make it to our living room? Or will TouchWall make an appearance in our office meeting rooms? Would our laptop allow us to interact with our operating systems by just blinking at them, or by moving our head? Only one thing is certain and that is uncertainty. Predictions proves futile in this fast-moving tech world of ours. It will take only one successful product, launched by a visionary company to completely change the way we interact with data.

The days of the tacky keyboard and mouse, and the menu and icon based user-interface that they have spawned must surely be numbered.

WolVerInE 29-04-10 10:40 PM

Re: Stuck Behind the Keyboard
nice man !

---------- Post added at 10:40 PM ---------- Previous post was at 10:39 PM ----------

3D cam reminds me of wall-e :p

Aumkar 29-04-10 10:42 PM

Re: Stuck Behind the Keyboard
Great article man!!

d@rK nEmEsIs 30-04-10 01:12 AM

Re: Stuck Behind the Keyboard

ManISinJpr 30-04-10 11:14 AM

Re: Stuck Behind the Keyboard
You have collected and placed together a very good article, Nice effort.

BLuEBLOODED 30-04-10 06:16 PM

Re: Stuck Behind the Keyboard
Concise Effort there mate.

d@rK nEmEsIs 30-04-10 07:26 PM

Re: Stuck Behind the Keyboard
thanks to everyone.

All times are GMT +5.5. The time now is 03:26 PM.

Powered by vBulletin® Version 3.8.5
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
SEO by vBSEO 3.6.0 PL2 ©2011, Crawlability, Inc.

Article powered by GARS 2.1.9 ©2005-2006