This idea is still fascinating, although it has already hatched and taken its first primordial steps in laboratories around the world over - the human brain directly interfaced to the machines, the ultimate interface one can think. Think and the mail is typed and sent. Blink and you are at your favorite site. Sneeze and your computer gets a virus...lol maybe not the last bit. Brains have been tested to drive machines for decades now - from rats to humans. The biggest human success stories and research are done to ease paraplegics.
Such machine links are generally done in one of two ways:
Both these methods have met limited success due to the technicalities involved. A more practical implementation of an EEG-running interface was released in the form of a baseball cap. A team of researchers from Taiwan designed a brain monitoring system inside a baseball cap .The wireless and portable system can process data and provide feedback in real time. It is currently being tested in real time.
- Electrodes are either linked to the surface of the cortex,in a direct brain-machine connection.
- Or electrodes are placed on the surface of head and run on the same brain firings that an EEG machine detects and analysis.
Army Mind-Control Projects
The American army is developing a project codenamed “Thoughts Helmets” that would one day enable direct mind to mind communication between soldiers in the warzone. The primary objective of the project is to enable entire control of the military system by just thoughts. While realization of such projects is still far off but the fact that us government has granted a $4 million contract to a team of scientists from California University and Maryland University makes one believe that we will be seeing prototypes of these models within next decade for sure.
MEMS-Based Robotic Probe
MEMS-based robotic probes can be implanted into your brain directly to talk with the specific neurons ('Matrix' style). It has been developed by the researchers at the Caltech Labs. The application of this technology is mind blowing, it could allow for realistic control of prosthetic limbs and other similar body parts. The software of the device is already complete and in final stages of its testing. But the micro mechanical part which actually goes into one’s brain is still in early stages of development.
OCZ Neural Impulse Actuator
The NIA is one of the finest examples of brain computer interface currently. The whole system consists of a headband, a controller that incorporates an electro-myogram, an electro-occulogram and an electro-encephalogram. All these strange sounding devices helps the controller to translate the brain waves, facial muscle movements as well as eye movements into computer inputs. It can be setup to work with virtually any game but still its far away from being plug and play.
Biometrics and Cybernetics Interfaces
Warfighter Physiological Status Monitoring
The warfighter in simple words can be explained as a chip which is embedded into the clothing to monitor soldier’s physiological behavior. It can be also used as a means to give input to the predictive models which military uses to evaluate success rate of its missions.
Fingerprint scanners is not longer a alien thing to us. It has being a integral part of sci-fi movies for ages. But they have been made readily available in recent ages. Mostly it is used for allowing or denying access to specific users say for a computer systems, vehicle, or controlled access area. As we all know that finger prints of each human being is unique, hence fingerprint scanner works as a nearly foolproof way of security.
Digital Paper and Digital Glass
Digital paper will be the common paper of the future generations. Normal paper will be overshadowed by its digital counterparts in the neat future due to its many advantages over traditional papers. Digital paper is a reflective type of display which is flexible and it does not need any kind of backlighting similar to normal paper. It consumes very less power as it requires power only when changing what it’s displaying. Digital glass is nothing but a transparent display that bears resembles to standard LCD monitor.
Transparent OLED Display
This kind of display is likely to appear in MP3 players or ads displays in the future. One of its prototype was displayed by Samsung on a notebook at CES 2010.
LG Flexible Display
Paper will be one day replaced by these flexible e-paper displays.They are as flexible as real cardboards and is thin too. LG has been one of the few companies to created these prototype flexible display. Its flexible display is made up of metal foils so it will always regain its original shape.
Electronic-Ink or E-Ink as its popularly known is an amazing technology. We got taste of it in form of motorola’s F3 phone in India. While it is currently limited to grayscale. It will be available in full color glory in coming future. E-Ink is currently famous for use in Kindle and the Sony Reader.
Augmented Reality is a technique of superimposing real world data over real-time images of that world. The most common example of Augmented Reality is application made specifically made for iPhone 3GS like Le Bar Guide, Worksnug and many more. There are numerous Augmented Reality projects in the pipeline. I am listing some interesting ones below.
Contact Lens Embedded Augmented Reality
Imagine a displayed contained within a contact lens. Now one would think why to use eyes as an interface, its because the connection between the brain and the eyes is much than any high-speed net connection. Our eyes can see a lot more than we can perceive. It can detect millions of color and minute shifts in lighting.
Currently University of Washington have been working on developing prototypes of these lenses. These lens have been embedded with an LED which can be used to power the lens wirelessly with RF waves. These lenses would contain hundreds of tiny LEDs which will display words, images and other useful information all in front of the wearers eyes.
Wearable Retinal Display
Do you remember the universal translator from Star Trek universe which made communication between various species a realtity. NEC have been working on a similar kind of device called as Tele Scouter that will translate foreign languages into subtitles of user’s default language in real time, sound interesting.
It is one of the augmented reality which has been around for years. First seen in military applications and then it gradually founds it ways into commercial airlines as well as automobiles (high-end ones).
HUDs are quite helpful as it eliminates the need to look away from windshield as all information is displayed on the screen itself. The future of HUDs is bright and it will be used in synthetic vision systems. In layman term everything the users see in his screen will be constructed from database containing information. It will not be seen in near future but it can revolutionize the way in which automobiles are designed and will make aircrafts and cars more safer too.
The movements which we make with our hands, feet, or any other body parts are known as gestures. When these gestures are interpreted by a computer into commands with the help of a camera, it is known as gesture recognition. It has been made popular due to its use in video games, although there are numerous other potential uses.
Acceleglove had been the brainchild of the George Washington University. It will be able to identify American sign language gestures and convert them into text. It makes use of multiple accelerometers on each finger of the glove along with sensors on hands which are used to send electrical signals to a microprocessor which finds the correct text associated with executed movement. All these conversion work is done in milliseconds after the user completes the sign using the gloves.
Gesture-Based Control for TVs
Control of TV using gesture recognition sounds cool instead of using the old boring remote control. Imagine flipping your hands changes the channel while swiping your fingers increases/deceases volume. One can expect these things as early as 2011, companies like Panasonic have already made its foray into it. They had showcased their product at CES 2009.
While Panasonic developed finger gesture based TV control system, Hitachi has gone one step further and introduced a TV with a 3D depth camera to identify gestures on a much larger scale. It makes use of hand gestures to perform various tasks including turning TV on and off.
One of the first gaming system to adopt gesture recognition technology. Although the technology used in Wii is crude as it needs one to hold the special remote and Nunchuk in order to have gestures recognized, still it’s a pioneering gaming device which had made its place into many homes despite mammoth consoles like PS3 and XBOX 360.
Xbox Project Natal
It is one the project which can revolutionize the way in which we play games. Like the iPhone revolutionize the mobile user interface. It takes the Wii’s gesture recognition a step into future. No controller or remote is needed, the user directly interacts with what’s on the screen as one would do in a real world scenario. It makes the game completely immersive without any controllers.
Head and Eye Tracking
As the name suggests it uses one’s eye and head movements to control a specific thing. It is another technology which can be used in many field in innovative ways in near future.
Gran Turismo 5
Gran Turismo is know as one of the most realistic racing simulator. With the launch of Gran Turismo 5, they have included head tracking capabilities. With the use of Playstation Eye Camera, a players head movement will be tracked and it will control the view within the car’s cockpit.
Pseudo-3D with a Normal Webcam
A guy named Chris Harrison has developed a head tracking system that works with a standard system. Sadly its available for Mac users only and it can be used with number of 3D interface.
Summing it up:
The application of these technologies are mind blowing in future. It would allow us to control TVs, computers, and many more appliances all by just thinking about it.
Multi-touch has made decent inroads into our interacting with computing devices. Will the Microsoft surface make it to our living room? Or will TouchWall make an appearance in our office meeting rooms? Would our laptop allow us to interact with our operating systems by just blinking at them, or by moving our head? Only one thing is certain and that is uncertainty. Predictions proves futile in this fast-moving tech world of ours. It will take only one successful product, launched by a visionary company to completely change the way we interact with data.
The days of the tacky keyboard and mouse, and the menu and icon based user-interface that they have spawned must surely be numbered.