Microsoft Shows Device Control By Muscle Movement [Demo Video & Images]
January 1, 2010How about making random finger gestures while jogging to change the track being played by your portable media player? Or opening your car by gripping the bags in your hand more tightly?! The crazy engineers at Microsoft Research have figured out a way! Prototype video & images are embedded below. Microsoft calls this muscle-computer interfaces (muCIs):
An interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible.
While Apple is busy innovating ways to bring Wii Remote like functionality to their Apple Remote, Microsoft is busy taking another leap in computer interaction. With Project Natal Microsoft brought the ability to interact by body movements, apparently someone at Microsoft must’ve started to feel a bit tired with all the exercises and decided to work on a project that will allow you to control your devices by flexing your muscles. Prototype image:
The idea is to figure out which finger is interacting and even the posture of the person interacting with the device. In a recent patent filing by Microsoft, the geniuses at Microsoft Research are working on a project based on Electromyography or EMG – which is a method of detecting the electrical potential created by muscle movement. A more technical definition comes from a research paper:
EMG measures the electrical signals used by the central nervous system to communicate motor intentions to muscles, as well as the electrical activity associated directly with muscle contractions.
The research in its infant stages fixes some sensors on your forearm and making certain gestures with your fingers you can control devices. The prototype can have functions based on:
- Sensing pressure exerted by the finger.
- Finger specific functions.
- Posture dependent even when your finger is not in direct contact with the device.
This technology according to the Microsoft Research team as demonstrated will allow the computer to determine which finger of the hand has been used and then perform certain functions. For example, if you were to move a photo from one frame to another on the Microsoft Surface you can pinch into the image lift your hand while maintaining the pinch posture and drop the image onto the other frame. Doesn’t sound half as cool until you see the video.
Here’s a demo video courtesy Microsoft Research:
Muscle-Computer Interfaces (muCIs) – MSR
Electromyography – Wikipedia