This article is reprinted from the new computer CHIP "July 2010, the technology and the future" column.
In the coming period, also will introduce new Microsoft technology, we will promptly reprint, to share with you an exciting new technology.Use your finger to open the car trunk, use your tongue to make wheelchair free ride in the yard, the lens can get the latest news and information is pushed to your eyes, these seemingly only in science fiction in the scene is Microsoft's lab was one.
People and computers and other machines and equipment to be able to better communication, has been the technical engineers working on the problem.
With the development of the Internet, hardware performance and software functions in rapid development, but as a person and a computer now, the most important communication tool, keyboard and mouse for many years no substantive technical breakthroughs that have hardware and software development speed lags, but this situation in the near future is expected to be improved. Recently, CHIP again walked into the Microsoft Research Asia, interviewed the HCI (human computer interaction) sector DesneyTan and mining for dereliction of duty, out-of-the-art, very cool man-machine Exchange technology.Open the trunk muscles
Have you ever thought about when your hands are busy with something of how to control computers, cos you hands basket, and want to open the trunk of the car.
To achieve this goal, we do not need to head cover a sensing device, in fact, the EEG human body there are many ways you can output signal, which is more convenient than getting a brain and accurate.The researchers found that when our hands means or shaking your wrist, movement of the muscle will through sensing and translation of these muscle movement can be achieved on the machine's control, so they designed
the muscle computing products. Currently, they have designed the samples, this sample can be connected to the PC, to control some similar GuitarHero (guitar hero) of the game, you only need to move a finger or swing wrist can emulate the guitar chord of pressure or call chord movements.When our brain needs some action will send signals to muscles and muscle in accordingly action will produce electrical signal, the signal strength than the brain's signals, and therefore easier than brain signal capture.
Microsoft experts design muscles computing product is a contains 12 signal sensor arm, the arm to arm, just need to go through a simple calibration can use. Current calculation arm muscles can feel most significant movements of hands, such as the thumb and other fingers are squeezed together movement, wrist action around the swing.How to accurately judge the actions of each finger is the technology most difficult, optimization algorithm for induction and discrimination, to avoid false positives.
In addition, not every finger only and single muscle, you need to judge a group of muscles can recognize a gesture. In addition, because the electrical signal sensors tied over the skin and muscles in the skin below the skin's barrier affect recognition accuracy. To address these issues, Microsoft research fellows use machine learning approach, continuous optimization algorithm for identification and calibration. In the design of this product, Microsoft Research's researchers have also fixed a lot of thought into, for example allows the user to accurately assign each sensor is aligned with the appropriate muscles is difficult, so they will arm sensors designed to tie ring, users just need to connect after your computer and follow the prompts to move a few fingers, the muscles and ring sensor suitable for a sensor.Currently, latest muscle calculation data analysis using the arm chip is very small, only 7mm×7mm, com
bined with sensors, batteries and Wi-Fi module, by wireless means and the computer or other electronic device. Reasonable guarantee arm volume by battery power it already can work 4 hours, the working time is not the product power consumption optimization of circumstances, such as sensor or signal output is active, Microsoft researchers are trying to optimize for data transmission at a time when the sensor is not turned on, turn off the sensor to save power.Of course, this technology is not only used for the acquisition of the arm muscles signal, as long as there is a place where the muscle that technology can play a role, if necessary, you bend over, kick and even move your toes can be recognized as a control command.
Microsoft internal has this technology expands the intersectoral cooperation, for example, they are trying to incorporate this technology and Microsoft's Surface product combination, which you in the action the interactive desktop Surface it can be detached from the touch screen of this plane, you will be able to put desktop objects out from desktop or throw it back again.The skin becomes touch screen
The skin can become the key, now Microsoft Research's researchers are working to put people's skin becomes a large touch screen.
This is a Microsoft researcher is called skin entered technology is also already have a sample, the calculation of the product and muscle are structurally similar products, but also adopt the form of the arm, the arm to arm, click on the arm to a different location on the skin can be make different action command.
In Microsoft's demo samples, the use of tied big arm micro projectionMachines in the lower arm skin projection out a picture image, such as a mobile phone dial-up keyboard, just tap the screen with your finger keys, you can calculate the arm muscles feel and have it translated into the corresponding key input gestures.This techno
logy uses a similar wave vibration transmission principle of energy, when we click on the skin of the body, using a high-speed video camera to shoot and shoot images for instant replay, you can see the click on the skin's energy like like to spread around water waves and conduction. While the use of vibration sensors to analyze these skin shock can be found, click on the location of different arm skin, returns the shock wave is different, through a special algorithm for analysis, you can click on the position.Although the principle understand this technology is not complicated, but actually implemented the same encounter many problems.
First, click on the skin with your fingers in the same position, angle, the resulting wave will vary, so you will need to be modeled on the arm, and constantly improve the algorithm can improve click-through rate of recognition. In addition, current technology is still not allowed induction module is too far from the tap location, because the wave of delivery process, through joint, etc will be significantly weakened. The current prototype samples of skin type has adopted the 10 dimensions of different sensors to sense different frequency shock wave, and the future of many different hardware induction frequency wave work can make use of the software to complete the calculation, it can make sense of the smaller arm, the sensor can also be used for less.Now use micro projector in the lower arm cast 20 press, has been completely can be accurately identified, i.e. the current skin's ability to distinguish input technology can already distinguish 1cm ~ 1.5cm distance above click operation.
And calculation of the technical prototype of muscles, skin and enter the product each time before use, also need to make corrections, Microsoft Research's researchers are trying to projectors, cameras and skin type products to achieve automatic position sensing and calibration, so after wearing when you nolonger need to carry out corrective action.One more interesting thing is the human body fat level on this wave of transmission interference, when comparing fat, transmission signal will be weakened, but actual testing has shown that even very fat man uses this sensor also can work correctly, only the results will be reduced, it seems to want to make better use of the future of human-computer interaction devices, the first thing to do is to control body weight.
In addition to tap on the skin, skin type technology can achieve similar muscle technology features, such as the two fingers together actions can be skin input technology identification.
Microsoft Research's researchers are also trying to put these two technologies integrated into the same arm, to achieve a more comprehensive functionality, in addition they will in future in arm embedded Accelerator sensor, so that the movement can be better induction effect.Mastering wheelchair with his tongue
Use different parts of the human body to convey the human-machine interaction signal can better help physically handicapped persons with disabilities.
For example a paralysis of the patient, even if the whole body cannot move but still has the potential to control his tongue, and therefore most total paralysis can speak correctly. While people on their tongues have strong control, used to convey the human-computer exchange of signals is a perfect fit.To this end, a researcher at Microsoft Research launched the tongue computing project, the project works is relatively simple, by mouth with a contains 4 (left one) optical sensors inductive module to induction tongue movements, four optical sensor can detect the tongue to the specific location of distance, thereby moving the tongue can be translated into some sort of behavior control equipment.
Currently using this control tools can connect a computer to play Russia square, orcontrol a wheelchair slow progression. Tongue computing projects prototype product volume is similar to the size of the tongue, using the built-in battery, can work for about 3 hours, and the use of Bluetooth technology and computers and other devices for wireless data transfer. Future through optimization, power consumption of this product will be lower, the volume can also be smaller, and can even be directly through the hole in the teeth on the way to install it. Currently the prototype products can already be induction 4 ~ 8 actions, can reach 95% accuracy rate.Microsoft Research's researchers hope you can join for this product more sensors to sense more tongue movements, they want this product in the future to be able to direct sensor you want to say the people.
Users do not need to make a sound, the sensor requires only the movement of the tongue by induction and mouth shape changes can know he wanted to say that this will improve many patients of language dysfunction problems.In the eyes of the monitor
Human-computer interaction includes two aspects, one is how to send messages to computers and other equipment, on the other hand is how to transfer the information, and output aspects of technological development is relatively fast, glasses monitor and Palm-sized projector already appear, and Microsoft research fellows do not reconcile to the original display products continue to be optimized, they believe that this technology can be the most difficult start, so they began researching lens display technology, they think that once the technology can be successfully developed, and other issues are solved with it.
We at Microsoft Research Asia meeting rooms seen called bionic contact lens products, this is already 6 months ago by model products and our daily wear contact lenses in size without any distinction, and even can only cover a small pupil contact lenses.
Although inlineThe LED lights and inductive loop, but it still uses a similar flexible material now contact lenses. If not for their own eyes can see, it's hard to imagine this is only the thin thickness 0.001mm will spread all over the glowing LED lights and a variety of lines.This technology is still in its infancy, and several other of the above technologies this technical challenges more, first is processing difficulty, to such a thin flexible objects on light body is very difficult.
Currently, Microsoft is in cooperation with the University of Washington, one of them to produce the special laboratory. At present only in contact lenses on the Red led, blue and green light-emitting relatively large size of the module, also temporarily unable to put into contact lenses slice. If the capture process difficulties also just matter of time, the heat is more difficult to solve the problem, the latest Bionic contact lens products can already be achieved 16 x 16 dot matrix display, not because of technical limitations cannot continue to increase the number of pixels, but because no matter how big the words product heat is too high, more than the human eye can bear. In addition, because contact lenses is tightly attached to the eye, so you want to allow the eye to focus contact lenses on the displayed content is not easy, which simulate the human eye structure for the contact lens product displays test results are not very good, the test is displayed a letter e is still rather ambiguous. The last issue is the power supply, such small products are unable to attach the battery, so Bionic contact lenses must adopt wireless power supply, the current Microsoft researcher using wireless power supply scheme similar we currently use transit cards, through the principle of electromagnetic induction, make contact lenses on the coil produces current, by controlling the output of power, contact lenses on the led brightness can be adjusted, but no matter how it needs to have a power of the transmitter, and the transmitter is not too far away from the lens, the ideal scenario is placed in the glasses box, but it will undoubtedly make Bionic contact lens of the portable and stealth charm. In the future, Microsoft Research's researchers hope that can be displayed with the miniaturization of LED, moved to the edge of contact lenses to reduce eye watching the scenery outside interference. In addition, this product because you can directly contact with tears, can also collect some information on medical, after joining the sensor can also help with eye diseases patients keep monitoring the situation of the eye. Overall, the Bionic contact lenses also face many challenges need to be addressed, in a short time with the practical value of the product is still difficult, but this technology is successfully developed, it will be possible to radically change our present patterns of human-computer interaction.Interactive interface for psychology
(Psychology provides the basic laws on human awareness of important knowledge, in human-computer interaction in a very important significance.
Microsoft Research Asia HCI group for dereliction of duty, here we are striving to share experiences in this regard. -Editor's note)Human-machine interaction is not only the continuous improvement of the computer or other electronic device for input or output device, to a large extent, need to consider design a easy to be accepted by users, and what kind of interface the interface easier to use in practice also involves human psychological problems, only consistent with the normal rules of the human mind, designed interface availability is stronger.
Therefore in the design of human-computer interaction interface, you need to consider two aspects, one is the human perception of the interface, how to identify the human interface is very basic, for example interface fault-tolerant and reminder functionality is very important, they can guarantee the human interface with accurate understanding. In addition, human-computer interaction interface must be a central system, such as a button action, we press the key after the machine need to tell the user has pressed the key, such as pressing the phone button will have a sound or vibration feedback reminders.While Microsoft had long before the man-machine interface of the system of, for example, Word 2003 this product is at least 80 hours of usability testing.
We will try to make products such as Office and WindowsLive when we use the time do not need to spend much time to learn, this illustrates these products already have a good user experience. Improve the user experience is the goal of our research products, and product availability is our research and development product index, it is just as important as the stability of the system, only reached the availability index products to market.Microsoft Research Asia in human-computer interaction psychological latest research-based products is the enterprise internal social network they want enterprise social network can give employees better information-sharing platform to improve the efficiency of operation.
The enterprise social network actually lowers the threshold for people to communicate, if used properly can increase informal communication among employees, improve overall productivity.--------------------------------------------------------------------------------------
Related Bowen:
New natural user interface
More natural interaction — 2010 Microsoft technology festival scene express
Russia square with his tongue Florida
P>Let PC read your figure
No comments:
Post a Comment