‘Imaginary’ interface could replace real thing

Researchers are experimenting with a new interface system for mobile devices that could replace the screen and even the keyboard with gestures supported by our visual memory.

Called Imaginary Interfaces, the German project uses a small, chest-mounted computer and camera to detect hand movements. Unlike Tony Stark in "Iron Man," who manipulates holographic elements in his lab with his hands, users conjure up their own imaginary set of graphical interfaces. For example, people can manually draw shapes and select points in space that have programmed functions, such as a power switch or a "send" key, for example.



This interface could allow people to use gestures during phone calls, much as they do in face-to-face conversations, while eliminating traditional hardware elements.

"We definitely envision a system like this replacing all input for mobile devices," said Sean Gustafson, a research student at the Hasso Plattner Institute at Potsdam University in Germany and lead author of an upcoming study on the Imaginary Interfaces concept.

Button-pushers, screen watchers
The standard way one operates a cell phone or a computer, of course, involves using a touchpad, mouse or buttons to select options electronically displayed on a screen.

These devices cannot get any smaller really, Gustafson and his co-authors contend, because screens and buttons require a minimum size to remain viewable, touchable and hence usable.

Many attempts to advance beyond keyboards and mice have focused on gestures.
Yet these gesture-based interfaces have still relied on some sort of "real" visual reference, meaning one that other people can see and that does not exist solely in a user's mind: Think of "Minority Report"-style screens that people manipulate rather like conductors of an orchestra, or gaming on a Nintendo Wii.

In place of the screens found in these setups, some interface concepts use head-mounted projectors that display imagery on a wall or a hand, say, in order to provide a frame of reference. Sixth Sense, a project out of MIT, and Brainy Hand from the University of Tokyo are two such examples.

With Imaginary Interfaces, however, there is nothing to see; short-term visual memory instead serves as the reference, and like mimes, people can mentally record and "touch" these make-believe elements.

"People are able to interact spatially without seeing what it is that they create," said Patrick Baudisch, a professor of computer science at the Hasso Plattner Institute and Gustafson's teacher.

The un-imaginary device
In generating a virtual reality interface, the Imaginary Interfaces device combines a camera and a computer to see and then interpret gestures.

The device for now is about 2 inches by 2 inches square and attaches to the clothes on a user's chest. Its makers envision shrinking it down to the size of an unobtrusive button.

A ring of light emitting diodes (LEDs) around the camera beams out invisible infrared light. The camera sees this light reflected by the nearby gesturing hands but the distant background does not get illuminated.

To operate Imaginary Interfaces, people use two basic commands. Making an 'L' shape with one's non-dominant hand (typically the left) 'opens up' a two-dimensional plane where the finger tracing interaction will take place; the L acts as the lower left corner of the plane in this example.

Users can 'pinch' with the dominant hand to select a point in space on this plane that can serve a function. As an easy frame of reference, a grid can be visualized based on the lengths of the finger and thumb in the L gesture as a 'Y' and 'X' coordinate, respectively. Pinching at approximately 3, 2 — or three finger-lengths up and two thumb-lengths over — could press a virtual button.

Other more sophisticated methods of interfacing via one's imagination are in the works. "We are exploring how users can sketch interfaces, then use them," said Baudisch. "It has a cartoony quality to it."

Such a "draw your own interface" would have advantages, Gustafson said. "If the user places the user interface elements themselves then they will remember — visually and proprioceptually – the location for later use," he said. (Proprioception refers to the sense of our body parts and their relation to one another in space.) "If they ever forget the location, they can just redraw it."

Applications easy to imagine
This ability to create simple sketches on the fly opens up a range of new application scenarios, and could make phone conversations more like person-to-person interactions that often involve gestures.
"I would love to reclaim the hand gestures that are missing from normal telephone conversations," Gustafson told TechNewsDaily.

"We use our hands in conversation to, amongst other things, transmit spatial information that is hard to get across otherwise," Gustafson continued. "For example, driving directions and the like are much easier to get across with some simple hand movements. It would be wonderful to re-enable this communication channel for telephone conversations."

Imaginary Interfaces still needs work. The infrared detection of hand gestures by the camera does not function well outdoors in sunlight, for example.

And Imaginary Interfaces would not be detailed or precise enough for engineering schematics or the like. "Architectural drawings require a large amount of precision that is probably not possible without a high-res input and output channel," said Gustafson. "I like to think this system is best for 'napkin drawings' — simple visual representations of ideas that aid conversation."

A study about Imaginary Interfaces that includes several user trials will be presented at the 23rd Symposium of User Interface Software and Technology held by the Association for Computing Machinery in New York this coming October.

0 comments:

Post a Comment

 
Copyright © Design Your Dream!!