Computerworld - Increased computing power doesn't just make for better graphics; it opens up new ways to interact with computers. Take, for example, the EyePoint system developed by Stanford University doctoral researcher Manu Kumar.
EyePoint uses a four-step process that incorporates a user's hands and eyes to increase accuracy and eliminate the false positives that come from using eye movements alone. Plus, the technique brings a more natural way of interaction to a broader band of users.
"Using gaze-based interaction techniques makes the system appear to be more intelligent and intuitive to use," says Kumar. "Several users have reported that it often felt like the system was reading their mind."
Makes eye-gaze a viable alternative to the mouse for everyday pointing and selection tasks, like Web surfing.
Here's how it works: While looking at a screen, the user presses a hot key on the keyboard, magnifying the area being viewed. The user then looks at the link within the enlarged area and releases the hot key, thereby activating the link.
Eye tracking, which has been around for decades, typically uses infrared devices embedded into a headset or a monitor frame. The devices track the centers of the user's pupils and then calculate which part of the screen the user is viewing.
This method, however, has been plagued by errors, limiting its use primarily to people with disabilities that prevent using a keyboard and mouse.
Eye trackers are accurate to about 1 degree of visual angle. When looking at a 1,280-by-1,024-pixel, 96-dpi screen at a distance of 20 inches, this equates to a 33-pixel spread in any direction from where the user is looking. That's not accurate enough to pinpoint a link.
"What is really exciting is that the processing power of today's computers is completely changing the kinds of things we can use for computer interfaces," says Ted Selker, associate professor at the MIT Media and Arts Technology Laboratory and director of the Context Aware Computing Lab. "Things like eye tracking are using channels of communication that literally were unavailable to interface designers even five years ago."
"[Kumar's] approach -- using eye movement in a subtle, lightweight way, rather than as a direct mouse substitute -- is exactly the right way to go," says Robert Jacob, a professor of computer science at Tufts University in Medford, Mass.
Selker says eye tracking might become a standard computer interface within the next five years. For now, the primary obstacle is the high cost of eye-tracking hardware, although mass adoption of the technology would drive those costs down.
Next cutting-edge technology >
Read more about Applications in Computerworld's Applications Topic Center.
- What is this "File Sync" Thing and Why Should I Care About It? All of a sudden, getting a file from your work laptop to your iPad became as simple as clicking "Save." So it's no...
- The 5 Big Lies About Going Mobile You've heard about the power of mobile to change your business. But have you realized your mobile potential? It's about much more than...
- BYOP: How Mobile and Social Are Creating New User Personas The digital world of mobile + social creates new customer segments and behaviors. Companies need to reorient their customer interactions around these segments...
- Software Asset Management: Ensuring Today's Assets Today's trends like BYOD and SaaS are new and exciting in terms of how they will help make our jobs more productive but...
- Why do you need an enterprise mobile platform? Today companies must offer great apps that run on a range of devices, and connect to an exploding set of backend data. Appcelerator...
- Technology for Everyone A Kansas school district modernizes teaching and learning and paves the way to a one-to-one program with a comprehensive upgrade of its wireless... All Mobile/Wireless White Papers | Webcasts