Computerworld - Increased computing power doesn't just make for better graphics; it opens up new ways to interact with computers. Take, for example, the EyePoint system developed by Stanford University doctoral researcher Manu Kumar.
EyePoint uses a four-step process that incorporates a user's hands and eyes to increase accuracy and eliminate the false positives that come from using eye movements alone. Plus, the technique brings a more natural way of interaction to a broader band of users.
"Using gaze-based interaction techniques makes the system appear to be more intelligent and intuitive to use," says Kumar. "Several users have reported that it often felt like the system was reading their mind."
Makes eye-gaze a viable alternative to the mouse for everyday pointing and selection tasks, like Web surfing.
Here's how it works: While looking at a screen, the user presses a hot key on the keyboard, magnifying the area being viewed. The user then looks at the link within the enlarged area and releases the hot key, thereby activating the link.
Eye tracking, which has been around for decades, typically uses infrared devices embedded into a headset or a monitor frame. The devices track the centers of the user's pupils and then calculate which part of the screen the user is viewing.
This method, however, has been plagued by errors, limiting its use primarily to people with disabilities that prevent using a keyboard and mouse.
Eye trackers are accurate to about 1 degree of visual angle. When looking at a 1,280-by-1,024-pixel, 96-dpi screen at a distance of 20 inches, this equates to a 33-pixel spread in any direction from where the user is looking. That's not accurate enough to pinpoint a link.
"What is really exciting is that the processing power of today's computers is completely changing the kinds of things we can use for computer interfaces," says Ted Selker, associate professor at the MIT Media and Arts Technology Laboratory and director of the Context Aware Computing Lab. "Things like eye tracking are using channels of communication that literally were unavailable to interface designers even five years ago."
"[Kumar's] approach -- using eye movement in a subtle, lightweight way, rather than as a direct mouse substitute -- is exactly the right way to go," says Robert Jacob, a professor of computer science at Tufts University in Medford, Mass.
Selker says eye tracking might become a standard computer interface within the next five years. For now, the primary obstacle is the high cost of eye-tracking hardware, although mass adoption of the technology would drive those costs down.
Next cutting-edge technology >
Read more about Applications in Computerworld's Applications Topic Center.
- Securing Mobile App Data - Comparing Containers and App Wrappers Analysts agree that Mobile Device Management (MDM) is not enough when it comes to securing app data. Although it remains a critical component...
- 3G/4G Digital Signage Guide Today, the widespread availability of 3G and 4G cellular or wireless broadband networks enables digital signage to be deployed virtually anywhere.
- Enterprises in Motion: In-Vehicle Networks In a world where traditional tethers to the central office have all but vanished, enterprises that operate vehicle fleets require constant and dependable...
- Uninterrupted Internet: Maximizing Revenue and Minimizing Business Risk with 3G/4G Failover Whether your businesses are connected to the Internet via T1/T3, or Cable, incorporating a mobile broadband backup solution adds uptime whenever there is...
- Don't Believe the Hype: Not All Containers are Created Equal Hear executives discuss the 3 C's of Secure Mobility-content, credentials, and configurations-and learn the inherent security risks to your organization of using MDM...
- Navigating the New Wireless Landscape Thriving in the new wireless landscape View Now>> All Mobile/Wireless White Papers | Webcasts