Give your computer the finger: Touch-screen tech is coming of age

The WIMP human-computer interface may have an uninspiring name, but Windows, Icons, Menus and Pointing devices have dominated computing for some 15 years. The keyboard, mouse and display screen have served users extraordinarily well.

But now the hegemony of WIMP may be coming to an end, say developers of technologies based on human touch and gesture. For evidence, look no further than Apple's one-year-old iPhone. From a human-interface point of view, the combined display and input capabilities of the iPhone's screen, which can be manipulated by multiple fingers in a variety of intuitive touches and gestures, is nothing short of revolutionary, researchers say.

The iPhone isn't the only commercial device to take the human-computer interface to a new level. The Microsoft Surface computer puts input and output devices in a large, table-top device that can accommodate touches and gestures and even recognize physical objects laid on it. And the DiamondTouch Table from Mitsubishi Electric Research Laboratories is a touch- and gesture-activated display that supports small-group collaboration. It can even tell who is touching it.

These devices point the way toward an upcoming era of more natural and intuitive interaction between human and machine. Robert Jacob, a computer science professor at Tufts University in Medford, Mass., says touch is just one component of a booming field of research on "post-WIMP interfaces," a broad coalition of technologies he calls "reality-based interaction."

Those technologies include virtual reality, context-aware computing, perceptual and affective computing and tangible interaction -- in which physical objects are recognized directly by a computer. This ascendance of reality-based interaction is driven by four "real-world themes," he says -- naïve physics, body awareness, environmental awareness, and social awareness. (See the sidebar "Getting real.")

"What's similar about all these interfaces is that they are more like the real world," Jacob says. For example, the iPhone "uses gestures you know how to do right away," such as touching two fingers to an image or application, then pulling them apart to zoom in or pinching them together to zoom out. (These actions have also found their way into the iPod Touch and the track pad of the new MacBook Air.)

"Just think of the brain cells you don't have to devote to remembering the syntax of the user interface. You can devote those brain cells to the job you are trying to do," Jacob adds. In particular, he says, the ability of the iPhone to handle multiple touches at once is a huge leap past the single-touch technology that dominates in traditional touch applications such as ATM machines.

The long nose


Although most people hadn't heard of multi-touch until the iPhone's debut last year, Microsoft researcher Bill Buxton says he was experimenting with multi-touch computer technology at the University of Toronto in the early 1980s. Bill Buxton demonstrates "The Active Desk," developed in 1992 at the University of Toronto.

Indeed, he says, touch technology may be following a path similar to that of the mouse, which was co-invented by Douglas Engelbart in 1965 but did not reach a critical mass until the introduction of Windows 95 some 30 years later. Buxton calls these decades-long ramp-ups the "long nose of innovation," and he says they are surprisingly common.

Touch now may be where the mouse was in about 1983, Buxton says. "People now understand there is something interesting here that's different. But I don't think we yet know what that difference could lead to. Until just one or two years ago there was a real separation between input devices and output devices. A display was a display and a mouse was a mouse."

But now, he says, the idea that a screen can be bidirectional is on the cusp of catching on. "So now not only can my eye see the pixels, but the pixels can see my finger."

1 2 3 4 Page 1
Page 1 of 4
It’s time to break the ChatGPT habit
Shop Tech Products at Amazon