Skip the navigation

New computer interfaces challenge touch screens

Innovative interfaces offer flexibility in device interaction

By Nick Barber
November 15, 2011 01:07 PM ET

IDG News Service - Touch screens could be extinct if researchers pioneering new human-computer interfaces have anything to say about it. From brain-controlled machines to gesture-driven devices, there's a range of technologies in development that may find their way into everyday electronic devices.

Several conferences this year have given a great glimpse into innovative interfaces and what the future may hold.

Touchscreens are somewhat limited in giving feedback to a user. The screen may vibrate when tapped, but that's just about all it can do. At this year's Computer Human Interaction (CHI) conference in Vancouver, British Columbia, in May, a researcher showed a way to completely change the feeling of a screen, at times making it slippery and other times making it sticky.

The prototype screen has four actuators that shake the screen.

"This is actually the same technology used in many cell phones or other devices, but it runs at a higher frequency so you don't feel the vibration itself," said Vincent Levesque, a post-doctoral fellow from the University of British Columbia. "It pushes your finger away from the piece of glass, a bit like an air hockey table."

Levesque's team had a demonstration set up with basic file folders on screen. When a folder is selected the screen becomes slippery. When it is dragged over another folder or the trash, the screen became sticky.

The prototype occupied a sizeable section of the table on which it sat. Wires protruded and circuit boards were visible, making it too bulky to integrate into any mobile devices. The system uses lasers to determine the position of the finger. As the team continues work on the project, it hopes to reduce the system's size and replace the lasers with a capacitive touchscreen.

At the CHI conference, university students and research groups came up with most of the projects on display and shared them with potential employers who could license the technology and invest in developing it.

Texas A&M University's Interface Ecology Lab favored gestures over touch, creating a gesture-controlled system called ZeroTouch. It looks like an empty picture frame and the edges are lined with a total of 256 infrared sensors pointing toward the center. The frame is connected to a computer and the computer to a digital projector.

"I like to consider it an optical force field," said Jonathan Moeller, a research assistant in the lab.

When the spiderweb of light created by the sensors is broken, the computer interprets the size and depth of the break and displays it as a brushstroke. If just a pencil breaks the beam, the brushstroke will be thin. If an entire arm or head breaks the beam, the stroke will be thick.

Reprinted with permission from IDG.net. Story copyright 2014 International Data Group. All rights reserved.
Our Commenting Policies