CES: IBM, Emotiv show advances in virtual reality worlds
Technology allows users to control an avatar using brain signals transmitted wirelessly to a PC
Computerworld - LAS VEGAS -- Hundreds of products at the International Consumer Electronics Show (CES) here are devoted to new ways to input data to a PC or gaming console, including a variety of inputs via voice commands or gestures that are registered via video detection.
But another way demonstrated at CES is the ability to wirelessly transmit the brain's electronic signals, including emotions and cognitions, from sensors on a person's head to a PC.
Emotiv Systems Inc., an IBM partner, demonstrated an alpha version of a neural input device that it plans to unveil as a consumer product at the Game Developers Conference in San Francisco next month.
Emotiv's working product name is the Emotiv Headset, which could sell for $200 to $300, similar to the cost of a high-end handheld game controller, said Patrick McGill, a spokesman for the San Francisco-based start-up.
The alpha version includes about a dozen sensors that pick up a brain's signals, which are transmitted via a 2.4-GHz wireless signal, said Emotiv product engineer Marco Della Torre. He demonstrated the alpha version while wearing the sensors that picked up his eye movements, eye blinks, smiles and frowns, which were shown the PC and a large display at the Emotiv booth. Each facial gesture was quickly and accurately recorded on a large graphical representation of a face on the display.
In addition to the simpler facial expressions, Della Torre was able to transmit the brain's affective impulses, such as calm or excited (which involves a group of facial movements) and even cognitions. The cognitions (conscious control) that Della Torre demonstrated were the ability to make an animated cube on the display move up or down or spin in space. He was able to train the software to interpret the cognition in less than 20 seconds.
While such capabilities might seem rudimentary, the control of the animated cubicle could eventually be extended to "think" whether an avatar in a virtual world should gesture with face or hands, shake someone's hand, or even throw a ball, Della Torre said. By comparison, in Second Life, many controls of an avatar are now possible, including facial expressions and walking and even flying, but all must be input via a keyboard.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- 4 Customers who never have to refresh their PCs again This paper illustrates a common theme: the combination of desktop virtualization and thin client computing helps organizations deliver an up-to-date user experience more...
- Mobile Devices: The New Thin Clients Get essential guidance for understanding the role thin clients plus virtual desktops play in the enterprise today.
- Taking Windows Mobile on Any Device Taking Windows applications mobile has many advantages, but the process of identifying a solution is complex. Learn how to solve this complex problem...
- PaaS - Powering a New Era of Business IT Why PaaS has suddenly become relevant and irresistible to many organizations. Dive into the opportunities and considerations associated with using PaaS from an...
- Redefine Your IT Operations: Remote Office IT Has Never Been Simpler Join us to see why PC Pro named Dell PowerEdge VRTX the "2013 Server of the Year." PowerEdge VRTX may be just what...
- Mobile Apps and Devices Slash Customer Cycle Time Consolidated Engineering Laboratories' field employees used to collect data on triplicate forms that were sometimes hard to read and difficult to manage. After... All Hardware White Papers | Webcasts