I, Coach

Someday, robots will do more than vacuum your floors. They'll train you and advise you -- and maybe even help out with the cooking.

Takeo Kanade is a roboticist, but his work extends far beyond the C3PO-like humanoids that often come to mind when one thinks of robots. He has been a pioneer in computer vision, smart sensors, autonomous land and air vehicles, and medical robotics. Kanade, a professor of computer science and robotics at Carnegie Mellon University in Pittsburgh, recently told Computerworld that peoples notions of what robots can and should do will change. Robots will serve as coaches and advisers, not so much replacing human labor as enhancing it.

Whats coming in human-computer inter­faces? The trend toward computer vision is clear, and it will accelerate. In 10 years, I wouldnt be surprised to see computers recognizing certain levels of emotions, expressions, gestures and behaviors, all through vision.

What is the quality-of-life technology that you are working on?

Takeo Kanade

Takeo KanadeIntelligent systems [that incorporate robotics] have been developed over the past couple of decades, mostly for military, space exploration, hazardous environments and entertainment. I think they can do better for our daily lives, especially for older people and people with disabilities. The systems range from small devices that you carry to small mobile robots to the whole environment home, streets, the community. The most important thing is for these systems to understand what the human wants to do, or is about to do, and then help them accomplish those tasks.

How could a computer know someones intent? What Im advocating right now is what I call inside-out vision. We think of putting cameras in the environment to observe you, which I call outside-in vision. But people dont like being observed. And, technically speaking, its difficult, because its important to know what you are looking at in order to know what you are trying to do, and the things you are looking at up close tend to be occluded by your body.

So the idea is to do it the other way and put sensors on you, looking out. They could be small cameras and maybe sound recorders. The computer then should be able to recognize objects that you are looking at, like a door you are approaching. [Because] the computer is looking from your viewpoint, it can understand what you are trying to do. We can put 1TB of memory on a small device with all the images of your home and neighborhood, all the places you tend to go and the routes you drive.

Where else might computer vision be applied? Imagine you train a person how to assemble a product. A trainers job is to observe the trainees actions, point out errors and show the right way. The job requires a lot of attention and patience, and that kind of one-on-one training is expensive. I can imagine the computer looking at what the trainee is doing and then giving some advice so the trainer can actually train more than one or two people. So the computer becomes a job coach.

Some people forget the names of their friends and relations, and they are so embarrassed about that that they avoid going out, and that actually accelerates the progression of the problem. But maybe the system could, as a kind of social coach, understand that this is a person you are supposed to say a greeting to and then tell you, This is so-and-so; you may want to talk to her. You can imagine all kinds of things, all the way from those coaches to the system that understands where you want to go and drives your wheelchair.

Whats next for robotics in manufacturing? Humans have very excellent capabilities for evaluating some things, and for dexterity and speed. However, many car companies, for example, are beginning to realize that although the human does the assembly task very fast, there is still tedious work, like, Bring that thing from the place its stored to the place its assembled, or, Move the tool from one place to the other, and make sure each step has been done correctly and nothing forgotten. So maybe the robot is becoming more of a co-worker to help increase the quality that the human provides indeed, to help the human work efficiently, reliably and economically. Many companies are thinking along the lines of, Dont replace human labor; enhance it. Its consistent with the quality-of-life technologies.

Mitsubishi Heavy Industries' WAKAMARU robot is designed to provide entertainment and information.  Here, a boy plays rock-paper-scissors with a Wakamaru in a Tokyo department store.

Mitsubishi Heavy Industries' WAKAMARU robot is designed to provide entertainment and information. Here, a boy plays rock-paper-scissors with a Wakamaru in a Tokyo department store.

For decades, science fiction writers have imagined homes populated with mobile robotic servants. Will we ever see that? I think so, but it may not be in the form of humanoids. They could be lightweight robots for a mixture of entertainment, information-providing and some mobility assistance, like Wakamaru [from Mitsubishi Heavy Industries]. But the home itself can be a robot, with many embedded sensors, possibly working with sensors attached to you. The home knows what a person is trying to do, like cooking, and helps.

Whats the biggest challenge in developing these home robots and the quality-of-life robots? One, their capability to recognize human needs. Why are human caregivers so good? Because they understand what people want. The second thing is safety. How can we make the robot soft? Soft in two senses. One, in the physical sense; a human is a reasonably soft device, but very fast. And our control is soft, in the sense that we are failure-safe. When we make a mistake, most of the time it is not disastrous. We can at the last moment avoid disaster. Can the robot do the same? An automatic door is a robot, but the current model is dumb. It opens when you want to look outside to see if a taxi is there but does not open fast enough when you approach it in a hurry.

Related:

Copyright © 2007 IDG Communications, Inc.

  
Shop Tech Products at Amazon