As researchers at Worcester Polytechnic Institute (WPI) make progress in efforts to give their humanoid robot more autonomy, they're getting ready to get their hands on an upgrade robot this fall.
The robotics team that advanced into the finals of the DARPA robotics challenge has been working 50 to 60 hours a week getting ready for that last round of competition, which is set for next June 5 and 6 in Pomona, Calif.
The teams are working to put their robots through a series of tasks -- climbing ladders, walking over rubble, opening doors and driving cars -- designed to ultimately lead to robots capable of working with humans after natural or man-made disasters.
The 11 finalists, who are competing for a $2 million prize, include teams from WPI, MIT, Virginia Tech and NASA's Jet Propulsion Laboratory.
The WPI robot -- a six-foot, 330-lb. humanoid device that the researchers have named Warner -- is an Atlas robot from Boston Dynamics. Several other teams in the DARPA competition also use Atlas robots, and they all will have to do without them for the month of October.
Matt DeDonato, the WPI team's technical project manager, told Computerworld that Boston Dynamics will take its robots back at the beginning of October and spend the month upgrading them from the knees up.
"It won't look much different, but a lot of the systems will change," said DeDonato. "We'll be getting two to three computers onboard the robot. Now, they're mostly off-board."
As of now, there's just one processor inside the robot. Most of the computing is done by off-board computers with data sent back and forth on a fiber optic cable tethered to the robot.
In the final challenge, there will be no fiber optic tether; the roboticists will go from having a 10Gbps cabled link to talk to the robot to a wireless link that only gives them about 300Mbps.
With added onboard processors, the robot will be doing more of the calculations and decision-making itself, so there won't be as much need to send information back and forth between the machine and its handlers.
"Now, we have to migrate our software from the off-board computers to the onboard computers," said DeDonato. "The software was designed for human control. Now we'll restructure how the data flows through the system and what talks to what. The way we talk to the robot will have to be rethought."
He added that the team is considering putting all of the robot's balancing and standing algorithms on one onboard processor. "That way, the balancing can run at real time and not be bogged down with other systems taking up resources," said DeDonato.
The team may dedicate another onboard processor to the robot's vision, because that function is so computationally intensive.