Google bringing vision to robots that need touch

Google and health-care giant Johnson & Johnson have teamed to make robots better at helping surgeons make big changes with small incisions and as little damage as possible.

Details about what each will contribute to the partnership – or even what the goal is – are purposely vague, but it appears that Google's contribution will be things it is already good at, not the kind of really new capabilities robot-assisted surgical systems actually need.

The deal, announced Friday with Johnson & Johnson (J&J) subsidiary Ethicon had no details, but did say the two would develop a surgical platform that sounded a lot like it would compete with that of Intuitive Surgical, Inc., whose da Vinci surgical-assist systems enjoy a surprisingly dismal dominance of the surgical robotics market the company was instrumental in creating.

The goal of most robotic-assist systems is to apply to ever-more-complex procedures the same minimally invasive surgical techniques that let pro athletes get their knees reassembled. Pros get back on the field in half the time it used to take because surgeries are done using a tiny incision, tiny tools and a tiny camera that shows surgeons what they're doing on big monitors.

Operating with tiny tools using cameras that can throw off your perspective can be awkward, however -- like playing piano by tapping keys with a fishing pole in each hand.

So, when robotic-assist machines like Intuitive's da Vinci Systems showed up in 2000, offering to take some of the strain off surgeons with power-assisted, remote-controlled, very, very tiny instruments attached to the end of a pencil-thin probe with an HD camera on the end – sales took off.

Robots don’t actually do anything on their own. They're remote-controlled instruments with very fine motor controls that give surgeons more control than they might have had using traditional instruments, but at the cost of not actually being able to feel what they're doing – just see it on the screen.

That lack of tactile feedback has been blamed for complications, including one in which the surgeon had to abandon the robot and slice open the patient's chest. The surgeon accidentally sliced the patient's aorta because the instrument at the end of a robotic, remote-controlled probe was too large for the space involved and the surgeon could only see what was going on, not feel identify the feel of the aorta, or gauge resistance well enough to know how hard to cut. 

A 2013 Johns Hopkins study found that surgeons and hospitals underreported the number of complications and deaths during robot-assisted surgeries. Other studies suggested that a lot of the problems – most of the deaths were due to excessive bleeding – might have been caused by surgeons who had learned to rely on a sense of touch making mistakes without that level of feedback when they were using robots to operate. Both the complications and tendency to overlook the robot's involvement were problems serisous enough that The American College of Surgeons published a paper questioning the future of robot-assisted surgery.

The Food and Drug Administration has scheduled an industry tete-a-tete in July about the safety and effectiveness of robotically assisted surgical devices, a conference intended to set the direction for future investigation and regulation of robot-assisted surgery, focusing on both safety and cost effectiveness. No one is saying how much big a role touch will play.

People have been building haptic feedback into robot-controlled instruments since at least 2006, and the pace has picked up since the Johns Hopkins report.

Haptic feedback is rare among the more than 1,400 robot-assisted surgical devices installed at U.S. hospitals, however.

And it looks as if the Ethicon/Google partnership isn't going to spend much time on it.

Ethicon doesn't sell robotic surgical systems right now. It but does license technology from Intuitive that cuts, sews, staples and manipulates tissue.

J&J's own focus in robot-assisted surgery is, essentially, in making the systems smaller, cheaper to own and use and easier to manage as well as increasing their functionality, according to a FierceMedicalDevices story quoting the head of the company's Global Surgery Group.

Google, which has been buying up robotics companies left and right, and which has extensive research and development experience in robotics, artificial intelligence, autonomous vehicles and a range of other relevant technologies, appears to be focused primarily on vision – which appears to be a strong point of existing systems – rather than the other senses.

Google Life Sciences officials who talked to the Wall Street Journal hinted that they wanted to expand real-time image analysis so surgeons could see the edges of nerves or tumors, add more sensors to tools, consolidate visual information so they're looking at one screen rather than several to see images from MRIs, cameras and other systems.

They didn't talk about Google's own patents in haptic feedback, or adapting haptic tech already being built into surgical systems in research labs, or super-fine motor feedback that makes robotic hand controllers feel like real hands, let alone futuristic discoveries like how to use ultrasound technology to create a virtual-reality sense of touch in thin air – technology simple enough that Disney has done a version of it to create the sensation of contact in thin air using cheap speakers stuck in a tube.

Focusing on machine vision, image analysis and visual-data management all play to Google's strengths, but they don't do much about the weaknesses of the surgical-assisting robots, though.

It just seems as if Google would have a bigger impact if it could help Ethicon find ways to let surgeons know whether they're about to make a precise incision to insert a tiny stent, or slash a half-inch hole in the patient's aorta, rather than improve the quality of a picture they can already see.

Copyright © 2015 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon