The Education of a Surgical Robot

Autonomous surgical robots can perform remotely, even when communications with a remote surgeon are poor.

by Jeff O'Heir
April 10, 2017

The RAVEN remote surgical robot, now 15 years old, is still a teenager: It has muscle and brains, but is still learning how to apply them. Its developers are helping by teaching RAVEN to think for itself, so it can overcome communications issues that have kept long-distance surgical robots from entering the mainstream.

The advanced training also improves RAVEN's accuracy and its ability to work with other surgeons.

"This is the first time in our life that we have a surgical tool that makes us better surgeons," said Douglas Boyd, a cardiothoracic surgeon and director of robotics and biosurgery at the University of California, Davis, who has contributed to RAVEN's ongoing development, which has been the subject of recent research in ASME's Journal of Mechanisms and Robotics. "This is revolutionary."

RAVEN is an open-system medical research robot marketed by Seattle-based Applied Dexterity. Globally, 18 labs use the robot to develop new surgical robot applications. It has become the go-to platform for long-distance surgical research.

Telesurgery would enable a surgeon sitting at a system console in Boston to use the four-armed robot to remove a tumor in Bulgaria, treat a soldier on a military base, or respond to an accident in space.

In fact, NASA is testing RAVEN as part of its Extreme Environment Mission Operations program in the Aquarius undersea research facility, which simulates the isolation of space. Applied Dexterity recently applied for a grant to test RAVEN aboard a NASA space mission. 

Communications Breakdown

Before RAVEN spreads its wings, however, it must improve its ability to communicate on the ground.

Today, a lead surgeon performing a remote operation would view a live video feed from RAVEN's cameras. He or she would also receive haptic feedback from the robot's arms to judge the resistance when using a scalpel or needle.

Ideally, the surgeon should receive the video information half a second before the haptic feedback. This gives him or her time to think about the next move, reach a decision, and execute it.

That half second is critical. Tissue and muscle move constantly and change shape as they are manipulated during an operation. If the video and haptic information are not aligned, the surgeon could cut the wrong spot or nick an artery.

"If you're manipulating a robotic arm and there's lag time, you can put arm through an organ or harm the patient in other ways," said Boyd, who completed the first robotic closed-chest, beating-heart bypass surgery in 1999.

A lag as short as several milliseconds can also cause mechanical instabilities in the robotic system, severely complicating the surgery.

Unfortunately, the broadband and satellite networks that link the surgeon to the robot often have time delays. While network service providers continue to reduce the latency, spikes in Internet traffic, storm damage, or electrical outages may cause latency to fluctuate during the day.

Since RAVEN's developers cannot eliminate latency, they must work around it. They hope to do this my teaching RAVEN to perform routine tasks autonomously. Allowing more local control reduces the need for broadband communications and real-time human oversight, while enabling the surgeon to focus on more important surgical tasks.

Apprenticeship Learning

"The philosophy is that humans are good at making decisions, but not as good at repetitive tasks," said Jacob Rosen, a professor of mechanical and aerospace engineering who heads UCLA's Bionics Lab.

"With automation, you bring the [robot] surgeons into a position where they are becoming more of a decision maker, opposed to just moving a scalpel," said Rosen, who cofounded Applied Dexterity and helped design the original RAVEN robot.

To give RAVEN more decision-making powers, the developers turned to apprenticeship learning and supervisory control.

Apprenticeship learning involves learning from an expert. Researchers guided RAVEN through the same motions as highly rated surgeons performing specific tasks. RAVEN then used a machine-learning algorithm to smooth out those motions into an ideal trajectory and set of motions. Ultimately, the developers hope it duplicate those trajectories, and do it faster than humanly possible.

So far, RAVEN has developed idealized paths for suturing, tying knots, cutting, and dissecting tissue. Its range of movements included reach and orient; grasp, hold and cut; push; pull; and release.

The algorithms also include parameters for each surgical procedure. If RAVEN is suturing a patient, for example, the algorithms will guide the angles, entry points, and exit points of the needle to minimize tissue damage while performing the task quickly. This information is embedded locally, so RAVEN can use it even when communications are poor.

The surgeon retains supervisory control, engaging these subtasks through a combination of voice recognition, advance motion control, and custom user interfaces. When operating autonomously, RAVEN can emulate a team of doctors and nurses in the surgical suite.

"Surgical teams that have worked together for years have almost a speechless process," Rosen said. "Nurses contribute to the next step by handing the right tool to a surgeon before he even asks for it. We want to create a situation where the assistant as an artificial agent anticipates the things you would do as the lead surgeon."

By providing RAVEN with greater autonomy, its developers hope to see RAVEN enter adulthood as a valued member of the surgical team in remote locations around the world and in space.


Jeff O’Heir is a science and technology writer based in Huntington, New York.