Sitting on a stool a few feet from the armed robot, Dr. Danyal Fer wrapped his fingers around two metal handles near his chest.
When moving the handles – up and down, left and right – the robot mimics every small movement with its own two arms. Then, when he puts his thumb and index finger together, one of the robot’s tiny claws does the same. This is how surgeons like Dr. Fer have long used robots when operating on patients. They can remove the prostate gland from the patient while sitting on a computer console in the room.
But after this brief demonstration, Dr. Fer and his fellow researchers at the University of California, Berkeley, showed how they hope to advance the level of the arts. Dr. Fer let go of the handle and a new kind of computer software took over. As he and other researchers looked at, the robot began to move entirely on its own.
With one claw, the machine lifts a small plastic ring from an equally small pin on the table, moves the ring from one claw to another, moves it across the table and carefully hooks it to a new peg. The robot then does the same with more rounds, completing the task as quickly as it was instructed by Dr. Fer.
Training exercises originally designed for humans; Moving rings from pin to pin is how surgeons learn to operate robots like robots in Berkeley. Now, an automated robot that performs the test can match or even surpass humans in dexterity, accuracy, and speed, according to a new research paper from the Berkeley group.
The project is part of a much broader effort to bring artificial intelligence into the operating room. Using many of the same technologies as the foundation for self-driving cars, self-driving drones and warehouse robots, researchers are also working to automate surgical robots. These methods are still far from being used every day, but progress is accelerating.
“It was an exciting time,” said Russell Taylor, a professor at Johns Hopkins University and a former IBM researcher, known in academia as the father of robotic surgery. “That’s where I hope we will be 20 years ago.”
The aim is not to get surgeons out of the operating room, but to offload them and possibly even improve success rates – where there is room for improvement – by automating surgical stages. specific technique.
Robots have been able to exceed human accuracy in a number of surgical tasks, such as placing a pin in a bone (a particularly risky task during a knee and hip replacement). Hopefully autonomous robots can bring more precision to other tasks, like slitting or stitching, and reduce the risk of happening to overworked surgeons.
In a recent phone call, Greg Hager, a computer scientist at Johns Hopkins, said that surgical automation would progress just as Autopilot software was guiding his Tesla down to the New Jersey Turnpike as he he said. He said that the car was driving itself, but his wife was still holding the wheel, so something happened. And she will take over when it’s time to get off the highway.
“We cannot automate the entire process, at least without human supervision,” he said. “But we can start building automation tools that make a surgeon’s life a little easier.”
Five years ago, researchers at the National Children’s Health System in Washington, DC, designed a robot that can automatically stitch the intestines of a pig during surgery. It’s a remarkable step forward for the kind of future Dr. Hager envisioned. But it comes with an asterisk: The researchers implanted microscopic markers into the pig’s gut to emit near infrared light and help guide the robot’s movements.
This method is not suitable for practice, as the markers are not easily implanted or removed. But in recent years, artificial intelligence researchers have dramatically improved the power of computer vision, which could allow robots to perform surgical tasks on their own without the need for markers such as so.
Change is driven by what is known as a neural network, a mathematical system that can learn skills by analyzing large amounts of data. By analyzing thousands of pictures of cats, for example, a neural network can learn to recognize a cat. In a similar way, a neural network can learn from images taken by surgical robots.
Surgical robot is equipped with a camera that records three-dimensional video of each surgery. The video is transmitted into a viewfinder that the surgeon looks at while instructing the operation, viewed from the robot’s point of view.
But then, these images also provide a detailed road map showing how the surgeries are performed. They can help new surgeons understand how to use these robots, and they can help train the robot to handle jobs on its own. By analyzing images that show how surgeons guide robots, a neural network can learn similar skills.
This is how Berkeley researchers are working to automate their robots, based on the da Vinci Surgical System, a two-handed machine that helps surgeons perform over a million surgeries a year. . Dr. Fer and his colleagues collected images of robots moving plastic rings while under human control. Their system then learns from these images, determines the best way to take rings, move them between nails, and move them to new pegs.
But this process comes with its own asterisk. When the system notifies the robot of where to move, the robot often misses that position by millimeters. Over the months and years of use, many of the metal cables inside the robot’s twin arms have been stretched and bent in small ways, so its movement is not as accurate as it should be.
Operators can compensate for this change, unconsciously. But the automatic system cannot. This is often the problem of automation: It struggles to deal with change and uncertainty. Self-propelled vehicles have yet to be widely used because they are not agile enough to handle the chaos of the everyday world.
The Berkeley team decided to build a new neural network to analyze the robot’s mistakes and see the precision it loses with each passing day. “It learns how a robot’s joints evolve over time,” said Brijen Thananjeyan, a PhD student in the group. Once the automated system can account for this change, the robot can pick up and move the plastic rings, matching the operator’s performance.
Other laboratories are trying different approaches. Axel Krieger, a researcher at Johns Hopkins who joined the pig sewing project in 2016, is working to automate a new type of robotic arm, one with fewer moving parts and stable performance. than the type of robot used by the Berkeley team. . Researchers at Worcester Polytechnic Institute are developing ways for machines to carefully guide surgeons’ hands as they perform specific tasks, such as puncture of needles to biopsy for cancer or stinging. into the brain to remove the tumor.
Greg Fischer, one of the Worcester researchers, said: “It’s like a car where the laning is autonomous, but you still have the throttle and brake control.
The scientists note that many obstacles are ahead. Moving plastic pegs is one thing; Cutting, moving and stitching the skin is another thing. “What happens when the camera angle changes?” Ann Majewicz Fey, an associate professor at the University of Texas, Austin said. “What happens when the smoke gets in the way?”
In the near future, automation will be more of a work together with surgeons than replacing them. But even that can have profound effects, says Dr. Fer. Doctors could, for example, perform surgery on long distances greater than the width of the operating room – from miles or soldiers, perhaps, helping the wounded on a distant battlefield.
Currently, the signal delay is too great to be performed. But if a robot can handle at least some of the tasks on its own, then long-distance surgery could become possible, Dr. Fer said: “You can submit a high-level plan and then the robot can do it. show it ”.
Similar technology will be essential for remote surgery over longer distances. “When we start operating on people on the moon,” he said, “surgeons will need completely new tools.”