Untitled Document
Not a member yet? Register for full benefits!

Username
Password
 US National Robotics Initiative grant will provide surgical robots with a new level of machine intelligence

This story is from the category Artificial Intelligence
Printer Friendly Version
Email to a Friend (currently Down)

 

 

Date posted: 27/10/2013

Providing surgical robots with a new kind of machine intelligence that significantly extends their capabilities and makes them much easier and more intuitive for surgeons to operate is the goal of a major new grant announced as part of the National Robotics Initiative.

The five-year, $3.6 million project, titled Complementary Situational Awareness for Human-Robot Partnerships, is a close collaboration among research teams directed by Nabil Simaan, associate professor of mechanical engineering at Vanderbilt University; Howie Choset, professor of robotics at Carnegie Mellon University; and Russell Taylor, the John C. Malone Professor of Computer Science at Johns Hopkins University.

“Our goal is to establish a new concept called complementary situational awareness,” said Simaan. “Complementary situational awareness refers to the robot’s ability to gather sensory information as it works and to use this information to guide its actions.”

“I am delighted to be working with Nabil Simaan on a medical robotics project,” Choset said. “I believe him to be a thought leader in the field.” Taylor added, “This project advances our shared vision of human surgeons, computers and robots working together to make surgery safer, less invasive and more effective.”

One of the project’s objectives is to restore the type of awareness surgeons have during open surgery – where they can directly see and touch internal organs and tissue – which they have lost with the advent of minimally invasive surgery because they must work through small incisions in a patient’s skin. Minimally invasive surgery has become increasingly common because patients experience less pain, blood loss and trauma, recover more quickly and get fewer infections, and is less expensive than open surgery.

Surgeons have attempted to compensate for the loss of direct sensory feedback through pre-operative imaging, where they use techniques like MRI, X-ray imaging and ultrasound to map the internal structure of the body before they operate. They have employed miniaturized lights and cameras to provide them with visual images of the tissue immediately in front of surgical probes. They have also developed methods that track the position of the probe as they operate and plot its position on pre-operative maps.

Simaan, Choset and Taylor intend to take these efforts to the next level. They intend to create a system that acquires data from a number of different types of sensors as an operation is underway and integrates them with pre-operative information to produce dynamic, real time maps that precisely track the position of the robot probe and show how the tissue in its vicinity responds to its movements.

For example, adding pressure sensors to robot probes will provide real time information on how much force the probe is exerting against the tissue surrounding it. Not only does this make it easier to work without injuring the tissue but it can also be used to “palpate” tissue to search for hidden tumor edges, arteries and aneurisms. Such sensor data can also feed into computer simulations that predict how various body parts shift in response to the probe’s movements.

To acquire sensory data during surgery, the VU team lead by Simaan will develop methods that allow surgical snake-like robots explore the shapes and variations in stiffness of internal organs and tissues. The team will generate models that estimate locations of hidden anatomical features such as arteries and tumors and provide them to the JHU and CMU teams to create adaptive telemanipulation techniques that assist surgeons in carrying out various surgical procedures.

See the full Story via external site: news.vanderbilt.edu



Most recent stories in this category (Artificial Intelligence):

03/03/2017: Application of Fuzzy Logic Teaches Drones to land on Moving Targets

02/03/2017: Poker-playing AI program first to beat pros at no-limit Texas hold 'em

05/02/2017: Google's driverless cars make progress

04/02/2017: Study Exposes Major Flaw in Turing Test

31/01/2017: Artificial intelligence uncovers new insight into biophysics of cancer

31/01/2017: Hungry penguins help keep smart car code safe

12/01/2017: First ever perched landing performed using machine learning algorithms

12/01/2017: AI takes on humans in marathon poker game