Robots as surgical assistants

Today, robotic systems are little more than assistants used to hold and aim surgical tools. A research team led by Jan Steinbrener and Stephan Weiss at the University of Klagenfurt is looking to develop new technological options to support surgeons in their work.

Minimally invasive operations, performed with an incision no larger than a buttonhole, offer many advantages. They result in fewer post-surgical complications, are more readily accepted by patients, and are more cost-effective. One drawback, however, is that surgeons cannot use their own eyes to guide their instruments around the inside of the body, but must instead rely on external imaging devices to show them precisely where they are operating at any given moment. According to Jan Steinbrener, a researcher and lecturer working in the Control of Networked Systems group at the Department of Smart Systems Technologies, the first questions an supporting robot must be able to answer are these: “Where exactly am I? Where do I want to go?”

What may sound somewhat philosophical is not at all easy to implement technically, Steinbrener continues: “Let us imagine an operation involving soft tissue sites. By the simple act of breathing, the tissue is in constant motion. Besides, we don’t all look the same on the inside; distortions can shift target regions or obscure the view of the path ahead.” This is not an easy task, even for human operators, as they are constantly having to interpret what is being transmitted to a screen via a camera. However, in the future, it is intended that a robot will assist with the “state estimation” (i.e. with this positioning issue). For this, the research team is drawing on findings from the field of drone research.

“We are dealing with a different imaging modality here, however. If you want a drone to localise itself with the help of objects on the ground, you can resort to geometric approaches. The drone can determine its position on the basis of fixed points within the image taken by the camera and, for example, the distance to the object. Inside the body, however, we are dealing with less clearly delineated forms. Moreover, they are in a constant state of motion”, Jan Steinbrener, who heads the project together with Stephan Weiss, elaborates. The aim is to apply AI methods to train a learning algorithm in such a way that it ultimately facilitates a robust state estimation.

“Algorithms are often a black box: This means we only have a limited understanding of why they work, when they fail to work, and how reliably they work.“
(Jan Steinbrener)

On a day to day basis, this project involves a lot of work on the PC for Jan Steinbrener and his colleagues, but some of it takes place in experimental settings: “We also use the infrared tracking cameras installed in our drone hall to capture the movements of the tool being guided by the robot. We want to know very precisely where the surgical tool was, so that we can subsequently train our algorithms.” The overall aim is to achieve a greater level of “autonomous” support for surgeons, although major barriers remain with regard to the independent action of the robots, not least for ethical and legal-regulatory reasons: “There are many technical solutions that we lack as well: artificial intelligence to ensure that images and potential image errors are interpreted correctly, and modular multi-sensor fusion algorithms that can use data gathered from all sensors, including any uncertainties they may have, to determine the position of the surgical instrument within the body.” This AIMRobot project is funded by the Austrian Research Promotion Agency (FFG) and is jointly run with iSYS Medizintechnik GmbH, an industry partner from Wattens. The team anticipates results around mid-2023.

Jan Steinbrener, who has a strong affinity for medicine but decided to concentrate on physics, spent five years working in the medical division at Siemens, where he helped to develop X-ray equipment. Medical technology innovations yield significant improvements for patients. Here, Steinbrener draws a comparison with commercial aviation: “In the 1960s, the pilots flew the machines by themselves; Today, they are system managers of a highly intelligent machine who work as a team. Overall, aviation has become much safer as a result.“ We are still worlds away from the futuristic vision of robots operating on their own: “It is true that in diagnostics, for example, some algorithms can recognise diseases from images with a similar degree of sensitivity and specificity as specialists with years of experience. Still, algorithms are often a black box: This means we only have a limited understanding of why they work, when they fail to work, and how reliably they work.” Although technology is set to develop rapidly over the next few decades, the ultimate goal is to achieve the best possible performance in the interest of the patient. Assistant robots that can alert human surgeons to potential errors could play an important role here, as humans are also susceptible to mistakes: “Even surgeons have good days and bad days.”

for ad astra: Romy Müller, translation: Karen Meehan