Using AI and visualisation to create knowledge insights
How can AI be used as a tool to train new brain surgeons? And can operating theatres become more sterile with the help of visualisation? Mario Romero’s research is multi-faceted and often surprising.
Early in his career, Romero developed BrailleTouch: a keyboard for blind smartphone users. The first version was launched in 2013, making it possible to use Braille to write text messages and emails on smartphones. The app became widespread, and smartphone manufacturers soon copied the design and functionality to make it a part of the smartphone operating systems.
“We never patented our innovation because we didn’t want to make money from the large numbers of visually impaired people. But we wanted to make a difference, and we did that. How things turned out also took us further in our academic careers,” says Romero, Associate Professor in Visualization and Interactive Graphics at KTH.
Training platforms for brain surgery in 3D visualisation
Romero has recently focused on helping the healthcare system train more brain surgeons. Conveying knowledge from experienced brain surgeons to new surgeons takes work. So much of the learning is practical and based on experiences involving several senses. In addition, the margins of error in operations are minimal, which raises the necessary skill requirements.
“These are the skills that take a very long time to learn; most surgeons study for at least five years before they are allowed to perform their first operation. On top of theory, their studies are based on observations and supervision by already experienced surgeons,” he says.
But there need to be more experienced surgeons, which is why hopes are being pinned on the development of new forms of training platforms based on simulations and 3D visualization.
The research is conducted in collaboration with Karolinska University Hospital and has involved the creation of 3D models based on medical images of patients. More experienced surgeons have then used the models to simulate actual operations.
“Using sensors, we capture every movement the surgeon makes, for example, how they hold the instruments, place their hands, and even where they look with their eyes. All this in the minutest of detail,” says Romero.
The result is a complete simulation of an operation that students can experience in 3D. They can stop and rehearse each movement or sequence and zoom in on various details to get a better picture. The goal is to create a platform that can complement current teaching methods.
The simulation also makes it possible for students to perform operations with recordings of real surgeons, which makes it possible to compare the different placement of instruments and hands directly.
The next step is to use machine learning and AI to create an artificial expert teacher. An AI teacher can be trained using the work of experienced surgeons and then observe how students perform, which makes it possible to adapt learning to each student.
“We don’t want to replace experienced surgeons and experts; rather, we want to complement them by giving them more tools to share their knowledge,” says Romero.
Reducing healthcare-associated infections
Romero also runs a research project with Danderyd Hospital to visualise how particles move in the air in operating theatres. This project aims to create more sterile environments that reduce care-related infections. As much as 98 per cent of the bacteria in surgical wounds have been shown to originate from the air. Despite masks and sterile clothing, particles and bacteria are spread from exhaled air and the skin of those in the room. However, it is possible to better protect patients by adapting ventilation, the placement of equipment, and even the movements of surgical teams.
The researchers have shown how particles move in the air using a computer model of the operating theatre. The model considers everything from the ventilation system to particle size, the room’s heating system, the positioning of equipment and the actions of the surgical team. Even body heat from surgeons and patients is included in the calculation models.
“In the longer term, the goal is to improve methods and procedures associated with surgery, for example, by improving ventilation to reduce the number of care-related infections. In future, you’ll be able to step into a real operating theatre and, with the help of AR glasses, see exactly how the air moves in the room around objects and people.”
The research has an offshoot in a project at KTH Live in Lab, where ventilation systems are being studied to establish how they should be designed to minimise the spread of viruses and bacteria.
Pushing the boundaries in using current technology
Romero’s research is about constantly pushing the boundaries regarding using current technology. Is it possible to visualise real situations and environments as close to real-time as possible?
“We give people tools and methods that increase their understanding and knowledge of reality. And we do that by focusing on their reality, extracting the right data to analyse it and visualising results in ways that deepen their knowledge,” Romero says.
Collaboration with the healthcare system is especially rewarding, he says.
“People who work in healthcare are constantly striving to improve. They want to continually develop to make a difference to their fellow human beings, which creates an extremely inspiring environment.”