Assessing unintended consequences in AI-based neurosurgical training

This post was originally published on this site

Virtual reality simulators can help learners improve their technical skills faster and with no risk to patients. In the field of neurosurgery, they allow medical students to practice complex operations before using a scalpel on a real patient. When combined with artificial intelligence, these tutoring systems can offer tailored feedback like a human instructor, identifying areas where the students need to improve and making suggestions on how to achieve expert performance.

A new study from the Neurosurgical Simulation and Artificial Intelligence Learning Centre at The Neuro (Montreal Neurological Institute-Hospital) of McGill University, however, shows that human instruction is still necessary to detect and compensate for unintended, and sometimes negative, changes in neurosurgeon behaviour after virtual reality AI training.

In the study, 46 medical students performed a tumour removal procedure on a virtual reality simulator. Half of them were randomly selected to receive instruction from an AI-powered intelligent tutor called the Virtual Operative Assistant (VOA), which uses a machine learning algorithm to teach surgical techniques and provide personalized feedback. The other half served as a control group by receiving no feedback. The students’ work was then compared to performance benchmarks selected by a team of established neurosurgeons.

Comparing the results, AI-tutored students caused 55 per cent less damage to healthy tissues than the control group. AI-tutored students also showed a 59 per cent reduction in average distance between instruments in each hand and 46 per cent less maximum force applied, both important safety measures.

However, AI-tutored students also showed some negative outcomes. For example, their dominant hand movements had 50 per cent lower velocity and 45 per cent lower acceleration than the control group, making their operations less efficient. The speed at which they removed tumour tissue was also 29 per cent lower in the AI-tutored group than the control group.

These unintended outcomes underline the importance of human instructors in the learning process, to promote both safety and efficiency in students.

“AI systems are not perfect,” says Ali Fazlollahi, a medical student researcher at the Neurosurgical Simulation and Artificial Intelligence Learning Centre and the study’s first author. “Achieving mastery will still require some level of apprenticeship from an expert. Programs adopting AI will enable learners to monitor their competency and focus their intraoperative learning time with instructors more efficiently and on their individual tailored learning goals. We’re currently working towards finding an optimal hybrid mode of instruction in a crossover trial.”

Fazlollahi says his findings have implications beyond neurosurgery because many of the same principles are applied in other fields of skills’ training.

“This includes surgical education, not just neurosurgery, and also a range of other fields from aviation to military training and construction,” he says. “Using AI alone to design and run a technical skills curriculum can lead to unintended outcomes that will require oversight from human experts to ensure excellence in training and patient care.”

“Intelligent tutors powered by AI are becoming a valuable tool in the evaluation and training of the next generation of neurosurgeons,” says Dr. Rolando Del Maestro, the study’s senior author. “However, it is essential that surgical educators are an integral part of the development, application, and monitoring of these AI systems to maximize their ability to increase the mastery of neurosurgical skills and improve patient outcomes.”