AI comes with the risk that younger doctors will not develop their critical-thinking skills. Photo:
Doctors using artificial intelligence tools to take patient notes say it can make critical mistakes, but saves time.
The University of Otago surveyed nearly 200 health professionals and found 40 percent used AI for patient notes, but there were problems with accuracy, legal and ethical oversight, data security, patient consent and the impact on the doctor-patient relationship.
Bio-ethicist professor Angela Ballantyne lead the research and said nearly three-quarters of the doctors who used the tools found them helpful.
Benefits included saving time - between 30 minutes and two hours a day - and improving rapport with patients by not having to take notes as they're talking.
"On the other hand, people who had negative experiences said often the notes that were produced were too long, sometimes they include errors or hallucinations, and that sometimes it was quite subtle to pick up where those errors had popped in," Ballantyne said.
"They thought sometimes it didn't capture the essence, like what was really important in the consultation, and sometimes it missed critical findings."
Ballantyne was concerned doctors would become too reliant on the tools and warned the use of AI came with the risk that younger general practitioners not developing their critical-thinking skills.
"I think that's a bit deceptive to refer to them as administrative tools," she said. "Actually, the process of writing a clinical note is... a cognitive process and requires critical thinking.
"The clinician's greatest tool is their brain and it's the thinking that we want from clinicians, so I think we need to just track how they're being used."
While AI was relatively new, doctors would check and edit the notes thoroughly, but Ballantyne worried that would lapse, as they got used to it.
She also called for guidance and regulation around patient consent and the type of software used.
About two-thirds of those surveyed had read the software terms and conditions, and about 60 percent sought patient consent.
"When we were doing the research in 2024, it was a little bit of a wild west and, really, GPs being out on their own," Ballantyne said.
Health NZ had endorsed two AI clinical scribe tools - Heidi Health and iMedX - and guidance around consent was expected from the Medical Council later this year.
AI was unlikely to go away, so greater regulation was vital, she said.
"Health providers are time poor and overburdened at the moment, and this does have the potential to be really helpful.
"The things we want to see are more centralised regulatory and governance support, so that GPs aren't having to kind of pick and choose between tools themselves."
Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.