New ChatGPT-like AI tool to accurately generate doctors' notes: Study
A new artificial intelligence (AI) computer programme can generate doctors' notes so well that two physicians couldn't tell the difference, according to a study, that may soon open the door for AI to support health care workers with groundbreaking efficiencies.
image for illustrative purpose
New York, Nov 27: A new artificial intelligence (AI) computer programme can generate doctors' notes so well that two physicians couldn't tell the difference, according to a study, that may soon open the door for AI to support health care workers with groundbreaking efficiencies.
In this proof-of-concept study, physicians reviewed patient notes -- some written by actual medical doctors while others were created by the new AI programme -- and the physicians identified the correct author only 49 per cent of the time.
A team of 19 researchers from NVIDIA and the University of Florida trained supercomputers to generate medical records based on a new model, GatorTronGPT, that functions similarly to ChatGPT.
The free versions of GatorTron models have more than 430,000 downloads from Hugging Face, an open-source AI website. GatorTron models are the site's only models available for clinical research, according to lead author Yonghui Wu,, from the University of Florida’s department of health outcomes and biomedical informatics.
"In health care, everyone is talking about these models. GatorTron and GatorTronGPT are unique AI models that can power many aspects of medical research and health care. Yet, they require massive data and extensive computing power to build. We are grateful to have this supercomputer, HiPerGator, from NVIDIA to explore the potential of AI in healthcare," Wu said.
For this research, published in the journal npj Digital Medicine, the team developed a large language model that allows computers to mimic natural human language.
These models work well with standard writing or conversations, but medical records bring additional hurdles, such as needing to protect patients' privacy and being highly technical. Digital medical records cannot be Googled or shared on Wikipedia.
To overcome these obstacles, the researchers used health medical records of two million patients while keeping 82 billion useful medical words.
Combining this set with another dataset of 195 billion words, they trained the GatorTronGPT model to analyse the medical data with GPT-3 architecture, or Generative Pre-trained Transformer, a form of neural network architecture. That allowed GatorTronGPT to write clinical text similar to medical doctors' notes.
Of the many possible uses for a medical GPT, one idea involves replacing the tedium of documentation with notes recorded and transcribed by AI.
For an AI tool to reach such parity with human writing, programmers spend weeks programming supercomputers with clinical vocabulary and language usage based on billions upon billions of words.