Daily Bulletin

The Times Real Estate

.

  • Written by Stacy Carter, Professor and Director, Australian Centre for Health Engagement, Evidence and Values, University of Wollongong
Some clinicians are using AI to write health records. What do you need to know?

Imagine this. You’ve finally summoned up the courage to see a GP about an embarrassing problem. You sit down. The GP says:

before we start, I’m using my computer to record my appointments. It’s AI – it will write a summary for the notes and a letter to the specialist. Is that OK?

Wait – AI writing our medical records? Why would we want that?

Records are essential for safe and effective health care. Clinicians must make good records to keep their registration. Health services must provide good record systems to be accredited. Records are also legal documents: they can be important in insurance claims or legal actions.

But writing stuff down (or dictating notes or letters) takes time. During appointments, clinicians can have their attention divided between good record-keeping and good communication with the patient. Sometimes clinicians need to work on records after hours, at the end of an already-long day.

So there’s understandable excitement, from all kinds of health-care professionals, about “ambient AI” or “digital scribes”.

What are digital scribes?

This is not old-school transcription software: dictate letter, software types it up word for word.

Digital scribes are different. They use AI – large language models with generative capabilities – similar to ChatGPT (or sometimes, GPT4 itself).

The application silently records the conversation between a clinician and a patient (via a phone, tablet or computer microphone, or a dedicated sensitive microphone). The AI converts the recording to a word-for-word transcript.

The AI system then uses the transcript, and the instructions it is given, to write a clinical note and/or letters for other doctors, ready for the clinician to check.

Most clinicians know little about these technologies: they are experts in their speciality, not in AI. The marketing materials promise to “let AI take care of your clinical notes so you can spend more time with your patients.”

Put yourself in the clinician’s shoes. You might say “yes please!”

GP talks to a patient
Some clinicians will welcome the chance to cut down their workload. Stephen Barnes/Shutterstock

How are they regulated?

Recently, the Australian Health Practitioner Regulation Agency released a code of practice for using digital scribes. The Royal Australian College of General Practitioners released a fact sheet. Both warn clinicians that they remain responsible for the contents of their medical records.

Some AI applications are regulated as medical devices, but many digital scribes are not. So it’s often up to health services or clinicians to work out whether scribes are safe and effective.

What does the research say so far?

There’s very limited data or real world evidence on the performance of digital scribes.

In a big Californian hospital system, researchers followed 9,000 doctors for ten weeks in a pilot test of a digital scribe.

Some doctors liked the scribe: their work hours decreased, they communicated better with patients. Others didn’t even start using the scribe.

And the scribe made mistakes – for example, recording the wrong diagnosis, or recording that a test had been done, when it needed to be done.

So what should we do about digital scribes?

The recommendations of the first Australian National Citizens’ Jury on AI in Health Care show what Australians want from health care AI, and provide a great starting point.

Building on those recommendations, here are some things to keep in mind about digital scribes the next time you head to the clinic or emergency department:

1) You should be told if a digital scribe is being used.

2) Only scribes designed for health care should be used in health care. Regular, publicly available generative AI tools (like ChatGPT or Google Gemini) should not be used in clinical care.

3) You should be able to consent, or refuse consent, for use of a digital scribe. You should have any relevant risks explained, and be able to agree or refuse freely.

4) Clinical digital scribes must meet strict privacy standards. You have a right to privacy and confidentiality in your health care. The whole transcript of an appointment may contain a lot more detail than a clinical note usually would. So ask:

  • are the transcripts and summaries of your appointments processed in Australia, or another country?
  • how are they kept secure and private (for example, are they encrypted)?
  • who can access them?
  • how are they used (for example, are they used to train AI systems)?
  • does the scribe access other data from your record to make the summary? If so, is that data ever shared?
Clinician writes paper notes in a clinic corridor
Clinicians need to adhere to privacy standards. PeopleImages.com - Yuri A/Shutterstock

Is human oversight enough?

Generative AI systems can make things up, get things wrong, or misunderstand some patient’s accents. But they will often communicate these errors in a way that sounds very convincing. This means careful human checking is crucial.

Doctors are told by tech and insurance companies that they must check every summary or letter (and they must). But it’s not that simple. Busy clinicians might become over-reliant on the scribe and just accept the summaries. Tired or inexperienced clinicians might think their memory must be wrong, and the AI must be right (known as automation bias).

Some have suggested these scribes should also be able to create summaries for patients. We don’t own our own health records, but we usually have a right to access them. Knowing a digital scribe is in use may increase consumers’ motivation to see what is in their health record.

Clinicians have always written notes about our embarrassing problems, and have always been responsible for these notes. The privacy, security, confidentiality and quality of these records have always been important.

Maybe one day, digital scribes will mean better records and better interactions with our clinicians. But right now, we need good evidence that these tools can deliver in real-world clinics, without compromising quality, safety or ethics.

Authors: Stacy Carter, Professor and Director, Australian Centre for Health Engagement, Evidence and Values, University of Wollongong

Read more https://theconversation.com/some-clinicians-are-using-ai-to-write-health-records-what-do-you-need-to-know-237762

Business News

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

Benefits of Outsourced Bookkeeping for Growing Businesses

Outsourced bookkeeping can have numerous benefits regardless of the size of business. The main advantage being it can provide more than just cost savings. So, if you are thinking of outsourcing your b...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals