What to know about an AI transcription tool that ‘hallucinates’ medical interactions

What to know about an AI transcription tool that ‘hallucinates’ medical interactions

Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.

Article Source
https://www.pbs.org/newshour/show/what-to-know-about-an-ai-transcription-tool-that-hallucinates-medical-interactions

More From Author

The Less People Know About AI, the More They Like It

The Less People Know About AI, the More They Like It

White House in talks to have Oracle and US investors take over TikTok, NPR reports – Reuters

Listen to the Podcast Overview

Watch the Keynote