Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.
Article Source
https://www.pbs.org/newshour/show/what-to-know-about-an-ai-transcription-tool-that-hallucinates-medical-interactions