• What to know about an AI transcription tool that 'hallucinates' medical interactions

  • Jan 25 2025
  • Length: 6 mins
  • Podcast

What to know about an AI transcription tool that 'hallucinates' medical interactions

  • Summary

  • Many medical centers use an AI-powered tool called Whisper to transcribe patients' interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more. PBS News is supported by - https://www.pbs.org/newshour/about/funders
    Show more Show less

What listeners say about What to know about an AI transcription tool that 'hallucinates' medical interactions

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.