Connect with us

Hi, what are you looking for?

Brilliant AchievementBrilliant Achievement

Tech News

Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Image: The Verge

A few months ago, my doctor showed off an AI transcription tool he used to record and summarize his patient meetings. In my case, the summary was fine, but researchers cited by ABC News have found that’s not always the case with OpenAI’s Whisper, which powers a tool many hospitals use — sometimes it just makes things up entirely.

Whisper is used by a company called Nabla for a medical transcription tool that it estimates has transcribed 7 million medical conversations, according to ABC News. More than 30,000 clinicians and 40 health systems use it, the outlet writes. Nabla is reportedly aware that Whisper can hallucinate, and is “addressing the problem.”

A group of researchers from Cornell University, the University of Washington, and…

Continue reading…

Join The Exclusive Subscription Today And Get Premium Articles For Free
Your information is secure and your privacy is protected. By opting in you agree to receive emails from us. Remember that you can opt-out any time, we hate spam too!

You May Also Like

Editor's Pick

Thomas A. Berry and Alexander Khoury Since the creation of the market square, there have been boisterous, loquacious individuals who have solicited bids for...

Editor's Pick

We had a sneak preview of emerging leadership on the morning of July 12th. That was the morning the June Core CPI came in...

Editor's Pick

Colleen Hroncich A few years ago, EdChoice released what’s casually known as the Chicken Little report. The official title was a little less catchy...

Editor's Pick

Travis Fisher and Josh Loucks Just north of Boston in Everett, Massachusetts sits the poster child for irrational energy permitting in the United States....