Close Menu
Fintechbits
  • News
  • AI
  • Acquisitions
  • Trends
  • Insights
  • Rumors
  • Startups
  • finjobsly

Subscribe to Updates

Get the latest news from Fintechbits.

Trending Now

Melania Trump Advocates for Robotic Solutions in Homeschooling Education

March 25, 2026

European WealthTech Investment Doubles in Q4 2025 Driven by Investor Optimism

March 25, 2026

The Childcare Subsidy System Is Training Parents to Expect Payment Flexibility Everywhere

March 25, 2026

Hawk’s AI Agent Streamlines Anti-Money Laundering Investigations

March 25, 2026
Facebook X (Twitter) Instagram
Trending
  • Melania Trump Advocates for Robotic Solutions in Homeschooling Education
  • European WealthTech Investment Doubles in Q4 2025 Driven by Investor Optimism
  • The Childcare Subsidy System Is Training Parents to Expect Payment Flexibility Everywhere
  • Hawk’s AI Agent Streamlines Anti-Money Laundering Investigations
  • SME Supplier Failures: 7 Leaders Reveal How Deep-Tier Finance Prevents Collapse
  • Regnology Expands Portfolio with Addition of Invoke to Strengthen RegTech Presence
  • DeleteMe Expands Portfolio with Acquisition of Social Media Security Tool Block Party
  • Decline in LatAm FinTech Funding in 2025 Fueled by 42% Reduction in Transactions Exceeding $100 Million
Facebook X (Twitter) Instagram Pinterest Vimeo
Fintechbits
  • News

    Hawk’s AI Agent Streamlines Anti-Money Laundering Investigations

    March 25, 2026

    Monument and Midnight Introduce Tokenized Deposits to UK Retail Banking

    March 25, 2026

    Aviva Launches Pilot Program for ChatGPT Application in Home Insurance Quotes

    March 25, 2026

    The Success of the UK’s Payments Overhaul Will Depend on the Development Phase

    March 25, 2026

    Financial Crime Risk Assessments: Establishing a New Regulatory Standard

    March 25, 2026
  • AI

    Central African Republic’s Fintech Developments and Broader Digital Initiatives in 2026

    March 24, 2026

    The Fintech Ecosystem of Cabo Verde in 2026: Insights from an African Nation

    March 22, 2026

    Your Next Customer Might Not Be Human. Is Your Business Ready?

    March 3, 2026

    Why AI Quoting Will Split the Trades Industry in Two

    February 26, 2026

    How Fintech Companies Balance AI Automation With Human Expertise in Regulated Finance

    February 25, 2026
  • Acquisitions

    Regnology Expands Portfolio with Addition of Invoke to Strengthen RegTech Presence

    March 25, 2026

    FinTech Acquisition Activity Declines More Than Other Sectors in the First Half of 2023

    March 24, 2026

    LATAM FinTech Investments Decrease 31% Year-over-Year Amid Growing Investor Caution

    March 23, 2026

    UK FinTech Deal Activity Declines by 61% Amid Five-Year Low in Investment

    March 22, 2026

    European FinTech Transactions Exceeding $100 Million Rise by 2.6 Times Quarter-over-Quarter as Funding Rebounds in Q1 2025

    March 22, 2026
  • Trends

    Brazil Maintains Leadership in LatAm FinTech Market in Q2 Despite 77% Year-over-Year Decline in Deal Activity

    March 22, 2026

    Client Churn Data Is a Better Default Predictor Than a Balance Sheet

    March 20, 2026

    European FinTech 2025 Is Back and Means Business

    March 16, 2026

    Subscription Payment Fatigue Is Coming for Children’s Services

    March 16, 2026

    Green Fintech: 5 Proven Reasons It Goes Beyond a Compliance Checkbox

    March 16, 2026
  • Insights

    European WealthTech Investment Doubles in Q4 2025 Driven by Investor Optimism

    March 25, 2026

    SME Supplier Failures: 7 Leaders Reveal How Deep-Tier Finance Prevents Collapse

    March 25, 2026

    Decline in LatAm FinTech Funding in 2025 Fueled by 42% Reduction in Transactions Exceeding $100 Million

    March 25, 2026

    Solar Financing Risks: 7 Alarming Gaps Every Homeowner Must Know

    March 25, 2026

    Commodity Price Alerts Don’t Help When Your Customers Lock Quotes Three Months Out

    March 24, 2026
  • Rumors

    Gilead Snaps Up Arcellx in $7.8B Most cancers Drug Deal

    March 14, 2026

    Tilly’s Inventory Pops After This autumn Earnings Shock

    March 14, 2026

    Elliott and Jana Take Recent Actions Alongside Other Speculations

    February 22, 2026

    Hank Payments (TSX) Rises to CAD 0.26 on February 18, 2026: Catalyst Analysis

    February 19, 2026

    Abivax CEO refers to Eli Lilly acquisition speculation as a diversion.

    February 8, 2026
  • Startups

    Melania Trump Advocates for Robotic Solutions in Homeschooling Education

    March 25, 2026

    DeleteMe Expands Portfolio with Acquisition of Social Media Security Tool Block Party

    March 25, 2026

    Elon Musk Suspends Modifications to X’s Creator Revenue-Sharing Program Following Criticism

    March 25, 2026

    Lucid Bots Secures $20 Million to Meet Growing Demand for Window Washing Drones

    March 25, 2026

    Kleiner Perkins Commits $3.5 Billion to Advancing Artificial Intelligence Initiatives

    March 25, 2026
  • finjobsly
Fintechbits
Home » Researchers say AI transcription tool used in hospitals makes up things no one ever said
AI in Finance

Researchers say AI transcription tool used in hospitals makes up things no one ever said

7 Mins Read
Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
A921b2afd2dd6fc56288eddb212ebd81.jpeg
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

SAN FRANCISCO (AP) — Tech giant OpenAI has touted its artificial intelligence-based transcription tool, Whisper, as having “near human-level robustness and accuracy.”

But Whisper has a major flaw: It tends to compose chunks of text, or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. These experts said some made-up texts — known in the industry as hallucinations — can include racist comments, violent rhetoric and even imaginary medical treatments.

Experts said such fabrications are problematic because Whisper is used in many industries around the world to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

What is more concerning, they said, is a rush of medical centers use Whisper-based tools to transcribe patient consultations with doctors, despite OpenAI’ s warnings that the tool should not be used in “high risk areas”.

It’s difficult to determine the extent of the problem, but researchers and engineers say they’ve encountered Whisper’s hallucinations frequently in the course of their work. A University of Michigan One researcher conducting a study of public meetings, for example, reported finding hallucinations in eight out of ten audio transcripts he inspected, before starting to try to improve the model.

A machine learning engineer said he initially discovered hallucinations in about half of the more than 100 hours of Whisper transcripts he analyzed. A third developer reported finding hallucinations in almost every one of the 26,000 transcripts he created with Whisper.

Problems persist even in short, well-recorded audio samples. A recent study by computer scientists discovered 187 hallucinations in more than 13,000 clear audio clips they examined.

This trend would result in tens of thousands of faulty transcriptions across millions of recordings, the researchers said.

___

This story was produced in partnership with the Pulitzer Center’s AI Accountability Network, which also partially supported the Whisper academic study. AP also receives financial assistance from the Omidyar Network to support coverage of artificial intelligence and its impact on society.

___

Such errors could have “very serious consequences”, particularly in hospital settings, said Alondra Nelsonwho led the White House Office of Science and Technology Policy for the Biden administration until last year.

“No one wants a misdiagnosis,” said Nelson, a professor at the Institute for Advanced Study in Princeton, New Jersey. “There should be a higher bar.”

Whisper is also used to create closed captioning for the deaf and hard of hearing – a population at particular risk for faulty transcriptions. That’s because deaf and hard-of-hearing people have no way of identifying fabrications “hidden among all these other texts,” said Christian Voglerwho is deaf and directs the technology access program at Gallaudet University.

OpenAI urged to fix the problem

The prevalence of such hallucinations has led experts, advocates, and former OpenAI employees to call on the federal government to consider regulating AI. At a minimum, they said, OpenAI must fix the flaw.

“It seems solvable if the company is willing to prioritize it,” said William Saunders, a San Francisco-based research engineer who left OpenAI in February over concerns about the company’s direction. “It’s problematic if you put this out there and people are overconfident about what it can do and integrate it into all these other systems.”

A OpenAI The spokesperson said the company continually studies how to reduce hallucinations and appreciates the researchers’ findings, adding that OpenAI incorporates feedback into model updates.

While most developers assume transcription tools misspell words or make other mistakes, engineers and researchers said they’ve never seen another AI-powered transcription tool hallucinate so much than Whisper.

Whispered hallucinations

The tool is integrated into some versions of OpenAI’s flagship chatbot, ChatGPT, and is an integrated offering with Oracle and Microsoft’s cloud computing platforms, which serve thousands of businesses around the world. It is also used to transcribe and translate text in multiple languages.

In the last month alone, a recent version of Whisper has been downloaded more than 4.2 million times from the open source AI platform HuggingFace. Sanchit Gandhi, a machine learning engineer, said Whisper is the most popular open source speech recognition model and is integrated into everything from call centers to voice assistants.

Teachers Allison Koenecke from Cornell University and Mona Sloan from the University of Virginia examined thousands of short excerpts obtained from TalkBank, a research repository hosted at Carnegie Mellon University. They determined that nearly 40% of hallucinations were harmful or disturbing because the speaker could be misinterpreted or distorted.

In one example they discovered, a speaker said: “He, the boy, was going to, I’m not sure exactly, take the umbrella. »

But the transcription software adds: “He took a big piece of the cross, a very small piece… I’m sure he didn’t have a terrorist knife, so he killed a number of people.”

A commenter in another recording described “two other girls and a lady.” Whisper made up an additional comment about race, adding “two other girls and a lady, uh, who were black.”

In a third transcript, Whisper invented a nonexistent drug called “hyperactivated antibiotics.”

Researchers aren’t sure why Whisper and similar tools hallucinate, but software developers have said the hallucinations tend to occur amid pauses, background noises or music.

OpenAI has recommended in its online publications against using Whisper in “decision-making contexts, where lapses in accuracy can lead to pronounced flaws in the results.”

Transcription of doctor appointments

This warning has not stopped hospitals or medical centers from using text-to-speech models, including Whisper, to transcribe what is said during doctor visits to allow medical providers to spend less time taking notes or writing reports.

More than 30,000 clinicians and 40 health systems, including the Mankato Clinic in Minnesota and Children’s Hospital Los Angeles, have started using a Whisper-based tool, created by Nablawhich has offices in France and the United States

This tool was refined on medical language to transcribe and summarize patient interactions, said Martin Raison, chief technology officer of Nabla.

Company officials said they were aware that Whisper could be hallucinating and were addressing the problem.

It’s impossible to compare Nabla’s AI-generated transcript to the original recording because Nabla’s tool erases the original audio for “data security reasons,” Raison said.

Nabla said the tool has been used to transcribe around 7 million medical visits.

Saunders, the former OpenAI engineer, said erasing original audio could be concerning if transcriptions aren’t verified or clinicians can’t access the recording to verify they’re correct .

“You can’t catch errors if you remove the ground truth,” he said.

Nabla said no model is perfect and theirs currently requires medical providers to quickly edit and approve transcribed notes, but that could change.

Privacy issues

Because patients’ meetings with their doctors are confidential, it’s unclear how AI-generated transcripts affect them.

A California state legislator, Rebecca Bauer Kahansaid she took one of her children to the doctor earlier this year and refused to sign a form provided by the health network that asked her permission to share audio of the consultation with providers among including Microsoft Azure, the cloud computing system run by OpenAI’s largest investor. . Bauer-Kahan didn’t want such intimate medical conversations shared with tech companies, she said.

“The release was very specific that for-profit companies would be allowed to have this,” said Bauer-Kahan, a Democrat who represents part of suburban San Francisco in the state Assembly . “I said to myself ‘absolutely not’. »

John Muir Health spokesman Ben Drew said the health system complies with state and federal privacy laws.

___

Schellmann reported from New York.

___

AP is solely responsible for all content. Find AP standards to work with philanthropies, a list of supporters and funded coverage areas on AP.org.

___

The Associated Press and OpenAI have a license and technology agreement allowing OpenAI to access part of the AP’s text archives.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Central African Republic’s Fintech Developments and Broader Digital Initiatives in 2026

March 24, 2026

The Fintech Ecosystem of Cabo Verde in 2026: Insights from an African Nation

March 22, 2026

Your Next Customer Might Not Be Human. Is Your Business Ready?

March 3, 2026
Leave A Reply Cancel Reply

Latest news

Melania Trump Advocates for Robotic Solutions in Homeschooling Education

March 25, 2026

European WealthTech Investment Doubles in Q4 2025 Driven by Investor Optimism

March 25, 2026

The Childcare Subsidy System Is Training Parents to Expect Payment Flexibility Everywhere

March 25, 2026
News
  • AI in Finance (2,159)
  • Breaking News (268)
  • Corporate Acquisitions (89)
  • Industry Trends (54)
  • Jobs Market News (338)
  • Market Insights (324)
  • Market Rumors (308)
  • Regulatory Updates (217)
  • Startup News (1,425)
  • Technology Innovations (224)
  • uncategorized (13)
  • X Feed (1)
About US
About US

FintechBits is a blog delivering the latest news and insights in fintech, finance, and technology. We cover breaking news, market trends, innovations, and expert opinions to keep you informed about the future of finance

Facebook X (Twitter) Instagram Pinterest Reddit TikTok
News
  • AI in Finance (2,159)
  • Breaking News (268)
  • Corporate Acquisitions (89)
  • Industry Trends (54)
  • Jobs Market News (338)
  • Market Insights (324)
  • Market Rumors (308)
  • Regulatory Updates (217)
  • Startup News (1,425)
  • Technology Innovations (224)
  • uncategorized (13)
  • X Feed (1)
Happening Now

November 28, 2024

“ Intentionally collaborative ”: how the Rotman school of U of T leads Innovation Fintech

February 6, 2025

‘1957 Ventures’ to Drive FinTech Innovation in Saudi Arabia

September 10, 2024
  • About FintechBits
  • Advertise With us
  • Contact us
  • Disclaimer
  • Privacy Policy
  • Terms and services
  • BUY OUR EBOOK GUIDE
© 2026 Designed by Fintechbits

Type above and press Enter to search. Press Esc to cancel.