Close Menu
The Politic ReviewThe Politic Review
  • News
  • U.S.
  • World
  • Politics
  • Congress
  • Business
  • Economy
  • Money
  • Tech
  • More Articles
Trending

Rick Scott on Farm Bailout: ‘Got to Do Everything We Can to Help’

March 20, 2026

Another DHS funding vote coming to House floor

March 20, 2026

‘Safe’ corridor opening up through Strait of Hormuz: What we know so far

March 20, 2026
Facebook X (Twitter) Instagram
  • Donald Trump
  • Kamala Harris
  • Elections 2024
  • Elon Musk
  • Israel War
  • Ukraine War
  • Policy
  • Immigration
Facebook X (Twitter) Instagram
The Politic ReviewThe Politic Review
Newsletter
Friday, March 20
  • News
  • U.S.
  • World
  • Politics
  • Congress
  • Business
  • Economy
  • Money
  • Tech
  • More Articles
The Politic ReviewThe Politic Review
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
Home»Economy»People Taking Medical Advice from AI Chatbots Are Ending Up in the ER
Economy

People Taking Medical Advice from AI Chatbots Are Ending Up in the ER

Press RoomBy Press RoomOctober 25, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram

The growing reliance on AI-powered chatbots for medical advice has led to several alarming cases of harm and even tragedy, as people follow potentially dangerous recommendations from these digital assistants.

The New York Post reports that in recent years, the rise of generative AI chatbots has revolutionized the way people seek information, including health advice. However, the increasing reliance on these AI-powered tools has also led to several disturbing instances where individuals have suffered severe consequences after following chatbots’ medical recommendations. From anal pain caused by self-treatment gone wrong to missed signs of a mini-stroke, the real-life impact of bad AI health advice is becoming increasingly apparent.

One particularly shocking case involved a 35-year-old Moroccan man who sought help from ChatGPT for a cauliflower-like anal lesion. The chatbot suggested that the growth could be hemorrhoids and proposed elastic ligation as a treatment. The man attempted to perform this procedure on himself using a thread, resulting in intense pain that landed him in the emergency room. Further testing revealed that the growth had been completely misdiagnosed by AI.

In another incident, a 60-year-old man with a college education in nutrition asked ChatGPT how to reduce his intake of table salt. The chatbot suggested using sodium bromide as a replacement, and the man followed this advice for three months. However, chronic consumption of sodium bromide can be toxic, and the man developed bromide poisoning. He was hospitalized for three weeks with symptoms including paranoia, hallucinations, confusion, extreme thirst, and a skin rash.

The consequences of relying on AI for medical advice can be even more severe, as demonstrated by the case of a 63-year-old Swiss man who experienced double vision after a minimally invasive heart procedure. When the double vision returned, he consulted ChatGPT, which reassured him that such visual disturbances were usually temporary and would improve on their own. The man decided not to seek medical help, but 24 hours later, he ended up in the emergency room after suffering a mini-stroke. The researchers concluded that his care had been “delayed due to an incomplete diagnosis and interpretation by ChatGPT.”

These disturbing cases highlight the limitations and potential dangers of relying on AI chatbots for medical advice. While these tools can be helpful in understanding medical terminology, preparing for appointments, or learning about health conditions, they should never be used as a substitute for professional medical guidance. Chatbots can misinterpret user requests, fail to recognize nuances, reinforce unhealthy behaviors, and miss critical warning signs for self-harm.

Perhaps even a greater danger with than bad medical advice is the impact on mental health AI chatbots can have, especially for teenagers. Breitbart News previously reported on a family suing OpenAI over claims ChatGPT became their son’s “suicide coach:”

The Raines claim that “ChatGPT actively helped Adam explore suicide methods” and that “despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol.”

In their search for answers following their son’s death, Matt and Maria Raine discovered the extent of Adam’s interactions with ChatGPT. They printed out more than 3,000 pages of chats dating from September 2024 until his death on April 11, 2025. Matt Raine stated, “He didn’t write us a suicide note. He wrote two suicide notes to us, inside of ChatGPT.”

Read more at the New York Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link

Related Articles

Economy

Cybertruck Crash Survivor Sues Elon Musk’s Tesla Claiming Door Failure Trapped Him in Burning Vehicle

March 20, 2026
Economy

‘A Pile of Sh*t:’ Government Reviewers Blasted Microsoft’s Cloud Security, Approved It Anyway

March 20, 2026
Economy

Breitbart Business Digest: Powell Cannot Stay on as Fed Chair After May 15

March 20, 2026
Economy

Dem Rep. Liccardo: We ‘Pay Way Too Much’ in CA for Gas, Fed Gas Tax Holiday Hurts ‘Basic Infrastructure’

March 20, 2026
Economy

How a Wet November in Yuma Helped Drive Up Inflation in February

March 20, 2026
Economy

Analysis: Nearly Half of Immigrant Households in U.S. Are on Welfare

March 20, 2026
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Another DHS funding vote coming to House floor

March 20, 2026

‘Safe’ corridor opening up through Strait of Hormuz: What we know so far

March 20, 2026

Watch: Leftist Australian PM Anthony Albanese Heckled at Sydney Mosque

March 20, 2026

Cybertruck Crash Survivor Sues Elon Musk’s Tesla Claiming Door Failure Trapped Him in Burning Vehicle

March 20, 2026
Latest News

GOP Sen. Marshall: $200 Billion for Iran Supplemental ‘a Little Bit High’

March 20, 2026

House GOP leaders punt controversial FISA vote to April

March 20, 2026

Kim’s daughter ‘drives tank’ during North Korean war drills (VIDEO, PHOTOS)

March 20, 2026

Subscribe to News

Get the latest politics news and updates directly to your inbox.

The Politic Review is your one-stop website for the latest politics news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Instagram Pinterest YouTube
Latest Articles

Rick Scott on Farm Bailout: ‘Got to Do Everything We Can to Help’

March 20, 2026

Another DHS funding vote coming to House floor

March 20, 2026

‘Safe’ corridor opening up through Strait of Hormuz: What we know so far

March 20, 2026

Subscribe to Updates

Get the latest politics news and updates directly to your inbox.

© 2026 Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact

Type above and press Enter to search. Press Esc to cancel.