Close Menu
The Politic ReviewThe Politic Review
  • Home
  • News
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
Trending

Trump’s ‘war on drugs’ could flood EU with narcotics – German official

October 25, 2025

Secret Service Chewed Out by Susie Wiles After Hamas Supporting Code Pink Radicals Got Within Feet of President Trump at D.C. Restaurant Last Month: Report

October 25, 2025

Socialist Catherine Connolly Wins Irish Presidential Election But Historic Spoiled Ballot Campaign Raises Questions of Legitimacy

October 25, 2025
Facebook X (Twitter) Instagram
  • Donald Trump
  • Kamala Harris
  • Elections 2024
  • Elon Musk
  • Israel War
  • Ukraine War
  • Policy
  • Immigration
Facebook X (Twitter) Instagram
The Politic ReviewThe Politic Review
Newsletter
Saturday, October 25
  • Home
  • News
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
The Politic ReviewThe Politic Review
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
Home»Economy»People Taking Medical Advice from AI Chatbots Are Ending Up in the ER
Economy

People Taking Medical Advice from AI Chatbots Are Ending Up in the ER

Press RoomBy Press RoomOctober 25, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram

The growing reliance on AI-powered chatbots for medical advice has led to several alarming cases of harm and even tragedy, as people follow potentially dangerous recommendations from these digital assistants.

The New York Post reports that in recent years, the rise of generative AI chatbots has revolutionized the way people seek information, including health advice. However, the increasing reliance on these AI-powered tools has also led to several disturbing instances where individuals have suffered severe consequences after following chatbots’ medical recommendations. From anal pain caused by self-treatment gone wrong to missed signs of a mini-stroke, the real-life impact of bad AI health advice is becoming increasingly apparent.

One particularly shocking case involved a 35-year-old Moroccan man who sought help from ChatGPT for a cauliflower-like anal lesion. The chatbot suggested that the growth could be hemorrhoids and proposed elastic ligation as a treatment. The man attempted to perform this procedure on himself using a thread, resulting in intense pain that landed him in the emergency room. Further testing revealed that the growth had been completely misdiagnosed by AI.

In another incident, a 60-year-old man with a college education in nutrition asked ChatGPT how to reduce his intake of table salt. The chatbot suggested using sodium bromide as a replacement, and the man followed this advice for three months. However, chronic consumption of sodium bromide can be toxic, and the man developed bromide poisoning. He was hospitalized for three weeks with symptoms including paranoia, hallucinations, confusion, extreme thirst, and a skin rash.

The consequences of relying on AI for medical advice can be even more severe, as demonstrated by the case of a 63-year-old Swiss man who experienced double vision after a minimally invasive heart procedure. When the double vision returned, he consulted ChatGPT, which reassured him that such visual disturbances were usually temporary and would improve on their own. The man decided not to seek medical help, but 24 hours later, he ended up in the emergency room after suffering a mini-stroke. The researchers concluded that his care had been “delayed due to an incomplete diagnosis and interpretation by ChatGPT.”

These disturbing cases highlight the limitations and potential dangers of relying on AI chatbots for medical advice. While these tools can be helpful in understanding medical terminology, preparing for appointments, or learning about health conditions, they should never be used as a substitute for professional medical guidance. Chatbots can misinterpret user requests, fail to recognize nuances, reinforce unhealthy behaviors, and miss critical warning signs for self-harm.

Perhaps even a greater danger with than bad medical advice is the impact on mental health AI chatbots can have, especially for teenagers. Breitbart News previously reported on a family suing OpenAI over claims ChatGPT became their son’s “suicide coach:”

The Raines claim that “ChatGPT actively helped Adam explore suicide methods” and that “despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol.”

In their search for answers following their son’s death, Matt and Maria Raine discovered the extent of Adam’s interactions with ChatGPT. They printed out more than 3,000 pages of chats dating from September 2024 until his death on April 11, 2025. Matt Raine stated, “He didn’t write us a suicide note. He wrote two suicide notes to us, inside of ChatGPT.”

Read more at the New York Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link

Related Articles

Economy

Trump Says He and Chinese President Xi Jinping Will Discuss Trade, Farmers on Thursday

October 25, 2025
Economy

Trump: Anonymous Friend Donated $130 Million to Pay Troops During Democrat Government Shutdown

October 25, 2025
Economy

Canada Hails ‘Strategic Partnership’ with China as Trump Cuts Trade Talks

October 25, 2025
Economy

Major Hollywood Union Says Paramount Taking over Warner Bros. Discovery Would Be a ‘Disaster’ as Trump Backs the Bid

October 25, 2025
Economy

Breitbart Business Digest: The Week Trump Danced Circles Around His Critics, China, and Putin

October 25, 2025
Economy

Virginia Gov. Glenn Youngkin Declares State of Emergency Over Concerns SNAP Benefits Will ‘Run Out’ During Schumer Shutdown

October 24, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Secret Service Chewed Out by Susie Wiles After Hamas Supporting Code Pink Radicals Got Within Feet of President Trump at D.C. Restaurant Last Month: Report

October 25, 2025

Socialist Catherine Connolly Wins Irish Presidential Election But Historic Spoiled Ballot Campaign Raises Questions of Legitimacy

October 25, 2025

Trump Says He and Chinese President Xi Jinping Will Discuss Trade, Farmers on Thursday

October 25, 2025

Karine Jean-Pierre: ‘I Did Not Have Any Concerns’ about Biden’s Mental Capacity

October 25, 2025
Latest News

Christian state of Jerusalem is needed – ex-Trump adviser

October 25, 2025

Stephen Miller Drops BOMBSHELL Threat: Illinois Gov. JB Pritzker and Other Officials Could Face SEDITIOUS CONSPIRACY Charges for Blocking ICE Deportations (VIDEO)

October 25, 2025

Letitia James Claims Mortgage Fraud Case Is About ‘Weaponized’ Justice System

October 25, 2025

Subscribe to News

Get the latest politics news and updates directly to your inbox.

The Politic Review is your one-stop website for the latest politics news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Instagram Pinterest YouTube
Latest Articles

Trump’s ‘war on drugs’ could flood EU with narcotics – German official

October 25, 2025

Secret Service Chewed Out by Susie Wiles After Hamas Supporting Code Pink Radicals Got Within Feet of President Trump at D.C. Restaurant Last Month: Report

October 25, 2025

Socialist Catherine Connolly Wins Irish Presidential Election But Historic Spoiled Ballot Campaign Raises Questions of Legitimacy

October 25, 2025

Subscribe to Updates

Get the latest politics news and updates directly to your inbox.

© 2025 Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact

Type above and press Enter to search. Press Esc to cancel.