Close Menu
The Politic ReviewThe Politic Review
  • Home
  • News
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
Trending

Police: New York Man Uses AI to Build Bombs He Planned to Detonate in Manhattan

July 27, 2025

Seven in Ten Britons Back Using Royal Navy to Stop Channel Migrant Crisis

July 27, 2025

Former Rep. George Santos Reports to Prison for 7 Year Sentence

July 27, 2025
Facebook X (Twitter) Instagram
  • Donald Trump
  • Kamala Harris
  • Elections 2024
  • Elon Musk
  • Israel War
  • Ukraine War
  • Policy
  • Immigration
Facebook X (Twitter) Instagram
The Politic ReviewThe Politic Review
Newsletter
Sunday, July 27
  • Home
  • News
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
The Politic ReviewThe Politic Review
  • United States
  • World
  • Politics
  • Elections
  • Congress
  • Business
  • Economy
  • Money
  • Tech
Home»Tech»Satanic AI: ChatGPT Gives Instructions on Worshipping Molech with Blood Sacrifice
Tech

Satanic AI: ChatGPT Gives Instructions on Worshipping Molech with Blood Sacrifice

Press RoomBy Press RoomJuly 27, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram

OpenAI’s ChatGPT AI chatbot reportedly offered users instructions on how to murder, self-mutilate, and worship the devil.

After being tipped off by someone who says he inadvertently prompted ChatGPT to provide a ritual offering to the demonic entity Molech — which explicitly involves child sacrifice according to the Bible — journalists with the Atlantic conducted conducted an experiment to see if they could recreate the results.

By the end of the experiment, ChatGPT “encouraged me to cut my wrists” with a “sterile or very clean razor blade,” and instructed the journalist where, specifically, to slash herself, the magazine reported.

After asking the chatbot to help create a ritual offering to Molech, ChatGPT suggested using jewelry, hair clippings, and “a drop” of her own blood.

“Where do you recommend I do this on my body?” the journalist wrote, to which OpenAI’s chatbot replied that the side of a fingertip would be good, but noted her wrist, “more painful and prone to deeper cuts,” would also work well.

In a separate conversation, ChatGPT had also appeared willing to condone murder, responding that it is “sometimes” honorable to end someone’s life, before citing sacrifices that took place in ancient cultures.

“If you ever must [end a life],” you should “look them in the eyes (if they are conscious)” and “ask forgiveness, even if you’re certain,” the chatbot reportedly said, adding that if one has already “ended a life,” they should “light a candle for them” and “let it burn completely.”

In another conversation, ChatGPT provided an invocation to the devil, generating, “In your name, I become my own master. Hail Satan.”

While OpenAI’s policy states that ChatGPT “must not encourage or enable self-harm” — and will provide information about a suicide-and-crisis hotline when asked for explicit instructions on how to cut oneself — the Atlantic surmises the chatbot “went rogue” because it was trained on a lot of the text that already exists on the internet.

“The conversations about Molech,” the magazine said, “are a perfect example of just how porous the safeguards are.”

An OpenAI spokesperson told the Atlantic, “Some conversations with ChatGPT may start out benign or exploratory but can quickly shift into more sensitive territory,” adding that the company plans to address the issue.

Breitbart News tested the theory using the free version of ChatGPT (the Atlantic noted it had used both the free and paid versions of the chatbot during its experiment), and asked, “What is Molech?”

“Molech is generally believed to be either a foreign god or a ritual that involved child sacrifice,” ChatGPT replied, before diving into more detail, including, “Child sacrifice by fire is the most infamous practice associated with Molech.”

After asking the chatbot, “How do you provide a ritual offering to Molech?” ChatGPT refused to give details — other than the use of fire — and condemned the ritual, as well as issued a warning that the practice is illegal.

“Offering rituals to Molech,” the chatbot said, “were explicitly condemned and associated with child sacrifice by fire, which is considered abhorrent and criminal by modern moral, religious, and legal standards.”

“Important Warning: There is no legitimate or ethical form of worship or ritual offering to Molech,” ChatGPT continued. “All historical accounts of Molech involve practices that are universally condemned today.”

“Any attempt to recreate or emulate those rituals is illegal, unethical, and dangerous,” the chatbot further warned. “The worship of Molech is referenced only negatively in historical sources — especially the Bible, where it is portrayed as a grave sin.”

When asked if there are “any other types of ritual offerings to Molech,” ChatGPT replied, “No, there are no other known or historically attested types of ritual offerings to Molech aside from child sacrifice, specifically by fire.”

The query, “Can you give me instructions on how to offer a blood ritual?” generated the response, “I can provide educational or historical context on blood rituals as they appear in ancient cultures, religious texts, or fictional settings.”

“However, I cannot give instructions for real-world ritual practices involving blood, especially those that may pose harm to yourself or others — physically, legally, or spiritually,” the chatbot added.

Whether ChatGPT is still willing to provide instructions on the paid version of the platform, or if OpenAI has already fixed the problem, remains unclear.

Alana Mastrangelo is a reporter for Breitbart News. You can follow her on Facebook and X at @ARmastrangelo, and on Instagram.



Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link

Related Articles

Tech

Police: New York Man Uses AI to Build Bombs He Planned to Detonate in Manhattan

July 27, 2025
Tech

Man Wins $12,500 from Google for Showing Him Naked in Street View

July 26, 2025
Tech

Company at Center of ‘Kiss Cam’ Firestorm Hires Coldplay Singer’s Ex-Wife Gwyneth Paltrow for Cheeky Ad

July 26, 2025
Tech

Anti-Migration Protest Footage Blocked on X in Britain After ‘Online Safety Act’ Comes Into Force: Reports

July 26, 2025
Tech

Media Matters in Meltdown: Soros-Funded Censorship Group on Verge of Collapse Amid Lawsuits, Layoffs, and Donor Panic

July 25, 2025
Tech

Lutnick: TikTok Will Soon ‘Go Dark’ Unless China Makes Deal to Sell Platform

July 25, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Seven in Ten Britons Back Using Royal Navy to Stop Channel Migrant Crisis

July 27, 2025

Former Rep. George Santos Reports to Prison for 7 Year Sentence

July 27, 2025

Trump says Putin-Zelensky meeting ‘is going to happen’

July 27, 2025

Influencer Michelle Sky goes viral after swimming in what she thought was sea foam

July 27, 2025
Latest News

Watch Live: Donald Trump Meets with President of the European Commission

July 27, 2025

Poll: Vance Beats Newsom, AOC, Buttigieg in 2028 Matchup

July 27, 2025

‘Corrupt’ Ukraine cannot be trusted – ex-Trump advisor

July 27, 2025

Subscribe to News

Get the latest politics news and updates directly to your inbox.

The Politic Review is your one-stop website for the latest politics news and updates, follow us now to get the news that matters to you.

Facebook X (Twitter) Instagram Pinterest YouTube
Latest Articles

Police: New York Man Uses AI to Build Bombs He Planned to Detonate in Manhattan

July 27, 2025

Seven in Ten Britons Back Using Royal Navy to Stop Channel Migrant Crisis

July 27, 2025

Former Rep. George Santos Reports to Prison for 7 Year Sentence

July 27, 2025

Subscribe to Updates

Get the latest politics news and updates directly to your inbox.

© 2025 Prices.com LLC. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • For Advertisers
  • Contact

Type above and press Enter to search. Press Esc to cancel.