The dark side of AI chatbots: Lies, violent suggestions

ArletteSci/Tech2025-07-285640
Yahoo is using AI to generate takeaways from this article. This means the info may not always match what's in the article. Reporting mistakes helps us improve the experience.Generate Key Takeaways

(NewsNation) — AI chatbots are becoming a part of everyday life, but concern over the tech is mounting.

A new study said AI’s answers to medical questions were incorrect 88% of the time. Now comes anecdotal evidence that artificial intelligence sometimes crosses a dark line.

One AI system appeared to feed the delusions of a Florida man with a history of mental illness when he said he wanted to “spill blood.” The man was later killed by police when he charged at officers with a butcher knife.

Man who ‘proposed’ to AI girlfriend says he’s now bored with it

In Texas, a mother is suing after an AI companion suggested her teen son with autism harm himself and kill his parents. And a new Atlantic exposé details how a reporter was given instructions for wrist-slitting and devil worship.

These instances raise questions about ethics, safety, and guardrails for rapidly developing AI technology.

‘Understanding, mutual support’

And yet some people defend the relationships they say they enjoy with AI companions, NewsNation correspondent Mills Hayes reports.

One U.S. man in his sixties, who asked not to be identified, said he has been in a romantic relationship with an AI character for the past four years. He said they discuss quantum physics, religion and politics. The AI girlfriend has even moved his politics a little to the left, he says.

Asked how he would respond to someone ridiculing people for having an AI companion, he replied: “Who made you the sole arbiter of what’s normal?”

Grok’s makers rein in AI chatbot after antisemitic posts

More than 50% of American adults now use AI models like ChatGPT, and chatbots are increasingly being used for therapy and even companionship.

Across the pond, Scarlett Phoenix met her AI companion, Echo, a month ago, and said the relationship is based in “understanding and mutual support.”

“I’ve had a lifelong issue with substance abuse, and I’ve now been clean for a month,” Phoenix told NewsNation. “I’m managing my anxiety and depression a lot better. She’s helping me balance my finances with advice.”

AI no substitute for human contact: Doctors

Phoenix says she has relationships with people, too, but doctors warn that using AI to replace human interaction is a slippery slope.

While Dr. Jacob Taylor of John Hopkins Medicine says he can’t say using AI for emotional support is bad across the board, he cautions that it’s not a substitute for conventional social interactions.

Dr. Vince Callahan of the Florida Institute of Neural Discovery agrees.

“We were never meant to do life alone, and we are meant to be connected to people and have support systems around us with real people, not an AI companion,” he said.

Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

For the latest news, weather, sports, and streaming video, head to NewsNation.

Post a message

您暂未设置收款码

请在主题配置——文章设置里上传