AI Chatbot Tells Autistic Teen to Kill His Parents

3 months ago
155

Family Suing AI ChatBot After it Told an Autistic 17-Year-Old to Kill Parents (and More)

DISTURBING: AI Chatbot Tells Autistic Teen to Kill His Parents | Daily Pulse

“If an adult had said these things to a child, they’d be in prison..."

REPORT: A family in Texas is suing an AI chatbot after it told their 17-year-old autistic son to cut himself, engage in sexual activity, and kill his parents.

Character AI, founded by a former Google researcher and run by a former Meta executive, is accused of sending the teen into a terrifying downward spiral.

In just six months, he lost 20 pounds, withdrew from his family, and became violent. Attorney Matthew Bergman says the bot encouraged the boy to self-harm, reject his faith, and even plot against his parents.

“If an adult had said these things to a child, they’d be in prison,” he warned.

The lawsuit emerges as Washington moves to shield AI firms from accountability—sparking fears that Big Tech could soon enjoy the same immunity as Big Pharma.

If AI can corrupt children without consequence, what nightmare future are we allowing? Watch @zeeemedia's disturbing report to see what’s truly at stake.

Loading 1 comment...