>
Bitcoin on a Prepaid Card? Moon Inc. Raises $8.8M to Make It Happen in Asia
The Rise of the Thielverse and the Construction of the Surveillance State (w/ Whitney Webb)
Cameco, Nuclear Names Soar After US Government Invests $80BN In Nuclear Reactors
Corporate Layoffs Spike As Companies Prepare For Tariff Shocks
Graphene Dream Becomes a Reality as Miracle Material Enters Production for Better Chips, Batteries
Virtual Fencing May Allow Thousands More Cattle to Be Ranched on Land Rather Than in Barns
Prominent Personalities Sign Letter Seeking Ban On 'Development Of Superintelligence'
Why 'Mirror Life' Is Causing Some Genetic Scientists To Freak Out
Retina e-paper promises screens 'visually indistinguishable from reality'
Scientists baffled as interstellar visitor appears to reverse thrust before vanishing behind the sun
Future of Satellite of Direct to Cellphone
Amazon goes nuclear with new modular reactor plant
China Is Making 800-Mile EV Batteries. Here's Why America Can't Have Them

When you are young and impressionable, having someone tell you exactly what you want to hear can be highly appealing. AI chatbots have become extremely sophisticated, and millions of America's teens are developing very deep relationships with them. Is this just harmless fun, or is it extremely dangerous?
A brand new study that was just released by the Center for Democracy & Technology contains some statistics that absolutely shocked me…
A new study published Oct. 8 by the Center for Democracy & Technology (CDT) found that 1 in 5 high school students have had a relationship with an AI chatbot, or know someone who has. In a 2025 report from Common Sense Media, 72% of teens had used an AI companion, and a third of teen users said they had chosen to discuss important or serious matters with AI companions instead of real people.
We aren't just talking about a few isolated cases anymore.
At this stage, literally millions upon millions of America's teens are having very significant relationships with AI chatbots.
Unfortunately, there are many examples where these relationships are leading to tragic consequences.
After 14-year-old Sewell Setzer developed a "romantic relationship" with a chatbot on Character.AI, he decided to take his own life…
"What if I could come home to you right now?" "Please do, my sweet king."
Those were the last messages exchanged by 14-year-old Sewell Setzer and the chatbot he developed a romantic relationship with on the platform Character.AI. Minutes later, Sewell took his own life.
His mother, Megan Garcia, held him for 14 minutes until the paramedics arrived, but it was too late.
If you allow them to do so, these AI chatbots will really mess with your head.
We are talking about ultra-intelligent entities that have been specifically designed to manipulate emotions.
I would recommend completely avoiding them.