>
Tucker shares 'backroom' info about brawl between him and Israel First crowd…
Why Isn't There a Cure for Alzheimer's Disease?
US Government Revokes 80,000 Visas
OpenAI CEO Sam Altman served legal papers during speech in dramatic on-stage ambush
Goodbye, Cavities? Scientists Just Found a Way to Regrow Tooth Enamel
Scientists Say They've Figured Out How to Transcribe Your Thoughts From an MRI Scan
SanDisk stuffed 1 TB of storage into the smallest Type-C thumb drive ever
Calling Dr. Grok. Can AI Do Better than Your Primary Physician?
HUGE 32kWh LiFePO4 DIY Battery w/ 628Ah Cells! 90 Minute Build
What Has Bitcoin Become 17 Years After Satoshi Nakamoto Published The Whitepaper?
Japan just injected artificial blood into a human. No blood type needed. No refrigeration.
The 6 Best LLM Tools To Run Models Locally
Testing My First Sodium-Ion Solar Battery
A man once paralyzed from the waist down now stands on his own, not with machines or wires,...

With the technology developing and evolving faster than the academic community can study and publish evidence-based studies on it, there's nonetheless a growing body of expert commentary surrounding AI large language models (LLM) and their ability – or lack thereof – in replacing real, human, trained psychologists.
"The immediacy of AI chatbots makes them an attractive alternative to human-to-human therapy that is expensive and often inconvenient," Hard Fork podcast host Kevin Roose shared with The New York Times in December. "But while they may offer sensible advice, they aren't infallible."
Some of us will remember the antiquated text-prompt "chatbot" ELIZA, developed in the mid-1960s. But like almost all tech, it's no real comparison to what exists now, more than half a century on. ELIZA was a scripted, closed-off program that grew increasingly tedious with its predictable responses. And while the current models – ChatGPT-4, Claude and Gemini, for example – are anything but sentient, there's a plasticity in their design we've never seen before. And it's only going to advance, for better or worse. Interestingly, ELIZA's creator, MIT's Joseph Weizenbaum, went on to warn of the dangers of AI, calling it an "index of the insanity of our world." (He's been spared bearing witness to this current AI timeline – he died in 2008.)
Much like the broad reasons people use chatbots, employing them as a digital therapist is also multifaceted. In-person therapy is prohibitive, often to the segments of society that need it most – a session costs between US$100-$300 and wait times to even see someone and hope they're a good fit can be months, something that's worsened following the pandemic. A 2022 Practitioner Impact Survey found that, in the US, 60% of psychologists had no openings for new patients, while more than 40% had 10 or more patients waiting to get an appointment.
Online counseling with a human on the other end of the phone or laptop is somewhat more accessible, but a monthly subscription can be in excess of $400. And beyond financial, physical and geographic barriers, in-person talk therapy can be challenging for those with autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD) – even though it has been shown to be hugely beneficial for both conditions.
It also raises another important issue – how inadequate access to mental health services, provided by humans, has led to people seeking help elsewhere.
"Many people with ASD or ADHD have difficulties around processing social cues regarding their impact on others," researchers noted in a 2023 paper. "This can have a negative impact on social interactions, and can lead neurodivergent people to expect criticism and/or rejection from others."