>
Dark Matter: An 86-lb, 800-hp EV motor by Koenigsegg
LA To Institute Mass Layoffs Of City Workers In Wake Of $1 Billion Deficit
Canada Votes On New Government Monday After Shock Poll Reversal
DEVELOPING: Trump Says Those Who Operated Biden's Autopen "Should All Be In Jail"
Cramming More Components Onto Integrated Circuits
'Cyborg 1.0': World's First Robocop Debuts With Facial Recognition And 360° Camera Visio
The Immense Complexity of a Brain is Mapped in 3D for the First Time:
SpaceX, Palantir and Anduril Partnership Competing for the US Golden Dome Missile Defense Contracts
US government announces it has achieved ability to 'manipulate space and time' with new tech
Scientists reach pivotal breakthrough in quest for limitless energy:
Kawasaki CORLEO Walks Like a Robot, Rides Like a Bike!
World's Smallest Pacemaker is Made for Newborns, Activated by Light, and Requires No Surgery
Barrel-rotor flying car prototype begins flight testing
Coin-sized nuclear 3V battery with 50-year lifespan enters mass production
With the technology developing and evolving faster than the academic community can study and publish evidence-based studies on it, there's nonetheless a growing body of expert commentary surrounding AI large language models (LLM) and their ability – or lack thereof – in replacing real, human, trained psychologists.
"The immediacy of AI chatbots makes them an attractive alternative to human-to-human therapy that is expensive and often inconvenient," Hard Fork podcast host Kevin Roose shared with The New York Times in December. "But while they may offer sensible advice, they aren't infallible."
Some of us will remember the antiquated text-prompt "chatbot" ELIZA, developed in the mid-1960s. But like almost all tech, it's no real comparison to what exists now, more than half a century on. ELIZA was a scripted, closed-off program that grew increasingly tedious with its predictable responses. And while the current models – ChatGPT-4, Claude and Gemini, for example – are anything but sentient, there's a plasticity in their design we've never seen before. And it's only going to advance, for better or worse. Interestingly, ELIZA's creator, MIT's Joseph Weizenbaum, went on to warn of the dangers of AI, calling it an "index of the insanity of our world." (He's been spared bearing witness to this current AI timeline – he died in 2008.)
Much like the broad reasons people use chatbots, employing them as a digital therapist is also multifaceted. In-person therapy is prohibitive, often to the segments of society that need it most – a session costs between US$100-$300 and wait times to even see someone and hope they're a good fit can be months, something that's worsened following the pandemic. A 2022 Practitioner Impact Survey found that, in the US, 60% of psychologists had no openings for new patients, while more than 40% had 10 or more patients waiting to get an appointment.
Online counseling with a human on the other end of the phone or laptop is somewhat more accessible, but a monthly subscription can be in excess of $400. And beyond financial, physical and geographic barriers, in-person talk therapy can be challenging for those with autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD) – even though it has been shown to be hugely beneficial for both conditions.
It also raises another important issue – how inadequate access to mental health services, provided by humans, has led to people seeking help elsewhere.
"Many people with ASD or ADHD have difficulties around processing social cues regarding their impact on others," researchers noted in a 2023 paper. "This can have a negative impact on social interactions, and can lead neurodivergent people to expect criticism and/or rejection from others."