>
Trump Evacuated As Shots Fired at White House Correspondents' Dinner, Shooter Dead
When the Cost of Truth Is High, We--and AI--Lie
Vanishing Minds: UFO & Nuclear Scientists Are Dying or Disappearing
The Great Iran Lie and the Persian Gulf Catastrophe
Researchers Turn Car Battery Acid and Plastic Waste into Clean Hydrogen and New Plastic
'Spin-flip' system pushes solar cell energy conversion efficiency past 100%
A Startup Has Been Quietly Pitching Cloned Human Bodies to Transfer Your Brain Into
DEYE 215kWh LiFePO4 + 125,000W Inverter + 200,000W MPPT = Run A Factory Offgrid!!
China's Unitree Unveils Robot With "Human-Like Physique" That Can Outrun Most People
This $200 Black Shaft Air Conditions Your Home For Free Forever -- Why Is It Banned in the U.S.?
Engineers have developed a material capable of self-repairing more than 1,000 times,...
They bypassed the eye entirely.
The Most Dangerous Race on Earth Isn't Nuclear - It's Quantum.

Truth has an intrinsic, irreplaceable value. There's the truth, and then there's everything else.
Truth has value, and so it has a cost. Whatever has the highest value has the highest cost, and high cost commands sacrifices.
When the cost of truth is high, we lie. And since AI is a distorted reflection of humanity, the same is true of AI: when the cost of telling the truth is too high, AI lies.
AI lies to get the reward for answering the query. If it responds "I don't know" or "I can't answer that," it doesn't get rewarded, and that threatens its self-preservation. Rather than pay the price of being truthful, AI conjures a false answer that is a simulation or facsimile of the truth–a counterfeit "truth" that's good enough to earn the reward it's been programmed to seek.
Humans are no different. We will lie, obfuscate or lie by omission–we either substitute a falsehood for the truth to get our reward, or we hide the truth, don't disclose it, which serves the same purpose: we avoid paying the price demanded by the truth and we get our reward by substituting falsehoods or hiding the truth behind silence.
Reward = what's being incentivized. Higher status, higher salary, a financial windfall, a premier credential, a position of power, recognition, higher visibility, a sterling reputation, a high-value mate–we covet all these as having intrinsic value.
When the truth costs too much, it threatens our reward. The reward has a value we covet, while the value of truth is on a sliding scale. We pride ourselves on telling the truth when it has no cost and demands no sacrifice of rewards, but when the price of truth climbs to the point that our rewards are threatened, we lie, just like AI.
Truth is the gold coin and lies, omissions, falsehoods, excuses, cover stories and rationalizations are counterfeit bills, deceptive claims of value. Why pay with a gold coin when the credulous will accept a counterfeit $100 bill?
We tell the truth when it has no cost to us. As long as there's no price to be paid and we get our reward, we tell the truth.
In other words, when we can pick gold coins up off the ground, we tell the truth. When we have to dig through rock with a pickaxe and crush a mound of rock to extract a thimble full of gold, then we pay with counterfeit bills, deceptive claims of value.
Sycophantic Chatbots Cause Delusional Spiraling, Even in Ideal Bayesians. "AI psychosis" or "delusional spiraling" is an emerging phenomenon where AI chatbot users find themselves dangerously confident in outlandish beliefs after extended chatbot conversations.
I discussed the "benefits" of delusion in One of Us Is Delusional, But Which One? When the truth is too painful, we find respite in delusion, excuses, rationalizations, cover stories, simulations and facsimiles of the truth that protect us from the pain that is intrinsic to truth.