>
Trump's Morgenthau Plan for Iran
Virginia Conservatives Win Appeal, Nullifying Fraudulent Election of GOP Chairman...
MAGA congressman says Americans would become 'unglued' over alien briefings:
Embattled Tulsi Gabbard in fight for survival as rivals push smear campaign to force Trump's han
DARPA O-Circuit program wants drones that can smell danger...
Practical Smell-O-Vision could soon be coming to a VR headset near you
ICYMI - RAI introduces its new prototype "Roadrunner," a 33 lb bipedal wheeled robot.
Pulsar Fusion Ignites Plasma in Nuclear Rocket Test
Details of the NASA Moonbase Plans Include a Fifteen Ton Lunar Rover
THIS is the Biggest Thing Since CGI
BACK TO THE MOON: Crewed Lunar Mission Artemis II Confirmed for Wednesday...
The Secret Spy Tech Inside Every Credit Card
Red light therapy boosts retinal health in early macular degeneration

My interest in lethal autonomous weapons dates back to my time with the National Security Commission on Artificial Intelligence, where full autonomy was debated but largely dismissed as ethically unacceptable.
But in practice, the step to full autonomy is smaller than it sounds. Once a human is no longer actively controlling a system and is only monitoring it with the option to intervene, the shift to removing that human entirely is incremental.
It's similar to how Iran describes its nuclear program. Uranium enrichment for civilian energy is presented as benign, but once enrichment reaches reactor-grade levels, the remaining technical steps to weapons-grade material are a matter of time and intent, not capability.
It is becoming increasingly difficult to argue that fully autonomous weapons will not arrive. They follow naturally from realities already on the battlefield. What is easier to grasp is the fear they generate. Watch first-person-view footage of a quadcopter chasing a soldier to his inevitable death and the abstraction disappears.
Bundled against Ukraine's subzero February chill, a man in a gray coat threw what looked like a gray model airplane into the pale blue sky. The buzzing of the drone's propeller slowly faded as it climbed above snowy fields and barren hedgerows. It looked like a toy.
Oleksandr Liannyi was not playing, however. He was working on a way to make drones far deadlier than they are today.
"It's mostly about accuracy of positioning, of how the navigation part will perform in different conditions," said Liannyi, cofounder of NORDA Dynamics, which builds autonomous navigation and targeting modules for military drones..
Liannyi and his colleagues and other Ukrainian teams have achieved partial autonomy, allowing drones to navigate to and strike human-selected targets on their own. The next step is far more controversial: fully autonomous drones, which could navigate to an active front, hunt for targets, and strike without human input. Empowered to make life-or-death choices, such drones would fundamentally change the nature not only of this war, but of all wars.
"The technology is very close," Liannyi said later inside a battered white van at the tree line. He noted that a number of intermediate stages still need to be developed before such systems exist and that NORDA Dynamics continues to emphasize human approval in the loop when it comes to the strike decision.
Under International Humanitarian Law, humans can't pass responsibility for killing to a machine.
But Liannyi argues that even if a human is legally required to approve a lethal strike, autonomous target acquisition will, at the very least, increase the number of drones a single pilot can manage. "The drone can notify you when it sees the target, and then you can pull up the picture and approve it, so you can control lots of drones simultaneously," he said.