>
If housing prices had simply followed income growth since 1970
According to INSEE data, France is now channeling 57.2% of its GDP through government spending
Power is moving eastwards and westerners are talking about how great it is...
Agricultural commodities are now breaking out decisively from nearly 20-year resistance.
Researcher wins 1 bitcoin bounty for 'largest quantum attack' on underlying tech
Interceptor-Drone Arms-Race Emerges
A startup called Inversion has introduced Arc, a space-based vehicle...
Mining companies are using cosmic rays to find critical minerals
They regrew a severed nerve - by shortening a bone.
New Robot Ants Work Like Real Insects To Build And Dismantle On Their Own
Russian scientists 'are developing the world's first drug to delay ageing' months after
Sam Altman's World ID Expands Biometric Identity Checks
China Tests Directed Energy Beam That Recharges Drones Mid-Flight
Jurassic Park might arrive sooner than expected, just with Dinobots.

Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.