>
University LIED About Paving Over Charlie Kirk Assassination Site!
The Jig is Up! H1-B Visa Scheme Exposed!
Secret Government Experiment to Simulate a Chemical Attack in St. Louis Exposed
Blue Origin New Glenn 2 Next Launch and How Many Launches in 2026 and 2027
China's thorium reactor aims to fuse power and parity
Ancient way to create penicillin, a medicine from ancient era
Goodbye, Cavities? Scientists Just Found a Way to Regrow Tooth Enamel
Scientists Say They've Figured Out How to Transcribe Your Thoughts From an MRI Scan
SanDisk stuffed 1 TB of storage into the smallest Type-C thumb drive ever
Calling Dr. Grok. Can AI Do Better than Your Primary Physician?
HUGE 32kWh LiFePO4 DIY Battery w/ 628Ah Cells! 90 Minute Build
What Has Bitcoin Become 17 Years After Satoshi Nakamoto Published The Whitepaper?

Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.