>
Chicago Homicide Rate: 2025 Analysis
Tennessee Sues Roblox, Says Game is a 'Gateway for Predators' Targeting Children
Kushner and Witkoff Reportedly Draft $112B Plan to Turn Gaza Into 'Smart City'...
Christmas in Venezuela: What It Was Like After Socialism Destroyed the Country
Perfect Aircrete, Kitchen Ingredients.
Futuristic pixel-raising display lets you feel what's onscreen
Cutting-Edge Facility Generates Pure Water and Hydrogen Fuel from Seawater for Mere Pennies
This tiny dev board is packed with features for ambitious makers
Scientists Discover Gel to Regrow Tooth Enamel
Vitamin C and Dandelion Root Killing Cancer Cells -- as Former CDC Director Calls for COVID-19...
Galactic Brain: US firm plans space-based data centers, power grid to challenge China
A microbial cleanup for glyphosate just earned a patent. Here's why that matters
Japan Breaks Internet Speed Record with 5 Million Times Faster Data Transfer

Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.