>
The Prostate Cancer Test Dilemma
The Separation of Righteousness and Politics
Russian strike knocks out power in Kyiv FRANCE 24 English
CLAIM: Bitcoin is going to ZERO folks – HOLY SCHLIT! Benny Johnson and Mike Benz on it
How underwater 3D printing could soon transform maritime construction
Smart soldering iron packs a camera to show you what you're doing
Look, no hands: Flying umbrella follows user through the rain
Critical Linux Warning: 800,000 Devices Are EXPOSED
'Brave New World': IVF Company's Eugenics Tool Lets Couples Pick 'Best' Baby, Di
The smartphone just fired a warning shot at the camera industry.
A revolutionary breakthrough in dental science is changing how we fight tooth decay
Docan Energy "Panda": 32kWh for $2,530!
Rugged phone with multi-day battery life doubles as a 1080p projector
4 Sisters Invent Electric Tractor with Mom and Dad and it's Selling in 5 Countries

Let's start with the basics. You might already know that a data center is essentially a large warehouse filled with thousands of servers that run 24/7.
AI companies like Anthropic, OpenAI, and Google use data centers in two main ways:
Training AI models – This is incredibly compute-intensive. Training a model like the ones powering OpenAI's ChatGPT or Anthropic's Claude required running calculations across thousands of specialized chips (GPUs) simultaneously for weeks or months.
Running AI services – When you converse with those models' chatbots, your messages go to a data center where servers process it and send back the model's response. Multiply that by millions of users having conversations simultaneously, and you need enormous computing power ready on-demand.
AI companies need data centers because they provide the coordinated power of thousands of machines working in tandem on these functions, plus the infrastructure to keep them running reliably around the clock.
To that end, these facilities are always online with ultra-fast internet connections, and they have vast cooling systems to keep those servers running at peak performance levels. All this requires a lot of power, which puts a strain on the grid and squeezes local resources.
So what's this noise about data centers in space? The idea's been bandied about for a while now as a vastly better alternative that can harness infinitely abundant solar energy and radiative cooling hundreds of miles above the ground in low Earth orbit.
Powerful GPU-equipped servers would be contained in satellites, and they'd move through space together in constellations, beaming data back and forth as they travel around the Earth from pole to pole in the sun-synchronous orbit.
The thinking behind space data centers is that it'll allow operators to scale up compute resources far more easily than on Earth. Up there, there aren't any constraints of easily available power, real estate, and fresh water supplies needed for cooling.
There are a number of firms getting in on the action, including big familiar names and plucky upstarts. You've got Google partnering with Earth monitoring company Planet on Project Suncatcher to launch a couple of prototype satellites by next year. Aetherflux, a startup that was initially all about beaming down solar power from space, now intends to make a data center node in orbit available for commercial use early next year. Nvidia-backed Starcloud, which is focused exclusively on space-based data centers, sent a GPU payload into space last November, and trained and ran a large language model on it.
The latest to join the fold is SpaceX, which is set to merge with Elon Musk's AI company xAI in a purported US$1.25-trillion deal with a view to usher in the era of orbital data centers.