>
2025-10-20 -- Ernest Hancock interviews Karen Kwiatkowski (MP3&4)
Will We See a New Era of Truly Popular Anti-Statism?
30 Minute Secret Makes Your Water Heater Last Decades
3D Printed Aluminum Alloy Sets Strength Record on Path to Lighter Aircraft Systems
Big Brother just got an upgrade.
SEMI-NEWS/SEMI-SATIRE: October 12, 2025 Edition
Stem Cell Breakthrough for People with Parkinson's
Linux Will Work For You. Time to Dump Windows 10. And Don't Bother with Windows 11
XAI Using $18 Billion to Get 300,000 More Nvidia B200 Chips
Immortal Monkeys? Not Quite, But Scientists Just Reversed Aging With 'Super' Stem Cells
ICE To Buy Tool That Tracks Locations Of Hundreds Of Millions Of Phones Every Day
Yixiang 16kWh Battery For $1,920!? New Design!
Find a COMPATIBLE Linux Computer for $200+: Roadmap to Linux. Part 1
This new category is sparking a revolution in data center architecture where all applications will run in memory. Until now, in-memory computing has been restricted to a select range of workloads due to the limited capacity and volatility of DRAM and the lack of software for high availability. Big Memory Computing is the combination of DRAM, persistent memory and Memory Machine software technologies, where the memory is abundant, persistent and highly available.
Transparent Memory Service
Scale-out to Big Memory configurations.
100x more than current memory.
No application changes.
Big Memory Machine Learning and AI
* The model and feature libaries today are often placed between DRAM and SSD due to insufficient DRAM capacity, causing slower performance
* MemVerge Memory Machine bring together the capacity of DRAM and PMEM of the cluster together, allowing the model and feature libraries to be all in memory.
* Transaction per second (TPS) can be increased 4X, while the latency of inference can be improved 100X