Allyson Klein predicts inference spreading from cloud to edge, agentic oversight reshaping ops, privacy battles intensifying, scientific computing facing brain drain, and quantum finally breaking through.
RackRenew remanufactures OCP compliant-infrastructure into certified, warranty-backed assemblies—standardized, tested, and ready to deploy for faster capacity without bespoke engineering.
These data-infrastructure shifts will determine which enterprises scale AI in 2026, from real-time context and agentic guardrails to governance, efficiency, and a more resilient data foundation.
From circularity to U.S. assembly, Giga Computing lays out a rack-scale roadmap tuned for the next phase of AI—where inference drives scale and regional supply chains become a competitive edge.
In Part 2 of Matty Bakkeren’s 2026 predictions series, he explores how regulation, sovereignty, and public trust will push data centers to behave more like utilities than tech projects.
Arm’s OCP board seat and new FCSA spec push chiplet interoperability from idea to implementation—enabling mix-and-match silicon and smarter storage so teams can build AI without hyperscaler budgets.
From WEKA’s memory grid and exabyte storage to 800G fabrics, liquid-cooled AI factories, edge clusters, and emerging quantum accelerators, SC25 proved HPC is now about end-to-end AI infrastructure.
Xeon 6 marries P-cores, E-cores, and scalable memory to feed data-hungry HPC workloads, eliminating bandwidth bottlenecks so spectral sims and other memory-bound codes can finally scale.
On Day 1 of KubeCon + CloudNativeCon Atlanta, CNCF unveiled Kubernetes AI Conformance to make workloads portable—arriving as inference surges to ~1.33 quadrillion tokens/month across Google’s systems.
Multi-year agreement makes VAST AI OS CoreWeave’s primary data foundation, aligning roadmaps for instant dataset access and performance across training and real-time inference in global AI cloud regions.
At the OCP Global Summit, Avayla CEO Kelley Mullick reveals how rapid standardization and hybrid cooling strategies are reshaping infrastructure for the AI era.
Open models move fast—but production doesn’t forgive surprises. Lynn Comp maps how to pair open-source AI with a solid CPU foundation and orchestration to scale from pilot to platform.
TechArena Host Allyson Klein chats with Open Compute Foundation leaders Michael Schill and Steve Helvie about the organization’s rising contributions and what it means for broad adoption of open hardware configurations from edge to cloud.
TechArena host Allyson Klein chats with OpenUK CEO Amanda Brock live from the OCP Regional Summit in Prague on her organization’s mission to drive open software, hardware and data contributions for UK developers.
TechArena host Allyson Klein chats with Oracle VP Shasank Chavan about in-memory databases, customer demands in a data centric world, and how infrastructure must change to fuel customer needs.
TechArena host Allyson Klein chats with Alphawave Semi’s Letizia Guiliano about the future of semiconductor innovation across memory, optical and interoperable chiplet solutions and how her company is poised to deliver leadership innovation rooted in standards.
TechArena host Allyson Klein chats with Memverge founder and CEO Charles Fan about his company’s disruptive vision for breaking through data center memory limitations and what the CXL standard will bring to infrastructure innovation.
TechArena host Allyson Klein chats with AMD senior fellow and CXL technical taskforce co-chair Mahesh Wagh regarding AMD’s entry of CXL platforms into the market gen 4 AMD EYPC processors and his organization’s strategy to deliver disruptive innovation utilizing CXL capability in the years ahead.
Rose-Hulman Institute of Technology shares how Azure Local, AVD, and GPU-powered infrastructure are transforming IT operations and enabling device-agnostic access to high-performance engineering software.
From SC25 in St. Louis, Nebius shares how its neocloud, Token Factory PaaS, and supercomputer-class infrastructure are reshaping AI workloads, enterprise adoption, and efficiency at hyperscale.
Recorded at #OCPSummit25, Allyson Klein and Jeniece Wnorowski sit down with Giga Computing’s Chen Lee to unpack GIGAPOD and GPM, DLC/immersion cooling, regional assembly, and the pivot to inference.
From #OCPSummit25, this Data Insights episode unpacks how RackRenew remanufactures OCP-compliant racks, servers, networking, power, and storage—turning hyperscaler discards into ready-to-deploy capacity.
Allyson Klein and co-host Jeniece Wnorowski sit down with Arm’s Eddie Ramirez to unpack Arm Total Design’s growth, the FCSA chiplet spec contribution to OCP, a new board seat, and how storage fits AI’s surge.
Midas Immersion Cooling CEO Scott Sickmiller joins a Data Insights episode at OCP 2025 to demystify single-phase immersion, natural vs. forced convection, and what it takes to do liquid cooling at AI scale.