Open collaboration just leveled up: OCP pushes shared specs from rack to data center—power, cooling, networking, and ops—so AI capacity can scale faster, with less friction and more choice.
Appointment to Open Compute Project Foundation board of directors, contribution of Foundation Chiplet System Architecture (FCSA) spec underscore Arm’s ascendency in hyperscale, AI data centers.
CelLink’s ultrathin flex harnessing ushers in a new era in compute infrastructure innovation, cutting cable volume by up to 90% and boosting density, reliability, and efficiency.
As AI workloads push storage power consumption higher, the path to true storage efficiency demands systems-level thinking including hardware, software, and better metrics for picking the right drives.
Dave Driggers, CEO of Cirrascale, breaks down what “compute efficiency” really means, from GPU utilization and TCO modeling to token-based pricing that drives predictable customer value.
From data center to edge, Arm is enabling full-stack AI efficiency, powering ecosystems with performance-per-watt optimization, tailored silicon, and software portability across environments.
From feeding data-hungry GPUs to enabling real-time on-set visual effects, flash storage has evolved from luxury to necessity in modern content creation pipelines.
Surveying 250 IT pros, we found 29% already run SSDs beyond performance tiers, 81% would migrate when TCO wins, and storage innovation is a top lever to free power and space across the data center.
Our flagship podcast earned a Stevie® in the International Business Awards® annual competition; judges called out the show’s high production quality, editorial clarity, and guest caliber.
FMS 2025 celebrated Jim Pappas’ lifetime of standards leadership while showcasing how AI is reshaping memory and storage with record-breaking innovation across the industry.
A unified security-first platform is eliminating decades of fragmented IT vulnerabilities while delivering military-grade protection at half the cost
Hypertec’s immersion-born servers cut cooling power by 50% and shrink 10MW deployments from 100,000 sq ft to around 10,000—showing liquid cooling isn’t experimental anymore; it’s essential.
Midas Immersion Cooling CEO Scott Sickmiller joins a Data Insights episode at OCP 2025 to demystify single-phase immersion, natural vs. forced convection, and what it takes to do liquid cooling at AI scale.
From hyperscale direct-to-chip to micron-level realities: Darren Burgess (Castrol) explains dielectric fluids, additive packs, particle risks, and how OCP standards keep large deployments on track.
From OCP San Jose, PEAK:AIO’s Roger Cummings explains how workload-aware file systems, richer memory tiers, and capturing intelligence at the edge reduce cost and complexity.
Innovative power delivery unlocks a shift in data-center design. CelLink PowerPlane routes thousands of amps in a flat, flexible circuit—cutting cabling and accelerating AI factory builds.
Recorded live at OCP in San Jose, Allyson Klein talks with CESQ’s Lesya Dymyd about hybrid quantum-classical computing, the new Maison du Quantique, and how real-world use cases may emerge over the next 5–7 years.
CEO Carl Schlachte joins TechArena at OCP Summit to share how Ventiva’s solid-state cooling—proven in dense laptops—scales to servers, cutting noise, complexity and power while speeding deployment.
From OCP Summit San Jose, Allyson Klein and co-host Jeniece Wnorowski interview Dr. Andrew Chien (UChicago & Argonne) on grid interconnects, rack-scale standards, and how openness speeds innovation.
From the OCP Global Summit in San Jose, Allyson Klein sits down with Chris Butler of Flex to unpack how the company is collapsing the gap between IT and power—literally and figuratively.
From OCP Summit 2025, Kelley Mullick joins Allyson Klein and co-host Jeniece Wnorowski for a Data Insights episode on rack-scale design, hybrid cooling (incl. immersion heat recapture), and open standards.
At OCP’s 2025 global summit, Momenthesis founder Matty Bakkeren joins Allyson Klein to explore why open standards and interoperability are vital to sustaining AI innovation at datacenter scale.
AMI CEO Sanjoy Maity joins In the Arena to unpack the company's shift to open source firmware, OCP contributions, OpenBMC hardening, and the rack-scale future—cooling, power, telemetry, and RAS built for AI.
Graid Technology takes on Intel VROC licensing for data center and workstation customers, extending its RAID portfolio to offer both CPU-integrated and GPU-accelerated solutions.