LLMs have given attackers new angles. Fortinet showed, step by step, how AI-driven probes escalate—and how FortiGate, FortiWeb, FortiAnalyzer, and FortiSOAR close the door without slowing the business.
From racks and liquid loops to data placement and standards pace, five takeaways from CoreWeave, Dell, NVIDIA, Solidigm, and VAST Data on building AI factories that keep accelerators busy and dollars well-spent.
At Cloud Field Day 24, Oxide outlines a vertically integrated rack—custom hypervisor, integrated power/network, and open integrations—aimed at bringing hyperscale efficiency and faster deploys to enterprise DCs.
Presenting at Cloud Field Day 24, Pure pitched fleet-level automation across mixed environments as the antidote to storage silos, promising one control plane for legacy systems and modern workloads.
A 2025 field guide for architects: why Arm’s software gravity and hyperscaler adoption make it the low-friction path today, where RISC-V is gaining ground, and the curveballs that could reshape both.
The cloud security architect who came from networking explains her framework for separating tech hype from genuine innovation, and why stepping away is key to solving hard problems.
Dave Driggers, CEO of Cirrascale, breaks down what “compute efficiency” really means, from GPU utilization and TCO modeling to token-based pricing that drives predictable customer value.
From data center to edge, Arm is enabling full-stack AI efficiency, powering ecosystems with performance-per-watt optimization, tailored silicon, and software portability across environments.
Real-Time Energy Routing (RER) treats electricity like data—modular, dynamic, and software-defined—offering a scalable path to resilient, sustainable data center power.
Intel shares insights on Arm vs. x86 efficiency, energy goals for 2030, AI-driven power demands, and how enterprises can navigate compute efficiency in the AI era.
From manure-to-energy RNG to an aluminum-air system that generates electricity on demand, innovators tackled real AI bottlenecks—power-chain integration, rapid fiber turn-ups, AI-driven permitting, and plug-and-play capacity that speeds time-to-value.
AMD improved energy efficiency 38x—roughly a 97% drop in energy for the same compute—and now targets 20x rack-scale gains by 2030, reimagining AI training, inference, and data-center design.
Ventiva CEO Carl Schlachte joins Allyson Klein to share how the company’s Ionic Cooling Engine is transforming laptops, servers, and beyond with silent, modular airflow.
Discover how JetCool’s proprietary liquid cooling is solving AI’s toughest heat challenges—keeping data centers efficient as workloads and power densities skyrocket.
Allyson Klein hosts Manu Fontaine (Hushmesh) and Jason Rogers (Invary) to unpack TEEs, attestation, and how confidential computing is moving from pilots to real deployments across data center and edge.
Allyson Klein and Jeniece Wnorowski welcome Mohan Potheri of Hypertec to explore how immersion cooling slashes energy use, shrinks data-center footprints, and powers sustainable, high-density AI, HPC, and edge solutions on this Data Insights episode.
SayTEC redefines IT with a zero trust, hyper-converged platform delivering sovereign cloud, seamless scalability, and military-grade security for critical industries.
In this podcast, Pragmatic Semi CEO David Moore shares how flexible, sustainable ICs are unlocking new edge AI and IoT applications—powered by a low-carbon, high-volume fab model.
In this In the Arena episode, Winbond’s Jun Kawaguchi discusses their industry-leading strategies for tackling next-gen cybersecurity threats, ensuring robust protection for the future.
In this podcast, PCI-SIG President Al Yanes explores PCI-SIG's journey to PCIe 7.0, advancements in copper and optical specs, and their pivotal role in HPC and AI.
In this podcast, NCSA Director Bill Gropp explores the latest advanced computing trends, from AI innovations to groundbreaking research on supercomputing climate models, and how this tech is transforming science and society.
Discover how AI is transforming data centers, with innovations in high-speed networking, emulation, and hyperscale infrastructure driving efficiency and performance in the era of AI workloads.
OCP’s Rob Coyle shares insights on AI, cooling innovations, and open hardware’s role in transforming data centers as the industry accelerates toward scalable, sustainable infrastructure.
In this episode, OCP CEO George Tchaparian shares how OCP is driving AI infrastructure innovation, fostering collaboration, and tackling scalability, efficiency, and sustainability challenges in data centers and beyond.