OpenAI’s GPT-5 outperforms rivals in coding, context retention, and accuracy—setting a new bar for enterprise AI while signaling a subtle shift toward openness.
Market share shakeups, pricing shocks, and a tectonic shift in the open internet: Intel’s Lynn Comp unpacks developments in AI trends in 2025 that no one could have predicted.
Global surge in submissions reveals the pivotal role of storage in scaling AI training, with new checkpoint tests tackling failure resilience in massive accelerator clusters.
MLCommons launches industry-standard benchmarks for LLM performance on PCs, cutting through marketing hype and giving developers and enterprises the transparent metrics they need.
From Midjourney to Firefly, Part 2 of our ‘AI Zoo’ series breaks down how today’s top image models work—and how TechArena uses them to create powerful, responsible visuals.
As Chinese EV giants like BYD rise, German automakers are forging an unlikely alliance, but history shows such partnerships often crumble within months.
The EU’s massive investment in AI gigafactories positions Europe as a global AI contender, amid rising competition from China's DeepSeek and the U.S.'s Stargate projects.
DeepSeek claims its open-source AI model rivals top LLMs at a fraction of the cost—but is the hype justified? Here’s what you need to know about its tech, impact, and potential risks.
During day two of the Oregon AI conference, attendees focused on the ethical implications of AI and how small-to-medium-sized businesses (SMBs) can integrate AI into their operations.
The first day of the inaugural Oregon AI Conference showcased how quickly AI can unite small-to-medium-sized businesses, spotlighted DeepSeek’s evolution, and championed responsible innovation.
Our own Allyson Klein moderates a powerhouse panel on AI ethics, with panelists representing Loyola University Chicago, Google, MLCommons, VAST Data and Momethesis.
In this video from Chiplet Summit, Shekhar Kapoor discusses how Synopsys’ transition to a multi-die approach to chiplet development has allowed them to innovate beyond the limitations of traditional monolithic chips.
From storage to automotive, MLPerf is evolving with industry needs. Hear David Kanter explain how community-driven benchmarking is enabling reliable and scalable AI deployment.
Solidigm’s Ace Stryker joins Allyson Klein and Jeniece Wnorowski on Data Insights to explore how partnerships and innovation are reshaping storage for the AI era.
With sustainability at the core, Iceotope is pioneering liquid cooling solutions that reduce environmental impact while meeting the demands of AI workloads at scale.
In this episode of In the Arena, David Glick, SVP at Walmart, shares how one of the world’s largest enterprises is fostering rapid AI innovation and empowering engineers to transform retail.
Haseeb Budhani, Co-Founder of Rafay, shares how his team is helping enterprises scale AI infrastructure across the globe, and why he believes we’re still in the early innings of adoption.
Direct from AI Infra 2025, AI Expert & Author Daniel Wu shares how organizations build trustworthy systems—bridging academia and industry with governance and security for lasting impact.
From the OCP Global Summit, hear why 50% GPU utilization is a “civilization-level” problem, and why open standards are key to unlocking underutilized compute capacity.
In the Arena: Allyson Klein with Axelera CMO Alexis Crowell on inference-first AI silicon, a customer-driven SDK, and what recent tapeouts reveal about the roadmap.
In this episode of Data Insights, host Allyson Klein and co-host Jeniece Wnorowski sit down with Dr. Rohith Vangalla of Optum to discuss the future of AI in healthcare.
From OCP Summit, Metrum AI CEO Steen Graham unpacks multi-agent infrastructure, SSD-accelerated RAG, and the memory-to-storage shift—plus a 2026 roadmap to boost GPU utilization, uptime, and time-to-value.
Anusha Nerella joins hosts Allyson Klein and Jeniece Wnorowski to explore responsible AI in financial services, emphasizing compliance, collaboration, and ROI-driven adoption strategies.
At AI Infra Summit, CTO Sean Lie shares how Cerebras is delivering instant inference, scaling cloud and on-prem systems, and pushing reasoning models into the open-source community.