
AI Hardware & Edge AI Summit 2024 takes place from September 9-12, 2024, in San Jose, California. This summit is a key event for the AI and machine learning ecosystem, focusing on the deployment and scaling of machine learning systems at the edge. The event covers a range of topics including AI hardware, infrastructure, model training, and deployment strategies, making it essential for professionals involved in AI development and implementation.
As AI breaks the networking playbook and data centers hit the power wall, the optics industry enters a chaotic “2003 moment.” Mark Grodzinsky explores why the lessons of Wi-Fi will define the winners of the AI era.
AI demand is tightening HDD and NAND supply—and prices may follow. VAST is betting on flash reclamation and KV-cache persistence as storage starts acting more like memory.
Deterministic wireless is becoming the nervous system of AI. As robots and XR scale, “best effort” turns into business risk—and networks must deliver predictable, identity-driven, secure performance.
Discover how Ayar Labs' Optical I/O tech is solving AI data bottlenecks, boosting performance, and driving new metrics for profitability, interactivity, and scalability in next-gen AI infrastructure.
Check out our in-depth look at key takeaways from AI Hardware and Edge AI Summit, highlighting how advancements in AI infrastructure, acceleration, and connectivity are transforming the future of computing.
Letizia Giuliano of Alphawave Semi discusses advancements in AI connectivity, chiplet designs, and the path toward open standards at the AI Hardware Summit with host Allyson Klein.
Sean Lie of Cerebras Systems shares insights on cutting-edge AI hardware, including their game-changing wafer-scale chips, Llama model performance, and innovations in inference and efficiency.
Lisa Spelman, CEO of Cornelis Networks, discusses the future of AI scale-out, Omni-Path architecture, and how their innovative solutions drive performance, scalability, and interoperability in data centers.
Join Sascha Buehrle of Uptime Industries as he reveals how Lemony AI offers scalable, secure, on-premise solutions, speeding adoption of genAI.
Mark Wade, CEO of Ayar Labs, explains how optical I/O technology is enhancing AI infrastructure, improving data movement, reducing bottlenecks, and driving efficiency in large-scale AI systems.
Allyson Klein reflects on her fascinating chat with Guest Gayathri "G" Radhakrishnan, Partner at Hitachi Ventures, who discussed innovation in the AI space, future adoption of AI and how long it will take for enterprises to adopt AI tech at scale.
Neeraj Kumar, Chief Data Scientist at PNNL, discusses AI's role in scientific discovery, energy-efficient computing, and collaboration with Micron to advance memory systems for AI and high-performance computing.
We’re honed in on these trends for 2025: Will NVIDIA GPUs face real competition? What innovations will we see in AI fabric? And how will our planet support the power-hungry future of AI?