
Supermicro Powers AI Innovation at CloudFest 2025
At CloudFest 2025, Supermicro showcased their innovations that are driving the future of AI, cloud infrastructure and storage solutions. As AI technology continues to evolve, Supermicro’s ability to deliver cutting-edge hardware solutions has become a game-changer, and their booth at CloudFest served as a testament to that progress. Solidigm’s Hayley Corell spoke with Thomas Jorgensen, senior director in the Technology Enabling Group at Supermicro,to dive deeper into how Supermicro is powering AI advancements and meeting the growing demands of modern infrastructure.
Thomas highlighted the rapid growth in AI, noting that the demand for powerful AI infrastructure is being driven by large-scale model training and, increasingly, AI inferencing. But what's crucial for this advancement? A center element of that answer is storage. Supermicro understands that AI models require fast, reliable storage to keep GPUs from idling, ensuring that the entire infrastructure is working in concert to deliver results as quickly as possible. As Thomas bluntly put it, “AI doesn't work without storage,” and Supermicro is delivering solutions to meet this growing demand.
Over the past few years, AI’s exponential growth has shifted the way companies are approaching infrastructure. As Supermicro and the rest of the industry ride the wave from training centric infrastructure demand to a demand curve that also reflects inference, new kinds of infrastructure for a wider range of environments is required. As AI continues to be specifically integrated into edge environments, Supermicro is positioning itself at the forefront, enabling AI at the edge with small, fanless servers that process inferencing directly where data is generated. This localized approach reduces latency, and it also improves the speed at which data is processed, ensuring AI workloads perform seamlessly.
A key part of Supermicro’s success lies in its commitment to delivering high-performance, low-latency infrastructure. Thomas discussed how AI clusters require not only powerful GPUs, but also fast network communication and efficient storage systems. The infrastructure design has evolved significantly to meet the demands of AI, particularly with the rise of high-density petascale storage solutions. Supermicro’s focus on providing multi-tiered storage setups ensures that data is delivered at optimal speeds for any given AI workload, enabling seamless performance across AI applications.
The collaboration between Solidigm and Supermicro has been crucial in driving these advancements, particularly in the realm of high-speed storage. Solidigm’s cutting-edge storage solutions, such as their high-capacity SSDs, perfectly complement Supermicro’s AI infrastructure. By combining Solidigm’s innovative storage technology with Supermicro’s powerful hardware, they deliver the performance and reliability required to handle the intense data demands of AI workloads.
This collaboration helps ensure that AI models can access and process data quickly, making it an essential part of AI-driven infrastructure.
Supermicro's petascale storage is capable of integrating up to 122 terabyte SSDs. This massive capacity allows AI workloads to scale up and manage vast amounts of data with ease.
So, what’s the TechArena take? For on-prem AI deployments, tapping large volumes of data locally for AI integration across business functions is becoming increasingly critical, especially as many businesses shift away from the cloud due to rising costs and data privacy concerns. Supermicro’s petascale storage delivers the speed and bandwidth needed to support the growing demands of AI models, ensuring that organizations can keep up with both the scale and complexity of modern AI workloads right from their own data centers. Solidigm’s leading 122 TB drives are a perfect match for these large scale deployments.
For those looking to learn more, Supermicro offers an abundance of resources on their website (supermicro.com) and social media channels (X, LinkedIn and YouTube).
At CloudFest 2025, Supermicro showcased their innovations that are driving the future of AI, cloud infrastructure and storage solutions. As AI technology continues to evolve, Supermicro’s ability to deliver cutting-edge hardware solutions has become a game-changer, and their booth at CloudFest served as a testament to that progress. Solidigm’s Hayley Corell spoke with Thomas Jorgensen, senior director in the Technology Enabling Group at Supermicro,to dive deeper into how Supermicro is powering AI advancements and meeting the growing demands of modern infrastructure.
Thomas highlighted the rapid growth in AI, noting that the demand for powerful AI infrastructure is being driven by large-scale model training and, increasingly, AI inferencing. But what's crucial for this advancement? A center element of that answer is storage. Supermicro understands that AI models require fast, reliable storage to keep GPUs from idling, ensuring that the entire infrastructure is working in concert to deliver results as quickly as possible. As Thomas bluntly put it, “AI doesn't work without storage,” and Supermicro is delivering solutions to meet this growing demand.
Over the past few years, AI’s exponential growth has shifted the way companies are approaching infrastructure. As Supermicro and the rest of the industry ride the wave from training centric infrastructure demand to a demand curve that also reflects inference, new kinds of infrastructure for a wider range of environments is required. As AI continues to be specifically integrated into edge environments, Supermicro is positioning itself at the forefront, enabling AI at the edge with small, fanless servers that process inferencing directly where data is generated. This localized approach reduces latency, and it also improves the speed at which data is processed, ensuring AI workloads perform seamlessly.
A key part of Supermicro’s success lies in its commitment to delivering high-performance, low-latency infrastructure. Thomas discussed how AI clusters require not only powerful GPUs, but also fast network communication and efficient storage systems. The infrastructure design has evolved significantly to meet the demands of AI, particularly with the rise of high-density petascale storage solutions. Supermicro’s focus on providing multi-tiered storage setups ensures that data is delivered at optimal speeds for any given AI workload, enabling seamless performance across AI applications.
The collaboration between Solidigm and Supermicro has been crucial in driving these advancements, particularly in the realm of high-speed storage. Solidigm’s cutting-edge storage solutions, such as their high-capacity SSDs, perfectly complement Supermicro’s AI infrastructure. By combining Solidigm’s innovative storage technology with Supermicro’s powerful hardware, they deliver the performance and reliability required to handle the intense data demands of AI workloads.
This collaboration helps ensure that AI models can access and process data quickly, making it an essential part of AI-driven infrastructure.
Supermicro's petascale storage is capable of integrating up to 122 terabyte SSDs. This massive capacity allows AI workloads to scale up and manage vast amounts of data with ease.
So, what’s the TechArena take? For on-prem AI deployments, tapping large volumes of data locally for AI integration across business functions is becoming increasingly critical, especially as many businesses shift away from the cloud due to rising costs and data privacy concerns. Supermicro’s petascale storage delivers the speed and bandwidth needed to support the growing demands of AI models, ensuring that organizations can keep up with both the scale and complexity of modern AI workloads right from their own data centers. Solidigm’s leading 122 TB drives are a perfect match for these large scale deployments.
For those looking to learn more, Supermicro offers an abundance of resources on their website (supermicro.com) and social media channels (X, LinkedIn and YouTube).