X

Gigabyte: From Gaming Gear to Open Compute Trailblazer

Data Center
Allyson Klein
December 12, 2024

When you think of Gigabyte, gaming hardware probably comes to mind. But this Taiwan-based computer hardware manufacturer develops much more than motherboards and graphics cards – they provide a spectrum of computer hardware as well as liquid and immersion cooling and have a long history of contributing to open standards and advancing server technologies.

I recently had the pleasure of chatting with Chen Lee, VP of Sales, HPC, Data Center and Enterprise for Giga Computing and learned a fascinating tidbit about how the company became involved in the Open Compute Project Foundation (OCP).

“Around 2004, this very little-known company came to us and said, ‘We’ve got a search engine, and we want to build this motherboard and this thing called OpenRack,’” Chen explained.

The little-known company was Google, he said.

“So that's how we got into OCP,” Chen said. “(Gigabyte was) actually the first company to help Google develop open compute.”

Gigabyte’s collaboration with Google on OpenRack marked the company’s entry into the open infrastructure movement, making them one of the initial contributors to OCP standards.

Today, Gigabyte’s portfolio extends beyond Intel and AMD servers — they also produce Arm-based solutions using Ampere technology and specialize in advanced cooling systems like immersion and direct-to-chip liquid cooling. With this holistic approach, they continue to drive efficiency and performance in the data center space, reflecting their adaptability and forward-thinking approach.

Embracing AI: Gigabyte's Focus on GPU Servers

Artificial Intelligence (AI) has reshaped the demands on data centers, particularly in terms of computing power and infrastructure. Chen discussed how Gigabyte has been positioning itself in the AI hardware game, particularly through high-density GPU servers. He shared a pivotal moment for Gigabyte in 2010 when they introduced a 2U server that could support eight double-wide, dual-link GPUs, which at the time was the highest density on the market.

Today, Gigabyte’s expertise in GPU servers continues to be an asset, providing systems for AI model training and inferencing, using cutting-edge GPUs like Nvidia’s H100 and soon, Blackwell. As AI shifts towards edge deployments, Gigabyte is also preparing for the growing importance of edge inferencing, which Chen predicts will be a significant area of growth in the near future. Industries such as medical, finance, and retail are moving fast to adopt AI solutions at the edge, from convenience store smart shelving to real-time customer analytics. Gigabyte is ready to meet these needs with high-performance, scalable server technology that suits the unique challenges of edge computing.

Liquid Cooling and Efficiency in Data Centers

The demand for powerful servers to support AI training and inferencing has pushed energy consumption to unprecedented levels, making cooling a top priority. Chen highlighted how immersion and direct liquid cooling are allowing Gigabyte to manage energy efficiency better while meeting the needs of customers working on advanced AI projects. It’s a testament to the company’s adaptability and focus on sustainable solutions—aligning well with the OCP’s values of open innovation and energy efficiency.

AI Beyond the Data Center: The Future of Inference

Chen and I also discussed moving from centralized data center training to inferencing at the edge. Today, most inferencing still happens within large data centers, using high-power systems designed for training. But Chen believes that as AI technologies mature, edge inferencing will become critical—allowing smaller, more efficient hardware to perform tasks where the data is generated, such as in retail stores, hospitals, and banks.

Chen shared an interesting example involving a convenience store, where AI systems can detect customer behavior in real-time and use edge servers tucked away in the back to provide analytics directly to the headquarters. The potential for rapid, on-site AI-driven insights will push industries to adopt smaller-scale AI inferencing solutions—a market that Gigabyte is well-positioned to serve.

This shift to the edge will transform how AI is implemented across industries, bringing smarter technology closer to users and changing how data centers interact with local environments. Chen also shared that, in his view, AI isn’t just a passing trend—it’s a new wave that’s here to stay.

So what’s the TechArena take? As AI evolves and the infrastructure to support it becomes more advanced, hats off to Gigabyte for doubling down on its strengths—high-performance GPU servers, innovative cooling technologies, and partnerships with leading hardware and storage vendors.

Thanks to Solidigm for sponsoring this delightful Data Insights discussion. In case you missed it, check out the full episode here. As AI and edge computing continue to advance, the innovations coming from companies like Gigabyte are paving the way for the data centers of tomorrow.

When you think of Gigabyte, gaming hardware probably comes to mind. But this Taiwan-based computer hardware manufacturer develops much more than motherboards and graphics cards – they provide a spectrum of computer hardware as well as liquid and immersion cooling and have a long history of contributing to open standards and advancing server technologies.

I recently had the pleasure of chatting with Chen Lee, VP of Sales, HPC, Data Center and Enterprise for Giga Computing and learned a fascinating tidbit about how the company became involved in the Open Compute Project Foundation (OCP).

“Around 2004, this very little-known company came to us and said, ‘We’ve got a search engine, and we want to build this motherboard and this thing called OpenRack,’” Chen explained.

The little-known company was Google, he said.

“So that's how we got into OCP,” Chen said. “(Gigabyte was) actually the first company to help Google develop open compute.”

Gigabyte’s collaboration with Google on OpenRack marked the company’s entry into the open infrastructure movement, making them one of the initial contributors to OCP standards.

Today, Gigabyte’s portfolio extends beyond Intel and AMD servers — they also produce Arm-based solutions using Ampere technology and specialize in advanced cooling systems like immersion and direct-to-chip liquid cooling. With this holistic approach, they continue to drive efficiency and performance in the data center space, reflecting their adaptability and forward-thinking approach.

Embracing AI: Gigabyte's Focus on GPU Servers

Artificial Intelligence (AI) has reshaped the demands on data centers, particularly in terms of computing power and infrastructure. Chen discussed how Gigabyte has been positioning itself in the AI hardware game, particularly through high-density GPU servers. He shared a pivotal moment for Gigabyte in 2010 when they introduced a 2U server that could support eight double-wide, dual-link GPUs, which at the time was the highest density on the market.

Today, Gigabyte’s expertise in GPU servers continues to be an asset, providing systems for AI model training and inferencing, using cutting-edge GPUs like Nvidia’s H100 and soon, Blackwell. As AI shifts towards edge deployments, Gigabyte is also preparing for the growing importance of edge inferencing, which Chen predicts will be a significant area of growth in the near future. Industries such as medical, finance, and retail are moving fast to adopt AI solutions at the edge, from convenience store smart shelving to real-time customer analytics. Gigabyte is ready to meet these needs with high-performance, scalable server technology that suits the unique challenges of edge computing.

Liquid Cooling and Efficiency in Data Centers

The demand for powerful servers to support AI training and inferencing has pushed energy consumption to unprecedented levels, making cooling a top priority. Chen highlighted how immersion and direct liquid cooling are allowing Gigabyte to manage energy efficiency better while meeting the needs of customers working on advanced AI projects. It’s a testament to the company’s adaptability and focus on sustainable solutions—aligning well with the OCP’s values of open innovation and energy efficiency.

AI Beyond the Data Center: The Future of Inference

Chen and I also discussed moving from centralized data center training to inferencing at the edge. Today, most inferencing still happens within large data centers, using high-power systems designed for training. But Chen believes that as AI technologies mature, edge inferencing will become critical—allowing smaller, more efficient hardware to perform tasks where the data is generated, such as in retail stores, hospitals, and banks.

Chen shared an interesting example involving a convenience store, where AI systems can detect customer behavior in real-time and use edge servers tucked away in the back to provide analytics directly to the headquarters. The potential for rapid, on-site AI-driven insights will push industries to adopt smaller-scale AI inferencing solutions—a market that Gigabyte is well-positioned to serve.

This shift to the edge will transform how AI is implemented across industries, bringing smarter technology closer to users and changing how data centers interact with local environments. Chen also shared that, in his view, AI isn’t just a passing trend—it’s a new wave that’s here to stay.

So what’s the TechArena take? As AI evolves and the infrastructure to support it becomes more advanced, hats off to Gigabyte for doubling down on its strengths—high-performance GPU servers, innovative cooling technologies, and partnerships with leading hardware and storage vendors.

Thanks to Solidigm for sponsoring this delightful Data Insights discussion. In case you missed it, check out the full episode here. As AI and edge computing continue to advance, the innovations coming from companies like Gigabyte are paving the way for the data centers of tomorrow.

Subscribe to TechArena

Subscribe