Supermicro Enhances NVIDIA Blackwell Solutions with Liquid-Cooled NVIDIA HGX B300 Systems for Hyperscale AI Deployments

Supermicro Launches New Liquid-Cooled NVIDIA HGX B300 Systems



Recently, Super Micro Computer, Inc. (SMCI) has expanded its NVIDIA Blackwell architecture portfolio with the introduction of two new systems: the 4U and the 2-OU (OCP) liquid-cooled NVIDIA HGX B300. Designed to meet the increasing demands of hyperscale data centers and AI factory deployments, these systems promise exceptional performance, density, and energy efficiency.

The Need for High-Density Computing


With the rapidly growing demand for AI infrastructure, companies are searching for solutions that can deliver intensive computing power without compromising energy costs. Supermicro's latest offerings are positioned to address this demand, providing solutions that maximize GPU density while minimizing power consumption.

The 4U liquid-cooled NVIDIA HGX B300 systems can accommodate up to 64 GPUs within standard 19-inch EIA racks, capturing nearly 98% of the heat produced through its DLC-2 (Direct Liquid-Cooling) technology. On the other hand, the compact 2-OU (OCP) model is adept for 21-inch OCP Open Rack V3 specifications and boasts the capability to maintain a staggering 144 GPUs in a single rack.

Features and Benefits


In addition to maximizing space efficiency, the new NVidia HGX B300 systems are designed with advanced liquid-cooling solutions that significantly reduce overall power consumption and cooling costs. CEO Charles Liang stated, "With AI infrastructure demand accelerating globally, our new liquid-cooled NVIDIA HGX B300 systems deliver the performance density and energy efficiency that hyperscalers and AI factories need today.”

The 2-OU (OCP) system's design is characterized by a modular GPU/CPU tray architecture, featuring blind-mate manifold connections that simplify maintenance and enhance serviceability while maintaining a low rack footprint. Each system can handle up to eight NVIDIA Blackwell Ultra GPUs, maximizing compute power while minimizing the overall space needed within the data center.

For those embracing traditional rack setups, Supermicro also offers the 4U Front I/O HGX B300 system, which retains the compute capabilities of the 2-OU version but fits the conventional rack format more commonly found in existing setups. This makes it easier for businesses to integrate the new solutions without overhauling their current infrastructure.

Accelerating AI Workloads


Both new systems come equipped with 2.1TB of HBM3e GPU memory, allowing them to accommodate larger AI models and workloads. Furthermore, they double the compute fabric network throughput to as high as 800Gb/s when integrated with NVIDIA Quantum-X800 InfiniBand or NVIDIA Spectrum-4 Ethernet technologies. Such advancements are pivotal for supporting demanding AI tasks, including training complex models and executing large-scale inferences across diverse datasets.

As Supermicro continues to innovate and expand its offerings, these newly launched systems stand out as a vital component in their broader portfolio of NVIDIA solutions, which includes the NVIDIA GB300 NVL72 and NVIDIA HGX B200, among others. Each system undergoes rigorous testing to ensure peak performance for a wide array of AI applications, further proving Supermicro's commitment to delivering cutting-edge technology to their customers.

Conclusion


Supermicro's introduction of the liquid-cooled NVIDIA HGX B300 systems marks a significant step forward in addressing the evolving needs of AI and hyperscale data center environments. With a focus on maximizing performance, minimizing energy consumption, and simplifying serviceability, these new solutions exemplify how technology can keep pace with the ever-growing demands of modern computing.

Discover more about these solutions and others at Supermicro's official website.

Topics Consumer Technology)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.