Compal Electronics Unveils Next-Gen AI Server with AMD Instinct MI355X in San Jose and Europe

Compal Electronics Unveils SG720-2A/OG720-2A AI Server



In a significant leap in the realm of AI computing, Compal Electronics has introduced its latest high-performance server platform, the SG720-2A/OG720-2A. This server was showcased at both the AMD Advancing AI 2025 event in the United States and the International Supercomputing Conference (ISC) 2025 in Europe, highlighting its advanced capabilities tailored specifically for the demands of next-generation generative AI and large language model (LLM) training.

Key Features and Innovations


The SG720-2A/OG720-2A incorporates the AMD Instinct™ MI355X GPU architecture, which is designed to handle intensive AI workloads while maximizing energy efficiency. One of the standout features of this server is its dual cooling architecture that allows for both air and liquid cooling options, ensuring optimal thermal management in various deployment scenarios.

Compal’s cutting-edge liquid cooling technology, developed in collaboration with ZutaCore, employs a unique two-phase liquid cooling system, named HyperCool, that effectively maintains system stability even under extreme computational loads. This innovation is crucial as enterprises are increasingly leaning towards server solutions that not only perform well but are also adaptable to their specific infrastructure needs.

Furthermore, the server supports up to eight AMD Instinct MI350 Series GPUs, which include the MI350X and MI355X, enabling high-density configurations that can easily scale with rising computational demands. With architectures built around the CDNA 4 framework and equipped with significant memory and bandwidth, the SG720-2A/OG720-2A is poised to deliver exceptional computational power necessary for AI and high-performance computing (HPC).

Enhanced Interconnect Functionality


The server platform features high-speed interconnect capabilities with PCIe Gen5 and AMD Infinity Fabric™ technology, facilitating seamless multi-GPU orchestration and significantly reducing latency. This essentially enhances the server's ability to carry out AI inference tasks more effectively, which is becoming increasingly critical for data centers working with large datasets and complex AI models.

Additionally, the SG720-2A/OG720-2A shows full compatibility with mainstream open-source AI frameworks including ROCm™, PyTorch, and TensorFlow. This compatibility allows developers to efficiently integrate AI models, speeding up the time-to-market for their applications.

Commitment to Sustainable Solutions


Alan Chang, Vice President of the Infrastructure Solutions Business Group at Compal, emphasized the importance of sustainable deployment in the future of AI and HPC, stating, "Each server we build aims to address real-world technical and operational challenges, not just push hardware specs. We are committed to creating solutions that enhance efficiency and sustainability in data centers."

Compal’s strategic partnership with AMD has been vital in developing this server platform, allowing both companies to innovate and co-create solutions that lead to improved data center efficiency. This collaboration reflects a mutual aim to not just meet the demands of contemporary computing but also plan for future advancements in technology.

The SG720-2A/OG720-2A signals a promising direction for data center operations, showcasing Compal’s commitment to integrating advanced technologies while promoting sustainability and high performance. As AI continues to evolve and reshape industries, the role of efficient and robust server platforms like the SG720-2A/OG720-2A will be pivotal in supporting further advancements in the field.

For more information about this innovative server and Compal's wider range of solutions, visit their official website at Compal Electronics.

Topics Consumer Technology)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.