Moreh and SGLang Introduce Groundbreaking Distributed Inference System at AI Infra Summit 2025
Moreh and SGLang's Strategic Collaboration at AI Infra Summit 2025
From September 9 to 11, 2025, the AI Infra Summit in Santa Clara, California, brought together thousands of industry professionals to discuss key advancements in artificial intelligence infrastructure. One of the highlights of this year’s event was the unveiling of a distributed inference system by Moreh, an innovative company specializing in AI infrastructure software. This new system is built on AMD technology and represents a significant advancement in the efficiency of deep learning models.
During the conference, Gangwon Jo, CEO of Moreh, delivered an insightful presentation emphasizing the capabilities of the newly developed system. He demonstrated benchmark results that reveal how Moreh's distributed inference solution can optimize the latest deep learning models like DeepSeek more effectively than industry competitor NVIDIA. This performance boosts Moreh's position as a key player in providing affordable and efficient alternatives in AI computing.
The AI Infra Summit, which began as the AI Hardware Summit in 2018, has grown into the largest global conference dedicated to AI infrastructure. This year, over 3,500 attendees and more than 100 partners convened to explore the latest technologies in the field. Keynotes and sessions were tailored for hardware vendors, hyperscalers, and IT specialists focused on developing rapid AI solutions.
Moreh showcased strategic partnerships during the summit, particularly their collaboration with SGLang, a prominent provider of deep learning software. Together, they hosted a joint booth and facilitated networking sessions aimed at enhancing their collaboration within the global AI ecosystem, particularly in North America. Both firms aim to expedite the development of distributed inference systems based on AMD technology, aligning with the increasing demand for scalable AI solutions in the market.
During a critical session on September 10, Jo also introduced a next-generation AI semiconductor system, blending Moreh's software with hardware from Tenstorrent. This partnership is set to challenge existing market norms dominated by NVIDIA, presenting new, cost-effective options for businesses looking to harness AI capabilities.
Jo's commitment to raising Moreh’s profile in the global market is evident. He stated, "Moreh possesses exceptional technical expertise among AMD's global software partners, engaging in proof-of-concept projects with leading LLM companies. Our goal is to establish a global presence, offering diverse alternatives for AI computing."
With an expanding footprint in the AI sector, Moreh is enhancing its core AI infrastructure engine, backed by significant technological competencies through its subsidiary, Motif Technologies, which focuses on large language model (LLM) methodologies. Collaborations with technological front-runners like AMD, Tenstorrent, and SGLang further cement Moreh's reputation and influence within the competitive landscape of AI services.
The innovations unveiled at the AI Infra Summit 2025 signal a promising future for Moreh and its partners. By pushing the boundaries of what is possible with distributed inference systems, they are well-positioned to capitalize on the rapidly growing market for deep learning inference technologies. As enterprises globally seek robust AI solutions, Moreh's advancements set a new standard in infrastructure efficiency and effectiveness.
In conclusion, the announcement of Moreh’s distributed inference system on AMD technology not only showcases their technological prowess but also their strategic vision in cultivating partnerships that steer the AI industry towards more accessible and efficient solutions. The AI Infra Summit serves as an essential platform for such innovations, and Moreh's contributions exemplify the potential for future advancements in AI infrastructure.