Moreh and SGLang Collaborate to Unveil Innovative Distributed Inference System at AI Infra Summit 2025
Moreh and SGLang's Groundbreaking Innovations in AI
At the recent AI Infra Summit 2025 in Santa Clara, California, a significant development in artificial intelligence was showcased as Moreh, an AI infrastructure software enterprise, unveiled its new distributed inference system on AMD technology. This system is set to enhance the efficiency of deep learning applications across various sectors. The summit, recognized as the largest AI infrastructure conference globally, attracted around 3,500 participants, including industry leaders and AI specialists.
During the event, Moreh's CEO, Gangwon Jo, delivered an insightful presentation on September 10, emphasizing the capabilities of their distributed inference system. Benchmark results revealed that Moreh's technology is optimizing modern deep learning models, such as DeepSeek, even more effectively than existing NVIDIA solutions. Such advancements promise to provide businesses with high-performance, cost-efficient AI solutions.
In addition to revealing their innovative system, Moreh formed collaborations with Tenstorrent and SGLang, signaling a robust strategy to cement their position in the rapidly growing AI market. Moreh's joint presentation with SGLang revolved around strengthening ties with the North American AI ecosystem. This partnership aims to develop a state-of-the-art AMD-based distributed inference system, which is set to enhance deep learning inference capabilities.
The AI Infra Summit serves as an ideal venue for technology firms to collaborate and explore new avenues in AI-related infrastructure. This year, the summit transitioned from its original focus on semiconductor innovations to encompass the entire spectrum of AI infrastructure. Through over 100 partnerships and various sessions optimized for hardware providers and enterprise IT officials, the summit provided a platform for pioneering discussions and collaborations.
Focusing on cost competitiveness, Moreh plans to pool its software expertise with Tenstorrent’s hardware capabilities, allowing businesses an alternative to NVIDIA's offerings. As Jo mentioned, Moreh is currently engaged in proof-of-concept projects with several leading companies in the large language model (LLM) sector, showcasing their technical prowess among AMD's software partners.
Moreover, Moreh is developing its core AI engine while leveraging its LLM subsidiary, Motif Technologies, to establish comprehensive technological capacity in the model domain. Their ambitious roadmap includes enhancing AI computing alternatives for customers, an endeavor made possible through collaboration with AMD, Tenstorrent, and SGLang.
As the AI landscape rapidly evolves, partnerships like those forged at the summit are essential for driving innovation. Moreh's focus on distributed inference systems and strategic collaborations heralds a new era for AI infrastructure development, enabling enterprises to harness the full potential of machine learning.
With the global demand for AI solutions escalating, Moreh is poised to lead the charge by providing efficient, diverse, and powerful alternatives for computational needs in the industry.