FMS 2025: Exploring the Future of AI Inference at Executive Panel

FMS 2025: Navigating the Future of AI Inference



The Future of Memory and Storage (FMS) conference is set to occur from August 5 to 7, 2025, at the Santa Clara Convention Center. This event is renowned for showcasing innovations in memory and storage technologies that support various applications, especially in the realm of artificial intelligence (AI). Among the highlights of this year’s event is the Executive AI Panel, scheduled for August 7 from 11:00 AM to 12:00 Noon. This session promises to offer attendees invaluable insights into the complexities and advancements in AI inference, a field that is increasingly critical in today’s fast-paced digital world.

Key Industry Leaders on the Panel


The Executive AI Panel will feature prominent figures from the technology sector:
  • - John F. Kim, Director of Storage Marketing at NVIDIA
  • - Rory Bolt, Senior Fellow at KIOXIA America
  • - Sungsoo Ryu, CEO of SK hynix America
  • - Vincent (Yu-Cheng) Hsu, VP of Storage at IBM
  • - John Mao, VP of Global Business Development at VAST Data

These executives are set to discuss how the memory and storage industry is adapting to the growing demands of AI inference workloads. Unlike the previous emphasis on raw bandwidth for AI training, inference workloads necessitate a more nuanced approach, particularly involving distributed systems designed to maintain ultra-low latency and enhance intelligent memory performance.

The Importance of AI Inference in Today's Tech Landscape


AI inference refers to the process of deploying a trained AI model to make predictions or decisions based on new data. This is increasingly becoming the backbone of applications such as autonomous vehicles, voice recognition software, and predictive analytics in diverse industries. As organizations strive to develop next-generation AI infrastructure, the demand for memory and storage solutions that can efficiently handle these workloads is at an all-time high.

Tom Coughlin, the General Chair of FMS, stated, "The efficient scaling of inference workloads is the next frontier of AI infrastructure. This panel brings together leaders who are shaping the technologies that will make it possible. This is a panel you don't want to miss."

The discussions will revolve around how AI-optimized storage and networking can unlock significant improvements in throughput for inference requests, enhancing scalability across extensive GPU deployments. As AI continues to permeate every aspect of technology, understanding the capabilities of memory solutions will be pivotal for businesses seeking to leverage these advancements effectively.

Engaging in Knowledge-Sharing


Attendees of the FMS conference are encouraged to join this critical discussion. Whether you're involved in developing AI infrastructure, fine-tuning system performance, or simply exploring this emerging area, the perspectives shared during this panel will be indispensable for navigating the future of AI in computing.

Registration for this essential conference is open at FMS Registration. For more details on the event and its complete program, including technical sessions and discussions about emerging technologies, visit Future of Memory and Storage.

About FMS


The Future of Memory and Storage (FMS) conference is the leading global event dedicated to exploring the cutting-edge developments in multi-billion-dollar high-speed memory and storage technologies. It serves as a platform for industry professionals, executive leaders, customers, and analysts to connect and explore the evolution of memory and storage solutions in critical application areas such as AI, high-performance computing, and embedded systems. FMS is pivotal in shaping future innovations at the intersection of memory technologies and artificial intelligence.

Topics Consumer Technology)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.