Harnessing Computing Power for Space Filmmaking: Hugh Hou's Live Demo at CES 2026

The Power of Computing in Space Filmmaking: Hugh Hou at CES 2026



At CES 2026, taking place in Los Angeles, the spotlight was on cutting-edge technology and innovation, particularly in the realm of immersive video production. Renowned filmmaker and virtual reality educator, Hugh Hou, led an electrifying demonstration within the GIGABYTE suite, showcasing the incredible potentials of spatial computing in film. This live presentation emphasized how immersive videos can be produced in real-world environments, steering away from traditional practices that often rely on theoretical or controlled settings.

During the session, attendees were offered a comprehensive look into the entire filmmaking process—right from capture to final playback, including the crucial postproduction phase. Unlike typical displays that depend on pre-rendered content, the showcased workflow unfolded live at the event, imitating the rigorous processes seen in commercial XR (extended reality) projects. This not only highlighted the level of stability and system performance expected in such demanding applications but also showcased the thermal reliability necessary for seamless execution.

The culmination of the demonstration was nothing short of stunning; participants were invited to watch a two-minute space film trailer through a variety of high-tech devices such as Meta Quest headsets, Apple Vision Pro, and the newly launched Galaxy XR, as well as a 3D tablet that provided an additional option for 180-degree viewing.

Integrating AI in Real Creative Workflows


One of the standout aspects of the demonstration was the incorporation of artificial intelligence (AI), not as a flashy centerpiece, but as a functional tool embedded within everyday editing tasks. Throughout the event, various AI-assisted processes such as enhancement, tracking, and previewing were showcased, enhancing the workflow's speed without disrupting the creative flow. High-quality immersive footage was captured using cinematic-grade cameras and processed through well-known software platforms including Adobe Premiere Pro and DaVinci Resolve. AI-driven enhancements, noise reduction, and detail refinement were applied, which are vital for meeting the visual standards of immersive VR—where any artifacts or blurriness become glaringly apparent in vibrant 360-degree viewing experiences.

Importance of Platform Design in Spatial Computing


Supporting this demanding workflow was a custom-built GIGABYTE AI PC, expressly designed to manage sustained spatial video production workloads. This powerful system combined an AMD Ryzen™ 7 9800X3D processor with a Radeon™ AI PRO R9700 AI TOP GPU, providing the necessary memory bandwidth and ongoing AI performance required for real-time playback and rendering of 8K spatial videos. Additionally, the X870E AORUS MASTER X3D ICE motherboard was critical in supplying stable power and signal integrity, ensuring the workflow proceeded predictably throughout the live demonstration.

As the session wrapped up, the attendees were once again treated to the space film trailer, solidifying the efficacy of a demanding spatial filmmaking workflow operating live with precision at CES. GIGABYTE's impactful display highlighted how thoughtful system-level design transforms complex immersive productions into reliable processes that creators can depend on, rather than merely experiment with.

This event was not just a glimpse into the future of filmmaking but was a testament to the active engagement between technology and creativity, paving the way for exciting advancements in the cinematic landscape. As we move forward, it's clear that the integration of such technological innovations into filmmaking will usher in new narratives and experiences for audiences worldwide.

Topics Entertainment & Media)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.