Exploring the Computation Power Behind Spatial Film Production at CES 2026

Exploring the Computation Power Behind Spatial Film Production at CES 2026



At the CES 2026 event in Los Angeles, renowned VR filmmaker and educator Hugh Hou conducted a live, hands-on demonstration that illustrated the intricacies of spatial computing. This engaging session took place inside the GIGABYTE Suite, allowing attendees to see firsthand how immersive video is produced in real-world environments, moving beyond traditional theoretical approaches or controlled lab settings.

The live demonstration provided attendees with a comprehensive perspective on the entire workflow of spatial film production. This included everything from initial capturing through post-production to final playback. Instead of relying on pre-rendered content, the demonstration executed the workflow in real-time on the exhibition floor. This not only mirrored routines found in commercial XR projects but also tested the system's stability, consistent performance, and thermal reliability.

The event's highlight was the screening of a two-minute trailer for a spatial film, observed on multiple platforms including Meta Quest, Apple Vision Pro, and the newly unveiled Galaxy XR headsets. Alongside these devices, an additional 180° view option was offered via a 3D tablet display, enriching the immersive experience for attendees.

Interestingly, artificial intelligence (AI) wasn't merely highlighted as a feature but was seamlessly integrated as an essential tool in daily editing tasks. Throughout the demonstration, AI-assisted processes such as enhancements, tracking, and previews significantly accelerated iteration, all while maintaining a consistent creative flow. The footage, captured using cinema-quality immersive cameras, was processed with industry-standard software including Adobe Premiere Pro and DaVinci Resolve. Techniques such as AI-powered upscaling, noise reduction, and detail refinement were employed to meet the visual demands of immersive VR, where any artifact or blurriness within a 360-degree viewing environment stands out immediately.

The entire workflow operated on a GIGABYTE AI PC, meticulously designed for extended use in spatial video applications. This powerful system incorporated an AMD Ryzen™ 7 9800X3D processor combined with a Radeon™ AI PRO R9700 AI TOP GPU, delivering the essential memory bandwidth and continuous AI performance required for the real-time playback and rendering of spatial 8K videos. Equally crucial was the X870E AORUS MASTER X3D ICE motherboard, which ensured stable power supply and signal integrity, facilitating a predictable workflow throughout the live demonstration.

To conclude the event, participants enjoyed viewing a finalized spatial film trailer on devices like Meta Quest, Apple Vision Pro, and Galaxy XR. By executing a demanding workflow for spatial film production at CES, GIGABYTE demonstrated how system-level platform design can turn complex immersive production into a reliable process for creatives, rather than merely a field for experimentation. This pioneering initiative signals significant advancements in the capabilities of spatial computing and its application in filmmaking.

Inside GIGABYTE

Topics Entertainment & Media)

【About Using Articles】

You can freely use the title and article content by linking to the page where the article is posted.
※ Images cannot be used.

【About Links】

Links are free to use.