Foundry has announced new developments for Nuke Stage, its hardware-agnostic application for end-to-end virtual production and in-camera visual effects (ICVFX). The latest updates follow customer feedback gathered during the tool’s first uses in production, including work on Amazon MGM Studios’ upcoming feature film, The Thomas Crown Affair.

Designed for real-time playback of photoreal environments onto LED walls, Nuke Stage now includes support for NotchLC playback and Gaussian Splat rendering, alongside enhanced metadata tracking intended to preserve on-set creative decisions throughout production and post.

Aligned with the recent Nuke 17.0 release, Nuke Stage now incorporates expanded 3D workflows that allow artists to work natively with USD across both applications. Artists can create assets in Nuke and import USD scenes directly into Nuke Stage, where scenes can be edited and overridden in real time during production.

“Our goal with Nuke Stage has always been to break down the barriers between on-set production and post-production to make it easier to refine assets across stages of production, rather than start over,” said Christy Anzelmo, Chief Product Officer. “The tremendous engagement from studios since the initial launch has been guiding the ongoing development of Nuke Stage and the wider Nuke family, creating a more iterative and interconnected pipeline.”

Christopher Simcock, Founding Director of Sensel Studio, which specializes in live events, immersive experiences, and virtual production, said, “Nuke Stage is really exciting because it’s a product designed from the ground up for virtual production. It takes the best of what we see in plates, which is fidelity and high-quality playback, as well as a degree of flexibility that we see with game engines, and puts them together into one package. Using Nuke Stage, we can set up scenes in half the time.”

Foundry said a core principle of Nuke Stage is the ability to operate on standard hardware without requiring proprietary infrastructure or specialized expertise. The software’s pipeline-centric approach is designed to support workflows from pre-production through on-set iteration and into post-production. The application includes shared color management, support for OpenUSD, OpenEXR, and OpenColorIO, and a node graph-based compositing environment that mirrors the familiar workflows and interface of Nuke.

Key Nuke Stage features include: 

  • NotchLC support: Productions can seamlessly harness the efficient GPU-powered video codec’s playback capabilities to run high-resolution, high-fidelity environments on LED walls. Studios can generate, prepare, and validate NotchLC media directly within existing Nuke workflows across Windows and Linux. 
  • Live metadata capture: Scene data, camera tracking, lens metadata, timecode, scene settings, and color decisions are automatically logged and preserved in the Vault, ensuring all the critical information is available in post.
  • Gaussian Splat support: Native integration with 3D Gaussian Splats enables productions to more easily integrate captured content and deploy it as a photorealistic virtual environment on LED walls. 
  • Standard hardware to scale with your needs: A single machine can drive multiple LED wall sections while remaining fully genlocked, reducing the total number of nodes needed on set and creating cost efficiencies.
  • Improved operator functionality: An enhanced Sequencer and a new Media Gallery simplifies show programming and Feed Mapping functionality gives greater control of the content on screen.
  • Expanded Python capability: External devices can hook into Nuke Stage, allowing on-set control to be given to relevant production teams.

Source: Foundry

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.