Hubbry Logo
On-set virtual productionOn-set virtual productionMain
Open search
On-set virtual production
Community hub
On-set virtual production
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
On-set virtual production
On-set virtual production
from Wikipedia

On-set virtual production (OSVP)[a] is an entertainment technology for television and film production in which LED panels are used as a backdrop for a set, on which video or computer-generated imagery can be displayed in real-time. The use of OSVP became widespread after its use in the first season of The Mandalorian (2019), which used Unreal Engine, developed by Epic Games.

History

[edit]

Australian film director Clayton Jacobson first had the idea of improving the green screen that was then in use when filming a TV ad for detergent in 2003. Watching his son playing videogames and seeing the 3D technology used in them gave him the idea. Eventually, in 2016, Jacobson and his son made one of the prototypes for a virtual production stage in their shed, using a set of LED screens. However, he could not get anyone to take an interest in developing the technology further, so gave up on it. Other filmmakers had also caught on to the idea though, and in 2018 an Australian cinematographer, Greig Fraser, used the technology to film the Star Wars franchise spin-off series, The Mandalorian (released 2019). Instead of using the green screen during the filming stage, the team combined post-production with the production stage of the series. They installed huge LED walls linked to powerful computers that ran Unreal Engine gaming software (used for Fortnite, among others). They called this soundstage "the volume", a term already used to refer to a stage where visual effects techniques take place.[1]

Since its inventive use in The Mandalorian, which used ILM's StageCraft, the technology has become increasingly popular. Miles Perkins, industry manager of film and TV for Epic Games and maker of the Unreal Engine, estimated that there were around 300 stages by October 2022, increased from only three in 2019.[2] Most of these were built during or after the COVID-19 pandemic, when lockdowns meant that production studios had to find ways to produce films without traveling to other locations.[1]

In March 2023, the world's largest virtual production stage was opened at the Docklands Studios Melbourne, in the city of Melbourne, Australia.[1]

Terminology

[edit]

On-set virtual production (OSVP) is also known as virtual production (VP), immersive virtual production (IVP), In-Camera Visual Effects (ICVFX),[3] or The Volume.[citation needed]

Technology

[edit]

With careful adjustment and calibration, an OSVP set can be made to closely approximate the appearance of a real set or outdoor location.[4] OSVP can be viewed as an application of extended reality. OSVP contrasts with virtual studio technology, in which a green screen backdrop surrounds the set, and the virtual surroundings are composited into the green screen plate downstream from the camera, in that in OSVP the virtual world surrounding the set is visible to the camera, actors, and crew, and objects on set are illuminated by light from the LED screen, creating realistic interactive lighting effects, and that the virtual background and foreground are captured directly in camera, complete with natural subtle cues like lens distortion, depth of field effects, bokeh and lens flare. This makes it a far more natural experience that more closely approximates location shooting, making the film-making process faster and more intuitive than can be achieved on a virtual set.[citation needed]

To render parallax depth cues correctly from the viewpoint of a moving camera, the system requires the use of match moving of the background imagery based on data from low-latency real-time motion capture technology to track the camera, such systems as Stype and Mosys which track camera position within the volume and equate this position within the Unreal scene - so that any movement is reflected within the scene as well .[citation needed]

Industry organizations including SMPTE, the Academy of Motion Picture Arts and Sciences, and the American Society of Cinematographers have started initiatives to support the development of OSVP.[4][5][6][7]

Examples

[edit]

Stages that use OSVP include the various StageCraft stages, Pixomondo's Toronto-based LED stage, which has a long-term lease from CBS, ZeroSpace's Volume in New York City,[8] or Lux Machina various stages.[2] In Japan, the LED wall and virtual production were used by Toei Company for its Super Sentai shows Avataro Sentai Donbrothers[9] and Ohsama Sentai King-Ohger, with the latter also being produced in collaboration with Sony PCL Inc.[10]

Productions using the technologies

[edit]

Television series

[edit]

Feature films

[edit]

Short films

[edit]

See also

[edit]

Notes

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
On-set virtual production (OSVP) is a filmmaking methodology that merges real-time with live-action capture on physical sets, enabling the creation and manipulation of digital environments and characters during . This approach combines traditional production elements, such as actors and practical sets, with virtual and technologies, real-time rendering, and to compose and record scenes in-camera, minimizing reliance on . OSVP emerged from advancements in game engine technology and visual effects, with roots tracing back to early experiments in real-time rendering during the 1990s and 2000s in films like Star Wars: Episode I – The Phantom Menace and Avatar. It gained widespread recognition through Industrial Light & Magic's (ILM) StageCraft system, first prominently used in the Disney+ series The Mandalorian (2019), where LED volumes—curved arrays of high-resolution LED panels—displayed dynamic virtual backgrounds that interacted with the camera and provided realistic lighting for actors. Subsequent adoption accelerated during the COVID-19 pandemic, as remote collaboration tools integrated with on-set workflows, leading to over 250 dedicated in-camera visual effects (ICVFX) stages worldwide by 2022. Key technologies underpinning OSVP include real-time rendering engines like and Unity, which generate photorealistic graphics at interactive frame rates; motion and camera tracking systems (e.g., OptiTrack or Ncam) for precise alignment of virtual and physical elements; and performance capture for integrating digital characters with live performers. These components facilitate live , where virtual assets are projected onto LED walls to simulate and environmental reflections, allowing cinematographers to achieve "final pixel" quality directly on set. Professional organizations, including the Society of Motion Picture and Television Engineers (SMPTE), (ASC), and (VES), have established standards and glossaries to address interoperability challenges, such as metadata consistency for camera-lens integration. The technique offers significant benefits, including reduced production timelines and costs—such as minimizing physical set builds and location shoots—while enhancing actor immersion and creative iteration through immediate visual feedback. For instance, productions like (2019) leveraged OSVP for virtual cinematography, achieving high percentages of in-camera completion for complex sequences and saving millions in post-production expenses. However, it demands specialized skills, blending film crews with gaming and software experts, and requires substantial upfront investment in hardware like multi-primary LED panels to mitigate issues such as moiré patterns and color shifts. As of November 2025, OSVP continues to evolve with cloud-based rendering, AI-assisted asset creation, and affordable LED technologies, democratizing access for independent filmmakers via open-source tools.

History and Development

Origins and Early Concepts

The origins of on-set virtual production trace back to foundational techniques that sought to integrate digital environments with live-action footage, evolving from to preliminary real-time experimentation. A pivotal early influence was the 2004 film Sky Captain and the World of Tomorrow, directed by , which pioneered extensive use of green screen (or bluescreen) methodologies to create entirely digital sets. The production filmed all live-action footage against uniform bluescreen backdrops on soundstages in Van Nuys, , and , comprising 2,031 computer-generated shots where actors were composited onto photographic or 3D-modeled backgrounds in . This approach marked a shift toward "digital backlots," allowing filmmakers to construct fantastical worlds without physical locations, though integration remained entirely offline and labor-intensive. In the , conceptual roots deepened through real-time rendering experiments by visual effects studios, particularly (ILM), which explored interactive digital environments to bridge pre-production planning and on-set decision-making. For (2015) and : A Star Wars Story (2016), ILM's Advanced Development Group developed proprietary real-time rendering pipelines inspired by video game engines, enabling directors to visualize complex shots in near real-time during filming. These efforts included streaming rendered sequences to tablets or VR headsets on set, allowing adjustments to camera movements and lighting interactively, though still reliant on blue-screen stages rather than immersive displays. ILM also conducted prototype tests for LED walls as extensions of these systems, aiming to project dynamic backgrounds for enhanced environmental interaction, but early iterations faced limitations in scalability and synchronization. Academic and technical demonstrations in the mid-2010s further advanced these ideas, with projects focusing on LED-lit sets to achieve effects for more convincing depth in virtual environments. For instance, (2016) utilized blue-screen stages augmented by real-time pre-visualization tools, where motion-captured block-outs of digital animals were rendered live to guide performances and camera work, demonstrating early on-set interactivity. University-led initiatives, such as those exploring integrations for film, experimented with LED panels to simulate responsive lighting and perspective shifts, laying groundwork for simulation without full reliance. Pre-commercial challenges, notably in early camera tracking systems, highlighted barriers to seamless real-time integration. These systems, essential for syncing virtual elements to live camera movements, suffered from latency delays—often exceeding 10 milliseconds—which disrupted accuracy and immersion during fast pans or tilts. Such issues underscored the need for lower-latency hardware and software, paving the way for later advancements like those seen in .

Key Milestones and Adoption

The landmark debut of on-set virtual production in major commercial filmmaking took place during the production of the first season of in 2019, where (ILM) deployed its proprietary system. This innovative setup featured a massive 270-degree semicircular LED video wall, approximately 20 feet high and 75 feet in diameter, composed of over 1,300 individual 2.84mm pixel pitch LED panels that displayed real-time, interactive backgrounds rendered via technology, allowing actors to perform within immersive environments captured in-camera. In 2020, ILM deepened its technological advancements through a key collaboration with , integrating directly into virtual production pipelines to facilitate seamless real-time rendering and workflows on LED stages, as demonstrated in further episodes of . This partnership marked a pivotal shift toward standardized, engine-driven tools that streamlined the transition from previsualization to final footage, reducing reliance on traditional greenscreen . The from 2020 to catalyzed a significant surge in industry adoption of on-set virtual production, as studios sought to minimize on-location shoots, limit crew exposure, and comply with health protocols by confining operations to controlled indoor stages. This period saw virtual production evolve from a niche technique to a practical solution for maintaining schedules amid global shutdowns, with a July 2022 survey finding that 40% of industry executives were using virtual production tools, and over 50% likely to adopt them within the next 18-24 months. A concrete milestone in this expansion was the opening of ILM's first full virtual production volume at in in early , providing a permanent 270-degree LED facility spanning 75 feet wide by 23 feet high to support international productions. By 2022, the maturing field prompted the formation of standards bodies to ensure across hardware, software, and workflows, exemplified by the Society of Motion Picture and Television Engineers (SMPTE) Rapid Industry Solutions initiative for on-set virtual production, which developed open specifications for camera-lens metadata exchange and equipment compatibility to foster broader ecosystem integration. These efforts addressed fragmentation in tools like LED walls and rendering engines, enabling more consistent adoption across global studios.

Evolution Through 2025

In 2023, on-set virtual production saw significant integration of AI-driven tools, particularly through NVIDIA's Omniverse platform update, which introduced generative AI capabilities for automated 3D asset creation and environment building directly on production sets. This advancement allowed filmmakers to generate synthetic environments in real-time using OpenUSD workflows, streamlining pre-visualization and reducing manual modeling time in collaborative pipelines. Building on the influence of productions like , these AI features enhanced scalability for complex scene assembly without extensive adjustments. By 2024, virtual production expanded accessibility to smaller studios via affordable modular hardware kits, exemplified by ROE Visual's launch of the Obsidian LED panel in April, a cinematic-grade solution optimized for in-camera (ICVFX) with improved energy efficiency and easier integration into compact setups. This facilitated quicker deployments and lower operational barriers, enabling mid-sized facilities in regions like and to adopt LED volumes without the prohibitive expenses of large-scale installations. Concurrently, global adoption accelerated with the establishment of key hubs in the region, such as the MUSHANG VFX Lab in , , which supported commercial projects through high-resolution LED volumes and real-time rendering systems. In 2025, technological refinements focused on higher and , including breakthroughs in 8K real-time rendering that delivered ultra-high-definition visuals for on-set environments, as demonstrated in the world's largest monolithic LED virtual production studio built by Absen and Versatile in , featuring a panoramic 270-degree with over 600 million pixels for seamless live . European productions increasingly incorporated energy-efficient LEDs, such as Alfalite's SkyPix panels with an average power consumption of 35W, which minimized environmental impact while maintaining performance in facilities like Portugal's EMAV LED studio. These developments, coupled with rising adoption in non-Hollywood markets—including Australian and Croatian virtual stages—underscored virtual production's shift toward broader, eco-conscious scalability worldwide. In November 2025, and Pixomondo unveiled a new virtual production soundstage at Jax Film Studios in , , further expanding capabilities in the .

Terminology and Fundamentals

Core Definitions

On-set virtual production (OSVP) is a technique that integrates real-time (CGI) with live action by using large-scale LED panels as dynamic backdrops during . These panels display real-time rendered virtual environments that respond to the camera's position and movement, creating effects where elements in the background shift realistically relative to the foreground. This setup allows for interactive from the virtual scene to illuminate actors and physical sets in real time, capturing the composite image directly in the camera without relying on compositing in . Occlusions between physical sets and virtual backgrounds are handled naturally by the camera, with advanced setups using depth data for precise of mid-ground virtual elements. At its core, OSVP embodies the principle of in-camera visual effects (ICVFX), where digital elements are rendered and integrated live on set, enabling directors and cinematographers to visualize the final shot immediately during filming. This approach minimizes the need for extensive adjustments, such as matching lighting or perspective between live-action footage and added CGI, thereby streamlining workflows and reducing costs associated with traditional green-screen processes. By capturing the scene holistically in one pass, ICVFX in OSVP ensures greater accuracy in environmental interactions, such as reflections and shadows, that would otherwise require manual alignment later. OSVP differs fundamentally from virtual reality (VR) production, which emphasizes immersive, interactive experiences for users in head-mounted displays, often prioritizing 360-degree environments and user navigation over linear narrative capture. In contrast, OSVP is tailored for traditional screen-based media like film and television, focusing on controlled, camera-driven compositions that deliver photorealistic results for edited . This distinction highlights OSVP's role in enhancing cinematic production rather than creating standalone interactive simulations. The basic process of OSVP begins with camera tracking systems that monitor the physical camera's position, orientation, and lens data in real time, feeding this information into a to synchronize the rendering of virtual backgrounds on the LED panels. As the camera moves, the engine adjusts the displayed imagery to maintain consistent perspective and lighting, ensuring the virtual elements behave as if part of the physical space. This closed-loop system allows for immediate feedback and iteration on set, with the final footage emerging as a seamless blend of real and digital components. On-set virtual production (OSVP) differs fundamentally from traditional green screen , where a static, uniformly lit green backdrop is captured during filming and replaced with digital environments through keying in . In contrast, OSVP employs dynamic LED panels that display real-time, interactive virtual backgrounds synchronized with camera movements, allowing for immediate and realistic lighting interactions on set without extensive post-processing. This approach reduces the need for chroma keying artifacts, such as color spill, by integrating the virtual elements directly into the live shoot. OSVP operates as a specialized subset of extended reality (XR), which encompasses virtual reality (VR), augmented reality (AR), and mixed reality (MR) technologies that blend physical and digital elements. Unlike audience-facing XR applications, such as interactive AR experiences or VR simulations delivered via headsets, OSVP focuses exclusively on production tools for filmmakers, using real-time rendering to create immersive sets for live-action capture rather than end-user immersion. Key terminology in OSVP includes the "volume," referring to an enclosed LED stage typically configured as a large, curved of high-resolution panels forming a 270-degree or similar arc to surround actors and cameras with photorealistic virtual scenery. Another critical concept is cluster rendering, which distributes computational tasks across multiple machines to render content for LED panels outside the camera's , ensuring accurate and reflections without latency. OSVP extends previsualization (previs), a pre-production process that uses rough CGI models to plan shots and visualize sequences in a virtual environment. While previs remains focused on planning and iteration before filming, OSVP brings these refined assets to the live set via real-time engines, enabling on-the-fly adjustments and in-camera finalization that previs alone cannot achieve.

Technological Components

Hardware Elements

LED panels form the core visual backbone of on-set virtual production (OSVP) setups, typically arranged in expansive, high-resolution arrays to create immersive environments. These panels must support high refresh rates, often exceeding 7,680 Hz, to eliminate motion artifacts when captured by high-speed cameras, ensuring seamless integration of virtual elements with live action. Pixel pitches are generally under 2.5 mm—such as 1.9 mm or 2.6 mm—to achieve fine detail visible at close range, with large volumes comprising over 20 million pixels for expansive scenes spanning hundreds of square meters. Examples include ROE Visual's Black Pearl series, which deliver HDR-capable brightness up to 2,500 nits while maintaining color accuracy for realistic virtual backdrops. Camera tracking systems are essential for synchronizing real-time virtual camera movements with physical ones, using markers and sensors to achieve sub-millimeter positional accuracy. Systems like OptiTrack employ networks of IR cameras to track markers on the camera rig, delivering ultra-low latency under 10 ms and precision down to 0.2 mm, which prevents errors in LED volumes. Similarly, Mo-Sys StarTracker provides marker-based optical tracking with 6-axis accuracy, supporting volumes up to 100 m x 100 m without drift, ideal for dynamic shots in cinematic productions. These systems integrate with rendering software to warp and project virtual content in real time, matching the camera's perspective precisely. Lighting integration in OSVP relies on automated adjustments to LED panel brightness and color, ensuring realistic reflections on practical set elements like actors' faces or props. Panels dynamically modulate output—often from 1,000 to 2,500 nits—to match on-set key lights, avoiding overexposure or mismatched shadows that could reveal the virtual boundary. This is achieved through sensor-driven feedback loops that synchronize virtual illumination with physical sources, as seen in setups using or Unity platforms for consistent specular highlights. Such integration enhances photorealism by allowing the LED wall to act as a practical light source, reducing post-production compositing needs. Support infrastructure underpins OSVP reliability, including curved panel arrays and robust cooling systems to sustain operations over extended shoots. Curved configurations, often at radii of 5-10 meters, wrap around the set for 270-degree immersion, using modular frames to assemble panels into seamless surfaces spanning 100-200 square meters. Cooling is critical to mitigate thermal distortion, with active systems like integrated fans or liquid cooling maintaining panel temperatures below 40°C during 12+ hour sessions, preventing failures or color shifts in high-brightness environments. Ventilation and power distribution networks further support this, drawing up to 500 W per panel while ensuring stable performance.

Software and Rendering Systems

On-set virtual production relies heavily on advanced game engines to generate photorealistic virtual environments in real time. Unreal Engine 5 stands out as the dominant platform, leveraging its Nanite virtualized geometry system to handle massive, high-detail 3D assets without traditional polygon budget limitations, enabling filmmakers to import film-quality scans and models directly into production pipelines. Nanite achieves this through a streaming micropolygon architecture that renders only visible geometry at pixel scale, supporting object counts in the millions for complex virtual sets. Complementing Nanite, Unreal Engine 5's Lumen system delivers dynamic and reflections, simulating light bounces across scenes in real time to create believable interactions between virtual elements and physical actors. This integration allows for immediate lighting adjustments during shoots, reducing needs. Rendering techniques in OSVP emphasize real-time ray tracing to produce accurate shadows, reflections, and that match physical lighting conditions. In , hardware-accelerated ray tracing via GPUs enables these effects at interactive frame rates, ensuring virtual environments respond dynamically to camera movements and on-set changes. On high-end configurations, such as those with RTX 40-series cards, ray-traced scenes can sustain 60 frames per second or higher at resolutions suitable for LED volumes, maintaining synchronization with live action. These capabilities extend to previews in later UE5 versions, though optimized hybrid rasterization-ray tracing remains standard for production stability. Asset management tools facilitate rapid iteration of virtual elements during filming. While offers blueprint-based scripting for procedural adjustments, Unity provides an alternative through Visual Scripting, a node-based system that allows non-programmers to create dynamic behaviors, such as real-time scene modifications for environmental interactions or prop animations. This visual approach streamlines prototyping and on-set tweaks in Unity-powered virtual production workflows, integrating with tools like the High Definition Render Pipeline for consistent asset fidelity. Data handling in OSVP involves robust synchronization across distributed systems to manage expansive scenes. Unreal Engine's nDisplay tool orchestrates multi-node rendering clusters, ensuring frame-accurate alignment across multiple displays by distributing computational load and synchronizing via signals. It processes large-scale assets—often exceeding 100 GB per scene—through efficient streaming and viewport culling, preventing bottlenecks in high-resolution outputs to LED panels. This setup supports seamless collaboration in virtual production stages, with outputs briefly interfacing with hardware displays like LED walls for in-camera capture.

Workflow and Integration Processes

The workflow for on-set virtual production (OSVP) begins in the phase, where detailed asset creation and camera calibration are essential to ensure seamless integration between physical and virtual elements. 3D artists develop high-fidelity digital assets, including environments, props, and characters, often using game engines and modeling software to build photorealistic scenes that match the intended production scale. LIDAR scanning plays a critical role in this stage, capturing precise 3D data of real-world locations or sets to create accurate digital twins, which are then imported into virtual environments for matching physical actors and props. Camera calibration follows, involving the alignment of physical camera movements with virtual perspectives through tracking systems, allowing for previsualization of shots and minimizing discrepancies during filming. During the on-set process, OSVP emphasizes real-time collaboration through live feedback loops, enabling directors and crew to view near-final composites on dedicated monitors. This in-camera preview (ICP) system overlays virtual elements onto the live feed captured by the physical camera, facilitated by synchronized tracking data that adjusts the LED wall displays to match camera angles and movements in real time. Directors can make immediate tweaks to , compositions, or asset placements, fostering a dynamic environment where creative decisions are informed by the actual output, often powered by real-time rendering engines like . This iterative approach reduces guesswork and enhances actor performances by immersing them in the visible on the set. Post-production in OSVP is significantly streamlined, with minimal VFX cleanup required due to 90-95% of effects being completed in-camera during . The focus shifts to refining edge cases, such as motion blur corrections or subtle integrations for complex interactions, leveraging the high-quality already composited on set. This efficiency arises from the pre-planned assets and real-time synchronization, allowing editorial and VFX teams to prioritize polishing over extensive reconstruction. Central to the integration processes is the collaborative team structure, typically centered in a unified where VFX supervisors, LED technicians, and engine operators coordinate operations. VFX supervisors oversee the artistic and technical alignment of digital elements with live action, ensuring consistency across phases. LED technicians manage the wall's and performance to maintain visual fidelity, while engine operators handle real-time rendering adjustments and asset loading. This integrated setup promotes seamless communication, bridging traditional roles with virtual specialists to optimize the end-to-end .

Applications in Media Production

Television Series

On-set virtual production (OSVP) has been particularly transformative for television series, where tight production schedules and ongoing narratives demand rapid iteration and cost control. By integrating real-time LED volumes and game-engine rendering directly on set, OSVP allows crews to create immersive environments without extensive or fixes, streamlining workflows for weekly episode deliveries. This approach contrasts with traditional methods by shifting much of the work to and on-set visualization, enabling directors and actors to interact with dynamic backgrounds in real time. A key benefit in episodic television is the significant reduction in shooting days and costs. For instance, in (2019-2023), over 50% of Season 1 was filmed using an LED volume called "The Volume," which eliminated the need for many location shoots and allowed for efficient capture of complex scenes, facilitating the series' demanding weekly release cadence on Disney+. These efficiencies not only cut transportation and set-building expenses but also reduced overall production time, making high-end feasible within television budgets. Adaptations of OSVP for TV emphasize modularity to accommodate frequent scene changes across episodes. Modular LED volumes, which can be reconfigured quickly by swapping panels or assets, enable seamless transitions between environments, as seen in Westworld Season 4 (2022), where real-time virtual builds powered by Unreal Engine created dynamic, post-apocalyptic settings for episodes 7 and 8 without rebuilding physical sets. This flexibility supports the iterative nature of series production, where storylines evolve rapidly and reshoots must be minimized. However, OSVP in television presents unique challenges, particularly in maintaining visual consistency across multiple episodes using reusable virtual assets. Issues such as color shifts from LED panel overheating or angle-dependent artifacts can disrupt continuity if not addressed, requiring rigorous and metadata management to ensure assets render uniformly episode-to-episode. Productions often mitigate this through tools like 3D LUTs and standardized pipelines to preserve the integrity of shared digital environments. By 2025, OSVP in streaming series has increasingly incorporated AI-enhanced backgrounds to further boost efficiency. Netflix's Eyeline division, merging VFX and virtual production expertise, integrated generative AI for real-time asset generation in pilots and series like El Eternauta, allowing smaller-budget productions to achieve advanced that would otherwise be cost-prohibitive. This evolution underscores OSVP's role in democratizing high-quality episodic content for platforms with rapid turnaround needs.

Feature Films

On-set virtual production (OSVP) has significantly enhanced storytelling in feature films by enabling directors to integrate high-fidelity digital environments directly into live-action shoots, allowing for more immersive performances and seamless integration that amplifies narrative depth and spectacle. Unlike traditional workflows, OSVP facilitates real-time visualization of complex worlds, reducing the disconnect between actors and their surroundings while intensifying VFX demands for photorealistic grandeur in theatrical releases. This approach supports the one-time, high-stakes nature of cinema, where epic narratives require unparalleled visual cohesion to captivate audiences on large formats. In Denis Villeneuve's (2021), OSVP contributed through the innovative use of sandscreens that mimicked Arrakis's dunes for sequences like the ornithopter flight, minimizing green screen spill and artifacts to ensure sharper integration in presentations. These techniques preserved environmental authenticity, with limited LED wall use in one scene providing dynamic backgrounds that immersed actors without the visual inconsistencies of conventional chroma keying. The result was a heightened sense of scale and realism, elevating the film's epic scope while streamlining VFX post-work. Films like Matt Reeves's The Batman (2022) demonstrated OSVP's budget impacts by utilizing LED volumes to capture intricate Gotham night scenes entirely on virtual stages, avoiding costly location shoots and logistical challenges associated with urban exteriors. Industrial Light & Magic's technology enabled real-time rendering of rainy, neon-lit environments, reportedly contributing to substantial cost reductions—virtual stages can save up to 99% in carbon emissions and millions in production expenses compared to traditional location filming for similar sequences. This efficiency allowed for more creative iterations during , focusing resources on narrative intensity rather than environmental logistics. Cinematic techniques in OSVP, such as depth, have been pivotal for 3D-like immersion, as seen in James Cameron's (2022), where LED panels extended underwater and Pandora extensions to create layered, volumetric shots that enhanced spatial storytelling. By projecting real-time depth cues onto curved LED setups, filmmakers achieved negative parallax effects that drew viewers into the scene, fostering a tangible sense of depth without relying solely on post-composited 3D. This method not only amplified the film's immersive spectacle but also integrated physical performances with digital extensions more fluidly, supporting the narrative's exploration of vast oceanic realms. Feature films often employ full-volume builds with expansive LED arrays—sometimes spanning hundreds of square meters—for epic sequences, contrasting the smaller, modular setups typical in television to accommodate tighter production timelines. These large-scale configurations, powered by high-resolution displays, enable sweeping camera movements and intricate set interactions essential for cinematic grandeur, underscoring OSVP's role in realizing ambitious visions unattainable through scaled-down alternatives.

Other Formats and Uses

On-set virtual production (OSVP) has expanded into , enabling brands to create immersive commercials with simulated environments that mimic global locations without extensive travel or . This approach supports quick-turnaround shoots, often completed in days rather than weeks, by leveraging LED volumes and real-time rendering to integrate actors seamlessly with virtual sets. For instance, production studios like Broadley.tv have utilized OSVP setups for brand advertisements across TV and streaming platforms, allowing for dynamic scene changes and cost efficiencies in hybrid workflows. According to industry analysis, TV spots produced via virtual tools can cost 30% to 40% less than traditional methods, primarily due to reduced and location expenses. In live events such as concerts and broadcasts, OSVP facilitates the integration of virtual backdrops and augmented elements to enhance audience engagement without disrupting physical performances. Technologies like 's virtual production toolset enable real-time projection of digital environments onto LED walls, placing performers in fantastical settings during live transmissions. This has been applied in 2025 events, including sports broadcasts and music productions, where multi-camera setups combine physical stages with virtual extensions for immersive visuals. Companies like Megapixel have demonstrated these capabilities at industry showcases, highlighting VP's role in scaling live event production while minimizing setup times. Educational applications of OSVP include virtual sets in film schools and corporate training programs, which simulate production environments to teach real-time effects and integration at a fraction of traditional costs. Institutions like the Savannah College of Art and Design (SCAD) incorporate LED volumes into curricula for hands-on film and television courses, allowing students to experiment with virtual environments without building physical sets. Similarly, the received a $25 million in 2025 to establish a dedicated virtual production center, emphasizing sustainable practices that cut physical construction expenses. Overall, these implementations can reduce set build costs by up to 40% compared to conventional methods, while also lowering demands. Experimental uses of OSVP, particularly AR-enhanced variants, have emerged in interactive theater at 2025 festivals, blending live with overlaid digital elements for audience-responsive narratives. At the Immersive festival, productions combined physical theater with virtual avatars and shifting environments, using AR to enable real-time interactions between performers and virtual props. The Festival featured digital theater projects that integrated OSVP for multi-sensory experiences, testing interactivity in large-scale venues. These innovations extend core integration processes by incorporating audience input via mobile AR, fostering hybrid performances that evolve dynamically during shows.

Notable Examples and Case Studies

Pioneering Productions

One of the pioneering applications of on-set virtual production (OSVP) occurred in the first season of the Disney+ series The Mandalorian (2019), where Industrial Light & Magic (ILM) introduced its StageCraft technology as the first major implementation for a Star Wars production. This setup featured a massive 20-foot-high, 270-degree semicircular LED video wall comprising 1,326 individual screens, surrounding a 75-foot-diameter performance area to enable immersive creature interactions, such as those involving the character Grogu (Baby Yoda), directly in-camera without traditional greenscreen compositing. The technology integrated real-time rendering via Unreal Engine and NVIDIA GPUs, allowing directors to visualize and adjust dynamic environments on set, marking a proof-of-concept for scalable OSVP in high-budget streaming series. Building on this momentum, series Batwoman (2020) represented an early adoption of OSVP techniques in television, particularly for constructing environments through virtual art department services and playback integration. Scarab Digital provided on-set virtual production support, including real-time content delivery and original footage capture for complex sequences, which facilitated the creation of urban Gotham sets in a controlled studio environment, demonstrating OSVP's potential for episodic TV budgets. These projects significantly influenced industry standards by reducing reliance on post-production visual effects; for instance, over 50% of 's first-season scenes were captured using , considerably lowering compositing workloads according to ILM's production reports. Key lessons from these early implementations included addressing initial challenges with LED wall artifacts, such as moiré effects caused by pixel grid interference with camera sensors, which were resolved through shallow depth-of-field (T2.5–T3.5 stops) and anamorphic lenses to defocus backgrounds, alongside real-time volumetric color corrections in the rendering .

Recent and Innovative Projects

In recent years, on-set virtual production (OSVP) has seen widespread implementation in high-profile television series, demonstrating novel integrations of real-time rendering and AI to enhance scene complexity and actor immersion. A prime example is the second season of House of the Dragon (2024), where OSVP facilitated dragon-riding sequences by employing LED volumes to simulate flight over virtual cliffs and dynamic environments, including weather effects, allowing performers to interact with pre-visualized CGI dragons in real time. AI tools further assisted in refining visual effects for these scenes, automating aspects of CGI enhancement and post-production workflows to streamline the creation of epic aerial battles. This approach not only reduced the need for extensive location shooting but also enabled precise control over environmental variables, marking a significant advancement in fantasy production techniques. OSVP's growing accessibility has empowered indie filmmakers, particularly through open-source tools like , enabling low-budget projects to achieve professional-grade results without massive infrastructure investments. In 2025, initiatives such as the Virtual Filmmaker Challenge showcased short films produced using LED walls and ICVFX pipelines, where creators leveraged for real-time environment building and , democratizing complex visual storytelling for emerging talents. Similarly, the short Simulacrum (2025) utilized entirely OSVP techniques with open-source assets to craft immersive, in-engine sequences, highlighting how indie productions can now simulate intricate sets and effects on modest setups. These examples illustrate a shift toward broader , where cost barriers are lowering via community-driven tools and scalable hardware. Innovations in hybrid OSVP-augmented reality (AR) have extended to , blending live-action filming with interactive digital overlays to engage audiences dynamically. For instance, Ellie Goulding's 2025 employed OSVP to enable rapid shifts in virtual time-of-day and environmental elements during , incorporating AR for layered effects that allowed interactivity, such as fan-customizable visuals in extended releases. This hybrid method, powered by real-time , facilitated seamless integration of performer movements with AR graphics, fostering innovative fan engagement through shareable, augmented content. A Altman Solon survey indicated that OSVP has become routine in the industry, with 40% of major studios' current projects incorporating virtual production tools as of that year, reflecting its maturation from experimental to standard practice. Over 50% of executives anticipated further adoption within the next 18-24 months, driven by efficiency gains in collaboration and deadline management.

Challenges and Future Directions

Current Limitations

On-set virtual production (OSVP) faces several technical constraints that impact its practical implementation. Large-scale LED volumes, essential for immersive environments, consume substantial power, with systems drawing between 100 and 500 kilowatts during operation, contributing to high energy costs of $50 to $250 per hour at commercial electricity rates. This power usage generates significant heat, necessitating dedicated cooling systems that increase overall energy demands by 20-30% and can lead to panel overheating if not managed, potentially limiting continuous shoot durations to avoid thermal shutdowns or performance degradation. Logistically, OSVP requires extensive preparation and expertise, posing barriers for efficient deployment. Full LED volume setups and configurations often demand several days for professional installation and 2-4 weeks for testing and commissioning, particularly for integrating tracking systems, , and rendering hardware. Additionally, productions need specialized crews, including virtual production supervisors, real-time rendering experts, and technicians, who must collaborate closely with traditional roles like directors of photography, extending onboarding and operational complexity. These factors elevate upfront costs, with basic small-scale volumes starting at $500,000 including installation, making OSVP prohibitive for modest-budget projects despite rental options around $50,000 to $75,000 per week. Creatively, OSVP struggles with replicating nuanced environmental elements, restricting its versatility for certain visual storytelling needs. Subtle atmospheric effects, such as or , are challenging to render convincingly in real-time on LED walls due to limitations in dynamic particle and interaction, often necessitating hybrid approaches that combine physical effects on set with digital enhancements. This can disrupt the seamless in-camera capture that defines OSVP's advantages, requiring compromises in spontaneity or authenticity for complex scenes. Accessibility remains a key hurdle, with OSVP predominantly utilized by high-end studios due to its resource-intensive nature. While market growth indicates broadening participation, adoption among independent productions lags, limited by skill shortages in operating LED volumes and the high capital for retrofitting infrastructure, affecting only a small fraction of smaller-scale projects as of 2025. Advancements in are poised to transform on-set virtual production (OSVP) by enabling predictive rendering and automated scene generation, allowing for dynamic, real-time adjustments to environments during shoots. Generative AI tools, such as those integrated into pre-visualization pipelines, facilitate the rapid creation of storyboards and visual foundations from scripts, potentially reducing time by up to 40% through automated asset generation for photorealistic settings like urban landscapes or natural terrains. These AI-driven workflows, exemplified by platforms like upgrades for in-camera (ICVFX), empower filmmakers to iterate scenes on-set without extensive delays, fostering earlier collaboration between creative and technical teams. Sustainability initiatives in OSVP are gaining momentum through the adoption of eco-friendly LED technologies and cloud-based rendering, which minimize and logistical waste. New regulations on LED energy efficiency, effective from September 2025, mandate higher standards for imports, promoting low-power displays that reduce operational carbon emissions in virtual production studios. rendering solutions further contribute by shifting compute-intensive tasks to efficient data centers; virtual production workflows, including rendering, can achieve emissions reductions of 52-76% compared to traditional location filming, as per a report cited in industry analyses. These practices align with broader environmental incentives, accelerating the shift toward greener production pipelines in 2026 and beyond. Efforts to democratize OSVP are advancing accessibility via portable mobile kits and VR headset integrations, enabling remote and on-location shoots without large-scale infrastructure. Modular LED volumes and portable stages, such as those from Be Electric Studios, allow mid-budget productions to deploy flexible setups for targeted scenes, lowering entry barriers for independent filmmakers. VR headsets provide directors with immersive, real-time previews of virtual elements, enhancing creative decision-making during and reducing the need for costly revisions. This evolution supports broader adoption in diverse production scales, from indie projects to global collaborations. Integration with platforms is emerging as a key frontier for OSVP, facilitating distributed, real-time coproduction across global teams. Tools like the XR-IT framework enable multi-site live-action and synchronized mocap workflows in shared virtual spaces, as piloted in EU-funded projects for immersive . By fusing OSVP with environments, production teams can collaborate seamlessly from remote locations, streamlining asset sharing and iterative feedback in a persistent digital workspace. Recent experiments in distributed virtual production serve as precursors to this scalable, borderless approach.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.