top of page

Virtual Production Workflow: Previs, On Set, and Post

Two people in motion capture suits with reflective markers perform movements in a studio. "VIRTUAL PRODUCTION WORKFLOW" text below.

Virtual production is not a visual trick. It is a different way of planning, shooting, and finishing a project. A Virtual Production Workflow connects previs, on set craft, and post in a single real time ecosystem, so creative decisions are made with final pixels in mind from day one.


Instead of throwing plates over a wall to visual effects at the end, the entire team works inside the same digital space across pre production, principal photography, and finishing. Environments, lighting, digital humans, and motion data are shared assets that evolve over the life of the show. This is where a studio like Mimic lives every day, across scanning, rigging, performance capture, animation, real time engines, and visual effects.


This article goes through the fy visualization, on set practice in the volume, and post production in a real time aware pipeline. It is written for producers, directors, supervisors, and technical artists who want a practical, production ready view of how this workflow actually runs.


Table of Contents

What virtual production changes in the classic pipeline

Comparison chart of Traditional vs. Virtual Production processes. Traditional uses a linear flow, while Virtual uses a circular method.

Traditional production is mostly linear. Art and visualization are handled early, but the bulk of the heavy lifting in visual effects and finishing happens after the shoot. Many creative decisions are locked when editorial delivers a final cut, and surprises in post are common.


In a Virtual Production Workflow, the most expensive and consequential decisions move earlier. World building, lighting, and camera design are validated in engine before the crew walks onto the volume. Real time rendering, LED stages, and live compositing let the director see a close to final image on set, not months later in a review theatre.


This shift has a few important consequences.

  1. Visual effects stop being a separate department at the end and become integrated across all phases.

  2. Pre production takes a larger share of schedule and budget.

  3. On set days become more predictable, with fewer unknowns related to weather, locations, and resets.

  4. Post becomes about refinement, integration, and a smaller layer of secondary work rather than discovery.


For character driven projects, this pipeline connects naturally to Mimic style work such as photoreal digital doubles, facial performance, and high fidelity digital humans. If your film or experience is built around believable people, the earlier those assets exist, the more value they deliver to every department. A good starting point here is a dedicated page on advanced 3D character services.


Previs and visualization before the shoot

Flowchart titled "Previs and Visualization Flow" with three stages: Pitch, Previs, Technical. Each stage includes icons and descriptive text.

Virtual production stands or falls on preparation. The more that is explored and tested in previs, the smoother the days on stage. Unreal Engine and similar tools make it possible to treat previs as a working version of the final film, not a rough animatic that gets thrown away.


1. From pitch to tech planning


Early visualization often flows through a sequence of related steps. Different studios name them slightly differently, but the intent is consistent.

  1. Pitch level visualisation: This is a light but cinematic pass used to sell the project, align stakeholders, and test the tone. It might use rough characters and environments, but timing and framing already matter.

  2. Previs: Here, the creative team explores staging, lens choices, blocking, and tempo in 3D scenes. The virtual camera language that will drive the shoot is developed here.

  3. Technical visualisation: Once a sequence plays well creatively, the same scenes inform real world logistics. Camera positions, crane reach, volume size, tracking requirements, and lens data are derived from these digital rehearsals.


All of this happens with the same core assets that will later drive the volume: environments, lighting setups, camera rigs, and in many cases, performance capture data.


2. Character, mocap, and facial work in previs

On character heavy projects, Mimic style performance capture is already part of previs. Actors can be recorded on a stage early, giving directors real performances rather than stand in animation. Those takes are then dropped into the engine to test blocking and camera work.


This is where a dedicated motion capture team becomes more than a vendor. Accurate body capture, facial data, and retargeting pipelines let previs stand closer to the final film. Rigging and deformation are validated under the same lighting and lensing that will be used on set.


3. Environment building and location scanning

The virtual art department designs, builds, and optimizes digital locations during pre production. Existing sets can be captured with photogrammetry or 3D scanning, while imagined spaces are built from concept art and layout studies.


Key points at this stage

  1. Environments must be physically aware. Door heights, stair spacing, and travel distances affect blocking and camera choreography.

  2. Assets must be ready for real time. That means clean topology, intelligent level of detail strategies, and shader setups tuned for the chosen engine.

  3. Lighting concepts should be proven both in engine and in the real world lighting plan, so heads of department can trust what they see on the LED walls later.


When pre production is handled this way, the creative intent is already living inside the game engine by the time the volume is booked.


Previs is therefore not a throwaway step. It is the first practical expression of the Virtual Production Workflow.


On set practice inside the volume

Infographic on "On Set Practice Inside the Volume" shows stages: Physical, Digital, and Control. Includes text on real-time integration.

Once the show moves onto the stage, the work of previs becomes a set of live, controllable tools. The more faithfully the previs phase was executed, the less stress the shoot day carries.


The on set environment has three main layers.

  1. The physical stageThis includes LED walls, floor, tracking markers, any practical set pieces, and the camera and lighting package.

  2. The digital stageThis is the game engine scene, with environments, atmospheric effects, simulated light sources, and virtual cameras.

  3. The control layerHere live compositing, camera tracking, color pipelines, and show control tools tie both spaces together. Operations teams often refer to this area as the brain bar.


On set, a Virtual Production Workflow lives and dies on discipline. Camera moves, lens changes, and lighting cues are all tracked, logged, and mirrored in the engine. When this is done well, editorial receives material that feels like final visual effects plates on day one.


1. Role of real time integration

Real time engines only add value if they are deeply integrated with the rest of the production stack. That means color pipelines that match the grading suite, lens encoders that match tracking data, and cameras that share metadata cleanly with the online and conform stages.


A robust real time integration setup ensures that virtual cameras, lighting cues, and tracking data are not temporary. They become part of the editorial and finishing process.


2. Working with actors and directors

For performers, virtual production can be freeing. They see the world they are playing in instead of a green screen. For directors and cinematographers it becomes a form of live action animation, with the ability to nudge a mountain, adjust a skyline, or roll a sunset forward in time during the take.


Good practice on set includes

  1. Clear ownership of changes inside the engine, so the world does not drift unexpectedly.

  2. Shared terminology between physical and digital teams, so a request for a lighting change or camera move is understood by both.

  3. Daily reviews of takes that include both live action and engine output, so problems are caught while the stage is still booked.


In character heavy scenes, facial capture can be run in parallel, with clean lines from the volume to the animation and rigging teams.


Post production in a real time aware pipeline

Flowchart titled Real-Time Post-Production Pipeline with steps: Editorial, Environments, Additional VFX, VFX Finishing; includes icons and brief descriptions.

Once the shoot wraps, the work does not stop, but its nature changes. In post, the Virtual Production Workflow stops being a handover and becomes a refinement stage where departments polish a shared result.


A few patterns define well run virtual production post pipelines.

  1. Editorial cuts with both camera original and engine delivered material from the start, not waiting for later visual effects pulls.

  2. Environments remain live. If a director wants to adjust a skyline or deepen fog in the distance, the environment artists can update the engine scene and deliver a revised pass without breaking the color pipeline.

  3. Additional visual effects get layered on top. Explosions, particle simulations, cloth and hair passes, creature work, and secondary composites still live in traditional offline renderers where needed.


Here, a team handling visual effects finishing becomes a partner rather than a post house at the end of the chain. They understand how to take in camera results from the volume and blend them with offline simulation, complex compositing, and final beauty passes.


The more information is preserved from set metadata, lens grids, timecode, tracking, engine versioning the more efficient this stage becomes.


Asset and data management across the show

Infographic on asset management: 1. Version control, 2. Naming structure, 3. Data integration, 4. Core discipline. Emphasizes clarity and stability.

Underneath previs, on set work, and post sits an asset layer that must be treated with the same seriousness as any other department.


  1. Version control: Environments, characters, and props need clear versioning, with a single source of truth and defined handoffs between departments.

  2. Naming and structure: File structures should reflect the life of the show rather than personal habits. Scenes, sequences, shots, and environments should be navigable by anyone joining mid production.

  3. Data from other stages: Scans, motion capture, rig builds, texture sets, and render caches all feed into the real time environments. Integrity here directly affects stability in the volume.


Studios that work regularly in virtual production often treat asset management as a core discipline in the same way as layout or compositing.


Team roles and collaboration

Flowchart of team roles in collaboration, featuring Director, Cinematographer, and others interconnected around a Shared Digital Space.

Virtual production does not replace existing roles so much as connect them more tightly. You still need strong direction, experienced cinematography, production design, and visual effects supervision. The difference is that these conversations now happen earlier and are mediated through a shared digital space.


Key roles include

  1. Virtual production supervisor: Owns the connection between the physical shoot, the real time scene, and downstream post.

  2. Virtual art department lead: Responsible for environments in engine, including performance, fidelity, and consistency with production design.

  3. Real time technical lead: Oversees engine performance, stability, and bridge tools between the brain bar and the rest of the pipeline.

  4. Performance capture lead: Coordinates body and facial capture, retargeting, and integration with animation and rigging.

  5. Data and workflow lead: Ensures that everything from show LUTs to timecode travels cleanly through the pipeline.


When these roles collaborate closely with director, cinematographer, and production designer, the result is a coherent virtual world that behaves like a physical location.


Comparison table

Below is a compact view of how a virtual production pipeline differs from a traditional one across each phase of production.

Stage

Traditional production

Virtual production driven pipeline

Pre production

Storyboards and animatics, limited 3D testing, environments largely conceptual

Extensive previs in engine, digital locations built and tested, camera and lighting validated in virtual space

On set

Physical locations and sets, green screen for complex work, visual effects largely invisible during the shoot

LED volume or mixed stage, live compositing, director and actors see environments during takes

Post production

Heavy visual effects and environment work, many creative decisions still open, longer iteration cycles

Refinement of in camera results, secondary visual effects on top, faster iterations informed by live decisions made earlier

Data flow

Plate based, many manual handoffs between departments

Shared assets and scene data across previs, shoot, and post, closer link between engine and offline tools

Risk profile

Schedule and cost risk concentrated in post, surprises common late in the process

More investment and planning up front, fewer unknowns once shooting begins

Applications across film, games, and live media

Icons illustrate virtual production: films, games, ads, performances, and training. Black and white design with bold text.

Although virtual production emerged most visibly in high end episodic work, it now stretches across many sectors.

  1. Feature films and series: Large shows use volumes to control complex environments, from alien worlds to city streets that would be logistically painful on location. Digital doubles and advanced facial work sit comfortably inside this pipeline.

  2. Games and cinematics: Game studios already live inside real time engines. For them, virtual production is a natural way to capture in engine cinematics, using film crews and performance capture on stages that mirror their game worlds.

  3. Advertising and branded content: Agencies can cycle through multiple locations in a single day, or iterate on art directed environments at speed. Mimic style ai driven avatar work can front digital hosts for campaigns while staying within a single pipeline.

  4. Live performances and events: Concerts, launch shows, and mixed reality broadcasts use virtual stages to push live entertainment into augmented and extended formats. A strong base in xr focused experiences is valuable here.

  5. Training, medical, and industrial use: Simulated environments with realistic digital humans support rehearsal, education, and complex procedure training without leaving the studio.


In each case, the core pattern is the same. A shared engine and asset base serves previs, capture, and post, rather than building different pipelines for each project type.


Benefits for producers and creatives

Infographic titled "Benefits for Producers and Creatives" highlights predictability, creative alignment, performance space, reusable worlds, and faster marketing. Black icons illustrate each point.

For producers the biggest gain of a Virtual Production Workflow is predictability. When locations, lighting, and weather are simulated, schedule and budget become less vulnerable to chance.


Other benefits include

  1. Stronger creative alignment: Directors, cinematographers, production designers, and visual effects supervisors all look at the same 3D scenes from early in the process. Misunderstandings are caught when they are still inexpensive.

  2. Better performance space for actors: Instead of playing to a green wall, performers are surrounded by the environment their characters inhabit. This improves eyelines, emotional connection, and timing.

  3. Reusable worlds: Once a digital location is built to the right standard, it can be reused across episodes, sequels, or campaigns with incremental improvements rather than full rebuilds.

  4. Faster marketing material: Because key moments exist in engine, stills and teaser content can be generated long before final online and grading are completed.


For studios that already handle scanning, rigging, motion capture, and animation, this way of working turns those capabilities into a single continuous service instead of isolated offerings.


Challenges and risk points

Virtual Production: Challenges & Risks. Illustrated icons and text highlight issues like front-loaded effort, technical complexity, talent gaps, overbuilding, and misaligned expectations.

Virtual production is powerful, but it is not magic. It introduces its own set of challenges and failure modes.


  1. Front loaded effort: Creatives who are used to improvising late in the process may struggle with the amount of decision making required during pre production.

  2. Technical complexity: A volume is a dense stack of technology. Tracking, LED, cameras, color, engine, and control software all need to work together. Weakness in any part affects the whole show.

  3. Talent gaps: The demand for experienced real time technical artists, engine supervisors, and virtual art department leads still outpaces supply. Training and mentorship are essential.

  4. Over building: Teams sometimes spend too much time on environments and assets that only appear briefly on screen. Clear priorities and shot based planning are important.

  5. Misaligned expectations: Stakeholders may assume that virtual stages solve every problem. In reality, some work is still best done on location or with traditional bluescreen techniques, especially very large scale action or destruction.


Studios that are honest about these constraints and design their projects accordingly tend to deliver better, more consistent work.


Future outlook

Virtual production trends infographic with five steps: tighter engine connections, intelligent environments, digital humans, democratization, and media convergence.

Virtual production is still evolving, but a few trends are already visible.


  1. Tighter connection between engine and offline renderers: Expect smoother round trips between real time scenes and high end path traced renders, making it easier to combine volume work with complex simulation and lighting.

  2. More intelligent environments: Environments will respond more naturally to action, with procedural systems for crowds, traffic, foliage, and weather all driven by the same data that powers the shoot.

  3. Smarter digital humans: Studios with deep experience in facial rigs, neural animation, and conversational avatars will bring those capabilities into the virtual stage, enabling interactive characters and live controlled performances.

  4. Democratization: Smaller stages, improved tracking solutions, and lighter toolchains will bring this way of working to mid scale productions, independent features, and even high end social content.

  5. Convergence with live and immersive media: The same assets used for a film may also drive a live concert, a game level, and a mixed reality installation, making cross media storytelling more economical.


As these shifts continue, virtual production will feel less like a special technique and more like a standard way of making screen content.


Frequently asked questions


Q1. Is virtual production only for blockbusters?

No. The tools were proven on large shows, but the same principles help mid budget projects, advertising, corporate work, and even educational experiences. The key is to scale the stage, asset count, and team size to the project rather than copying big studio setups.

Q2. Do I still need post visual effects if I shoot on a volume?

Yes. In camera environments reduce certain tasks, but many projects still need additional compositing, simulations, secondary digital doubles, or fine detail work. The value of virtual production is that much of this work is already framed and lit correctly, which saves time later.

Q3. How early should I involve a virtual production team?

As early as possible. The same people who will run the stage should have input on script breakdown, shot design, and environment planning. Bringing them in once the schedule is locked and sets are designed reduces their ability to help.

Q4. Can virtual production work for fully animated projects?

Yes. Many animated shows now use real time engines, virtual cameras, and performance capture to direct scenes like live action shoots. The difference is that every element is digital, but the workflow across previs, principal capture, and finishing is similar.

Q5. How do digital humans fit into this workflow?

High quality digital humans and doubles are simply another class of asset inside the same engine. They benefit from the same previs, on stage lighting, and post pipelines as environments. For complex character led experiences, dedicated pages such as Mimic’s work on digital humans and 3D animation show how these disciplines connect to the broader pipeline.


Conclusion


Virtual production is not about replacing craft with technology. It is about moving decisions earlier, connecting departments more closely, and letting filmmakers see their work as it will truly appear while they are still able to change it.


When previs, on set craft, and post are designed as one Virtual Production Workflow, crews spend less time fighting surprises and more time shaping performance, composition, and emotion. For studios like Mimic that already live across scanning, rigging, mocap, animation, real time engines, and visual effects, this is simply the natural way to work.


The productions that benefit most are those that respect both sides of the equation the physical stage with lights, lenses, and actors, and the virtual stage with its worlds, characters, and data. Treating those as a single, shared space is where the real gains appear.


For inquiries, please contact: Press Department, Mimic Productions info@mimicproductions.com

bottom of page