MO-SYS TECHNOLOGIES

USED IN THE CONTRACTOR

The glue behind virtual production. It tells the camera where it is in the virtual world, so as you move the camera, the virtual background – whether on green or LED – moves with it. The Startracker itself is a little box that sits on top of the camera, and it achieves this by looking at stars – reflective circles – on the studio roof or floor (starmats can be laid down anywhere, quickly).

And it does it In real time. Which helps visualise the shot as you shoot it. Kind of essential, but a luxury at the same time.

And it corrects the background’s parallax. It even knows the lens distortion depending on focal length, and moves the scene accordingly. Therefore, a panning or tilting shot looks completely real.

MO-SYS STARTRACKER

The glue behind virtual production. It tells the camera where it is in the virtual world, so as you move the camera, the virtual background – whether on green or LED – moves with it. The Startracker itself is a little box that sits on top of the camera, and it achieves this by looking at stars – reflective circles – on the studio roof or floor (starmats can be laid down anywhere, quickly).

And it does it In real time. Which helps visualise the shot as you shoot it. Kind of essential, but a luxury at the same time.

And it corrects the background’s parallax. It even knows the lens distortion depending on focal length, and moves the scene accordingly. Therefore, a panning or tilting shot looks completely real.

MO-SYS VP PRO SOFTWARE

This software embeds the information from the Startracker into the virtual world, powered by Unreal Engine 5. And it does so in a way that encourages direct interaction with Unreal without fettering either software (Epic Games develop and improve Unreal so often, the Mo-Sys software is designed not to interfere with new versions).

MO-SYS ENCODED TURNTABLE

Director Toby Fountaine wanted to make use of the unlimited boundaries of shooting virtually and envisaged a 360-degree rotating shot for one of the central (and funniest) scenes in the pilot.

So the Mo-Sys engineers built a turntable, physically large enough to turn an entire restaurant table and its occupants. But also encoded, so plugging it into the virtual world. So as the set and actors turn, so the background turns accordingly.

Mo-Sys designed this to satisfy the director’s vision for this shot and while doing so invented a system that allows them to provide customers with turntables of 4m+ diameter… never going over budget or falling behind the shooting timetable.

Game-changing. Table-turning.

MO-SYS MOTION CAPTURE (MOCAP)

Remember those mocap suits with coloured blobs on them? No longer needed. The Mo-Sys-Mo-Cap system uses six cameras to capture the digital skeleton of a person dressed in normal clothes or costume. This skeleton can then be re-dressed in any costume in the virtual world and added anywhere in the scene.

So all our extras are metahumans or mocapped people.

MO-SYS HYBRID WORKFLOW

We made the pilot so quickly in part thanks to careful planning on the part of DOP Alex Grigoras. And in part thanks to the Mo-Sys Hybrid workflow.

Mo-Sys arranged their studio so the team had immediate access to a greenscreen stage and an LED wall right next to each other. The team had a choice of whether to shoot against greenscreen or LED depending on what suited the shot…

POST-PRODUCTION

EDITING

Screenography editor Tom Sands achieved picture lock (confirming which shots are used where: locking the picture so everyone knows how long the pilot is, and what it will look like so other departments can begin post-production work). The shots with an LED background are all captured in camera, so they’re ready once shot.

GREENSCREEN RE-RENDERING

But the greenscreen shots need re-rendering to bring the quality of the background up to broadcast quality (you shoot lower quality for speed).

This re-rendering takes a lot of computer power (one local computer took 8 hours to render 20 seconds of footage).  Thankfully Mo-Sys have Near Time, a real-time rendering and compositing solution hosted on AWS (Amazon Web Services) cloud that allows a much higher quality (cinema) finish that is not time limited.

It provides unlimited render quality not limited by the frame rate of the camera. It dramatically reduces the post-production time and cost of re-rendering VP background. It uses cloud computer power – other people’s computers – to deliver the re-rendered, XML-coded edited rushes directly to the editor via the cloud for use in the final cut.

So the editor gets the final version, quickly, directly.

COMPOSITING & COLOURING

And to composite the background and foreground shots to seamlessly merge real and background without it looking like a cardboard cut-out being carried across a stage you need to composite some of the greenscreen. The compositor also removes elements caught on camera that aren’t supposed to be there (like lighting rigs). World-renowned compositor and VFX supervisor Gianluca Dentici stepped in to oversee this process with leading post-production house iGene.

Finally the whole pilot is colour graded. DOP Alex Grigoras, having been involved with lighting continuously throughout the project is in charge of colour grading.