Tim and Bash -- Behind the Scenes

Play Video

Introduction

Tim and Bash is a series following the adventures of a firefighter (Tim) and his pet dragon (Bash). As Bash is prone to lighting things on fire, Tim has his hands full!

Their original production process involved a traditional manual method of shooting ‘blind’ without an animated character reference, while trying to imagine the location and reactions of Bash.

Jetset Discovery

Andy, the animator who creates Bash’s movements, was looking for an improved method to shoot their scenes. He discovered Jetset, and started to experiment with its capabilities.

A Workflow Transformation

Andy realized that they could completely revamp their production workflow, using Jetset both for the previsualization process, and for the actual on-location production.

Here’s how they are approaching this:

Location Scan, Animate, and Pre-Shoot

First, Andy scans the planned shoot location with an iOS scanning app like Scaniverse or Polycam. With a LiDAR-equipped iPhone, this can provide a 1:1 scaled scan of the location with color textures.

Andy imports this scan into Maya, and then adds scene locators to enable shooting in Jetset from different points.

He then animates Bash running through the scene, leaping onto couches and sofas, and finally dashing out the door.

He can export the entire scene, including the textured scan of the location, Bash’s animation, and the scene locators into a USD file with Maya’s standard USD exporter, and convert from USD to USDZ with Autoshot’s USDZ conversion tool. He then pushes the USDZ file into Jetset via Autoshot’s file transfer system.

Inside Jetset, the team can try different camera angles and focal lengths, and test shoot the entire sequence, all without needing to be at the actual shooting location, because the whole scan is in the iPhone!

They can edit the shots together to find the right rhythm to tell the story.

Chasing Dragons in Production

Because they already have an animated Bash in a scene that is dimensionally matched to the production location, the team can then simply load Bash’s animation (without the scan), set the Jetset origin in the live action shooting location, and then capture live action footage of Bash leaping around the scene.

Because his animation was originally created with a scan of the production location, he is now leaping over real sofas and bounding around real corners, and the camera operator can shoot the project looking at the Jetset viewfinder as if they were shooting a real dragon bounding around the scene!

The extreme low latency of the Jetset real time tracking and rendering is key here. For a rapidly-moving CG character like Bash, latency in the viewfinder will cripple the ability of the camera operator to realistically track him through the scene.

Stay tuned for the next video in this series, showing this process happening on location!