Transcript
**Eliot:** All right. Morning all. Hey, go.
**Heiko:** Hi, greeting from Germany.
**Eliot:** Greetings. All right, and Josh, good to see you. Good to see you. All right. Fantastic. Uh, let's see. Uh, who wants to jump in first?
**Ben:** Yeah, Elliot.
**Eliot:** Hey, Ben. How you doing? All right. Fantastic. Um, let's see. I tell you what, um, Hiko, let's, let's start off with yours real quick and let's, uh, let's kind of see what, see what we're, what we're looking at.
Lemme uh, pull up my email. So I've got, I've got a reference. Let.
**Heiko:** Sure. Um, well, I have nothing specific. Um, just wanted to talk about the current, uh, process with Unreal Engine, and maybe there's something I don't know, which we can use for our project, which we are shooting in a few, few weeks. We are starting, um, I don't know, maybe I'll show you my current workflow.
**Eliot:** Yeah, yeah. Let's, let's take a look at, 'cause uh, yeah, we've, we've been made a number of changes that are I think gonna be really good for Unreal. Um, that I haven't had time to, to put in the, the documentation systems. We've, we've written the code, it works, but we need, I need to document it a little bit better that it works with our, uh, timeline processing system.
**Heiko:** Right. That's maybe something, uh, worth looking into because right now. I just go shot by shot. And, um, but what I can say, what I really love is, uh, how fast you implemented the option to select the tag by the sinner file name instead of the iPhone file name. So that I already tested in Auto Shot and I love it.
**Eliot:** Excellent. Excellent.
**Heiko:** That's really excellent. Really time safe and not because that's what you already work with and resolve. So it's great to just pick it, uh, that way. Okay. So I guess I just share my screen here. Uh, can you see?
**Eliot:** Yeah, yeah.
**Heiko:** Okay, cool. So I have some, this is basically the timeline we have from our, uh, test shooting day
**Eliot:** mm-hmm.
**Heiko:** Where we tested the latest version and these are already com shots I produce. Here's some. This is an idea we have. We want to try to have an actual, uh, real, um, thing mm-hmm. Included in the CGI wall of the spaceship later on. So this is an actual thing we did on set.
**Eliot:** That's great. That's great.
**Heiko:** It into a green screen ball.
And, uh, that was basically the test shot. It's not 100 per, uh, perfect tracking right now, but it's very close. So I'm confident we can get it working. Uh, yes. Well, mostly my workflow is basically that I, as you can see, I have the, uh, time code displayed mm-hmm. And also my file name, so I can just go to the start frame.
And when I go into auto shot, this is now my pc, this is, uh, remote.
**Ben:** Mm-hmm.
**Heiko:** I can just select basically the file I want to work in.
**Eliot:** Mm-hmm.
**Heiko:** And, um, uh, actually, uh, no, that's the right one. And enter the, um, correct time codes.
**Eliot:** Mm-hmm.
**Heiko:** And of course I need to be on one wheel of course. And yeah, basically this, this is all, um, my takes are already synced, but because, uh, that's why I can choose this, um, file name.
**Eliot:** Mm-hmm.
**Heiko:** One of my questions is, um, I have this refined offset and it seems to do something. So I'm actually using it every time I import a new shot. Yeah, so question is, um, why do we have this as an option to refine the offset and why does it not automatically, uh, does it do it all the time?
**Eliot:** Ah, okay. So what refined offset does is it runs, um, optical flow analysis on the, the jet set portion and, and the, the, the video and the, uh, the cine video.
And if they're, if they're moving around a fair amount, it can deduce the, the, the, um, the vectors, the motion vectors, and provide a more close time alignment. 'cause sometimes with when you're, you match by time code, the time code can be off by a frame, depending on the system. It looks like your time code matches are very, very good.
Um, but sometimes the time code is like all over the place. And so, uh, and so in those cases it works well to, uh, to calculate the motion vectors, uh, of the shot and then use those to, to refine the offset. You have to get close with the time code within a couple of frames, and then it can refine it. Now if you're getting good results, um, without it.
You know that, that's fine. It's, it's, it's a, it's completely, it's not refining position, it's refining timing of the, of the tracking data. And it's really there to handle when you have a time code system where the frames are just, just, you know, like one of two frames off, which is not uncommon with time code depending on the system.
Um, it's, it's very, very common, uh, to, to see that. So that's, that's what it's doing.
**Heiko:** Okay. But, um, so you mean basically I should just try not to use it and only if I see a problem, I should use it?
**Eliot:** Yeah, I would say use it if
**Heiko:** you save on time.
**Eliot:** Yeah. The problem is, is if you have a shot that's mostly stationary there, or the motion is not, is, is, is not like moving around very much.
Then when it does the ti the, the emotion, the um, optical flow analysis, there's no motion for the algorithm to get it ah, to get hold onto.
**Heiko:** Understood. Yes.
**Eliot:** And it could be like all over the place. So,
**Heiko:** so basically if I have a tripod shot where I just pen and two, it would probably get confused by it because there's no parallax.
**Eliot:** Uh, parallax is okay. It's, it's just shot motion. It's, it's a, this is a purely 2D process where it is correlating the, the 2D motion of the pixels of the mm-hmm. Um, the syn camera footage with the jet set footage and just making sure they're, they're time synchronized, uh, better. So if you have a very, like, if you almost, if the camera's moving, it'll work.
It should work. Um, mm-hmm. There's, there's just a few cases where the camera barely moves, where the optical flow doesn't, doesn't work that well.
**Heiko:** Okay. So just use it if you have the, uh, if you think something is off the frame or something.
**Eliot:** Yeah.
**Heiko:** Otherwise I don't need it. I just made it a habit to just click it be before I start importing the shot.
So, um. Up to now. All my shots I tested from, from that shooting day all worked perfectly fine and tracking was basically flawless. Great in, in the sense of what you can expect from just using the phone with no post uh, tracking. Great. So I'm very happy with it. The, the process really works well. Um, yeah.
Okay. So we are basically not much more to tell. It's a simple workflow. I import the LIDAR scan. Another thing I really like, uh, I wrote it in the email, uh, that you can now, um, realign your shot in jet set based on the 3D LIDAR scan you have. I just stumbled upon that feature on our shooting day because I did the, uh, scan of our green screen.
Mm-hmm. And, uh, one element in the shot, and later on after we had a break, I had to realign my scene and then I saw I can just realign by using a scan. And it was just a simple point in click and pop. My scene was realigned perfectly and I could just start recording again. So that's a really nice feature I wasn't aware of.
**Eliot:** Yes, yes. That's, that's
**Heiko:** one of those things that's funnel by accident. Uh, maybe one more thought I had, um, uh, regarding the, the keying process, uh, we can, uh, key out the green of course, but this information does not transport further than from the app. Right. So
**Eliot:** Right.
**Heiko:** The workflow right now is you generate the footage into EXR for, let's say, unreal, but then you have this whole process of.
Uh, duplicating the material and unreal engine drag a new key on onto it. And, um, I was thinking if it would make sense to just use the information and records the information and jet set from the actual shoot where you'd already keep the footage and maybe do simple upscale from this met and use that as an alpha channel for the X Air when you import it into Unreal or Blender or something depending on what you are after.
I, myself, I only use it for reference and not for final render. So a simple met would be perfectly fine. Of course, it would not be enough if you would want to use it for final pixel in Unreal Engine to my knowledge. But just the thought I had, uh, because it seems um, unnecessary to do the process twice.
So yeah, gets it then completely disregard information and do another key and unreal engine later on. I don't know how feasible that is, but, uh, it occurred to me to maybe, uh, suggest that
**Eliot:** No, it's, it's a great suggestion. In, in Unreal. Are you using the, are you using JPEGs or are you using EXRs to, to process inside Unreal.
**Heiko:** Uh, I use EXRs, uh, as you can see.
**Eliot:** Oh, I see. Okay. I got it. I got it.
**Heiko:** Um,
**Eliot:** okay.
**Heiko:** So you would have Alpha channel information available if you want. Uh, but like I said, I just use it for a rough reference. Uh, mostly I don't use it at all. So the, the footage in, because if I have a 3D scan, I can just align my scene.
Mm-hmm. If I have something that is, uh, in my scene, like if you look here, that's our bridge set from our project.
**Eliot:** Oh, that's great.
**Heiko:** And um, if you have a green screen, and for example, you have the captain chair, we had the scan of the captain chair in its position, and that is basically our anchor. For the shot, then I could just align my whole, uh, take using the 3D scan and align it with that model.
Then just, um, hide that model in the render later on. And my shot is already aligned, so I don't really need to do much more than that. Um,
**Eliot:** right, right, right, right. This, the, the pull, bringing the key from Jet Set is a good idea. And in fact, what we. What we would probably wanna do, because in Jet Set, the key is quite low resolution.
We're, when we bring in the C footage, we're, we're down sampling pretty heavily down to, I think it might be like five 40, uh, p to, to, uh, to, so that the, you know, we don't overload the phone. Um, and of course, what we'd, you'd really like to transfer is the green screen color. Um, we'd have to work out more ways of going from a, uh, in the iPhone it's, it's A-S-R-G-B, you know, a signal, and, uh, yeah, and unreal.
It's linear. So we, you know, we have to work out the color space stuff, but that's, that's, that's workable. So it's a, it's a really good idea. Um, and I think we're gonna be, we're gonna be doing more work on integrating, um, on integrating the pieces, uh, as a, as a solution, um, to get, to get all the pieces together.
Um, especially as we, um, uh, as we ramp up, you know, sparklets, let's just keep track of things much more easily. Uh mm-hmm. In terms of, in terms of adding, like getting the flow of data through, through from point A to point B. So, but it, and it's, I guess the, one of the things that's interesting is, is I hadn't thought about just using the time codes as, as the shot workflow.
Um, and what we've, what we have done is we, and, and you can tell me whether or not this, this makes sense to, to do for you. I don't know if it does. Um, we did implement a, a timeline processing workflow where you export an open Timeline io file from, from Resolve and, and go through it. And it, and it generates, it, it, it adds, added some complexity because what we ended up doing is, is it generates a separate, um.
Uh, uh, like a, a, a more structured directory tree that looks more like a traditional visual effects effects tree where it, it's each, each shot is in a directory and the sub directories have, you know, EXRs and proxies and, and three. So it's, it's a much more traditional visual effects output. It, it does add a layer of complexity and I'm looking at your process that you have, and what I do like is the simplicity of it, uh, where you type in the type in the, you know, the time code in, time code out, and,
**Heiko:** and it works perfectly
**Eliot:** and it works.
And I'm inclined to say it's working and I would stay, I would keep doing that. Um, I think what we will end up doing when we. Re because as we, um, uh, as we, we go through this and, and what's gonna happen in Spark is that we're gonna actually go through and rebuild all these, these pieces that we have in, in jet set, syn and auto shot into a much more streamlined fashion.
And I think what we are going to be able to get to in Spark is the post-processing piece, is to kind of, uh, take the workflow much closer to what you're doing, to where there, there's not a separate output, um, directory structure, which is what we had to do for, for this, this initial attempted timeline. We just start with a timeline and then everything is simple.
Just shop based folders. It, it's just, we don't have that, we don't auto shot, we couldn't do it. 'cause we started off without a timeline and then we added in timeline later when the, the editors started having open Timeline io. So it's. It's unnecessarily complex to do timeline processing and auto shot. So I'm just talking myself through it and I just, I like your process.
Uh, and I'm inclined to just say, if it's working, I would just, I would just do this, this, this project with the process you've got. 'cause you've got it dialed. Yeah. Um, and then a little bit later when we have the, you know, for your next show when we actually have all the spark post stuff dialed in, then yeah, it should be as both, as simple as what you have and start off with a timeline, um, from, from day day one, which is really how it should work.
Mm-hmm. So anyway, so I'm, I'm, uh, that's me kind of logic thinking my way through
**Heiko:** it,
**Eliot:** but it, it's
**Heiko:** interesting.
**Eliot:** Yeah, it's, it's, we, we added a lot of complexity to do timeline processing and I just look at the simplicity of what you're doing and I like simplicity and I think we should, we need to give timeline like what you're, what
**Heiko:** project is also maybe a bit special because like I told you before, um, our whole post-production team is me and a buddy of mine, so two people doing the whole post-production.
So we don't need huge structures and perfect file naming and all this because I'm the one doing all the, uh, jet set unreal engine stuff and my colleague is doing all the previous max CGI renders of spaceships. So there's not even much of an overlap between our two fields. So I just need to know where I find my files.
**Eliot:** Right, right.
**Heiko:** So that
**Eliot:** Yeah.
**Heiko:** Really makes it, uh, much more streamlined and
**Eliot:** All right. So I'm gonna recommend you keep doing it the way you're doing it. And what we will do, be able to do, um, uh, again, for probably your next, your next, uh, episode is we will have, um, we should have our timeline processing as simple and as straightforward as what you have right now with, with, with your, your sort of time, manual time code entry process.
It's really clean. It's gonna work great.
**Heiko:** Yeah.
**Eliot:** It's gonna scale. Only
**Heiko:** thing, uh, I don't have at the moment. I could have, um, the way I work is like, you can see here, I, most of the time I basically pick my shot. Mm-hmm. Cut it to its length, and then basically do a duplicate and then. Create a for fix connect clip.
So I'm going to the Fusion standalone version and that's the integrated one in, in DaVinci Resolve. And just give it the same, uh, name like the Sony SoFi.
**Ben:** Mm-hmm.
**Heiko:** And render it out and have my composition so I don't have frame handles in, in the classic sense. What I could do would be to drag it onto the new layer and just make it a bit bigger.
**Eliot:** I'm getting a call from an alarm company. Lemme, let me get this real quick. Sorry about this. Sure.
Sorry about that. All good.
**Heiko:** No problem.
**Eliot:** It just like as well, the, you know, I got a remote from the alarm system. Like what's, that's, it's fine.
**Heiko:** Yeah.
**Eliot:** Direction stuff. Okay. Um, okay.
**Heiko:** Awesome. Um, yeah, just like I said, I just go to the Fusion standalone version and start building my shots. And for example, if I open this one,
uh, this is really just an example, not a fully, uh, realized shot, but as you can see, it's super simple. It's just my source footage, my render from Unreal Merge Pete and, uh, uh, somewhat more good old shot would be
**Eliot:** Oh, great. So then this way, each shot is an individual VFX connect clip, and so it opens up the and, and Fusion
**Heiko:** main reason for it is, uh, because.
These shots take a lot of time to render of course.
**Eliot:** Mm-hmm.
**Heiko:** Uh, you don't get real time playback. So this is like a shot that is a bit more realized how it should look. In the end, it's a bit slow because I'm actually using Corridor key. Uh, we wrote about it in email and corridor key is a bit slow on my Mac, so it's usually faster than that.
But, uh, the advantage of going through, uh, FX Connect is that all shots are final renders. So I can just in real time watch my movie and
Oh,
**Eliot:** that's great.
**Heiko:** And you can always go back into Fusion and do your changes and just re-render the shot. And it's. Then refreshed and resolved. And you can keep working and watch your movie in real time and not wait for any renders.
**Eliot:** Can, can you go back to that VFX? Do you get to choose where the VFX connect clip is?
**Heiko:** Uh, yes, you can. Uh,
we, when you say new for X Connect clip, you can specify locations. Mm-hmm. So we have a dedicated folder fusion, we have xConnect, and then it creates up folders with, uh, much like jet set or auto shot with its own naming conventions. Mm-hmm. But you get to name the clip, so I always can find my VFX clips because I have the same numbers, like the original clip.
**Eliot:** Mm-hmm.
**Heiko:** And, uh, you can of course say what color space and, uh, container format you want. So you can also do DPX or EXR and all the flavors. So for our sake, its okay to just go to Apple Progress because we already have MP fours coming from our Sony in 10 bits. So four to two Apple progress is perfectly fine for that.
And I also leave the color space and everything. So it's already, it's still SLO free. Right. And when you go to, uh, now my Zoom bars, both my icons.
**Eliot:** No worries. No worries.
**Heiko:** Okay. Uh, so I always start with a call space transform where I go to, uh, R Linear from my camera color space, and then do everything from there.
**Eliot:** And on the output, they have a standardized set of output file names. So VFX Connect knows how to do, uh, the return. Uh, the return, uh,
**Heiko:** right. Uh, the one I created, it's the beginning. As you can see. It always starts with just the. Source clip and with the S node, and it has basically the name you gave it plus an uh, under dash V oh one.
**Eliot:** Mm-hmm.
**Heiko:** And as you can see, it will be, uh, the folder name is basically what resolve created and it's already linked, so you don't have to do anything.
**Eliot:** Wow.
**Heiko:** Okay. Only thing you have to do is, uh, if you want to have another output format, uh, where, uh, compared to the one you put into it, like you want to go in progress four to two, but you want to go out Linear xr, then of course you have to change these settings, but it's still working.
So the link is still active and you just get a render XR file, which will be linked to this clip here on the timeline later on. So that works too, depending on what you are, uh, what you need for your, for your post-production, collaborating, et cetera.
**Eliot:** It's, it's, it's, it's very elegant and I wish there was a, I, I like how they have it set up, and I just wish there was a, a more, uh, generalized standard for how we people, how we, how we could do this.
I mean, resolve has a really well set up to, to do that. Um,
**Heiko:** yeah.
**Eliot:** That's great.
**Heiko:** Absolutely. And main reason for us is really just the fact that I can completely render my VFX shot and have really firm playback and do not have to think about it. Of course, you can always do something like render and place and pre-render your clip there.
**Eliot:** Mm-hmm.
**Heiko:** Uh, but, uh, for me it's, it's much more, uh, preferable to work in the standalone version because I can also just open Fusion without resolve, open the last shot I was working on from the recent, do my stuff and do not think about resolve and I still know next time I open, resolve the foot shot is, uh.
Refreshed and ready to work on and fusion standalone. So much slimmer and easier and faster, uh, than working through resolve. Also, I had so many weird things over the last few years when working with Fusion within Resolve that I have some trust issues.
**Eliot:** I, I, I know exactly what you're talking about. Uh, the immediate,
**Heiko:** I always have the feeling that my composition is not really physically somewhere and can at any point just disappear on me.
And
**Eliot:** yes,
**Heiko:** uh, it's too much work to just, uh, rely on that. So I know I always have the confide somewhere on my hard drive, which is also then, uh, in my backup on my nas and, uh, the backup from my NAS on another drive.
**Eliot:** Yep.
**Heiko:** So plenty more safety guards and, uh, that's, well, that's, uh, good for sleep.
**Eliot:** Yeah. Yeah. No, this looks, this looks fantastic.
I think this is the, the way to do this. Um,
**Heiko:** and one. One last thing, uh, is maybe if I can just find a shot. Uh, I have all the,
let me just see. Auto shot, auto shot, auto shot.
Uh, just whatever shot. Okay. One thing we are doing, which is maybe a bit special, uh, we do not use the jet set cam.
**Ben:** Mm-hmm.
**Heiko:** Uh, because we have this, um, something I came up with, uh, during our test shoot before we did our last episode, and that is, uh, simulating an anamorphic lens. Really being anamorphic, meaning we use a stretch on the camera.
So we get the vertical stretch pokey balls, but it's still just a normal syn camera. And I just realized, uh, all I need is, uh, some, some factor to, uh, to multiply my, my actual, uh, lens, uh, focal length. In my case, we shoot with a 35 millimeter on a full frame Sony. And if I use the, the 2.0 stretch factor for the anamorphic look, I need to use a 42 millimeter focal length for a full frame sensor in Unreal.
And I get the exact same frame, but I get the look of a, of an anamorphic lens because we, of course, we shoot in 16 by nine and we just cropped the picture at the end. But we still have the possibility to reframe shots vertically, uh, which is really helpful because we often have some green screen elements still in the frame.
And instead of rotoscoping it out, we can just. Puts a shot a bit behind the crop bars. And for that reason, I just created a simple, um, simple blueprint camera, which is nothing else. When, um, an empty blueprint door with a cinema camera loaded into it and it just preset up. So it's full frame. It has the Sony lens sensor, uh mm-hmm.
Stuff, and it's already put to the squeeze factor of 2.0 and it has the, um, focal length I see. Of 42.
**Eliot:** You, you created basically an anamorphic lens in Unreal that matches the, matches the field of view of the iPhone.
**Heiko:** Exactly.
**Eliot:** I got it. I got it. It took me a second.
**Heiko:** And not the field of view of the iPhone, but of the, uh, Sony.
**Eliot:** Of the Sony. Of the Sony, yes.
**Heiko:** Yep. And I see the initial reason was. I needed this, uh, when we were using the free version of Jet Set, because we got in con uh, before we got in contact, and I, uh, needed to have an offset. And because the offset is always the same, because we had our own rig built, I just used, uh, uh, um, I just, uh, measured the distance and put the distance offset into the, um, into the blueprint camera.
So back then it was just, uh, putting these offsets here into the location transform, and then I can just attach this blueprint camera to the original camera.
**Eliot:** I love it and
**Heiko:** it works. I get from, uh, jet Set and so I always had my, uh, offset. Now I don't need it because everything is already. Correct from Jet Set because of the senior workflow.
So I just have this camera for the anamorphic look and if you compare both, uh, so if I switch,
**Eliot:** that's great.
**Heiko:** This, this is jet set camera
**Eliot:** Uhhuh.
**Heiko:** So this is what I got from calibrating. Let's calibrating all, and this is the, uh, camera I created, basically just attached to the jet set camera. And as you can see, they are basically the same, just with the added benefit.
I can now, um, have my anamorphic vocal boards if I need them. And the other thing I did is, um, the camera has an automatic, uh, tracking, uh, mode for focus. And it's attached to this, uh, the, the focus. This is just an empty actor in my scene. Which I can then, uh, let just UNH hide everything as you can see.
Some, some more stuff going on.
**Eliot:** Yeah. Yeah. You, it's kind of the CG stand in. You can lock the
**Heiko:** three. So this actor I can just put in my scene where I want my focus. I can animate it, I can attach it to stuff. And, uh, so it's much more easy for me to, to do the artificial focus pulling because of course Jet Set does do some focus stuff, but it's not always reliable or doing the thing I needed to do.
**Ben:** Mm-hmm.
**Heiko:** So this is much easier for me to just have this actor on animated or put it just on the object I want my focus to be on. And that's basically our workflow. That's fantastic. When I'm done, I just go to the, uh, export dialogue. Use EXR, basically what you are doing when you import the stuff and um, what you have set up through Jet Set.
And I just use my own folder, not the jet set folder and put it into an Z folder with the name of the Sony shot so I can easily track it later on and find it back in my fusion composition. And mm-hmm. That's basically the workflow we are using right now.
**Eliot:** Alright.
**Heiko:** One thing I encountered a few times now during our test shoot day be, or using the, the data, um, sometimes I do get most of the import, but I get some weird error messages and it does not open the new composition, so the new sequencer and sometimes a few things are missing.
I.
**Eliot:** Oh, okay.
**Heiko:** I always get the camera and I always can somehow get it to work later on, but sometimes it works flawlessly. I get after my import, the sequencer opens the shot is there, everything is like it should be, and sometimes the sequencer does not open and some stuff is missing. And I could not really, uh, recreate the effect or can tell you why it happens or when it happens.
So just something I encoded a few times. No.
**Eliot:** The next time you run into that, can you zip up the take and send it to us? I just wanna look at to see if there's something either weird in the, in the take, original take data, or if something is happening weird on the, on, on the import into Unreal, uh, ju I mean, the next time you see it, just just you up.
I
**Heiko:** I
**Eliot:** need to zip it up and send it and we'll look, we'll open it up.
**Heiko:** Sure. Yeah.
**Eliot:** I, I'd love it. Love this process of, of you're really figuring out how to construct a full. Epi episodic workflow, uh, just basically yourself and another person doing, doing the shots, which is, that's just fantastic. Yeah. And very clever on the, the 3D matching of, of the character to the lock focus.
Uh, this is, this is great. I, I honestly, at some point, and I, I think what, what will make sense is for us to get some of the new workflow pieces kind of dialed in. Um, so, so we can actually start with timeline, timeline things in Unreal. Um, oh, oh, do you, do you use Unreal has some sort of new un time timeline system called CATI haven't dive, looked into it more.
Uh mm-hmm. Do, are you using it or is it, is it something I should be looking at?
**Heiko:** No, uh, doesn't ring a bell.
**Eliot:** Okay. I then I'm not gonna worry about it as much. Yeah.
**Heiko:** Uh, okay. So results just crashed.
**Eliot:** Yep.
**Heiko:** I just wondered for, show you our, our actual, uh, timeline from when we, uh, worked on the. Last movie, which is already online.
So Scon, which I uh, showed you before.
**Eliot:** Yeah, yeah, yeah.
**Heiko:** And it's basically exactly what I just showed you. So we have our master timeline and each original clip has its duplicate, which is a for X connect clip. And um, we did our sound editing, all this we did later on with a final render of the movie video track.
And uh, as you can see, it's really just super simple timeline. So it's basically all, always just our VFX connect clip, and if you just hide it, you have your original beneath it.
**Ben:** Mm-hmm.
**Heiko:** So that's, that's basically it. Wow. And as you can see, 60 by nine. And you can see we already had to drop this a bit lower and we just.
During editing, we just use this
**Eliot:** mm-hmm.
**Heiko:** And do final, uh, uh, final render in a crop mode so we get the real eno scope. But, uh, that's, that's about it. Yeah. Uh, maybe one thing
about our jet set r uh, right now, which I'm actually pretty proud of how it turned out. Um, let's see.
**Eliot:** That's, that's great. Combinate combining the, the practical and digital. That's, that's,
**Heiko:** yeah. We really hope this will work out Great. Um, so the interaction with an actual unreal background, but people can reach into it, pull out the cable, and it's actually, there's a tablet inside with an Aircast animation playing, so it's a real, it's remote control.
So when someone put a cable into it. Um, a lamp can switch from red to blue and, uh, stuff like that. So you run into, it's not all digital.
**Eliot:** If you run into any issues with like the track slipping on some of those critical shots, we are pioneering some automated tracking refinement pieces. Um, which, which I think would actually could, would work well for the, these shots.
So if you run into that would be really be interesting.
**Heiko:** Yeah.
**Eliot:** Yeah. Just, just again, let, let, let us know, um, where it's not really part of the system yet. Mm-hmm. It's still kind of RR and D stuff, so you have to, you know, send us the shot and then Greg processes it. But I've been doing it for some of the pathways shots and it works, works really well.
Okay. This is great. Yeah.
**Heiko:** So let's, that's the, uh, actual Rick we used for the test.
**Eliot:** Oh, that's great.
**Heiko:** It's a bit dark, but uh, basically I hope you can see it good.
**Eliot:** Yeah. Oh yeah. Yeah. I can absolutely see this. So you've got i'll mounted on a stabilizer.
**Heiko:** Exactly. So it's on a, on a big gimble on this ring. And, uh, just on the support ring, uh, of course iPhone in front with the fan.
This thing here is our, um, uh, receiver for the, uh, for the audio, and it's connected to a, to a a s splitter cable. And so one channel is our audio for our, uh, uh, boom pole. And the other one is the one from the, um, time code. And it's basically the time code, um, giver is basically just put into the, uh, clamp.
**Eliot:** Mm-hmm.
**Heiko:** Uh, from the seam o. So just,
**Eliot:** yep.
**Heiko:** And this is the CMO right here. Mm-hmm. And, uh, this is just a plate for the VM O. So the whole rig apart from this receiver is powered by the VM battery. So the Sony. The cmo, the iPhone, and the fan. And, uh, I had some trouble getting the fan to work with the cmo. I think I wrote, uh, with you about it.
Mm-hmm. But the solution in the end was to just, uh, get the fan into one of CSB ports from this plate here. And now everything is powered by one VMO battery and it lasts around two and half to three hours, one battery. So that's nice. Uh, this thing also almost one shooting day. Uh, so it's pretty reliable as a, as a full scene rig.
Now for, for jet cell only problem with the whole thing now, uh, weigh around six, between six and eight kilograms.
**Eliot:** Mm.
**Heiko:** So that's why we need this.
**Eliot:** The back support. Yeah.
**Heiko:** Yeah. With back support,
**Eliot:** easy r.
**Heiko:** Um, uh, and the only thing that's what I was, uh, writing with Greg about, uh, a few days ago is, uh, uh, because our Sony, so this is just a Sony problem, or a Sony or a Sony problem, it cannot, um, output a normalized back 7 0 9 picture while shooting and, uh, s log three.
So we get a very dim, very, uh, low contrast, low saturation preview on our iPhone. And as you can see, I actually use the jet set as my control monitor. So I actually see the scene tank within Jet Set within the Unreal Engine. Um, but of course it looks a bit weird and it's hard to tell if the Sony Auto focus actually work because you cannot really.
Um, see if the focus is correct because of the low contrast picture. So that was my intention to ask if, if we are maybe in the future as a way to just have a possibility, be a possibility to load in, uh, simple s log three to nine conversion that
**Eliot:** Oh, yeah, yeah. We, we have to do this. You know, just taking one look, one look at this.
We absolutely have to, uh, have to do this. It's, it's gonna, you know, uh, we're, we're going through an, and really redesigning a bunch of the systems to, uh, work better for, for pro productions like, like yours for, especially for as, as the team grows, there needs to be a remote, remote operations system and, and better, better lut control, et cetera.
So this is, we'll get there, right? Um, just, I, I, I absolutely see the need to have an onboard LUT system. And I think what we, we would wanna do is have that be part of the project met data. So that you're like, okay, we're, we're shooting a Sony S log three, you know, C project and it shows up correctly on the camera and then it shows up correctly in post-processing.
And so, you know, so you just know which camera is, is which log transform and Yeah. And that's you. We, we, we have to get there. So, uh, and again, I probably you're shooting in log for the, the, the preview's not perfect right now, but we are, we're a hundred percent gonna get there. It just takes a little while to put together all the pieces to make it, make it coherent.
Wow.
**Heiko:** Well it's, it's working. So as you can see, I'm actually having my actors here on the bridge in the live preview, and I was actually amazed how well it works. So all our troubles we had reported to you back then when we tried the free version with our old iPhone. It really was just the old iPhone because of the missing performance.
And now with the iPhone 16, uh, it really working. It's really working great and it's really, I was amazed how fluently it's working. I don't think we had a single crash on that day. Great. And it's actually working absolutely fluently. And it wasn't even feeling very laggy, uh, because of the translation from camera through CMO to iPhone.
It felt pretty natural. So I was actually able to just monitor using the iPhone. As you can see, the rig doesn't have an, an additional, um, additional big monitor or something. I just have the small money from the Sony on the side. I don't know if you can. Uh
**Eliot:** oh, yeah, I see it. I see it. So after we get
**Heiko:** it's, it's there.
So I have some control, but as you can see, I'm always looking at the iPhone and not as the Sony monitor. So it does work really well. So only thing for me missing right now for our project would be having a normalized Sony picture. So the preview is a bit more natural looking, but it's, it's working fine.
So I, I can work with it. It just, uh, complaining on a high level. Yeah.
**Eliot:** Oh yeah, yeah, yeah. Uh, it's, it's alright. Okay. This is, this is fantastic. And, and again, a a little bit later, like, I think we want to save this for when we're doing, when we have have like the spark, uh, timeline processing, we need to do a course with you because you're, you're basically putting together the prototype of how you have a small team shoot an episodic show, right?
Yeah. With, with this super tiny footprint. And yet you're getting amazing results. And as we get the, the process to go faster and faster, you know, at certain point you're gonna get, get some really interesting phone calls because you're, you're up your cost footprint. Is some tiny fraction of what people are currently using to, to do sci-fi projects.
And, and yet you're, you're totally doing it. And so it's just really, it's really fun to, fun to see this. Um, good. Okay. I, I got, I got, I should probably, I need to, to, uh, uh, handle some of Ben Ben stuff, but this is, this is just fantastic. I'll, I'll check on the LUT stuff. I bet we're, I bet we're gonna have to do that as part of our, when we, when we we're, we're starting to add the live compositing pieces back into, into the Spark iOS app.
Mm-hmm. And so we can kind of re-engineer all the different pieces of it to kind of be more like project production ready. Um, so we're, we're gonna get the LUT stuff. I'm gonna actually gonna put that on our, on our list to figure out how to do a, a high, we have to figure out how to do a GPU version of it.
When we're, when we're an auto shot, we use one version of the CPU, the LUTs, and they're, it's a, it's a big floating point conversion, and we have to do it right. Um, and I think we have to figure out a preview version of it that we can run on the phone. GPU. But I'll just, we'll, we'll look at that and we'll figure out how to, how to start, uh, is, 'cause it's such an important problem to solve.
You, you need to have a lot, a corrected luck preview on set. So we just, I, I totally get it. But man, I hope you hope you're catching this 'cause it's just awesome. It's so much fun to see. I
**Ben:** wish I could, I'm actually multitasking and I looked, uh, in between and looks so interesting. Just the, the fact of it that you're such a small team doing episodic, so.
Yeah.
**Eliot:** Yeah. I, it's, it's, uh, all right. Uh, Hiko. I, I better, I, I, I better, uh, jump to Ben, Ben, uh, just to double. I can, should I stop recording and stuff? And, and I, I don't remember exactly the, if you're, if you're dealing with anything super proprietary, uh, it's okay.
**Ben:** We're just speaking very basic jet set
**Eliot:** thing.
**Ben:** Okay. So it can be recorded. And feel free to say, nice to meet you. Um,
**Heiko:** nice to meet you too.
**Ben:** So the team did a bit of a jet set test on Friday. I wasn't on set. So that's a a little bit different. They essentially, we discussed it today and then reported to me and they wanted me to ask you some questions, which some, I think I have the answer.
Some I think I need to just ask you directly. Um, they had an interesting thing, um, they wanted to bring up is, um, when they use Jetset and they do a lot of in and out, let's say they turn off the phone or they close accidentally the app, it really takes time for them to get back into the flow. Mm-hmm. So they want to know if there was anything.
That you were thinking about or some sort of a solution that if something accident gets closed, the getting back on track, it's not again, you know, putting the origin, re-scanning everything. Right. Um, so, so they wanted me to ask about that
**Eliot:** actually. Okay. That's there, there there is, there is a feature and, and Ko mentioned using it.
This, this is one of these pieces where our development ends up running a little bit ahead of our, our documentation, but there is a snap to scan feature that we, in the origin that we implemented, um, uh, after Pat, like, you know, in the last, in the last month or a couple of months actually would've to go, go back and look at it.
But what it does is if you have a, uh, you know, you drop your origin, do do a scan, um
**Ben:** mm-hmm.
**Eliot:** Of your environment and if your environment has enough objects in it to kind of have some, some 3D you know, environment stuff to, to. Kind of grip to, it's just a blank blank room. It's not gonna work. But if you have like, you know, like if say the bookshelf or something like that you had in, in Pathways or a couple of the prop objects, um, then what it does is that it, um, uh, and, and of course, and the, the phone drifts.
It's like all over here. You can hit the, in, in, in the origin tape, uh, page. You can hit, um, uh, I think it's like, uh, snap to scan I think is, is the new button. And what it'll do is, um, it will do run a very, very quick scan, like a very, very, you know, couple second scan to detect the current geometry of where you are and compare it to your original scan and then automatically relock the origin back to where it should be.
And that's, that's one of these tools where, you know, again, we're, we're working on making the iPhone work better for production.
**Heiko:** Um, one, one thought about it. Uh, I do think you also have the feature of this printout, uh, where you can align your origin to this print. And, uh, I tested it one or two times, but I think this align to scan actually works better or is faster.
But if you have the possibility, you can do the printout from the, uh, from the life craft, uh, resources page.
**Ben:** Yeah,
**Heiko:** put it on the floor and just point jets to it and have that as your starting point again. So everything should line up again, but just like, uh, I mentioned with Rescan, well with 3D scan, LIDAR scan actually works pretty well.
So that's what we use on our last shoot and it's really just two seconds and it recognizes the old scan and just snaps to it and you are done. So we were pretty, pretty fast in getting back into it. The only thing you always have to do after you quit, uh, jet Set is do the first scan. So it has some 3D information and some alignment.
That's what you always have to do on startup. And uh, but the other thing, yes, works pretty well from my own experience.
**Eliot:** Ben, one thing that we, we, we noticed 'cause when we were, when we were cooking through some of the. Uh, the pathway stuff. I, I realized that, um, I think you guys did a new scan on almost every take.
And, um, and, and then that would've probably, you know, that that would've taken a lot of time to do it. And I, I, again, I wish I could have been there to not, to, you know, be like, ah, no, you don't, you don't need to do that. Um, the, um, you, if you, the only time you need to do a scan, um, is, uh, like the actual geometry scan is when you have changed your origin.
You know, if you have, you know, uh, uh, um, or if you have, if you have changed your origin or if you've changed the, the, the 3D scene that you, the physical scene, 'cause of course everything moves around, you could kind of rescan it. But other than that, what you could do is, is you could drop, you know, a floor origin somewhere in a known, known place.
Just keep it there, tape it down, whatever, you know, do a 3D scan of it, and then when you come back, okay, great, you, you, you can. You can relo, relocate, right? Sometimes, yeah, sometimes you can just re ECT it and, and have at it. Um, but you don't need to like do a 3D scan. There's, there's the, the mapping of the, that the iPhone sort of needs to do, where you, you look around really quickly to, to kind of make sure you've, you're seeing stuff, but the actual, like, I'm in the scan button.
I'm, I'm actually making mesh. You only have to do that once for like many, many shots.
**Ben:** So even that's the question. Let's say they accidentally closed the phone, uh, they logged that on the app. That's what they were concerned about. Do they need to re-scan it?
**Eliot:** No, they, I don't think they need to to re 3D scan and, and actually what this, they, they will need to sort of re localize where you find it, like starts a new AR kit session and it has to figure out where it, where it's at.
So there's the ground just point the origin
**Ben:** somewhere.
**Eliot:** Yeah. Point the origin and point the origin somewhere and, and, and they lock, you know, snap it to the, you know, point down to where you see the, the floor target and like reset the origin. But you don't need to go around where you're walking around and trying to map all the 3D environment.
That's just like a once every 20 shots, you know, between the, um, only, only when you are changing the physical set and by, you know, like if you're removing the bookcase, then Yeah. You know, go re-scan it or, or you're changing to, you know, I'll use pathways terminology 'cause because we're, we're both familiar with it.
Yeah. If you're moving to the, moving to the rock wall, yes. Re-scan on, once you're on the rock wall, it's the same rock wall, the hole for all the shots. So that's, that's just one 3D scan for. Hill, however many, however many shots it is.
**Heiko:** One. And then one thought from me, um, we had, we just did that, so one scan at the beginning we scanned our green screen.
Mm-hmm. Like you, uh, saw in our video and, um, just the screen, some floor elements and, uh, in our case, this chair, but the chair wasn't actually needed. And that actually works fine to realign the scan. But what is important, what I missed and was, um, there is a possibility over time that you origin, uh, or your seen starts to drift away from the 3D scan
**Ben:** mm-hmm.
**Heiko:** Due to some dropout or something. We added in, in one take. We had a slight jump. So, uh, our 3D scan wasn't aligned with our actual origin points. So later on when I was actually trying to use the 3D scan for this panel, you saw my video. To, to perfectly align it to the point in Unreal engine, nothing lined up.
And I was scratching my head while it was, and then I realized, okay, we had to take, we had just jumped somewhere. Uh, so it's a good practice to have the, um, lighter scan over late on your image while shooting so you immediately see if something is not lining up correctly. But if you do that and just scan your green screen and some elements around it, uh, at the start, you can use it all day long.
Just make sure it always realigns before you start a new take. But you don't need to scan again unless you need a new element in your scan for reference in Unreal or Blender. In our case it was with panel, so we have the exact plane where it should be an unreal engine. Otherwise we could have used one scan for the whole day.
**Ben:** No, that's a good point. I think just leaving it on and then you just see if it align another line. I think the one problem with that is if you use the phone, the take, you're recording as the final product and not solving it in Blender, but we are now doing it in Blender. Right. We're using the setup. Um, Elliot, another question that I do have is, and that's all discussions I'm asking because I'm physically not there, so I don't know what they did.
**Eliot:** Yeah.
**Ben:** Um, they had an issue with the origin, which I found to be interesting because when you print it out, the big version on the floor, they usually never an issue, but some, for some reason for them, sometime the floor came higher. Um, I know Greg helped us to take a look at it because of the scale. We had an scale issue.
Do you know what might be the reason sometime that the, um, floor might be higher? Is there general glitch? I don't know.
**Eliot:** Um, it's the sometimes that would, that would be a scale issue of, of the target. Um,
**Ben:** okay.
**Eliot:** Usually like it does a decent job of finding the floor, but I tell you what, one, one of the, the many things that we're, we're cooking through, again, this is another thing we're with, with Spark, we have a, so we have a centralized kind of switching system where you can communicate.
Yeah. And so I want to get to the point where, and while we were just talking about this last week where we actually have, we can run a remote feed to basically be able to do a remote VFX supervision on, on a shoot because that way, you know, again, we got, I got nailed by this on, on Pathways because I, I was traveling it tough, you hit, you hit the, like the one Friday where every jet set operator in LA was somewhere else.
And, and it, you know, when you can be geographically distant but patching and then as soon as you can see the ui, you know exactly what's going on. You know? Um, so this is, you know, we, we aren't gonna get this, um, and. What, just, what just tends to tell me is, is that we, we, what we really wanna do is be able to patch the feed through the browser system.
I was originally wondering of whether we needed to patch it locally onto another, another app. But honestly, just, just this discussion right here tells me we need to be able to patch it to where it's, you know, you're overseas, right? And so you're, you're, you're LinkedIn. Yeah. And, and you can like beep and, and like just look over people's shoulder and like, oh yeah, no click there.
Right. Um, or even we, like, eventually we need to be able to have you remote operate it. Right. Because that's, that was the other problem we ran to on the stage is all the, without being able to, without running up and touching the app on the camera, you, if we solve the remote operations for the stage, we should be able to actually solve it for remote operations overseas.
**Ben:** Yeah.
**Eliot:** Um, and so anyway, so this is, this is just literally stuff I'm, I'm engineering now.
**Ben:** Yeah. And it goes back because I was, what I was telling them is if you see that there's a really big issue, like misalignment, something's not working. Take a screenshot of the interface because that's the one way for me to know, based on what's happening down there.
And now, I think they will start doing it, but honestly, the way to have monitoring slash control would be a blessing. Um, I do have one last question, and then I think if they have, I don't wanna dump everything. I think they need to do more tests and then we'll reconnect with you. But, um, they did, what was it?
Uh, I can't believe it. I wrote everything and I, I forgot where it was.
**Eliot:** No worries.
**Ben:** Oh, okay. Yes, so. They shot it at the studio in Venice. And it's not a studio, it's an office, but I call it the studio because it's a very large space, white walls, white floor. So I call it a studio and it's, I saw probably not the best place to shoot with Jet Set, right?
Because we wanna have things thrown around. So I just said to them, what if you just put crosses slash dots all over the studios right now? They did ask one thing I think was interesting and, and that made me think maybe they should do it. What if instead of putting it nicely, properly, like, you know, in a very, uh, perfectly aligned, really dots all over the place in a very random fashion so that jetset might be better at detecting them.
**Eliot:** Uh, that's, that's actually, that's a great question. Um, you know, um, man, it, it's interesting 'cause for, for posts, I can tell you that, that, you know, the, depending on, on the future detection stuff. Um, you know, like the marker marker crosses worked fine and, and a few other other things. All, all work fine. Um, honestly, for the AR kit stuff, it's really designed, it's, it detects natural features.
Okay. And I would, I would pick, um, are they, are they shooting against green screen?
**Ben:** No, we're, shoot, they're shooting in a, in a very large white space.
**Eliot:** Okay. Okay. Um, I re Yeah. 'cause I re I remember helping 'em, you know, do lens calibration stuff and we found the only thing with features in the whole place, which is like a, you know, some artwork on walls.
Yeah. And then, and again, then it can get a good calibration. Um, oh man, my kingdom for, for some features, um, I'd say, I'd say like we, we, we can get enough features hung up on, even on, if you just hang up like a. A kind of a mediocre green screen, right, where it's kinda got a bunch of wrinkles in and things like this.
Then, then you can lock features onto that. Um, um, I, I, I know that there's one way of you, you cover, cover things with dots and stuff like that. Um, I think probably what you'd want to do is in the origin mode, you can see in real time the anchor points, right? Mm-hmm. It'll show up as all these little, little pieces of it.
And, uh, honestly, I, I would, um, lemme think, think for a second. Um, 'cause I, I know the off that office and it's just pure futureless white walls, this kind Yeah. That's
**Ben:** the issue.
**Eliot:** Yeah, that's the issue. Can we, you know, can we hang, hang up something or anything? Um, you know, if they're, if they don't, if they don't need to do green screen compositing, you could just hang up like posters on the walls and there's like plenty of features to lock onto.
Like a bunch of movie posters will totally work right. Um, you know, I, I'd say,
**Ben:** and if we do dots, you think, uh, you know, cross marks and might not be as effective as we think it is,
**Eliot:** you know, uh, the cross marks, uh, I would, I would check it against the, the part that I actually don't know is, I don't know the base level of AR kits feature detection mechanism.
Um, I, I don't actually know if it's Harris Corners or, or if it's just, um, I think it's probably corners in which case, you know, the crosses are gonna be a, anything. But honestly, the crosses only have a few feature points where if you just, you know, put up images, then images, man, it can pick up stuff all day.
Because again, what we, what we just did is we, we, you know, did calibration in front of some like, kind of photographs on the wall, um, or, or like movie poster stuff. So I don't know if it it studio, if movie studio, if they don't mind hanging up a bunch of movie posters.
**Ben:** I mean, I'll tell them
**Eliot:** it'll track like a dream.
**Ben:** I'll tell them, we'll see if it works out. Um, let's start with that. I'll just follow them back to, you know, give them all this information. I think they're gonna do a second tank, uh, another test on Friday and there might be a chance, I don't know, it's on their end, right? They might want to reconnect me with you or with Greg, but we'll, we'll discuss that over Slack.
So thank you Elliot.
**Eliot:** Yeah, that, that sounds great. That sounds
**Ben:** great.
**Eliot:** Yeah.
**Ben:** Fantastic. And, uh, one and one last thing just to loop you in. So Craig was really nice in helping us. But, uh, I think he said that the whole scale situation that we had issues solving in Blender might be soon solvable inside Sparks.
Uh, so yeah,
**Eliot:** that's
**Ben:** where Jane
**Eliot:** we're, we're, we're moving most of our, our, our like model creation pipeline in Spark. And because like we can, we could build the tools, right? We can build the flat scaling tools and measurement tools. And it also becomes, you know, the, the, the error that, that I had with Jetset is it's kind of isolated.
It's a little bit on an island, you know, by itself. And so in order to fix stuff, you have to, like, somebody has to export it and we look at it, you know, send it back and there's, there's lots of things that can go wrong. And the thing we're really doing with Spark is having this centralized 3D database system.
Oh my lord. You could just fix stuff so much faster than you could ever, ever do it with, with Jetset. And, and I think that'll let us do like all the things we were talking about, everything from the remote monitoring to, you know, having, just having a whole thing in the. Um, in the system so we can actually, you know, find stuff and fix it.
Yeah. It's just gonna be transformative. Just transformative.
**Ben:** Thank you, Elliot. Nice to meet you. Haiko. How, how do you pronounce your name? Yeah,
**Eliot:** yeah. Sounds
**Ben:** interesting.
**Eliot:** And you guys are geographically not that far off. Uh, 'cause Ben, you're in Paris, right? And
**Ben:** Yeah, I'm in Paris. Are you in Germany?
**Eliot:** Yeah.
**Heiko:** Yes, Germany.
**Ben:** Oh, cool. Great. Well, I'll, I'll check you out on LinkedIn. I'll put you a follow. So, uh, yeah,
**Heiko:** I'm not on LinkedIn, but you can, uh, always find me on, uh, YouTube or on Instagram.
**Ben:** Oh, cool. Awesome. Yeah. Yeah. I'll, I'll, uh, I'll follow you on Instagram.
**Eliot:** Awesome.
**Ben:** Okay. Thank you.
**Eliot:** No problem. No problem. And there's a third fellow, and I think I missed him, so I'm gonna have to catch him on the next, on the next Oh,
**Ben:** that was Josh.
Josh was with me from Hamad.
**Eliot:** Oh, okay. Okay. Fantastic. Fantastic. Don't worry. Okay, then we're good. All right. Well thanks. Good to see you guys. Uh, again, congratulations on if
**Heiko:** you, if you have. Two more minutes. I have one thing. Oh yeah.
**Eliot:** You
**Heiko:** might be interested in, uh, we talked about, um, the workflow of getting the jet set tracking data into fusion.
And I actually have a pretty fun, uh, thing where it actually made sense. So if you just wanna take another look and for our next epi, uh, can you see it? Yeah,
**Eliot:** yeah, yeah.
**Heiko:** Alright. For our next episode, we have, um, a storyline where characters from two timelines meet each other, so we need to have a clone effect.
Ah, and in this case, in our test shoot, we already did this, so this is the same in two different costumes. Basically. Wait,
**Eliot:** wait, wait, wait.
**Heiko:** Talking to each other.
**Eliot:** Wait, okay, lemme just stop and think through this for a second.
**Heiko:** It's actually not very hard.
**Eliot:** So you have actor one, you shoot, you shoot actor one.
He is on a a 3D plate in the, in the scene project projected with a camera projection act. So now he's in the scene. Actor two is, now you shoot this, you drop this into the same scene and you have to be close to the original shot. Um, you can't go 90 degrees, you have to be close to the original shot.
Exactly. But the tracking plate re normalizes the data. That's really cool.
**Heiko:** And, um, our first approach was exactly what you just described. So I put the, uh, plate of, uh, of this sitting act door that is just this, let me just get the output blanking off. So this is the original plate from him.
**Eliot:** Mm-hmm.
**Heiko:** I ran out that out, uh, out as an XR sequence with Alpha and already keyed and put it into Unreal.
But Unreal is rendering and BIP mapping or something. Did a number on it, so it was a bit jaggy, uh, ugly, uh, uh, edges, and it looked a bit low, uh, low quality. And then it occurred to me that I actually can just put both shots into Fusion and just do it there because I can just get the, uh, tracking data into Fusion using Auto Shot and the, uh, method, other program.
**Eliot:** Mm-hmm.
**Heiko:** And so that is what I have built here. So this is the standard for the actual shot, but we have, um, this is the tracking data from Jet Set. I imported through the script, and as you can see, I just have this 3D scene. And what I did was. Uh, although in this case it did not work because, um, this is our LIDAR scan and normally if you are seen as aligned and you have an element in your LIDAR scan, which is also in the scene, you have a perfect, uh, reference where you have to put your, uh, your, your 2D plate here.
In this case, we just shot into an empty corner of our green screen because we just wanted to test it out in theory. Uh, so we did not have a 3D scan of the actual cons he was supposed to sit on in Unreal. Of course, you have all these elements and you can just put it, it's right place in the 3D room here.
It was a bit of trial error just to see where it has to be. So it is not swimming, but it was like five or 10 minutes of trial and error scaling, and then it's actually just an image plane in the Ry, uh, three space here, and this is the actual short from resolve. And exactly the same workflow. And that just pipes into this mech mode, which, uh, just combines it with the, um, with the background.
Uh, so empty background render from Unreal Engine. So let me just hide that again. And that's my background. And that is just combined with the actual take, which has the camera motion from the foreground actor. Yeah. Yeah. So that's basically, but that was finally something actual feasible and practical where you can use this tracking data infusion, uh, at the same time as you would use it in Unreal Engine.
Another use case I had was, uh, I needed, especially for Corridor Key because it really needs, um, um, very clean, uh, fully green screen. Uh, shot. Otherwise it can produce, uh, strange artifacts on the edges. So what I did was import the tracking data too, and just put plain green planes on the, the edges of the green screen, or on a person I did want.
So it was already tracked and I just had to use an, um, object mask or, uh, magic mask fusion to, to rotoscope off the foreground and had my, had my clean plate. And, uh, that's another use case where I think this is really awesome to just import the tracking data into, into Fusion two. And it, it's, uh,
**Eliot:** and to pull the tracking data into Fusion.
Did you use the script that we made or how did you pull the, the data? Yeah. Okay. Fantastic.
**Heiko:** We used the script. Yep.
**Eliot:** Okay.
**Heiko:** The only thing I have, but that's just a me problem, uh, because I am, I'm using Unreal engine on pc, but I'm doing everything else compositing on a Mac. Uh, and so completely different machines and windows and Mac, and they use completely different, um,
**Eliot:** file paths.
**Heiko:** File paths. So I always have to, uh, just open the, um, the script first in a text editor and just do a, find a replace on the, on the first volume. So the rest is okay, but I need to, uh, just remap that, save it, and then drag it into Fusion. And then I get my tracking data and it's, and it's, like I said, everything worked perfectly, so it's all perfectly aligned.
It's all perfectly timed, so nothing is out of whack. And, uh, so great job.
**Eliot:** It's
**Heiko:** actually, actually working really, really good.
**Eliot:** This is, I I, I hope you're going to post some of this stuff on, I think, I think the corridor key stuff has a discord. I need to sign up for that. Um,
**Heiko:** yeah, I'm, I'm, I'm quite active there because, uh, I did two tutorials now on, on corridor key on my own channel, um, how to use it in fusion.
And in the second one, I'm not using Jet Set and tracking, but it's the second one. I'm actually showing how to put plain, uh, plain background things to just mask out unwanted elements from your green screen and stuff. Uh, but that's where the idea actually came from with Jet Set and Fusion into just incorporate the tracking into, into my compositing shot.
And yes, I'm actually, I was actually, uh, doing, uh, some, uh, commercial work for you, advertis advertising jets to Corridor. But then, uh, uh, I think two days later they had a podcast and Nico from Corridor was actually talking about that they already used Jet Set or are at least, um, experimenting with Jet Set because he, he mentioned it explicitly that they saw someone at NAB this year with a 3D printed jet set to iPhone rigs and was, uh, they were really impressed.
They used some. Some sort of light and a retroreflective background, so it wasn't perfect green and perfect key. And, uh, they are actually using jetset, like I said, uh, for testing at least, or maybe actually using it for their son of a dungeon, I think it's called. It's their Dungeon and Dragon Green Screen
**Eliot:** Project.
Yeah, yeah, yeah. Do do you have the link to the podcast? I'd actually love to, love to see that I talked just a bit to Sam at, at a, at a, at NAB, um, on the Oh,
**Heiko:** okay. Awesome.
**Eliot:** Yeah, so that's, I
**Heiko:** was, I would've been surprised if I did not know what Jet said was because, um. This is right up the alley.
**Eliot:** Oh, it's fantastic.
This is, it's fantastic to, to see, see the link up. And, and, and what's really interesting is that a, a number of people are converging on this, this combination of, of, of sort of combining jet set with the quarter key stuff to handle the, the large shot volumes and then, you know, rapidly, rapidly putting this stuff together and compositing.
So it's, it's, uh, I, I mean, really I think what, what's happening is we're mapping out the, the way to handle, um, independent, uh, independently shot episodic work, right? Because it's the, the volume of shots is really, really high. And so this is, this is really exciting to see this. Um,
**Heiko:** so this is, I don't have the time code where we talk about it, it's way into the one hour podcast, but um, this should be the one where we actually mentioned, jet said that we are experimenting with it.
And they talk about corridor key because of course, uh, corridor key was a big, big splash on at NAB, so yeah.
**Eliot:** Excellent. Excellent. All right, I'll, I'll, I'll go, I'll go look that one up. Um, that's, oh, I, this is so exciting. I just, I'm super impressed with this.
**Heiko:** I'm glad I, I remembered, uh, this clone shot because, uh, since we talked about how to use something like the tracking data directly in, in Fusion, and I think this is really a great use case to show how to use it.
Uh,
**Eliot:** yeah. This is, this is, I mean, it's just fantastic. This is, you know, this is almost early days now, you know, and, and as we, it, it's, it's, uh, some of the stuff is a little bit pioneering and, but as we engineer in the pieces of it to make this more systematic, I think you're gonna be able to accelerate the rate at, at which you can do your episodes.
You know, I, I think the process you're doing, I, I would, I would just stay with that process right now, you know?
**Heiko:** Okay.
**Eliot:** Another. In another, you know, half a year, we may have a new process that I think will be even better suited for, for what you're doing. Um, I, I think what you're doing is great. Awesome.
**Heiko:** Now, now I, I have the feeling it's, it's streamlined enough.
It's working for me or for us. Uh, I was maybe hoping if there's some more automation possible through this timeline, open timeline stuff, because you already have your information for in and out points. But then again, the way I'm working right now, it's, it's hard to really automate this process because.
At the end of the day, you still have to do manual work in Unreal Engine, like replacing the camera for, for my example or doing the focus stuff. Um, and so yeah, I think it's, it's okay to just enter the, the time codes into, into auto shot. Um, from what you are talking about now with Spark, um, I'm not sure I'm grasping the concept fully where this is all going, but is it conceptually still, uh, on your roadmap that you still have Jet Set as a standalone app and the way we are working now, or will this all be centralized into, into Spark as one platform or how do I have to.
Think for the future, let's
**Eliot:** say think, think for the future. So for, for, for the next, you know, chunk of time we have, uh, we're building out, we're shipping the, the Spark story, which is kind of the front end pre-visualization shop planning portion of Spark. Yeah. Where, where, you know, the, the is, I think for many people it's gonna be much more, much easier to construct a 3D model in that environment.
Um, you know, Gian S Flats, et cetera, than if you, if you, if you're already a 3D expert, then you, the process you're doing is great. You know, you export USD from Unreal and, and you know, and run, run it through auto shot, et cetera. Um, and, um, but I, I think Spark is gonna be much easier for many people to come in into the system.
We can export from Spark directly to Jetset, right? So we can already take, take the, the, the scene we build in Spark to Jetset, um. Over time, what we're gonna end up doing is, is porting a bunch of the jet set, you know, syn functionality and stuff into Spark. You know, we, the first part we're shipping is Spark Stories, the pre-visualization part.
Mm-hmm. It's gonna be a good, hard six months of work before we can get, um, you know, spark Shoot. Right. Which is what we're currently calling the, the, like the Jet Jetset C version of Spark that is gonna handle, you know, sort of c footage and do, do a lot of the stuff that Jetset, uh, uh, Jetset does. But with, you know, being able to do remote, much more remote stuff, right?
Mm-hmm. So you can, you can, um, like, like exactly what, what Ben was running into, you'd be, need to be able to remotely supervise your projects. 'cause you'll have things that are shooting here and the person who knows all the details of it is over here and mm-hmm. Right now there's no way to patch in and, and just kind of supervise a shoot.
And we, we absolutely need it. Yeah. Right. Okay. Um, and uh, also the, the part where you run into it is when the, when the project grows. You're not seeing it yet 'cause it's like two of you and, and you're in, you know, very close to the same room or it's, or some similar building or clo geographically location.
But boy, when the project grows, um, it blows up. And so we just realized we needed to re-engineer the whole process. That is assuming from the get-go that yeah, we'll still have the pro, the, the single person workflow where you, you don't have to move stuff into, into and out of the cloud if you don't want to.
But when the project grows up, then you start needing to be able to have, send, you know, 50 shots to this person, 50 shots to that person, 30 shots to that person, 40 shots. And that, and keep track of what everything is, is going on in a, in a clear, kind of centralized way. Um, so we're building that, right?
'cause it's, it's, um, again, I look at your work and I say, okay, this is the first instance of mm-hmm. Someone tackling. Again, you guys are way out ahead of, of you're tackling episodic length things before we've built an episodic. Software structure, but, um, there will be more. Right? And because if, if, I think what you're doing is you're pioneering the way that, um, episodic work is going to be made, the team will get a little bit bigger, right?
Because what happens is you need to go fast and then instead of like one person composite and you need five right? At, at which point, okay, now we can do something that's, that's every two weeks or every three weeks. You know, some, some for sort of mm-hmm. Mm-hmm. More, you know, a faster shooting space. Uh, my phone, my doorbell is going, but I, I better, I better, uh, grab.
Mm-hmm. See,
**Heiko:** no problem.
**Eliot:** Sorry, this a lot, a lot of, a lot of interruptions today. Just kinda see, lemme see what's, what's going on there. Um, oh, it's just delivery. Okay. All good. Um,
**Heiko:** okay.
**Eliot:** But yeah, so the, what we're, what we're really building is, is when, when people like artists like yourself. Start doing something really cool.
A lot of times what happens is you get some really unusual phone calls and the phone calls are like, okay, we're doing this big project, we like how you're doing this, uh, and we want you to help us do this. And the flip side of course is frequently the projects are in a geographically different place than you are, right?
Yeah. Like this, this is, I mean, you saw Alex Heineman's dragon, uh, where he is roaring outta the cave. Mm-hmm. Alex's phone melted down, man. Yeah. There were people, so, you know, and, um, and, and so, but the, a lot of the projects have historically required that you pick up and you move to New Zealand for six, six months or in London or Los Angeles or Vancouver, like, you know, the usual craziness.
And I think, of course, yeah, a lot of people are gonna do what you're doing, which is you make it happen Right where you're at. Right. And, and then that, but then we can patch in talent from all over the world to help you do it when the project grows up. Um, I think that's the future. So we're, we're just building the software system that I think is going to be, uh, let you do that kind of project, uh, faster, you know, and yeah.
Uh, and even more, you know, I think you're gonna be able to, to, to do something where you handle half the shots yourself and then you farm off a bunch of feather shots to, to other people. Um, and if you set it up really, really well, it's not that hard to go through. You know, if, if, if some, if we have a framework and an infrastructure that sets up all the file naming Yep.
And all that kind of stuff. You, you, the, the actual shop part's pretty fast. It's the export this that's true. You write this and file this, and that's, that takes like two thirds of the time. Um, so
**Heiko:** yeah.
**Eliot:** Uh, anyway, that's, that's a long story short. That's what we're, we're building. The first part of it is just the planning start phase.
I don't think you're gonna use that as much. Um,
**Heiko:** exactly. Just because we are such a small team and, uh. Most of the time I'm already, uh, also heavily involved in writing the script. So if I write the script myself or I'm involved in writing the script. So I already have the movie in here. And, uh, we have our, basically we have our digital sets already, so our bridge sets and, uh, I know what I will shoot.
So we don't use storyboarding at all actually. So we just, we have our shortlist, not even a shortlist, we actually have just a script and this, uh, and, and some luck not to forget anything. And I just, just go and shoot it. And, uh, right now it's working fine for us. So yeah, that's why I was not so involved in checking out everything about Spark up to no, because right now, this story boarding part and, and this.
Planning part is not really necessary for our special project, but I absolutely get the appeal and the idea where this will be, uh, taken to, into the future. So, uh, this, the storyboarding part is so essential for actual projects and the more people are involved, the more important it becomes. So everyone knows what actually is the plan.
So I can totally see it. Yeah.
**Eliot:** And, and as we're doing that, we can improve a bunch of the onset production sorts of things. You know, we can sit down and re-engineer the LUTs Right. To,
**Heiko:** yeah.
**Eliot:** To where, where, you know, so there'll, there'll just be a whole set of things that we have to do to. To make the onset experience much better, much better remote control.
Like, you know, all the, all the, all the pieces that we are, we're, we're putting in. And the, you know, the good news is our communications should get a lot better as well. Where, you know, we, before, you know, the, the team grew a lot in the last, in the last half a year, uh, because we realized the problem is, is bigger.
And so we need to
Mm
uh, you know, we need to, to actually scale up the, the company. Um, and so, you know, we have, we brought on Elijah as, as our pri as our new president. He's great. Just really, really good. Um, and so we're, we're gonna be getting a lot better with our website and our messaging and our emails.
Mm-hmm. Uh, 'cause you know, when it was just me doing everything and, you know, I, I forget to send out the emails, right, of, oh, hey, we built this new feature. Like, usually we solve the technical part of it is we're okay at that, but the communications, I just, I'm not, I'm not great. Uh, I wear out and so, uh, we're putting in the infrastructure, but it's,
**Heiko:** it's a lot.
Uh, but
**Eliot:** yeah.
**Heiko:** Uh, exactly. I, I noticed, um. I was just because I was, uh, texting with Greg and you read it and, uh, he was announcing that there will be a new version of auto shot in the coming days where this, this, uh, take selector for the senior takes will be implemented, which was, uh, what I was hoping for, but then I never heard back from anyone that is actually already online.
So I just skimmed through the, through the download page and just noticed, wait, it's another version that I have, so download, tested it and Oh, there it is.
**Eliot:** It works. Yeah. Oh, this is, this is absolutely,
**Heiko:** I actually, I actually signed up for the Notify Me pop up on your website and never got notified about a new auto shot version.
So
**Eliot:** yes, we, this is all part of things are gonna get a lot better on this because we, again, the. What we're figuring out how to do is integrate all the different pieces of the company together. Right? Yeah. So, so when we have new releases, then how you auto, how you automatically update the documentation, and then how you do a product release kind of things on an email blast.
Like just, it's all, it's all things you gotta, we gotta do,
**Heiko:** um, the Outshot app does not have an update, uh, function, or it doesn't, does not automatically look if it's an update. And otherwise you could just make a popup on start, Hey, there's a new version when we download it right now, or what update right now.
Yeah. That would save you a lot of emails.
**Eliot:** I mean, this, we're, we're doing, again, we're doing all these things basically because we can eng engineer them into Spark from the beginning. 'cause we, you know, it's, it's harder to patch in something that we. Auto shop was never designed to work, to talk to a network, to talk to like the, the world, the web or anything like that.
It doesn't know how.
**Heiko:** Yeah. Yeah.
**Eliot:** Um, it doesn't have authentication. It doesn't have any of these things that, that you sort of end up needing to do. Mm-hmm. Um, and it basically came down to, okay, we just actually have to redesign the system to think of the, to start off with a network, like from day one. Like it's, it knows what the network is and, and how to, how to use it and how to, and how to, you know, think through it.
Especially as we start, uh, merging the post-production pieces of it where we, we patch in AI models, because those things change every, every four days, you know, so we're gonna have to have something that reaches up. Okay. Current version of it pulls down the, the, the current model weights and stuff because there's, there's so many processes that just, um, some of the stuff that we're doing with automatic post-production tracking, um, that we're still working out, it just requires much more interaction with the, with the web than the old ways of doing things.
Um, so we're, you know, we're getting there. And I, I, again, I thank you for your patience. I apologize for No,
**Heiko:** no problem.
**Eliot:** For communications.
**Heiko:** We are, we are super happy with Jet Set and what you are doing. It's absolutely amazing. So our project wouldn't be like it is right now if it wasn't for Jet Set. So it's, it's really a godsend Oh, what you are producing.
And, uh, like I said, it's all complaining on a very high level. So it's from, from where we started with the free version. And so you were absolutely right about how much more of a senior workflow we'll do for our project. And I'm really happy now that we have our test shooting day and it worked flawlessly, basically.
So now I'm way more confident to go into our 15. Plus shooting days. We will have this year, and now I can really rely on this system and now I will get some amazing footage out of it. So it's, yeah, it's really, it's complaining on the highest level.
**Eliot:** Good. I hope
**Heiko:** that, that it's, it's the last, last few things, uh, because the more something works, the more ideas you get.
What, what can be improved, of course. It's a normal process. One thing just came to my mind, um, that was a bit weird when we had this, this shooting day with the German television where they visited us. Um, they were super interested in the, in the app and, um, what did life showcase? And I was, oh, of course I show you our bridge.
It's amazing. And I hadn't used Jetset for a while. And then I actually realized because, um, the, the whole, um, data, the, the usds all were missing. They weren't on the phone. They were just in the iCloud. And, um, Jetset did not want to download the files. It always said it needed a wifi signal, so I couldn't get, uh, the USD file back into Jetset from, from iCloud.
Then we tried to set up another iPhone as a hotspot so I can connect via wifi to that iPhone to Trick Jet set, uh, or the, my iPhone to think it's in a wifi, but it didn't work. It just said it, it needed a wifi signal to, to downloads models, so I couldn't show them. And now the uh, worst part was we are Star Trek project and had to loads the demo most level from Star Wars.
That's the one we actually put into the report. I think most people, it'll go over their head eternally. I was screaming,
**Eliot:** oh, this is. The very first thing that that, that Colin, who's our, our new iOS engineer, um, was, was was building the iOS, uh, spark app. Hmm. The first thing he did is he linked the app directly into our, into the Spark 3D database.
So you don't, you don't have all, all that stuff is gone. Like you, it, it just links to the same 3D database. Your, you're building the, the shots on the web version and it's automatically mirror mirrored back and forth. You don't have to set up iCloud. That's none of that. All It's all
**Heiko:** Yeah.
**Eliot:** I'll just, I'll just works.
**Heiko:** Well, I did, I did see, uh, uh, as always with a simple solution, I just created a, a permanent folder on my iPhone outside of any app and just called it Jet Set Model to put all my USD files in that one folder. And the fun thing is, it actually is faster to work like that because now when I switch, uh, wanna shoot another scene with another background, I just can load from this one folder, which is also in the recent.
So I, it just one tab. What model that one, and I'm there. Otherwise, when I want to load a model, I have to go through several folder structures inside of Jet Set to go from one project to another to load another model. And this way it's permanently in my phone, no matter if I have internet or not. And, uh, switching another model is basically two clicks.
So it's always the simple solution is the best for me.
**Eliot:** Oh, well, hey, I gotta, I gotta jump to a call, but this is, this is great. Sure. I thank you so much. Uh,
**Heiko:** no thank you for taking the time today. Uh, very glad to, to. Uh, to get my ideas, uh, out and yeah. Thank you so much. And I'll let you know how our shoot is, uh, working out.
**Eliot:** Oh, coming absolute
**Heiko:** weeks a
**Eliot:** month. Absolutely. I, I'll put this recording up 'cause I think this is just fantastic to, for people to see, see this. Sure. It is absolutely fantastic. If that's all right with you. All right.
**Heiko:** It's alright.
**Eliot:** Great. Great to see it and congratulations. I I I can't wait to, can't wait to see the next one.
**Heiko:** Thank you. We are, we are trying our best.
**Eliot:** Alright, take care.
**Heiko:** Have a good one. Bye.