Transcript
# Office Hours 2025-01-17
**Eliot:** [00:00:00] All right, morning all.
Uh, let's see, uh, I see Kevin's iPhone and Peter from Keen Tools and Navas.
All right, I'm probably gonna start, because if we get it right I want to record this so I can go back later. Um, all right, and so just a background for other people watching is, uh, Keen Tools builds a really interesting plugin for Blender that does, um, they build several things, but one of them is called, um, Uh, uh, Geo track and use a piece of geometry that matches some part of the live action footage.
And you pin that, uh, geometry to the live action footage and you can get a very accurate track right inside of Blender. You don't need to go outside of SynthEyes. Uh, so this is, this is potentially really interesting for us. So let me, uh, I'm going to pull up my, uh, [00:01:00] my Blender scene, uh, recent, there we go.
Let me pull this thing up and it's going to take a second for the shaders and I'm going to share my screen so you can see it. Um, all right. Let me find this. There's Zoom. Share. All right. There's, there's Blender. Okay. Uh, so here's the, uh, do you see this? Uh, Peter, do you, are you, are you seeing my screen?
Well,
**Petr:** not yet. Yeah. Yeah. Okay. Oh, now you see it. Okay.
**Eliot:** All right. So, uh, so let's see. Let me, there's the, uh, there is GeoTracker. Um, okay. And so just to give you a sense of what's going on. So we, our, our system automatically constructs, uh, a set of things. Uh, the main piece, I'm gonna turn off the, um, the scene collection.
'cause that's, that's not part of the, the key key, that's just the background. But this is the [00:02:00] key pieces that, uh, that auto shot generates based on a Jetset syn track. So we have, uh, an image sequence, um, that is of, um, uh, you know, of the. Uh, the actor in the scene, we have, uh, a tracked camera, uh, that's, uh, that's, and this is based on the cine camera.
This is not the iPhone camera. So it has the correct offset and the correct, uh, optics, the field of view, uh, for the cine camera, um, to match, to match this footage. Uh, the footage is set, uh, is set as an image plane, um, you know, it's, uh, you know, based, uh, reference to the camera. And then the key thing is that we.
Uh, we get a scan, a rough scan of the scene, uh, that we generate with our, our integrated scanning system. And this scan has turned out to be the key when we did, um, SynthEyes, I don't know if you saw the SynthEyes video, but what we basically do is. Uh, and so the other piece [00:03:00] that we, we can generate is we have, um, we can do an AI generated, um, uh, garbage mat or like a, basically a, a, a mat of the, the person.
And in SynthEyes, what we do is we bring that in as we use that as a roto mask and it's all done automatically in the script. So the user just hits, you know, go basically. And it, it, uh, SynthEyes finds all the tracking points, good tracking points in the scene. We pick a few of them that. Uh, that when we look through the camera, the tracking points are visible in that we'll project onto the mesh and we use a feature in Synthetise called project onto mesh.
And then, then we have survey data and then we can track and solve it. And it's the, the camera after you, before and after you solve it, the position of the camera, because this is already correct within a couple centimeters, right? We don't want to lose our live tracked camera position. Uh, but after the solve it's, it's moved maybe two centimeters and then it's a sub pixel track.
Uh, and it works great. I mean, it's, it's great. Um, and that basically makes this viable for heavy production. Uh, but a lot of people don't want to deal [00:04:00] with SynthEyes, you know, cause it's a whole thing. And I would so much prefer for the vast majority of Blender teams to be able to stay inside Blender.
Cause that's, you know, it's just much, much easier to stay in one app package than the jump between, you know, three. Right. So anyway, so enough of that, but that's, this is the basic piece of data that we've got. We've got the image sequence, we have a piece of scan geo and everything comes in, you know, it comes in lined up like this automatically.
Um, so let's look,
**Petr:** looks nice. I have a few questions about this shot. Yeah. So this picture was shot on the film camera or on an iPhone, the actual, this footage. Film camera.
**Eliot:** This is from a, uh, this is a film
**Petr:** camera. Okay. Yeah.
**Eliot:** Okay. But yeah, I got it.
**Petr:** I got it. Uh, but, uh, the model model, uh, built, uh, from the iPhone.
So that's kind of, uh, match between the iPhone and the camera, uh, and the shots. Made to the film. Okay, I get it. Uh, [00:05:00] about this, uh, footage. Um, what about, uh, distortion, lens distortion?
**Eliot:** Uh, it, it is, it is not undistorted. Uh, we the . Okay. So that's kind of original footage. Uhhuh. This is original footage. Uh, in this particular case, it was shot with the 24 millimeter lens, and I've calculated distortion on it.
It has very little distortion in the footage. You know, I'm sure how, how did you calculate it? Like, Oh, when I was doing group size, um, I, uh, uh, both in SynthEyes and in our. We have a lens calibration system in Jetset, and there just wasn't that much distortion in the footage. Okay, so basically
**Petr:** GeoTracker cannot handle the, uh, distortion calculations.
So we just think the footage is undistorted, and you have the correct camera parameters, correct geometry, and if you have, uh, all these things, you can track, uh, just like one click.
**Eliot:** Okay. Well, it's, yeah, you want to try it? Um, I'm, [00:06:00] I'm Yeah, yeah,
**Petr:** why not? We, we, we can try it. I mean, uh, if there are any questions or problems we can, uh, like, get from here.
As, as I understand, it's kind of the session where we can just make mistakes, yeah? Yeah, yeah,
**Eliot:** this, this is exactly what That's not,
**Petr:** not like, um, yeah, official presentation of, uh, collaboration, but, uh, hopefully I can help you.
**Eliot:** Yeah, yeah. This is, this is office hours where we figure stuff out. So I've got, I, you know, I, I can record it sometimes if we figure something out, great.
Maybe, you know, maybe I put it up. But usually what, what, you know, sometimes if I solve something, then I'll, I'll record it and transcribe it and put it on the site. Sometimes they're just like us figuring stuff out, which case, you know, I'll, I'll, I'll do a real recording later on after we, after we figure it out and do a tutorial.
Uh, cause the, cause this, again, it could be fantastic and useful for users, but I want to, I always want the video tutorials to be very systematic and step by step
**Petr:** and
**Eliot:** organized.
**Petr:** Yeah, that will, that will be amazing to have like tutorial that explains and if you can do it, that, that will be amazing, but well, let's try to do this.
[00:07:00] Um, if you want to get
**Eliot:** this working, then my next step is to call Ian, Ian Huber, I showed him what we were doing earlier, but to get the sub pixel tracks that he's using, we would have to bump, go out to synthesize. And I don't think he really wants to deal with the external stuff. So, um, but being, again, keeping it all in blenders is so huge blenders are a weapon of choice.
We just internally, we support lots of different systems, Unreal, Maya, you know, Houdini, all these sorts of things. But, um, you might, my go to is Blender all the time. So, um, okay. So let's see. So, uh, I downloaded GeoTracker and installed it. I've got a trial license. I can buy it. That's fine if I need to. Um, but I understand I click a new GeoTracker and then for the clip, um, I have, I'll just go to the, um, uh, let me go, let me make sure I'm in my correct.
Uh, auto, let me make sure I'm in my correct directory. Yeah, I'm, I'm in the correct directory and I'm going to go up from the blend file and I'm going to go to the, uh, cine cam [00:08:00] BXR, uh, do I need to hit eight, select all of them or do I, I
**Petr:** think, yes, I think it's better to hit.
**Eliot:** All right. So I just load the clip.
Okay. Now, uh, okay. So now we're, so this is a little bit interesting.
**Petr:** Okay. So what we did here, we just, uh, put, um, this clip into camera, but you have to select the camera and geometry also. So what's actually what's happened here now.
**Eliot:** So, okay. By default, we generate all of our. Our image sequences starting at, at 1001, uh, in our Blender scene, we originally, uh, we originally set our, here, let me, let me, uh, see if I can go back, uh, and then, let's see, let me click my camera, uh, and let me go back, so the original camera, um, we put, Have At, at, starting at frame zero, uh, frame one in Blender.
[00:09:00] Um, let me look, I think
**Petr:** it's right. Uh huh, okay, okay, so that's not a problem. You can shift this, uh, well, but for that you have to use the native Blender, uh, tracking, uh, environment. So basically you can go, um, add plus sign on the top bar of the Blender. Mm hmm. And select here the layout, the VFX. Uh, and motion tracking, I think that, yeah, motion tracking.
So here you can, uh, open, uh, the video clip as you see the open in the center, the button on the center. Oh, got it,
**Eliot:** open, and then Yeah,
**Petr:** yeah, like the same, the same, uh, clip you can select it here. All right. Uh, and then somewhere choose which frame is the first frame. So starting frame now is first, as I think.
Here on the left. Yeah, so that's
**Eliot:** 2001. Oh, wait, starting wait. Uh, starting Frank.
**Petr:** Well, it, it depends what, what you really like, want, want to get from that. Okay. [00:10:00] Okay. Okay. Okay. So now we can go back to the layout and select, uh, I, I mean in to the layout on top. Yeah. And select, uh, in the geo tracker.
**Eliot:** Okay. Geo Tracker to
**Petr:** find out what's going on.
I mean, uh. First of all, the input, uh, no, no, select the clip. Yeah, you can select it just from drop down.
**Eliot:** Okay. Oh, okay. Okay. So let me, let me play back what, what we're doing. Just to make sure I understand. So what we did is we loaded in, we loaded in our, um, so is GeoTracker using the blend, some of the blender tracking, uh, pieces to track?
Well, well,
**Petr:** well, not actually, not actually. So how it works when you load the clip here as the input. We'll look to the, uh, count of frames and your first frame is, uh, 1, 001. So, or, or something like that. And that's why we moved all this scene further. So, uh, for example, [00:11:00] now we just load it in another space to just like redefine the first, uh, That's how Blender works.
So that's the part of the Blender. We loaded that
**Eliot:** into
**Petr:** the
**Eliot:** tracking scene. We set the correct offset. Is it okay
**Petr:** for you to work in your, like, natural time range? That would be perfect to work in. Not from the first frame. It's better for simulation and so on. So that came from, uh, VFX industry, as I understand.
Uh, then you have to select the geometry and the camera. Do you have a camera for geometry? You can, uh, yeah, drop down something like that. Pick our geometry. Or picker, yeah. You can use a picker. Pick our
**Eliot:** camera.
**Petr:** And, uh, there's our camera. Okay. Okay. So now you have to, uh, well, you basically can look through, just click start pin mode.
Uh, this is the button, yeah, the huge one. Yeah, and you should see the picture, yeah, and the model. So the model is a green model, [00:12:00] uh, and the picture is, uh, kind of the background and the camera.
**Eliot:** Okay, let me be very careful on this. I'm going to turn off, I have an image plate here that has, that already has the image of the model.
Okay, so I turn, oh, hey, all right. Hey, so that, that, that is showing, that's showing correctly. Okay, this is fantastic. I didn't get, I didn't get here with it. Yeah.
**Petr:** So. Yeah, everything like inside now so we can track it But so how works our tracking we have a lot of tutorials. You can watch them on our website or YouTube So but basically how it works.
We have a keyframes kind of manual keyframes where you can pin the Geometry and place it correctly Uh, in your scene, well, here in your scene, you already like have good tracking. So, uh, it would be easier to place it correctly, but you can try just mess up a little bit. Uh, for example, go to the first frame or like whatever [00:13:00] frame you want, uh, and try to click and drag, uh, your mouse over the green mesh.
So when you click and drag over the green mesh.
**Eliot:** If I click and drag. Okay. Yeah, it moves. All right, and so of course Oops, if I click yeah, and
**Petr:** now if I if I click one one moment I don't know why I I cannot see how it moves. Ah, it's actually moves, but it's really slow or what?
**Eliot:** Oh, I can I can move it Do you see it moving?
No, I
**Petr:** don't know why, but maybe it's kind of the lag of the screen capturing. Okay. Yeah. That
**Eliot:** could be so I can click and drag and you
**Petr:** do it smoothly.
**Eliot:** Yeah. Yeah. I can, I can hear I'm, I'm, uh, slowly moving the mesh. All right. Are you seeing that while actually here? No, no, I cannot see it for some reason. Uh,
**Petr:** here's [00:14:00] the
**Eliot:** problem.
Um, you know what, let me, let me briefly, I'm going to stop and I'm going to reshare the window just to make sure it's, it's doing, doing the right thing. Stop, and then Maybe it's
**Petr:** because you just captured the window and something
**Eliot:** goes crazy with the shaders. So I'm going to, so now we're, do you see my window?
Not yet. Okay, give it a second.
Alright. Now, uh, now do you see it?
**Petr:** Yeah, I can see it.
**Eliot:** So now if I click and I drag, do you see me dragging the, uh, the geometry? No,
**Petr:** no, I see your mouse, but I cannot see our model. The model stays, uh, and now jump. Okay, now it jumped. Okay. Okay.
**Eliot:** That's weird. Well, I mean, so I can
**Petr:** Okay, I can describe the, like, idea.
Idea that you can click and drag. And we call it pinning. So you can align the model perfectly in your scene. It [00:15:00] depends on actual features that you can see to align it perfectly. But in your case, for example, you can just Hit, uh, the button to create a key frame on the first one. Um, yeah, the next one. No, no.
So this is a jumping key frames, but yeah, here you can create a key frame. So when you click it, uh, on the timeline, you can see the dotted green line appears in the first, on the first frame, you can just move a little bit, uh, right or left, and then see, do you see the green line? All right. And so if I move forward, so I move forward to say frame 45.
I, I don't, I don't know because, because you have like kind of freezes of interface. So I see your mouse, but the screen, uh,
**Eliot:** just strange, um, why isn't it sharing correctly on zoom? Uh, let's see if there's any settings. Um, [00:16:00]
**Petr:** all right. Uh, I think this is a problem with capturing the window. Oh, now I see, but that's kind of jumping.
You can try again, playback, right?
**Eliot:** So would you suggest I go through the, the, uh, How many keyframes do I need? Do I need keyframes through the whole shot? Uh, or just a couple?
**Petr:** Well, wow, that's an interesting thing. Well, you need a keyframe when you know that the model is placed correctly. So then, you hit track, and we track the, uh, all motion, and then, uh, we try to, uh, align the model in each frame, and when you think, uh, that model a little bit off, you can stop the tracking, you can, uh, pin a little bit to, like, get, uh, better result in this, uh, particular, uh, keyframe, and then you can refine the tracking between these two keyframes, [00:17:00] To get smooth transition from one to another
**Eliot:** so let me tell that how it works So in your case,
**Petr:** you can go to the first keyframe.
Yeah, you can jump it with alt and arrow keys on your Keyboard like a like in nuke we like made this hot case So, and then you can, uh, click track button, which is, uh, in the center under the back to 3D. So, this is classical track, uh, buttons. Yeah.
**Eliot:** Okay. Yeah. This is
**Petr:** the back and forth. Track to end, track forward.
Okay. From the first frame. Yeah. Oh, wait. Uh, current frame is outside the pre calc file. Okay, so that's amazing. So the another one thing is, uh, yeah, calculate analysis file. So you click reanalyze in this case or analyze if that's the first time you select the range I think we call range from first to the last frame and then hit go and wait [00:18:00] Well, uh, do not, um, change the layout or open the windows or something like that.
Because, uh, any interruption in the Blender can cause the problem. We try to figure out what's going on always, but sometimes Blender can be tricky to handle all the changes. But actually, I still have the problems with a freezing, uh, interface. For some reason I cannot see. It's tracking smooth animation, what's going on.
**Eliot:** So it's tracked, um, it's going forward right now. Okay. Okay.
**Petr:** Nice.
**Eliot:** Uh, and it's tracked seven. So yeah,
**Petr:** we are analyzing nice. We're analyzing the, uh, features on shot. So basically we use that, uh, optical flow to match, uh, the position of the object, uh, corresponding to the, uh, original key frame. Okay. So that's, that's [00:19:00] basically it, the conception.
You make keyframes, you track, you, uh, make another keyframes and you refine, and you do it, uh, as many times, uh, as it's needed to get the best result, the results that you really want to get for this shot.
**Eliot:** Okay. And
**Petr:** you don't need actually use the masks, but we have here the mask option, the mask tab. And there we have, uh, surface masks, which you can draw over the mesh.
So you can kind of, uh, ignore some polygons on the mesh, or, uh, you can connect the sequence, which as I understand you have from iPhone. So you can connect that AI sequence, select which channels you want to use for masking. And ignore them. But when you use masks, uh, the tracking significantly, uh, dropped down, so, uh, I think the best, better option to try without masking first and [00:20:00] use masks only, uh, if it's necessary.
Especially, uh, bitmap maps, masks like sequences.
**Eliot:** Okay, let me take a look to see, uh, Alright, so I don't, uh, I'll have to calculate the masks for For this one later that that'll take some time. But I, I can, I, I, I understand. Okay. Yeah.
**Petr:** I think we, we can just try without masks be
**Eliot:** to just understanding that this is, the workflow is, is exactly what I needed.
Is this, this is, is that the mm-hmm . You know, set the key, set the key frame. 'cause I, I, I started looking at, you know, I, I just got confused trying to figure out. Cause it's, it's the tutorials are designed when the footage is not aligned and I had aligned footage. So, uh, this is, this is perfect. Uh, it's a neat, it looks like a very neat tool.
So I, I think this will be, uh, really ideal for there's, there's a lot of things where you don't really need distortion. I mean, if you're shooting over like a 32 millimeter lens, um, you know, then. You know, you don't really have [00:21:00] a lot of braille distortion. Okay, so, so, okay, so it analyzed it, um, so now I'm going to, going to go to
**Petr:** You can go and track it from the first frame.
So you go to the first frame, uh, the, like your manual key frame, and then you hit, uh, track forward and, uh, look at the image if you are like satisfied with the result. So you just wait if something goes wrong you can stop it with escape or pause and then Like realign something also because you have a really good Like positions in all the cranes you can actually create keyframes for example first And last and then hit refine and in that case we will try to Make this tracking perfectly from first to last frame But actually as you see it works even from from first frame.
**Eliot:** Oh, this works great This [00:22:00] works
**Petr:** well, but now now we should like dive deep, uh into The understanding how it works. Well now you can see in the tracking tab. We select the geometry So basically you animate now geometry Aligning to your moving camera. Well, actually it's not, uh, not really what you want to get.
As I understand. You need to get the fixed geometry and moving camera. Right. Yeah. So what you can do else, you can go to the first keyframe now. Uh, just jump to it. Yeah. And then you can clear the tracking, uh, on the right. So that's under the creating keyframes, there is a clear track to the right. So you can clear it from the first keyframe.
**Eliot:** Let me make sure I got this right. So this is clear forward. So you can see the one I have. Okay. Yeah. Clear keyframes to the right. Okay. So now I'm going.
**Petr:** Yes. Now it's. Camera. Absolutely. And track again. So absolutely the same result in [00:23:00] projection. But, uh, in the scene, you are moving camera now,
**Eliot:** and if
**Petr:** your camera have, if your camera have no op, no, um, constraints, it should work perfectly.
Uh, only constraints can, uh, like really create a big deal. And actually you can, for example, create a new, uh, uh, geo tracker, just, uh, on top in the geo tracker, like create new one and then create, for example, cylinder. and track geometry using the tracked camera and we will like account the motion of a camera and calculate the new motion of an object so you can basically track all features in a scene and then reproject the textures we have like texture option we have thin alignment a lot of uh options like scaling and so on so i hope that's really what you find this
**Eliot:** looks so okay so now that i've tracked it uh this is it this [00:24:00] is the track shot And it looks really
**Petr:** kind of one bottle in track shot.
**Eliot:** Oh, this is this is so great This is this is this solves such a big hole we had where? You know, we want, we want to track something, you know, very accurately. So there's no slipping. Um, but you know, again, lots of people don't want to, don't want to deal with with something as sometimes it's amazing, but it's complex, it's huge.
And then, then you have the whole, like, you have to go to another, another app and pull into blender. And there's just like 15, there's like 20 steps. And this is like, no steps. Oh, this is great. This is really, really great. This is exactly what I was hoping for from this. Um, wow. If I'd known it was going to be this, this clean, I would have done this first.
Uh, but that's okay. Um, this is just great and it's really capturing all the little nuance of the camera motion. Um, and [00:25:00] we flipped the, um, all right, let me flip to the camera. So, okay. So now, uh, so once I have a track. Then I just go back
**Petr:** to, uh, back to 3D. 3D, yeah. You can go to the 3D. Uh, well, actually you can split your screen and look to the 3D scene, uh, simultaneously with the tracking.
So you can, like, uh, have a look, uh, the tracking scene at the same time and then, uh, correct it if something goes wrong. It's messed up my viewports, but let me,
**Eliot:** uh, let me set my timeline here. Ah, timeline. There you are. Okay. So then there's my scene. Oh, this is, this is big. So then I go back and everything is tracked.
I'm going to hit save and, um, okay. This is, this is okay. So then I can, I can pull on a new scene. Oops. I made a vertical split. Okay, there we go. And I'm going to go back to GeoTracker.[00:26:00]
Uh, and I can, if I go back into pin mode, it's not going to break stuff.
**Petr:** No, no. That's everything. Okay. You can enter pin mode in
**Eliot:** any window. Ah, I didn't, I didn't, didn't, uh, I don't accidentally move the mesh. Uh, how do I escape out of this escape? Okay. Actually undo that. Like I said, it was everything.
Everything's tracked correctly. Um, okay. This is great. Um, I actually am, what I'm probably going to do is I'm probably going to go record a quick video on this and basically, uh, do all the steps we did on this. Um, and may, yeah, that would have been, that would be nice.
**Petr:** Um, actually take a look to the like tutorials that we [00:27:00] have about the tracking, cause we have, for example, smoothing thing, uh, and there in the smoothing, you can, for example, um, in a smoothing tab.
You can adjust some parameters, uh, that will, uh, affect the tracking engine. So every new, uh, tracked, uh, frame, uh, will take an account, uh, during solvation, there's smoothing parameters. So all that smoothing parameters are in the coordinates of the camera. So Z axis, it's not the actual like global Z axis, this is the axis.
Uh, from camera, uh, like perpendicular to the camera. So, and X and Y is, uh, in a camera plane. So smoothing that things you can get rid of jittery and all, uh, like artifacts during the tracking.
**Eliot:** Yeah. Let me, let me make sure, make sure, make sure I understand that. So smoothing, um, [00:28:00] does that mean it is, so it is not when, when, when it is smoothing, um, give me a second.
Are you basically stabilizing the footage?
**Petr:** Well, no, no, no. Just every frame, when you track it, we try to align the model according the previous keyframe, uh, using the, um, optical flow. Mm hmm. But when you have smoothing above zero, every frame, we try to take into account that smoothing, And calculate the motion, uh, the strength of that motion depending on the smoothing parameter.
So that means if you have a really jittery camera actually on the set, you have to have like zero smoothing because you need all that [00:29:00] jittering and tracking. But if you have a smooth camera, but for some reason, for example, uh, noise, uh, problems on a blue channel or something like that, if you have some problems during the tracking, you have jittering, for example, on a Z axis, the one of the most problematic access during the tracking, you can, uh, play with smoothing and refine or retract the shot with a different smoothing parameters.
Try to find, uh, the smooth, try to get the smoother tracking curves during the solvation.
**Eliot:** I see, I see. So that's
**Petr:** not kind of the post filter. This is actually the parameter for solvation. I
**Eliot:** see. So it's, it's adjusting the solve, uh, to, to better, in case.
**Petr:** Yeah, yeah. You can save the solver how jittery that shot, uh, or how the rotation jittery that shot.
Oh, this is great. This is just great. Um, I mean, also we have masks, like as I said, a [00:30:00] lot of mask option there. You can, uh, mask using groups or so. You can, uh, for example, um, use scene tab to align, to play with a scale, to use, um, to convert the animation, for example, from geometry to camera, from camera to geometry.
Even if you've done the tracking, you can convert it. Uh, the unbreak rotation, uh, it's about, um, I forget how it's called, uh, in math, uh, that's a problem with, uh, 180 degrees. When your curves are, uh, caught because of the 100 degrees, uh, rotation. So, but we do it, uh, automatically on the background. So actually the button, you don't, don't need to use it.
You can bake, for example, geometry or camera motion. That means if you have, uh, like nested animation, you can bake it, uh, in actual coordinates.
**Eliot:** So, okay. And so [00:31:00] I'm thinking through and we have texture,
**Petr:** of course, you can bake the texture, even from each frame, just, uh, calculate the texture, for example. Uh, well, you have to have UVs here, but for example, automatically create UVs with a smart UV.
That's native blender thing. I think it should work pretty okay. You can click here, uh, create UV, um, add the one frame on the right side. Just for example, this one. Uh, and then create a texture, hit the button, and you will get the texture baked on the model. Okay. I'm not surprised. Well, it's not a problem, not a problem.
Just click. Okay. And you'll get some kind of okay. Result here.
**Eliot:** Done. The overlapping UVs. Okay. So let's, let's take a look at my 3d scene. Switch to model view, uh, switch to material preview, um, see wireframe. Not my, [00:32:00] my wireframe is not, uh, it's not, I
**Petr:** think you are seeing wireframe because of the settings, uh, of.
The visibility of the object. I don't actually remember how to fix it, but actually, uh, yeah, here on the elements, I
**Eliot:** mean, yeah, we may have set that to be a display as wire. Okay. Display is textured. So then let's see if it, let's see if it, uh, made a, made a texture. Maybe not. I wouldn't be surprised if it had problems with that geometry, just because that's, that's from the, the, uh, the set scan geometry and it's.
Yeah, it's just not
**Petr:** no, it should work. It should work. Actually. We tested it. You can click repack And check again the create texture button repack and then create texture button. Okay, and then Great texture.
**Eliot:** Okay project textures All right done, but overlapping [00:33:00] uvs and
**Petr:** oh, okay. Okay. That's interesting. I would like to check it You can see in the shading nodes, uh that we tried to Calculate it.
Well, actually Maybe something happened here. Maybe, maybe the normals problem, I think.
**Eliot:** It could well be. This is just a scan import, so it was, it's externally created geometry. But
**Petr:** actually, I'm really interested in that material, if it's possible to check it on our side, it would be really interesting.
**Eliot:** Yeah, so let's, let's take a look at, at, um, what's, what's in there.
So we create
**Petr:** the texture and apply it through principle based depth. For some reason that texture is Not visible here. Oh, maybe that's problem with Exhaust I think I should test it if you will send me scene or or even the Source I can check out I
**Eliot:** can zip that up and [00:34:00] then send you a little bit, it'll be kind of big, but I'll send it to you so you can see
**Petr:** no problem.
I can download any size of box.
**Eliot:** All right. This is, this is just, this is fantastic. This is just fantastic. So that, that, that motion tracking thing to offset the, the clip that's key. Um, and I'll bring this back to, uh, where's our visibility, uh, normal wire so we can see stuff. Um, okay. Okay. Um, no, this is, this is, I mean, just the fact that we can get a really locked in track, um, and while staying inside Blender, cause I, I even looked at, at seeing if we could modify Blender's, um, you know, normal motion tracking system to take in survey data.
And I mean, it's just, you know, maybe somebody can figure it out, but I couldn't, it's just not there, but this is, this is ideal. It's fast and it's simple and it doesn't cost much. And, uh, I think people are. And, and it's okay. The people who have like heavy distortion in their [00:35:00] lenses, that's why we have SynthEyes, right?
The way it has as a distortion solver, this, this is sort of the, the first thing you do, you know, um, to, to get a shot to, to lock in without having to go to, to that. Cause as soon as you go to distortion calculations, it's, it's a lot, right? Cause then you have to, you know, transfer the distortion notes to your compositor and, and we do it, we have tutorials.
It's just, you know, you're rendering oversize, you're running with overscan. And there's like. 15 things you have to start doing. It's just like, it's, it's too much for a lot of people where they just, they want the background to not wiggle, you know?
**Petr:** Yeah. Well, that's depends on the quality of the project and the defense on the like time, because sometimes you don't have time for all that.
Uh, Trip around the distortion and redistortion the footage. Yeah
**Eliot:** so basically
**Petr:** we have different licenses for example for studios and for Freelancers so people can depends on the side of the project [00:36:00] depends on the company or Just freelancer you can you can use all that stuff On your projects and we are really happy to help everyone with like understanding what's going on in our products.
So like right in support and
**Eliot:** that's, I mean, that's great. That's great. I mean, we should figure out something, you know, we're, we're, we're working on getting our social media stuff better because we were starting to have a lot of people make stuff. Um, and so there, and we just, we just saw a trailer for this astounding Star Trek fan film, uh, made over in Germany, uh, where it's, you know, these beautiful sweeping tracking shots, like go and they, they shot it in a gymnasium, you know, high school gymnasium.
Um, but, and they, they, they did a really fantastic job and the costumes are great and everything, it looks really good. And, uh, when I, when I talked to the VFX guy, uh, you know, [00:37:00] he was just, he was really paranoid because he knew that, that post tracking was going to be, you know, awful. So, if they, they would, they were shooting with Jetset, and they would check the takes, and if there was anything that had any little wobble in the take, they'd just actually reshoot the take.
Um, And I'm like, okay, you know, I get it for, you know, if you're the fan of film, you can do that, but for, you know, real production, you can't, you can't do that. Right. Like the tech, the tech stuff just, and this solves it. If you have, you know, as, as long as the track is good anywhere in the, in the take, then great, you know, we can set a key frame and, and, and away we go.
So if we, if we had like the, the good part was in the middle of the take, we'd set a key frame and then just track forward and track backward. Is that the basic, basic gist of it? Uh, if so, if instead of, cause right, this, this one, we, we set our key frame in the very first frame and we tracked forward and everything just worked out of the box.
Um, if I wanted to set, uh, if the tracking, uh, well, if we wanted to set a reference point in the middle, [00:38:00] uh, of the shot, will we just set the key frame in the middle and then track forward and then track backwards? Well,
**Petr:** the better idea is to click refine because, uh, when you track forward and backward, uh, you want to take an account of the all other, uh, keyframes that you made, uh, and to get the better results.
So every time you make a keyframe, you hit refine and refining, it's automatically refined between keyframes. Uh, if you press it on keyframe, it refines like between the previous and the next one. So, uh, every time you add a new key frame, you hit, refine or refine all. Uh, it means like refine all the gaps between all the key frames.
So, for example, uh, some, uh, productions as, uh, that pipeline that they put the key frames first and then hit refine all. Well, basically refining is a tracking from first to the second, then from second to the first, and then [00:39:00] matching these two, uh, tracking, uh, passes. With a different like mathematic magic.
So to make, uh, it's a smooth and accurate at the same time. All right. But again, if it works from one, uh, uh, key frame, why should I make more?
**Eliot:** Yeah, yeah, no, this is, this is great. This is, this is great. I already like this just solves it for a huge chunk of things. Um, I think, I think that the, now this is, this is of course, this footage doesn't have very much motion blur in it.
Uh, because you know, the cameras, it's a, it's a controlled camera move. Um, uh, how does the optical flow handle if it's more jittery, if there's more motion blur at a certain
**Petr:** point? Well, that actually depends on, uh, on, on a case. Of course, I can, I cannot say it's generally, but generally, of course, uh, well, motion blur is artifacts.
So that's. Kind of not a good thing, but first try it. I mean, uh, sometimes, uh, even have a motion blur we can [00:40:00] fix We can like handle that problem. And also we have a really smart algorithms of Getting rid of moving objects inside So if something moves first try to track like just maybe we will ignore this If not, okay, you can use masking with the polygons and so on.
So the iteratively working, uh, cause we have really, uh, like easy, uh, UX of our tracking. So you can try, do it first quickly. Then if it's not perfect results for you, you can do it again, again, again, again, and finally get a result that you really want.
**Eliot:** This, this is, and this is, we're already, I'm just ecstatic that it worked like this because it's, it's again.
Um, this is the perfect use case for when, so for example, our, our real time tracking actually handles super fast, crazy moves really well, right? Cause it's, it's, it's, it's a optical inertial. And so the inertial systems handle really fast [00:41:00] transitions. You know that it handles that it's actually the things that are harder is when the camera is really slow and when there's any error than it then you can see the background slide.
Um, and so this is perfect for it's just perfect for this.
**Petr:** Um, okay. Yeah. So for example, if you have really shaky camera and really shaky shot, what you can do, you can enter here in the geotracker. First create like five, seven, maybe more key frames because you have like already aligned mesh and then hit, and then hit refine.
Yeah. So, uh, first you use kind of your data as the position and then use the tracking to get smoother pass between all that
**Eliot:** great. That's great. That's great. So you just go through and this is good. This is good. This is good. And so then you're not fighting it. Okay.
**Petr:** Something like that. Uh, also there we have, uh, like professional things.
For example, we have, uh, lock view, [00:42:00] that's the button under the tracking. Uh, and also the lock view is L button on the key frames. So, on a keyboard. So, they, what it do actually, it locks, uh, the model in that kind of center when you play or track, whatever. Just to, uh, understand. How good or like bad tracking you get now.
All right. And if you have a pin, you can, you can take a pin, select one or more, many pins. You can put a pin on a mesh and then we will stabilize around that selected pin. So in that case, you even can check the quality of tracking right during the tracking.
**Eliot:** Let's try that
**Petr:** so cuz yeah, just click somewhere select a pin and then enable the lock view and Okay, now I have not created a plane do I just yeah Yeah, just click just click somewhere if you click and drag you move the model if you just click you just create a pin [00:43:00]
**Eliot:** Okay, so let me let me find let me just pick a pick a spot here.
That's Uh, that we could see through the shot. I mean, let me go back a little bit to see, okay, so maybe, maybe, uh, maybe this spot over here. All right. So I, I clicked the, I accidentally moved the model and click very fast. There we go. So there's a pin. So then I, if I lock view,
**Petr:** um, all right, uh, that's on the keyboard.
It's like, uh, in the native vendor training. So we, oh, there we go. Well, we cannot see here, uh, a little bit, uh, freezing, but yeah, that's kind of stabilized. It's just in the viewport, so we're not influenced on the tracking, that's not influenced, uh, on the video plate. It's just for you to check that you have really good tracking.
And you, and you can jump, for example, between keyframes to make the, uh, consistent, uh, keyframes. Because it's very [00:44:00] important for tracking to have the consistent keyframes. So, each polygon should be on the same place, uh, over the object, uh, during the shot.
**Eliot:** And that's kind of the
**Petr:** self checking thing. And you always can, like, press L to enable, L to disable, uh, just, uh, during your work.
**Eliot:** Oh, yep. That's, that's, uh, that was, uh, when I was doing this, the barn shots with Alex, he was showing me how you, you know, just stabilize it around that point in Nuke to look at, look at the tracking and verification of it. So I was just curious where that, where that was. And that's, that's, that's perfect.
This is, uh, all right. This, this is, uh, better Petra, Peter, Peter. Okay. Thank you so much. This is, I see another user coming in, uh, Kevin good that I'll probably want to. Uh, help correct stuff with, but this, I mean, this is, this is big. This is a great, um, uh, This is just, it's great. It uses all the pieces of it, uh, that, uh, you know, that we can generate onset very quickly, [00:45:00] and then we can get to this extremely high level of precision and post, uh, and that's, that's, and you know, without people can stay in blender and don't, they don't have to, they can learn it.
They, they, they can learn this in, in, you know, 10 minutes. That's, and then if they have some crazy high distortion thing, then great. We have the SynthEyes pipeline, but this is already just. It's fantastic. Just absolutely fantastic. Um, oh, thank you. I'm really glad. I'm glad I recorded this. I'm
**Petr:** really, I'm really glad to help you with that.
And I really, uh, appreciate that. And I can come here and explain everything or listen it. I'm really happy to share, uh, our work cause, cause like we really try to do the good things and they work and I'm happy that we can help you.
**Eliot:** Well, and, and I, we're going to, uh, this is going to figure prominently in what we do, uh, because one of the, one of the things we're running into is that Um, uh, you know, we've solved a bunch of these, [00:46:00] these, you know, real, you know, because we came out of production.
We built, we built the systems for once upon a time and Pan Am and, you know, Alice in Wonderland, like we built a lot of the systems for big, big projects and big shows, and we realized we could do it all on the phone. And so we, you know, we, we made the jump, you know, years ago and, and really built, built this new app.
Um, but it's surprising that a lot of people, uh, think that you need all this stuff. To do it. And they're, they're actually a little scared to see everything just running in a phone that's, that's, that's, that's it, man, we, we, we got the whole, whole thing running on there. And so, uh, it's, you need to be able to show this kind of stuff.
Like here, here we go. This is, you know, sub pixel lock. This is, this is production tracking and we can do it at volume and scale. Um, and that's, this is what makes a show possible. Uh, is being able to, to do not two shots. You need to do 300 shots.
**Petr:** Yeah, that's, I know what I'm talking about. Yeah. I have the, uh, compositing background.
So I, I know how, how much more than a lot of work. [00:47:00]
**Eliot:** Especially in episodic, the, the shot counts become crazy because what happens is people realize they can do this. And then we would have this every time where the writing room realized what they, you know, All of a sudden they could write stuff and it would just happen.
And what started out as a normal shot count just goes nuts. I mean, hundreds, you know, once upon a time was hundreds of shots per episode. Uh, this is, that's 10 years ago. Um, so just seeing that Star Trek fan film where they're doing, I don't know, it's gotta be, it's, it's dozens upon dozens of shots. It's, it's, it's many, many minutes long and they're track, you know, steady cam stuff like flying all over the place, stabilized and it looks like a, a broad, a network show, you know.
Um, and, but the, the, the camera's moving all the time. And if you're trying to post track that you need a team of 15 people. Uh, just to survive. And now, now they, but they can do it with, you know, two, uh, and that was before this, right? So they, before they had to, if they had a tracking jump, they [00:48:00] had to go back and reshoot it.
Now, you know, if they have, if they have scans, this is, this is it. We can just solve it. Um, okay. Yeah, we can solve it with scans. Peter, if it's okay, I'll probably put this recording up because this is just solid gold in our office hours. Then I'll go back and I'll do a real tutorial over it. Uh, but just seeing, uh, seeing this, this walkthrough of it, I think will be extremely useful for people.
And you explaining some of the details. Yeah, sure,
**Petr:** sure, sure you can, but it would be a really nice to make the cool tutorial. So I hope you will do that and explain like all the steps for you users. And, uh, well. Also, yeah, if you will, a thing that, uh, probably we can like somehow calibrate, for example, you will during the making the tutorial understand that you need some features to make that, uh, even easier to start tracking.
For example, um, that would be nice to talk about that we can, uh, um, call [00:49:00] and talk a little bit. And also, uh, it would be really nice, uh, to have, uh, this project just to look on the, uh, reprojection things, like why textures are not applied here on that model. And also, um, we're really, uh, we know that, uh, on the big projects, sometimes studios use our, um, product for tracking cameras using LIDAR scans and, uh, some retopologized models, but it's really hard to get the materials for the tutorials for some kind of, um, tutorial in the, the.
Entertainment, uh, and it would be really, uh, nice to have some projects from you, if it's possible, of course. Like, for example, some scene and the model also to make, like, our tutorial, uh, about how to track camera using the model. And, uh, I think, like, you reference us, we can reference. That this [00:50:00] model was, uh, also created with your app and your app, uh, people can use them, uh, even together.
**Eliot:** I've got the perfect one. And actually Alden, uh, so we're doing, we're working with an influencer named Alden, a YouTuber named Alden Peters, who has done a wonderful series. Uh, and actually one of his upcoming ones is we are going to do tracking refinement and we were originally planning on doing it with the, the Synthize, uh, pipeline and that's great, but he's already in Blender.
He's shooting every, doing everything in Blender. This tells me, I think we want to do it with this, have this in his video. And that way he can both do a tutorial, a quick, you know, walk through on it. And then I can get the really, it's nicely shot footage. It's on our green screen stage, shot with B Raw, Blackmagic, and with correct scans.
And everything's, everything's like, you know, mapped on. So, um, so we can, I'll check with him, but I'm sure we can get you that. And,
**Petr:** uh, and
**Eliot:** then
**Petr:** that's amazing. That's amazing. I think you, um, you can write us to the support, uh, and we will like connect about that, uh, [00:51:00] verbally. Okay.
**Eliot:** Cause that, that, that, cause this is gold.
Cause then, then it's, it's having all these pieces together. The thing with visual effects is so much of it is set up. You're trying to just get all these pieces and trying to synthesize information that's not there when the information is already there. You know, the algorithm is going to work really fast.
Like this was, we tracked the shot in, you know, four minutes, right? Boom, sub pixel, right? Ground contact, everything, everything's lined up. A piece of cake. If, if all the pieces are already there, then the algorithms just, you know, do it. And then most of the work is, is the work that takes forever. That's
**Petr:** really amazing.
That's really amazing that, uh, like modern technologists are really, um, you can apply it to any project, any size of project, even like your in house. Project and get amazing results like here, like for example, just, uh, you can use, uh, several cameras and use the one geometry between them and track them in one scene.
So you can make like multi color set up with a one geometry, uh, and for example, render [00:52:00] all the moving objects inside all the backgrounds, like from the correct views. Uh, so that's really, uh, that's a really huge.
**Eliot:** Oh, that's interesting. So hang on, hang on. Let me just, let me just go back and think, think through that.
So that would be in a, in the same shot or
**Petr:** is
**Eliot:** it, is a
**Petr:** Not, not the same shot, but I mean in the same scene, for example, you have a scene with a dialogue or something like that and have, uh, several cameras. And you can match all that cameras in the one 3D scene to render the proper backgrounds. To render the proper object, to get the, uh, correct lighting, everything correct, like everything made in a one shot.
And that's even easier than make different separate shots and every time try to, like on the post production, try to match the clips. So here you can like, [00:53:00] scientifically perfect, place everything in one 3D scene.
**Eliot:** That's, that's actually exactly how we're designed. Is we have a coherent 3D space that the cameras all show up.
And then now, now we can get them that level of precision. Yeah, you can
**Petr:** track it again. Like re track it with a geotracker absolutely the same way. So create another, uh, geotracker and another and another for each camera. And reanimate these cameras. And that they will be perfectly aligned in one scene.
**Eliot:** So this, this, okay.
Uh, I, so I need to help Kevin, but I, I'm also gonna say we are Alden's most recent video. And I'll, I'll, I'll link to it. Um, uh, let's see. Like , okay. Uh, you two Alden. Um, so one of the most recent ones, um, videos. I mean, I mean, uh, share this on here. Uh, copy. I'm gonna just put this in the chat. Uh, 'cause this, this, this becomes important.
[00:54:00] Um, all right, where's my chat? There we go. Going to everyone. Okay. So here's, here's the, here's the second video. Uh, but what he did is he has the Jetset camera, you know, uh, this Jetset Sydney camera on one, and then since he had a CD character, character in the scene, he used another Jetset camera just on an I or just an iPhone, right?
Just Jetset iPhone, um, because it was like a seven foot tall robot. And so they have that camera on a, on a stick going through the scenes. So the actor has correct eye lines. And then what we do do is then we, we pull that the data. From both of them, because they're all synchronized with timecode. And we have the object track, you know, drop right into Blender.
And it's, and it's correctly synchronized in time and space with the, um, with the original camera. Now, it's not a subpixel track, right? By the time you have all these transforms and stuff like that. Um, but what it does tell me is that I think. Uh, we're going to be able to use GeoTracker both for camera tracking, um, but also for object tracking with the same kind of assist, right, where you already know within a centimeter and where the position orientation of the object was, and then it's a lot [00:55:00] more straightforward to kind of, you know, move it into place.
Um, so I think we'll be able to do both camera and, and, and, yeah, I think there's, I think there's several tutorials to be done here as we Because people are starting to look at using, you know, again, these multiple Jetset devices to just track objects in a scene for the real time track. Um, and then, you know, again, you need to, if, if it has to be super precise, you gotta post track it, um, to get the precision onto it.
So I can see a great, um, alright, so this is awesome. I could go onto it, but I wanna, I wanna, uh, help Kevin out real quick. So, uh, Peter, thank you. This is . Thanks for joining. This is Thank, thank you. Absolutely. Fantastic. Thank you Le
**Petr:** let's keep in touch. Yeah.
**Eliot:** All right. All right. We're soon. Uh, all right, Kevin.
Uh, all right. How you doing?
**Kevin:** Hello. Good. I don't really, I don't really need to help. I'm just kind of trying to, uh. I'm trying to learn more about your system and get my brain in this space. I'm like a kind of super generalist director VFX guy. I don't know what to call myself. Um, and I've been kind of keeping an eye on like the Jetset [00:56:00] thing for a few years, but like, for example, I just did a bunch of, um, green screen virtual production shoots where we had a Skype tracker and a hardware ultimate box and a computer beating us on real, real time and all that.
And I've kind of been. For the stuff that I'm doing of the opinion that like what you guys are doing is super cool But the iPhone is not going to be as good of a tracker as a dedicated tracker And it's not going to be as good of a keyer as a dedicated keyer and so on so I've kind of been like cyber And then a couple weeks ago at that thing in Glendale that I'm blanking on the name of
**Eliot:** production summit
**Kevin:** Production summit.
Thank you. I talked to your colleague, or, uh, whoever else was at the booth. Not you, you were at the booth there a bit too, but I didn't, I missed you.
**Eliot:** Uh,
**Kevin:** um, and he was showing me some other stuff, like the workflow stuff, the refinement stuff you guys are doing, and, uh, pf track, or not pf track, but, uh,
**Eliot:** synthesizes.
Yeah,
**Kevin:** synthesize. Thank you. Um, and then the, um. The iPad slate with the QR codes for syncing things. And it's like all these like workflowy things that like keep me [00:57:00] up at night. And it's like, you're, you're, you're like speaking my love language. And so, um, I'm really trying to, I, I'm, I basically, I'm trying to take a closer look and see if it would work for the sorts of things that I'm working on.
Um, And maybe, maybe it would, maybe it wouldn't in the, in the near term, but like exactly, exactly the kind of stuff you're talking about right now, where like, I'm, I'm working on a show, which is a reboot of a show from like 10 years ago. Um, and, um, and, you know, we had an episode of 300 some VFX shots in it and it was, it was, uh, murdering us.
It's a pretty low budget show. It's not like this is not Game of Thrones or Stranger Things.
**Eliot:** Right. Um, that's exactly perfect for us.
**Kevin:** And, and so much of the time that, um, that our directors and the team is on set, we're looking at, um, we're looking, we're just looking at green and it's like, I, I feel like the technology is there [00:58:00] for us not to have to do that for us to be seeing like good previews on set and for us to have like a really streamlined, um, workflow coming out of there.
And, uh, right now what I'm experiencing is a lot of complication and messiness. And so, uh, that's it. I just, I just kind of wanted to start popping in at your office hours and start to wrap my head around it. I haven't used your app beyond downloading the free thing and like playing around with it on my iPhone.
I haven't like tried to like, like really implement it in a proper shot. But, um, I feel like that's, um, you know, that's next steps. You just nailed it.
**Eliot:** There it is. Like what you're describing is, uh, is the way virtual production has been done for at least 20 years, right? And probably more is you have some form of tracking, used to be an encoded crane, whatever.
Um, now it's, you know, and then you run it to an engine. It used to be brainstormed. Now it's unreal. And you take all the feeds and you wire them together. You hook it to an ultimate, used to be an analog. Now it's a, that's the digital ultimate. Each one of those things. Is a thing and half the time that [00:59:00] takes a dedicated op on set to do that, each one of those guys walks around with a thousand dollar a day price tag on their heads to keep it all alive.
And then you're taking all apart and put it back together. And then there's now there's two hours of fixing when you, when you boot up on, when you're, you know, you call time in the morning, cause stuff sent you the wire, got this, that area. And right.
**Kevin:** And ultimately. Ultimately, like I'm finding that a lot of the, and these are the things I'm finding.
I'm finding that like a lot of our onset tracks aren't good enough and they need refinement. And the fact that you guys are kind of like working that into your workflow. And from what I saw today, sorry, I was kind of on late. I was taking the kids and we're a little messy here. We're up in Pasadena. So the kids don't have school right now.
And, uh, things are a bit, uh, things are a bit crazy right now. So I was taking the kids into a childcare and running late today. Um, the, uh, The fact that you're working that into the workflow of like, let's just assume that the track data from onset is going to be close, but not quite instead of. Instead of it being this either or category of like we can use it [01:00:00] or we have to just send it off to be tracked, you know, um, so those kinds of things
**Eliot:** off to Vietnam, 300 bucks a shot.
**Kevin:** Yeah, right.
**Eliot:** You can't turn that fast like you can ship them overseas, but how you track 300 shots in three weeks, which is what your turn times are going to look like toward the end of your season. Right.
**Kevin:** Right. And then for a show, a show that I'm working on coming up here. Um, and it's a little weird, like my, like, like where I'm coming from, I said, I'm like a super generalist.
Like, I, what I mean by that is like, I'm, I'm not great at anything, but boy, I'm like a B minus at like a shocking amount of things. Um, I really come from like a cinematography background, directing thing, like, um, I'm kind of, I'm like a VFX obsessed director, like my career idol is like Gareth Edwards, right?
Man, if you come to
**Eliot:** the right place.
**Kevin:** Yeah. Um, can I screen share for a
**Eliot:** second? Oh yeah, go for it. Yeah, sooner or later. I think I
**Kevin:** can share this because it already aired, but like just, just an example of, uh, well, that's what you just linked us to, but, oh, and [01:01:00] that's something else I'm working on. Uh, This is just like an example of one of these virtual production shoots that I did recently, where we're on an enormous green screen stage.
And, uh, You got me? I gotcha. And this is just like a really nice preview for the director to see, because otherwise we're just looking at a whole lot of nothing. Yep. It's really hard to line up a shot or think about what you want when you're just looking at a whole lot of nothing. Um, but the track on this was not tight enough for whatever reason.
Uh, we had some jitters
**Eliot:** and so
**Kevin:** then everything gets retracted, everything gets re um, everything gets redone from scratch. So, um, these sorts of things, you know, is what I'm. Really interested in and this upcoming show. I just had to call this morning where I, I, I'm not sure if I understood it. Right.
This is just more of that same scene. So this is, and the other thing for this is that it was really great for editorial to have these to cut to. Oh yeah. Editorial loves it.
**Eliot:** Just loves it.
**Kevin:** [01:02:00] Uh, because instead this is just a bunch of people standing on green and that's just, that's just miserable. You know?
Yeah. Um. So.
**Eliot:** Not a happy
**Kevin:** killer. Yeah. Yeah. And so. Um, I was on a call this morning about an upcoming show, and I'm not, I'm actually, I might be misquoting, I'm not quite sure I understood the mandate right, but they basically said to the writers, like, we can have X amount of visual effects shots and no virtual sets.
Um, and to me, I almost feel like that's like, I would love to, um, have, have this stuff sorted out enough that, sorry, I lost the zoom there. Am I still there?
**Eliot:** I would
**Kevin:** love to have this stuff sorted out that that almost like flips, like the virtual sets shots are the easy ones. We can have, we can have four times as many of those as we can have these like bespoke shots where [01:03:00] we have to, you know, I don't know, make a magical portal appear in his eyeballs, and then he turns into a, uh, a dragon and flies away.
That's very hard to do. That's always going to be very hard to do, you know?
**Eliot:** Those are wonders. What you're just describing is the origin of Lightcraft, is me looking at this and saying, wait a second, once you build a good 3D model Um, and you set up your comp trees, then like you, and if you had good tracking data, you could go through these things at this crazy rate of speed because you're, you know, you're, you've got the tracking data and you're, you're copying and pasting your, your comp trees and you're, you're modifying it on a per comp basis, but you've got 20 shots pointed this direction with the same lighting, right?
You know, and it's a green screen, et cetera. So, um, and this is. This is what we came out of. And this is, you know, people ask, Oh, what's the difference between you guys and other iOS apps? And I'm like, well, among other things, we track city cameras. We do full on lens calibration, but more importantly, the whole system is engineered because we got our start in episodic for that, right?
So every single, our [01:04:00] metadata tracking, all this kind of stuff is already organized by, you know, organized by camera and by time, you know, shot, take, et cetera, but each take has unique 10 digit hex ID, and that's be so you can actually like, you know, parse. 5000 takes find exactly the one you're looking for.
I saw that. Yeah. I never, you know, and so how does
**Kevin:** the, um, uh, how does the, so you're, you're doing, uh, you're using the LIDAR on the iPhone to do that rough model that you guys were using today. Um, What's the range on that? Like, that one shot I just showed you, we started it up on a 10 meter crane. Is that like, out of the range?
So like, we wouldn't have LIDAR and then, you know what I mean? These are the kinds of questions where I need to like, start putting it through a pipeline and being like, Where is this going to break? Where can I?
**Eliot:** Right, right, right. This is, um, there are some, there's points that the LIDAR is good for up to about, uh, about a 5 meter range.
Um, but the trick is, and I've seen shots will come from a big crane up high and work, right? If there's enough stuff in the, in the big grounds that the, [01:05:00] because the tracking works, it's a natural feature tracking system. And it works like every other natural feature tracking system that looks for high frequency data.
Um, You know, to, you know, tracking points that are anchors and the tracking
**Kevin:** is, is a point based tracker ultimately, but it's, it's the help of the I. M. U.
**Eliot:** Yeah. Yeah. Right. So they, it fuses those together and they all behave work more or less the same way. Um, and I've seen big shots come in and come down. Um, where if there's enough stuff that there's enough parallax and stuff that the tracker has something to hook to it, it worked.
And that shot, they didn't even need post tracking. I looked at it and it was a monster crane shot. Yeah, yeah. But what I think is more key here, and just so you understand the three behind the system is, is, um, my background is engineering, right? I designed robots before I did this, so I designed the I heard you
**Kevin:** mention the Roomba.
I saw a talk you gave, uh, I think at the previous year's production summit.
**Eliot:** Oh, great, great. Um, and
**Kevin:** so, yeah,
**Eliot:** so I took control systems. I took all this kind of stuff. And, um, [01:06:00] what I really noticed is when I tried to hand, you know, when I got into visual effects, you do hand tracking shots. I'm like, yeah, they blow up all over the place.
I went, oh, I get it. It's a nonlinearly squared solver and we don't have enough data. Right. And so they, they blow up. That's just what they do. Um, And we realized you can either try to solve it, you know, open loop, which is, uh, you know, in terms of, of engineering, you know, everything has to be perfect and precise and at, and to get the, the solve in the moment.
And after going through it, I went, you're almost never going to do it. It, there's, there's too many parameters that are going off, um, among other things, the tiniest bit of misalignment between your tracking head and the, the, the camera lens and your calibrations off and you go 40 feet out and you're off by two centimeters.
Right. And that's. That's mechanical engineering. That's not optics and as mechanical engineer. So as soon as the, somebody goes bonk, you know, so when we saw. And we used to build our own trackers, et cetera. And then about four years ago, I was experimenting with the iPhone. I realized this thing's really [01:07:00] good.
It's used to build
**Kevin:** your own trackers. Oh
**Eliot:** yeah. Yeah. We hand machined it, machine vision system, targets in the ceiling. And we shipped a bunch of, you know, we, that's how we did it once upon a time. Uh, and then, I mean, we used an Intersense tracker for a while. Then we built our own same kind of thing that targets overhead.
**Kevin:** I've been, uh, using, well, I have two of them sitting at my desk. So, uh,
**Eliot:** there we go.
**Kevin:** Bliss, uh, let's go. Which is the ex vio,
**Eliot:** ex vio a All right. That's, uh, that sounds good. Ex
**Kevin:** vio with the ret tracker Bliss, which is their, like the software to, to plug into Unreal and all that, so,
**Eliot:** yep, yep. And, and I'm sure it's, it's good.
And, but what I, the fundamental thing I realized is that if we got close on set, instead of trying to be perfect on set, like the iPhone gets, you close it, it's. There's not enough to get perfect. You kind of have to be in broadcast and then you end up at a stipe, right? You just have this like very big chunk of hardware and there
**Kevin:** which is what we had for this shoot and it wasn't It didn't you know, the stipe wasn't messing up but something on the recording like so actually what you would that clip I just played you which was the You [01:08:00] know, we were recording to like a like a atomos ninja or something Not a ninja those are old whatever the new atomosis are.
Um, we're recording For editorial, right? The preview comp,
**Eliot:** um,
**Kevin:** and that was super locked in. And then we went back into the data and it like, there was jitter in it. And then we just had to, we had to, uh, match, move everything. So this type can still work and the recordings, I don't know. Right. And so these are the things that keep me up at night is these, uh, these workflow problems that are, um, if you were doing a one off shot, who cares?
And
**Eliot:** 300 shots, you care a lot because it'll tear a lot, you know, and, and this was so, okay. On, when we built those big systems, we built the front end. So we just output like a tracking data file and you load into my, or something like that, or an FBX file and we're like, okay, great. You guys can handle it back in.
No, no. Uh, there was a few places, you know, Stargate, the big image, image works. They, yeah, they, they, they got a team of engineers that can do it. Almost no small team by which I mean like a half dozen people that starts a [01:09:00] visual effects company. Could, um, do the throughput, the programming to, to handle, because the volume of shots can go nuts.
And so what we did this time around is we went, Oh, okay. So we're going to build both the front end and the back end. And that's what auto shot is, is it's basically the back end and automated ingest system. So it takes the tracking data, punch into your frame in frame out, pulls the EXRs, does the color, correct color space conversions where we're going to, in another probably a month or so, we'll, we'll have, um, red raw.
ARRI Raw, Canon Raw, you know, we already have B Raw. So we're implementing all the things so we can do straight schools from the source, do the correct things. That's good. The shows I'm
**Kevin:** working on are shooting ARRI Raw and Red Raw. So,
**Eliot:** yep, got to do it. And you got to go from the cam originals. So we're just, we just do that and, you know, solve the color space gamut stuff because.
You just, you gotta be on it, otherwise you blow it. Mm-hmm . And then, um, and then, you know, then, then you can do it. And so then, you know, wraps up the package, drops it into Lender and Unreal and Maya and, you know, all this, these different, different tools using their own [01:10:00] scripting language. We don't depend on us, we use USD to get into the phone.
We don't depend on any external thing like FBX or anything like that to go into the actual app. We write code, you know, script code and to, you know, if we're going to cinema four D, we write cinema four D Python code. So we. You know, we're not dealing with weird import problems because otherwise you, you will deal with weird import problems.
**Kevin:** Right. And those, those sorts of issues, like the camera Roth, those are the sorts of things that I'm looking at in terms of like, okay, is this something that I'm going to be able to incorporate into the work I'm doing, which is, you know, is, is network episodic TV essentially here and there, but kind of on, on a, on a more modestly budgeted side of that.
And there are certain like integration pieces where it's like. Um, you know, camera department strapping iPhone to their camera. I'm going to get some looks, right? I'm just going to get some look. I
**Eliot:** know, right?
**Kevin:** And I, I'm not and it's ridiculous because it's [01:11:00] an iPhone is objectively a more sophisticated piece of technology than a stipe.
But. You're going to get some looks for an icon. I'll just tell
**Eliot:** you, we have some very large skill shows going, uh, going off right now with very large scale groups. And it was funny because one of the, their, their, you know, heads of production is like, we're not, you're never going on the camera. They, so they're, they have us on the, off the side, you know, pointing at an iPad, the director looks through it, sees where everything is like, you're never going on the camera.
I'm like, I gotcha. I gotcha. You know, but people need to see. The core thing that I realized, and again, this is math engineering kind of stuff, feedback systems, is that if we got close. And we got scan data. Those are the two things that, so we have a geometric plan of the, then we could, then, then I can feed the solver enough information that we can lock in sub pixel tracks at a very high rate of speed.
And that's exactly what's manifested. Right. So we did the first one with SynthEyes and see, and there's a tutorial on that. And just earlier today, what you saw, you know, you know, Peter and I worked through is being able to do that completely inside [01:12:00] blender with geo tracker. I'm like. There it is. That was the you know, I didn't know that was out there Otherwise, I would have done that because the SynthEyes thing was a lot of work I think that was a couple that was a good month and a half of hard hard stuff It works like gangbusters.
And if you are You know, spooling, if you're a big team, great. You know, you have dedicated trackers and they're used to it. A lot of small teams that they look at some thighs and they're like, Nope. So this, this handles the, you know, people stay
**Kevin:** in blender. You know, the stuff I'm working on right now is kind of in between.
It's more on the smaller size because of the, the tight budgets. But, um, You know, at the same time, um,
**Eliot:** I think it's ideal, honestly, you know, the, the, the, the iPhone is so, and the thing that probably is going to. May help on this. Is it so lightweight? That you can park it on the camera and you can rent. So the, the shots we did with Alex in the barn, I don't know if you ever saw that [01:13:00] video,
**Kevin:** the ones that you were just using a clip from on the plane.
Yeah. Yeah. That one,
**Eliot:** that's on a stock. He was a Sony, uh, FX three, you know, shooting recording raw 12 bit raw. Um, and it's, we're stabilized on a stock DGI RS too. Gimbal, like you don't need one of these monster things. Like it doesn't weigh anything. Right. And so, you know, the iPhone actually works pretty well in production.
You need to keep it powered and cooled. So that's, we use the axiom SEMO both for video. We put a park, a battery on it, an MPF battery on it, and it keeps the phone charged and it runs the cooler. The key thing that the, the phones, I mean, honestly, the heart, like, again, I did robotics. Hardware engineering is a function of how many units you can ship is how much.
How much engineering you can put into the reliability and stuff like that. Um, and I guarantee you, iPhones, they ship more iPhones than they ship, you know, whatever, you know, kind of anything
**Kevin:** else you can name. Yeah.
**Eliot:** Yeah. There, there, there are limitations are battery and cooling. Yeah. Yeah. [01:14:00] So as you put a cooler on there and you put a big, put a big battery on there and yeah, it'll run, run all day, you know, if you're running the boiling sun, you may have some problems, you know, put a shield kind of usual production stuff.
Um, you know, they work, they work, they were great. And among other things, we, we've been adding a remote control systems. So, you know, cause it's cameras over here. And you need to be able to fix stuff remotely without going up the camera team and, you know, yeah, move out of the side. I got to touch the camera, which is, you know, that's always a win.
So Jetset has a browser system built into it, right? So each Jetset device has a browser, you type in the IP address in the browser, and now you have both a local preview, you can link in and see the video feed from Jetset. And it's got a remote control system, like the, the, uh, remote UI that reasonably mirrors what's on the, the actual device.
So you can remotely change scene locators. You can remotely like, you know, change, you know, adjust tracking. You can do all these things that are, you have to do because what happens is the camera's 30 feet up on a [01:15:00] crane. You can't touch it. You need to be able to remote control everything. And that's, that's come out of our, our projects over in Bangalore, uh, not Bangalore.
Um, actually yeah, Bangalore. Um, and, uh, cause they, you know, As well as the, the clip that I just sent you, the, uh, fast take matching and time code. We just, I just uploaded that yesterday. So if you use the tentacle sync, uh, you don't need to use the digital slate Um because the digital slate it's still great to use for a backup But those the flashing optical markers are too unreliable in in areas with with um, You can get a reflection on it from a bright light.
Oh got it It wipes you out and you can do it manually, but uh,
**Kevin:** got it. Okay. No, no, that's good to know. Um, yeah, it's
And you're You can use USDs, but you're kind of preferring splats because USDs just get really heavy on the phone. Is that right?
**Eliot:** Okay,
**Kevin:** so Did I get that right from something I saw?
**Eliot:** The USDs, um, are, are [01:16:00] accurate and, um, and they actually work great for reference geometry. The problem is that it's hard to get, um, the phone doesn't have that much texture memory.
And so in order to get things to fit in the phone, actually you could load a decent amount of geometry, like more geo than you would think you can load into the phone. And it'll, especially in a newer phone, if you're trying to do this with iPhone 12, you're going to be, you know, limping a bit. Uh, you know, but if you, by the time you have an iPhone 15 or 16, the GPUs are good.
What
**Kevin:** iPhone do I have? I don't know. You know, I mean, Find out what iPhone you have.
**Eliot:** Well, if you get into this and you start doing it seriously, I'm going to recommend at least a 15 Pro Max. And the key thing is that among other things, the cooling and the GPU are much better, but also it has a USB C port.
Um, and so this is a 15 pro max. All right. Oh, you're you're okay. Cause then this, this is your new best friend. What this is, is a, a USB C to gigabit ethernet connector. Um, because what happens is, is, is, you know, we can auto shot [01:17:00] pulls takes off of it over wifi. Um, but you run like, you know, you run a production day and you have like.
I dunno, 120 takes, 150 takes uhhuh. Um, and, and wifi, you might get 50 megabits per second off it, but it's not fast. This is 800 uhhuh, you know, so you just plug it into the phone and it is, you know, hit sync and it rips all the, takes all the files off the phone and, and you know, some number of minutes. Um, and it's fast.
Got it. So, you know, again, a lot of these things. You just have to be thinking about volume and speed and turnaround and engineering it into the core of the system, uh, to, to handle it. And that's, that's kind of what we're doing. What about,
**Kevin:** um, you don't, as far as I know, you don't have anything for dealing with, uh, focus and focus poles.
**Eliot:** Not yet. That has come up. Um, now by default, what we use is the iPhone has LIDAR, right? And so it has an automatic. Uh, focus for the phone, um, that is basically detecting the location of people in the [01:18:00] scene. And so it sets sort of the focus distance where the people are. And that that is automatically translated into, into our post production pipeline.
So we have an automatic focus distance that we have, you know, you're just doing the yield tracking shot. There's a person there. It's going to be focused on them and it's going to be accurate within a few centimeters. It's not doing racks. It's not doing,
**Kevin:** there's no way in the system currently of recording whatever the AC is doing with his little wheel.
**Eliot:** Not, yeah, it's very much on a road map. Uh, the first thing, um, the thing that's on a road map before we do that is compositing in the phone. Uh, right now, what you're seeing when you're shooting with Jetset is you're seeing, um, uh, even though we've taken the cine camera for calibration, um, uh, right now You're seeing the iPhone camera.
You're seeing the iPhone video with a reticule. Uh, over it showing you where the Cine, where the Cine, the equivalent of the Cine view is, but that breaks in all sorts of ways, right? When you have a 70 or 85 millimeter camera, you have this goofy little reticule, it's no longer useful for aiming. [01:19:00] And if you're very close to the camera and their offsets will be wacky.
You've just described where we're going to comp, and I mean, we're going to do it. We're cranking on it. Um, And that's, you just have to have it for production because then you have a production comp. So the, the place where we use USD files are, um, I think they're the default. Right. Um, but sometimes you really want to be able to show what the lighting is going to look like on your 3d scene and as rendered.
And if for that case, we, um, we have a Gaussian splats. So you can generate those, the blender one already works. Um, and so you can generate a Gaussian splat and then it's. You know, what's in the phone is rendering in real time and it looks rendered. I mean, it's, it's a little fuzzier. It's not as crisp as an actual phone render, but as an actual, you know, um, cycles or it's like 80 percent there.
And that's, you're there, you know, close enough. Uh, and we're building the same pipeline for Unreal to generate and, you know, generate, generate Gaussian splats then. Cause what everybody wants to see, if you're doing your green screen, they want to see. The shot, right? You know, like [01:20:00] 80 percent of the shot with the cine footage and the Unreal like Blender scene lighting and stuff.
**Kevin:** Yeah. And so I think I'll be lurking on these calls some more and experimenting with it. I think in the immediate future, I'm going to keep doing what I'm doing, which is looking up an ultimate box and, uh, you know, doing vanilla, what I call like vanilla green screen, Unreal, um, which is, you know, ultimate box.
External tracker and those things, but I would love to be working like my fantasy would be able to take like this one company that I'm doing a lot of stuff for now and like, get them, get them set up in such a way that they never have to be shooting, looking at a green screen and like, so much of their life is spent shooting, looking at a green screen and, and not just for the warm and fuzzies that it will give the director, but that that is integrated into a post pipeline and that that is actually making Making, uh, life for the post VFX team easier and also making like decisions clear and, you know, all those things.