Transcript
# Office Hours 2024-12-10
[00:00:00] All right. We're just going to go through and we're going to go keep a solve on this and see there was a couple of puzzling things that I want to look at this with you to kind of scratch my head on because the camera placement seemed a little unusual.
I think it was under the floor. Anyway, how does that work? Yeah, so, um, yeah, we've got a rig literally on the floorboards to kind of help stabilize the camera while we do what we did a camera movement from right to left. Just swinging the camera from right to left while I, our actor was also taking a step, um, which again, made me ask you that question about that GUI check picture.
You shot showed me because that does look, Oh, Oh, okay. Cause this is different. I believe from the first one I sent you, I accidentally had my solver set to something called glow map, which is a. It's a faster solver we've been experimenting with, but it's not our default solver. And your, and the default solver is colmap, [00:01:00] uh, lower, but much more, it's more reliable.
So that was my mistake. So, um, it means that we actually have a, okay, that's good. Lens, lens calibration. So that's great. All right. So that's, uh, that's, I don't need to save it. Okay. And it looks like we converged, uh, 0. 49 pixel error. That's, that's fine. It looks like a solid solve. All right. So then what I'm going to do is.
Um, to start off, okay, that's right, this one did not have an external set LIDAR. Um, let me, um, let me first pull this into Unreal, or I'm sorry, into Blender. Uh, it'll be a little bit easier for me to kind of look at, look at things. Uh, and the other things we're going to need are external scan OBJs. So I'm going to open up Blender.
Can I ask you, oh, go ahead. Forgive me. Can I ask you a question just before you, um, jump into Blender, because this is another thing that I was unclear. Uh, I am aware that a Lido scan is produced, uh, on the initial, uh, well before you start shooting now, am I [00:02:00] unclear whether if the set itself starts changing, do you need to produce another, uh, LIDAR scan of the sets?
IE if props get introduced to the same set? Or taken out or lights get moved, which were incorporated in the initial scan. Would the shot require, later on during the day, another scan, just to make sure things align better to the new set? It is never a bad idea to do a scan. And we designed And one of the deliberate things we did.
So for example, our scanning doesn't pull in texture. It only does geometry. And it's because as soon as you introduce texture, it's, it's a factor of five slower, it's just glacial in comparison to how fast we can get the geometry. And what we found, and that, that scan is so critical when you're doing this kind of stuff, and yet it's really easy to overlook.
Uh, so we said, that's our problem. So we're actually going to add a feature to the UI. Uh, but if you're shooting Jet Set [00:03:00] Sending, um, that the record button, uh, until you have a scan loaded, uh, if you've done a scan, the record board is going to turn yellow with a crosshatch, a scan symbol on it to remind you to, to scan, and it will do that every single time you move the, uh, Uh, if you move the origin, if you reset or redo the origin, it's going to change your record button because it's so pivotal and yet it's so easy to miss.
And that's, that's kind of why we're having to sort of try to patch together, uh, an external scan. So, you know, this, this, this, you ran into this on your shoot, but we are taking steps to, uh, and re engineer the software so that people just always have, uh, it's so fast to get that scan. It's so valuable. Um, We just want you to just always do it.
Yeah. Just like, whoop there, you know, boom. That, and you don't need much, you don't need like the stop, the whole stop project, do a whole room scan. You just turn on the scan button and wave the camera around a little bit. It's going to be fine. You know, that's going to be enough enough to get you going. So, [00:04:00] okay.
So what we're, what I'm going to do first is since we have to do it with an external, um, Uh, OBJ file, we need to go make that, uh, file from your scan. So let's see if I've got the, um, uh, Oh, actually that's right. It's an FBX file. So I'm going to clear out my blender. You got, you guys can see my screen, right?
Yeah, definitely. Okay. So I'm just going to import our FBX file and this was over in our, let's see. It wasn't here. Is that in there? Oh,
That date looks correct, 11th of the 26th, so it would be within that folder.
And there's a, yeah, the LIDAR model, maybe. Oh, there it is. There it is.[00:05:00]
Okay. And this is probably where I have to be a little careful on the scale. So let's see if the scale is, comes in correctly. Okay. That, that looks, that looks, that looks about correct. All right. Uh, I'm going to check it by adding a two meter square. Um, okay. That's right. So there was probably a, About three meters wide.
Um, okay. So let's take a look at our coloration. There's our scan. Okay. And so what we're going to want to do is have it line up with the origin. Um, and this is, uh, over here, you mentioned that that was where you set your origin to. Indeed, uh, but again, before you go any further, can I also mention that, um, initially when I, uh, I sent you an email via Dropbox, I also sent an OBJ file of, um, the, the, the scene, um, which, I mean, this was, let me start again.
This is the LiDAR scan before we, we shot up that day, but as the day [00:06:00] progressed, Uh, the set changed, changed, i. e. it was the same set, but, um, new items or props were introduced, lights were moved around and then an OBJ was created, which where I, where I told you via email that I believe this is the one because the geometry looks like the set of, uh, my shot.
Does that ring a bell? Okay, let me, let me pull this up. This is, this is why it's good to debug some stuff in real time. Uh, so let's take a look at, uh, our Dropbox over here. Um, let's see. So we have, I think it's the N nine. Oh, okay. The, the, yeah, that's the p and g. So there, okay. That's those, that's right.
That it was the preview, uh, the, okay. That was the Synthes. And then this is the zip was was, is this the correct folder? It's inside the zip, sadly. C PT Auto. Well, that's fine, but I think I've already got that zip. Let me double, double check my, check [00:07:00] our file name. Auto Shot Export 20 24 0 1 1 6. Let's take a quick look at that.
User files. Okay, so let's take a look at that real quick.
Okay, so that's the same one, right? Autoshot export 2024, 1126. So inside we've got our project. We've got our ladder model. Um, and that's where I found there's an FBS file on the matching FBM file. That's what I think. Forgive me for interrupting. If memory serves, it's definitely inside the project folder.
It may be within the calibration folder. Actually, no, it wouldn't be in the calibration folder. It would be in the folder where, um, Autoshot puts the scans. I think it's in sequences my project [00:08:00] Sequences my project and my project again Oh Forgive me. No, it's not the source the source data. Okay, so let's go back up and it'll probably be in There we go.
There we go Ah, okay this that's the that's the one. Yeah, that's the one i've been using and the one. Um, You And the, uh, the scan that, um, synth eyes for some reason just doesn't align it to, uh, Look at that guy. That's important over to blender. Dj file import, uh, object. Skinnet
as one to one scale. [00:09:00] Okay. So let's see. And this was, this was generated as this is okay. So this is a jet set scan, right? So, um, Okay. That definitely looks like the setup because there were two step ladders and I know you can, you can't really make them out but um, yeah definitely in the middle of that scene there's two step ladders where the actor moves from right to left.
So right now this is assuming that the, the origin is right, was right here. Um, understand how this works. So our x axis is over here. Okay. Um, let's see what happens when we, uh, add that to, uh, our auto shot import. So we're going to change that. I'm going to go up to, uh, with[00:10:00]
that guy and let's just grab, I just picked out. Actually, this was the original in and out point you mentioned. So that I will tell it to regenerate the frames and let us, uh, let's try this out.
Okay. So let's see if this all kind of makes sense. Okay. This is where it [00:11:00] got weird. Indeed. Yeah. I'm glad you said that. I thought it was something. This is where it got weird because somehow the camera, this is the part where I was scratching my head because the camera is under the floor. And I, I, I remember thinking, huh, not sure how that, how that, uh, so let's take a look at the take to see if we have the tracking.
I'm going to look at the information of the take. So it looks at the tracking was normal. So it was mapped. So it, So that's interesting. And what scene locator do we have on this? We are on, we're on the origin. So this is where I, this is where I begin to scratch my head. It's hard for me to, so if it's on the origin and he set the origin on the floor, then how has the camera under the floor?
This is the part where I, I began to ask distinct questions as to what was going on. Um, interesting. I'm just trying to figure out how you, [00:12:00] how that, how that. Came to be, um, interesting. Um, I mean, let me just clarify. There was no camera under the floor. There was definitely a rig on the floor. It's so right.
Right. But, um, yeah, uh, this is where I got to incentivize. And I thought I'm going to throw this out earlier. Yes. Yes. No, this is so that I, I honestly, I mean, maybe the tracking was got, got knocked off kilter. And what's strange is it says it's tracking was tracking was okay. Okay. This is one of the multitude of reasons why we're going to move to compositing in the camera.
So you see everything while you, while you do it, instead of running into something like this, where you're trying to figure out something that didn't line up. Yeah. Oh, actually let's, let's, let's take a look at, let's take a look at the shot. Um, because maybe that will tell us some information. I don't know if you had a model loaded in the shot, but we can look at.[00:13:00]
Uh, open up the, I don't think I've provided a model to be fair. I think on the day, um, the workstation, um, where unreal engine auto shot was working on, um, had all the models, unless you're referring to something else. Okay. So this, so this is the way here's the shot. This is what was coming in on the iPhone.
So then
some of that tracking looks reasonably okay.
Jackie looks very, very solid. It's just a little bit of slipping towards the end, or some, I don't know if that's camera shake, or um,
And, the other interesting thing is, [00:14:00] which, so that was using a USDZ file called City Street.
Do you have that USDZ file anywhere? I personally don't. It's, it's on the workstation that we used on the day. I could, I'm sure I could, um, get that for you. Not this minute, but I can make a note to try to get that for you as soon as possible. Yeah, yeah. So I'm, I'm so you can see why I'm scratching my head.
The track it says, okay, but it says it's underneath, underneath the floorboards. It clearly wasn't. Um, and so what I'm going to try, so I'm going to try to do a couple of things just to see if we can get, get an idea. What I'm first going to do is I'm going to do RZ 180. I'm going to see if this starts to align and look through our, our camera and see if you're going to get lucky and see if this starts to align.
Um, so here's,[00:15:00]
you know, like that actually does look sort of like the, the crease in the. Sorry, it's hard to tell because it's a limited sort of chunk of, chunk of stuff.
So I think just in that shot you can see the top of one of the, um, step ladders on the left hand side, bottom left corner, right there, you see the top, uh, just on, just, um, beside the, uh, what's it, the green screen that we placed on the roof? Oh, okay, okay. And I'm hoping that could probably be aligned with some of the geometry in the scan.
Um, again, it's very hard for you guys to envisage, um, the geometry of the stepladder because it's incomplete. Um, but maybe I can, and I think my next question was going to be, I think I saw on, uh, some SynthEyes tutorials that you can match up the LiDAR scan to your [00:16:00] footage manually. Now, it doesn't seem like the most accurate of processes, but But I'm sure, am I right in thinking this is possible?
Yeah, yeah, and this is what I'm kind of thinking through. So, what I'm going to suggest, let's actually just try processing this in SynthEyes. And if we get a locked track, then you can kind of re transform the track in your 3D application of choice. Because this is going to be one of these things where We're going to have to kind of figure out a workaround is, uh, something, uh, something was, uh, skew, uh, again, you know, we're under the floorboard.
So I, I, I don't fully understand that, but I think we can get something to work. Um, so let's go over to, we're going to go to others. I'm going to, uh, hit save and run. All right. And we'll bring all this information into, all right. So then we'll, I'm going to click onto the synth eyes script where we got that.
There it is. And it's open up the synth eyes folder.[00:17:00]
And also, am I right in thinking it's best to use that OBJ file that, uh, that we've, we were just looking at, or that OBJ scan rather than the initial LIDAR FBX file. You know, I, I'm for anything that is, is going to be close to a coordinate system match, you know, at a certain point it's, um, what can you find?
Then the part that again, puzzles me deeply is that the tracking data is under the floor and so something's not right, you know, at, at, at kind of a core level. And I'm, Actually, you know what, let me think about this for a second. Let's look at Blender. What happens if we, you know, if we make some guesses, frankly, because at this point we've got to guess a little bit.
Um, what happens if I grab hold of my shim or, so when you bring in, in your, and you can always kind of offset things usually by, um, uh, by just adjusting things. Whoops. [00:18:00] I want to, oh, I see. So right now I have the, uh, scan parented to the, um, to this object. Um, Let me think about this for a second. So, okay, there's a scan.
I'm going to actually unparent it for a second. We're parents with transformation. And what I just want to do is get an idea in Blender where things should be. And that way in SynthEyes, we can, we can modify things to try to try to fit. So I'm, what I'm going to do first is I'm going to try to put the camera first from above the ground, right?
So I'm just, I'm just moving it. I'm making a guess, frankly, that it's, it's pretty close to the ground, but not exactly on the ground. Uh, and then I'm going to look through this. And just see where we are with Let me also help you out a little bit. Um, so if you can imagine, imagine that scan being a square room, the camera would, would have been on the front left hand [00:19:00] side.
So right now, yeah, so yeah, so right now where this crosshair is to the right of that crosshair, that corner there of the lidar a scan, excuse me, would be, uh, One of the edges of the room, but the camera itself was on, if we can imagine that being a square, the camera itself was on the left hand side of that floor.
Um, again, if we, if you imagine you, yeah, exactly. The, uh, rotation of the, um, external scan. So I can rotate this guy. So, so you think he would be somewhere around here. Um, not, not necessarily, but let me see if I can try to, uh, word this. Um, so the LIDAR scan itself, can we say it's like a triangle? Imagine that being a square.
Um, so [00:20:00] exactly. So it would be, um, if you just drop your, um, your mouse cursor all the way down to that bottom corner and move a little bit to the right to complete that square, that's probably where the camera was. Okay, so let's uh, From the corner, yeah, looking upwards. Put it there. Yeah. Let's move that neighborhood, and let's take a look at where it was.
And the camera itself should, um, So this footage should be in front of the LiDAR scan's ceiling, or roof, if I can call it that. So, I think what we'll do, ultimately what we want to do is we want to get a tract, and You're we're, I think we, what we, we wanna do is first get a, get a, a track and then kind of [00:21:00] re put it back into the, in, back, into the um, uh, the real world, you know?
Okay. Because the, the nice thing is I think we can, so I think what, we'll, what we can do is, um, let's first take a look at how high that ceiling is and what I think what we can do is. Make some geometry and synthesize and project the points on the geometry. I think we're going to make some estimations here.
They're going to be kind of fun. Uh, so let's go see how high this, how high this, uh, this roof is. All right. So you are about 2. 56 meters high. So I'm gonna make a note, a note of that. Um, cause we're, what we're going to do is we're just going to fake it. And 2. 56. Thanks. Thanks. I'm going to put some geometry and synthesize that is just going to be a 2.
5 and that's going to be the height of the, the height of the ceiling, because we know, so there's, there's these things that we're looking for known. So for the scan, we know that the height of that, that ceiling area is, you know, somewhere in the neighborhood of 2. 65. And we're making some, we're going to [00:22:00] make a lot of approximations here, but we're going to, going to try to get ourselves into the ballpark and we're going to, you know, See how well this solves.
So just because we don't have, um, scan data for that ceiling, all we need is geometry that's at the correct location, right? Uh, to be able to kind of project our points onto it. And scan data is useful, but it's not, you know, completely necessary if we know the height that even this partial scan is invaluable because we know that the height of the room is, you know, uh, 2.
56 meters. Um, and, uh, so we're just going to do that information in SynthEyes. So let's go to SynthEyes. And we'll do, we're going to run a script and I get a script we just generated. There we go. All
right. So there's our camera. And so let's, you guys can see that synth eyes, right? All right. So we're going to be doing a little bit of winging it. So let's take a look. We're going to put it into quad view. So, all right, so here we, we just relocated our shot in, um, in [00:23:00] this. And I am going to switch to our, uh, camera and graphs.
Let's see if I can do a quad. I'm going to do a quad view in graphs, uh, dual in graph. How's that? Okay, there we go. And I want the, can I pick the top view? Uh, this is where I run into synthetized questions. How do I tell this guy to be 3D?
Tell this guy to be a top view? Oh, let me save synthetize for a second. Okay, hang on for just one second. No worries.
They're trying to change the, um, Orientation of that viewing box, right? [00:24:00] Yeah. Yeah. I think you on the right track by right clicking on the word top and it's gotta be in some of those options. 'cause I think I managed to do it somehow. Um, 3D no. Okay. I may have imagined it. No worries. Uh, camera panel. Let's see.
Where's the key? Would you have to be in the 3D? Um. Uh, what's it, tab? Let me try that over there. Oh, no. Okay, no. It's not that. I'm going to see if I can figure it out myself as well. Yeah, so what we're going to do is I'm going to grab hold of the camera, and then we're going to move it upwards. So, we're going to go from, we're going to get rid of our solve velocity, we're going to go to our, uh, our solve path, and we're going to look at our z, and then we can hit, I think it's frame, or is it z?
Oh, [00:25:00] I've got it. You got it? Yeah, so just above the word top, there's a little gray rectangle. Ah, there it is, top. So we want, uh, there we go. Okay, ha, there's a win. Okay, now then I want to find, where is the fit, uh, command? Because what I'd like to do is, I can see my x and my z, y path. I want to select my z.
So I've got that, I've got that selected,
there we go. You got it? Yeah. I can grab the zoom and move it down. Okay. Okay. Life goes on. Wait, how did you do that? So I was thinking about the future. I grabbed, of all things, I grabbed this and then scrolled it up and down. So let's actually just see if I can see all these things. So I think what that is doing.
Yeah, that's just scaling my zoom and view. Okay. All right. I'll take it. I'm going to bite some thighs on this one. All right. So then what I'm going to go back here and [00:26:00] go find our front view or, uh, there we go. Okay. All right. Once more with feeling and we're going to go into 3d and I'm going to create, um, well actually first let's, let's move the camera up.
Um, I'm going to grab all my key frames. There we go. I think I've got all of them. Okay. And then I can grab hold of this and I can move it up. So the camera's just going to move, move vertically. I can make a guess that our camera is, because it's the frustum that we care about, which is, uh, there we go.
Okay. Maybe I don't need to be up that high. The frustum is this pointed on the camera. So I'm actually going to drop this back down a little bit. Uh, so we're kind of grazing the ground here. I think that's about as low as the camera is going to kind of realistically be. So now, now I've just basically moved the camera here and the shot is still, You know, [00:27:00] um, okay, so the camera's, the rest of the shot is unchanged, um, and what I'm going to do now is, um, we, we want to have geometry overhead and right now I don't think we, uh, let's see, whatever.
Can I just point out by saying, um, so if we look at the geometry from the front view, we can see that the ceiling slightly dipped. Okay. Uh, there's some geometry that, um, yeah, now that is the actual green reflector. That's where the green reflector was, um, on the day we shot. Okay. Well, in that case, I don't know if we can, let's put geometry right there.
So what I'm going to do is instead of making a box, let's make a plane. Um, and let's go to the, let's go to the top view. So it makes the plane, the right spot. So make plane, and it's going to drag out a plane over here so we can [00:28:00] see it. I'm going to go back to our, uh, front view. There we go. And I'm going to move that plane upwards.
So I'm still holding onto the plane and I'm going to go, I'm going to click translation and let's move our Z axis upwards. And there we go. Do you guys see this on the screen? Yep. So I'm just kind of, and we are well into the world of winging it. So I'll have to see how this, how this all, all this goes. So once we've got that, I'm actually going to, uh, Um, remove the, the, uh, this, the scan.
Um, I think I can hide it. You can hide it. Yeah. Where's that hide command? It should be right next to it in the outliner. Wait.
I switch to
Oh, that's right. Where is that [00:29:00] guy? Layout. Hierarchy. Okay. So There's this. This guy. And Okay. So then Let's take a look at what it looks like. There's our, I'm going to hit play and kind of cache some of these, the frames
it's going by. And we'll see if we're actually, it seemed like we're kind of lining up. Um,
that could be worse. And then what I'm just trying to see is do, do we have 3d geometry at roughly the right distance, roughly the right camera. And when we, when we do these constraint points, they're fortunately not. Pinned, they're only kind of a, a relatively soft constraint, um, for that we're going to do to, to generate the, [00:30:00] um, uh, the 3d points with our kind of projected mesh workflow.
Uh, if they were fixed rigid points, then that means you have to be really accurate. And that's, you would only usually get that if you have a, like a field light on set, which is, is not what we're dealing with. Okay. So let's take a look at this. All right. I mean, it's, I think that's where we're in the realm of a reasonable, reasonably close.
So, um, right. And let's take a look at our solver. All right. So we have, uh, right now it thinks that we're on like, uh, uh, like a 30 millimeter focal length. Okay. Um, all right. So let's, let's, uh, let's see if we can track it. Let's, uh, we're gonna go to, let's see. Oh, yeah. And let me check my, um, Okay. Uh, let's check our Roto mask to make sure we're not Can I ask a question, Elliot?
At this point, would it be good to change, um, the focal length of the camera to what it was on set, or shall we just let [00:31:00] SynthEyes, you know, continue with the estimated, uh, camera? Ah, well, what we did is, is, uh, this, we brought over that script that I ran, brought over the solved focal length from the lens calibration.
Okay. So, uh, and more accurately, What it brought over was the horizontal field of view. Um, and so that will actually be reasonably accurate. So, you know, I think this is where we're kind of seeing things lined up to some reasonable degree. So that should be actually fairly accurate. Um, and then when we look at the focal length, what the, uh, that is driven by the sensor width.
And so. What we'll do is, we'll, we'll solve it first to make sure that we have a locked solve and then we can adjust the, the sensor width. Okay. And I think you'll see, see things line up. So let's, uh, let's, so there's our, okay. Our rotor masks are fine 'cause we don't have a, yeah, I don't think we have a roto mask in this.
Um, we haven't, um, brought in the rotor mask. Um, I did provide, uh, provide one for you when I gave you the, uh, zip file. [00:32:00] Uh, link it up, you know what, and let's, we can actually make this a little bit easier on ourselves. I'm actually going to, uh, so now that we know that we can, we know that that mesh restarted, I'm actually going to rerun this back in auto shot, the nice thing is it doesn't, it's not that hard to rerun this.
I'm going to pick an AI roto model and we use like PP Matt, something like that. Um, that way it'll still work on the Mac. Uh, so I'm going to just save and run again. And it's gonna, uh, run AI mats in this, this way it'll automatically drop them into, uh, drop them into, to synth eyes, see if this, if this, uh, behaves reasonably well.
I hope you get, uh, I hope it generates a better map than what I got on my Mac. It was not good usable. No, that's why I sent you the black magic, um, ultimate, um, alpha map with, um, a hold out roto. Okay. Okay. Just to be a little bit more precise. That's actually a [00:33:00] very interesting question because right now we don't have a way.
We, cause we scripted all the pieces to get that image sequence in. Um, and I suppose what we could do. All right. So I think I can, I can figure out how to use this. Well, what we'll do is we're going to have this generate their initial AI maps. Um, so we have a file file sequence, you know, that's, that's in a, in a directory.
Okay. And then what we could do is extract yours, um, to match, but let's, let's see how, how it's, how it's going to do it. We don't, it doesn't need to be perfect. I'm just looking for a very rough roadmap. Um, all right, there we go. Let's go. Let's load up SynthEyes again. And we're going to be a lot faster this time.
All right. Script, run script. And there we go. There's our, there's our script. Okay. There's. Let's take a look at our camera. Okay, and then we'll go to our Let's [00:34:00] go to our quad view. Or actually camera and Perspective and graphs. About, where were we at?
Perspective and graphs. Okay. So what we're going to do again is, we're going to do what we did last time. And we're going to change this to Front, because uh, okay. And here's our graphs. I'm just re redoing what we already did to get everything in the right spot. Get rid of our velocity. There's our Z. Zoom in on our magic Z button.
There it is. Okay. And then let's drop in in 3D. Let us, let's save this guy.
All right. Yeah, go ahead and place that. We're once again, going to make a plane. I'm going to switch to my top view. Go and let's drag out our plane. [00:35:00] Switch to front and move our plane up.
Okay. And we can actually hide that and do H to hide. No. Control H. Well,
maybe I won't hit that. Uh, where's our hierarchy?
Hide that back to our,
okay. So back to our camera. Okay. So now we're back. And we just, so what I did is I just, you know, did the stuff we did before. Oh, I need, I need to still need to move our camera. Uh, perspective graphs, which this to our top view, our front view. There's our camera below ground and let's move him above ground[00:36:00]
far.
Okay. So there's a, okay, about 40. So I'm going to put, I'm going to put Z. It's about 29 centimeters above the ground. So I think that's probably about as close to the ground as you're going to get. Um, I mean, uh, JP just said about, right. Is that with the, uh, top of the camera lens be, you know, about, uh, about.
20, 30 centimeters off the ground. Yeah, it sounds about right. Yeah. Okay. That's cool. Uh, we're, we're doing some guessing, so that's all right. All right. So then once we've got the camera in the right place, we put geometry in the right place. There's our geometry, there's our camera and we should be back where we were.
Okay. And then let's add a Rota mask. There we go. There's our Rota mask. That's all right [00:37:00] so far. Okay. We've got a little bit of a blocky Rota mask over there. Oh, I see. I see. Yeah. Oh, it's mangling it. Oh, stop, make it stop. Wow, that's a terrible mask. Okay, you're right. That's just awful. I thought we were going to be okay there for a couple frames.
That is, uh, that's pretty much a trainer. I can't, uh, goodness. Um, okay. You know, in that case, uh, Let's do something a little bit different. Let's, um, but it's actually all right there. Um,
what we could do
second, think through this for a second. Is it not easier to bring another alpha matte or another Roto? It's sort of surprisingly difficult to get all the frames synced up and lined up. [00:38:00] Um, so I'm just thinking, I'm thinking through this a little bit. Uh, there's a couple of things in SynthEyes that I haven't used.
There's a, uh, there's like a green screen command that will only extract. Um, markers from the, uh, the green screen area. Um, but I actually think, um, let's take a look at a Roto masking. Okay. Um, I may think we want to, we may just do this kind of a older way. Um, let's add a, we have to have a default Roto mask somewhere.
Okay. Where are we at here? Uh, it really messes that up, doesn't it? And that's going to block us from capturing our coordinate points. Um, how many things,
all right. And I, I assume you tried mod net. Oh God. Yeah. [00:39:00] That was a better one, but it's still, it's still, it's still pretty bad. Okay. Uh, you know what? So I'm actually, rather than fight the, um, fight all the maths, I'm just going to redo this.
Here we go.
Let's do our perspective.
Okay. And we'll make our
lane.[00:40:00]
Oh, there we go.
Okay, there we go.
This guy,
don't forget to move the camera as well up the on camera graphs and we want the spectrum graphs. All right, we are getting, we're getting faster here. . All right.
Yep, that's fine. Salt path, Z, zoom or Z? There we are. And[00:41:00]
line. Okay.
Okay, let's save that. All right. And there's actually a way of, of row masking that I'm, I should know don't, but, uh, cause I never learned the manual method of doing this. This may actually be simpler. You know what? Let's just see what, where, where his, his point tracks get and then we can delete the bad trackers.
Um, uh, I will probably go and relearn how to do the, uh, uh, Uh, the right way at some point, but let's, let's, uh, let's see if we can actually, uh, get something basic to solve first. So I'm going to run our multi peel, [00:42:00] uh, thing. So we're going to sweep through the, it's a Cynthia script. It's over here and go through and look for blips.
In the meantime, I don't know if, um, Bernard, um, has written a question. I don't know if you're able to answer it straight away. Oh, where's, uh, is that in zoom? Yeah, and so, all right, let me go take a look. I think that's crunching. Where is my zoom? All right, there's chat. Uh,
okay. System requirements for auto shot application house. Okay. Uh, requirements are a, uh, M series act. So it is, uh, it needs to, it needs to be running Apple silicon, uh, to be able to do, to be able to do it. The, the Intel max won't work. Um, okay. So there, let's see what we got here. Okay, so there are some tracking spots.[00:43:00]
Some of them are on him. We do not want those. Yeah, of course. Let me grab, get rid of a couple of those. Let's see if we have enough to track from.
Okay. And let me expose, let's drop our exposure down. Okay. All right. So let's go see what we have for tracking spots. There's a few. Okay. There's some. All right.
Okay, that may be enough to do it. So, alright, so now I'm going to save again and we are going to go We're going to grab some good looking ones that are are, you know, for example on it looks solid And are on the corners grab shifts like that shifts like that grab that And I'm just gonna go and grab the ones that are probably [00:44:00] Good ones.
Alright, I'm gonna get a few across the across the frame And, okay, so then I'm going to do, uh, track, I'm going to do drop onto mesh, so now that should be 3D data. Alright, and let's see if that does it. I'm going to go to the solver, uh, let's hit go. Alright, so that's solved. Let's take a look. Alright, so the camera is in the correct spot, still above the ground.
Our, we have an error about that many pixels, so hit shift C. Get rid of our bad frames. Uh, all right. Whoa. Hang on. My phone decided to leap off the desk. Um, all right. There we go. So let's fix that. Switch to constrain or refine. Go. All right. So that's, [00:45:00] that's getting to be a smaller error. Now let's, let's see if that kind of broke us.
No, that looks, looks pretty good. So let's take a look at this as we, as we track through it. Okay. So the camera's moving.
We're, we're still, so you can see here, we're still, uh, above the origin. Um, and
all right, now we can, we're actually, let's, let's get rid of a couple more of the bad ones, the higher or higher error trackers, get rid of that. Fix and go. All right. So we're down to point three, six error.
So that's, that's pretty close to the realm of, of, of working. Let's get rid of a couple more of these bad, kind of bad ones here and go. And right now [00:46:00] I've got radar on it. Sure. The individual errors on the, on the different trackers, but now the shots pretty much green. Um, have to go there.
All right.
Okay. I think that's good. I guess, I guess that's decent. So let's take a look at where our 3D system fine ended up and we'll take a look at our camera. All right. So our camera thinks it is 35 centimeters above the ground. Um, and, oh yeah, let's go fix our, let's go fix our focal length. So our focal length, um, let's go look at, uh, where's the shot info.
Let me see. I can't remember this. When you say focal length, are you looking to, uh, is it the lens distortion that you're wanting to? Uh, no, actually the um, I want to set in the correct sensor size. Uh, so What i'm going to do is, uh, I think it's [00:47:00] forgive me I think it's just above the word where it says classic.
I think that's where you type This guy, there we go. That's what i'm looking for. Okay, so let's punch in our correct sensor size Uh, what what kind of sensor was this? It was an Ursa Blackmagic Mini, uh, give me a second. Let me see if I can get the details on on the sensor size. Uh,
okay, I have 27. 3 millimeters by 14. 25 millimeters. 3, 14. 5, [00:48:00] uh, it wants, actually, is the aspect ratio 16 to 9 or is it, um, Bear with me, I think I've got the wrong, actually not, hold on, sorry, I'm just confused, bear with me. No worries. Okay, so I'm on the Blackmagic website. So, uh, the effective sensor size is 25.
34 millimeters by 14. 25 millimeters. Yeah, 25. 34. Okay, that sounds good. So let's see what our, it's computing a focal length with that sensor size of about, about 30 millimeter focal length. Does that sound about right? Um, I believe on, on the day we used for this shot, 24 millimeters. Interesting. All right.
Um, Oh, actually on the black magic, does it, there's sometimes you get different sensor sizes for different resolutions, what's setting, um, cause this, this is, let's see what our resolution is. So this was a 4k shot.[00:49:00]
38 40. Okay. Yeah. But that's, this is 38 40 by 21 60. So this is a UHD, uh, shot. Uh, so let's see. So, okay. And I'm sorry, which, which, uh, since we're. Yeah, since the size, bear with me. So the sensor size is 25. 34 millimeters by 14. 25 millimeters. And is that for the, their, um, uh, cause you have much different settings in the black magic and each one of them will result in a slightly different sensor size.
Uh, so let's take a look online since this is actually, we found an online sensor database and we're going to actually incorporate it into, uh, auto shot directly for exactly this reason. Cause it's, it's, it's, Oh, good Lord. [00:50:00] Uh, all right. So let's take a look camera sensor database. Um,
see, um,
okay. And then this guy, there it is. Okay, so this is, this is super useful. This is, uh, camera sensor database, black magic, and uh, it was, uh, which one was it? Uh, it's the Orsa 4K. Ora K 4K. This guy or the 4.6? No, uh, actually now that's said it, I think it's Orsa Mini 4K, sorry. Okay, no worries. Or so mini 4K.
Okay. All right, so there's some A 4K and it thinks it's. 22. All right. Let's, let's try that. Let's, let's see how, see how that works. Uh, we're going to do 22. All right. And then we end up with [00:51:00] a 25 millimeter focal length. And I'm sorry, which, which focal length would you reuse it? Did you, uh, 24. Okay. Okay. So that's, uh, that is I'd like that to be closer, but I'll, I will also take that where, you know, we're getting close enough and we've got a half a half a pixel, uh, half a pixel error in the shot.
Uh, so I think we can actually, and we can add distortion distortion is, it's a whole nother thing, um, to add to the workflow. You can do it. I'd say we're actually already at a pretty low pixel error, and I don't know if you're going to need like a shot like this. I don't know if you're really going to see it that much.
Um, But you know, you can jump into that, uh, as it were, but this, I think we're actually, we're doing all right. Um, and then, then you need to get, are you in unreal? So do you want to get this back into unreal? Yeah, that's yeah. I will have to. Yeah. So that actually, uh, you have just ran into the first [00:52:00] part of the process that I have, do not have, um, I am working through that.
As we speak, uh, as I started into a pro a, uh, a tutorial to get unreal to go to you, a synthesized track into unreal. Um, and the initial unreal FB or export to FBX didn't work. Uh, I'm sorry, since I was export to FBX did not work. I could not get that to work in my, um, but I'm going to show you where, where I think we started to build this and I need to test it out, uh, to make sure it's gonna, gonna behave.
Uh, so let me do this. Um, I have an Unreal project and I'll show you what I think it should work and then we'll see how close we actually get. Uh, cause I may actually need to do a finish the tutorial for Unreal next week. Uh, so we have, uh, give me, give me a second. I need to make sure I've got, um, image plate.
All right. And we need composure. There we go. I [00:53:00] need Python. Nope, not Python. Python. There we go. And I need, um, what's that called? Media? No, um, uh,
movie render. Cue. Yeah, maybe render cue. So I've got all those. Okay, great. So then. All right. So in SynthEyes, so you see how the process is going to work. Um, cause I banged my head on the different output format from, to get from SynthEyes to Unreal, largely without success, so we ended up writing something to the custom.
So, um, and, uh, there's, there's a default export from SynthEyes. to do export and we're going to do an ASCII export. Um, yes, no, or where do they hide that text, right? Um, plain text. There it is. Plain text. And it [00:54:00] was general X, Y, Z, pan, tilt, roll. Okay. I believe this is it could also be, was it camera object path?
Actually, let me look, let me look at my notes really quick to make sure I got this, get this right. Uh,
and this is a hundred percent something I need to, you know, Dial down and document. It's, it's on the list. Just got hammered by a couple other things.
There we go. So let's see. Um, where was I? Oh yeah. Um,[00:55:00]
camera object path. I think it's camera object path. Let's try that. Camera [00:56:00] object path. Let's see if I, if I, if this is correct. Um, yeah, that's,
All right, so let's take the, okay, our shots
and we want to see, uh, that's fine. X, Y, Z, focal length and channel. Okay. Let's look at that. So I just did is I wrote it just so you kind of see how it's, how I believe it's gonna, it's gonna work. Okay. Um, I'm going to open, let's pull up that file. And so here's the file that I just wrote out, the camera object path.
There we go. We got up here. Okay. So this has reach frame and has the, uh, X, Y, Z position. Uh, so you can, there's our 35. 42 [00:57:00] centimeters off, off the ground and it has the pan, tilt and roll, and it has the focal length, right? So. And then what we did is, uh, in autoshot, we have, and then again, this isn't documented yet, cause I needed to go through and make sure it all behaves.
External tracking and browse, uh, and pick out that same directory. There we go. Uh, let's, let's try that again. Okay.
Oh, okay. Go back up.
Okay. There it is. That's our syntax director. Okay. So then let's look at that text file.
No, that's right. So we put our external tracking file here. Um, now let's just try it. Hey, I give this a [00:58:00] 35 percent chance of working. Let's see how this, see how this works. Uh, and I don't want the external scan OBJ. Um, all right, let's try this thing out. So I'm going to. Click save and run. All right. I'm going to paste into Unreal.
Got Unreal console over here. Uh, all right, let's see what generates.
There we go. Let's go to our sequencer and let's go see if we can find our camera. Real find
camera. Okay. So let's actually, I don't want to walk. I don't actually see where the camera, there's our, there's our camera. [00:59:00] There's our camera. Hanging out. And, and of course this is, this is a Gaussian splat scene that I'm debugging, uh, which is kind of astounding stuff. All right. Hey, there we go. So there's, there's our, there's our camera and it looks like it came through.
All right. That's extremely promising. All right. So then let's, let's run our keyer stuff on it. Um, all right, content browser, autoshot, someday we'll come up with an easier way to do this. Um, and that was project.
This is all straight out of the Unreal tutorial stuff. You guys have probably seen this about 400 times before. Alright, let's set our key. There we
go.[01:00:00]
And let's apply him to the image plate.
Alright, there's our key. Now let's go take a look. There he is. Let's see how, let's see, let's play it back. Alright.
Let's see. Okay, nice. So it definitely worked this time, right? Um, whatever issue you were having, um, um, with exporting the newly tracked camera from synthesize into unreal. Um, it's now working. I think it's, it's starting to behave. I want to test it. Um, you know, like this is, this is the kind of stuff that I think it's mostly working, but I need to see it working at, at an exact pixel level.
I just want to show you guys the basic workflow. Uh, and that, that, this is how we will get it working. Um, I'm not [01:01:00] a hundred percent sure that, with this stuff, you have to be so crazy exact. Um, that I, I want to go through and, and very, but this is going to be the process by which we get it to work. So it's, it's already, you know, working to some level and then we're going to make sure that we're pixel accurate before this is, before I, I do all the tutorials for it.
So I'm just showing you how, how the workflow is going to work. Um. Understood. But the, the key thing, you know, the key thing here is we, this, uh, the synthesized track's good. Right? So this is, uh, you know, half, you know, 0. 6 pixels of error or something like that. All right. That's, that's, that's a reasonable solve.
Um, it looks like it's doing the things it needs, needs to do. Um, and then from there, getting it to Unreal, like we'll, we'll, we'll, we'll solve that. It's just going to take a little while to, to mop through the details and make sure everything's lining up precisely and exactly. Okay. Um, but yeah, so the key things, um, actually, I'm glad you brought this up.
So this is, there's a bunch of good things that came out of this. So one is that there was enough of a scan that we could figure out where that green [01:02:00] thing overhead was. Um, you know, the, the original tracking, there was something weird, you know, that it was, we thought it was below the floor, but we can make some estimations and bring you above the floor and then just basically, you know, track the shot with the information we've got and get, you know, a reasonable track that's, that's physically realistic.
And then we can drop it into the scene, um, and, uh, and put that all together. And then, then you have a track shot. So, you know, we went from, went from what happened to, all right. And that's, I think that's one of the key things on this is that the, there's enough data that you get on set that even if something is not quite perfect, we can, we can go through it and fix the shot, you know, fairly, fairly rapid period of time.
Um, yeah, I, I, I love that the, the track. The drop, drop things onto mesh. I am extremely happy about that because yeah, you, as you see it, you pick up enough pieces of information that you can then, you know, lock the shot in pretty, pretty, pretty solidly. Cool. [01:03:00] I mean, that looks great. I appreciate it. Um, Elliot, I'm going to have to re watch this again and again, I think.
Yeah. What I'll do is I'll transcribe it and I'll delete like the half hour we were running around trying to find the zoom key. Like that's, that's not the world's most exciting thing, but this, I think the basic bones of it are actually great for people to see. You know, there was, we had enough of a scan.
It didn't, it didn't really cover, um, exactly the part that we wanted, but there's enough information there that we just drop in master and synth eyes. You know, it's a plane, drop it in, line it up and, uh, reject the points. We just need geometry. It doesn't need to be the, it doesn't need to be jet set scan geometry.
That's the most useful. You know, cause it's, everything's automatically lined up, but we can make it work without it. Fantastic. Fantastic. Thank you so much. Um, like I said, I'm going to watch this again. Would you be able to send me a copy of this recording or are you going to hold back until you edit it before you, um, send it out?
I'll, uh, I'll put it into, uh, [01:04:00] Descript like right after I get off the call and I'll put it up on the, put it up on the site. Uh, and, uh, that way the Descript is nice because that's, that's how all of our. I'll show you the, I don't know if you've gone through any of these, uh, these others, but all of our, uh, all of our office hours, uh, that, that had, you know, some of them, I don't, I don't record everything, uh, but because some, some of them, you know, are, uh, aren't as useful, but some of these, uh, I'll just go to view office hour and, um, then Descript transcript.
Oh, oh, there we go. So if you click this, uh, then it has an automatic transcription. So you can hit. You can actually, you know, go through and search through the wording, you know, very, very quickly, you know, to kind of see how it works. And, uh, there's also a full transcript below, so that way you can search rapidly through a given office hours and see, you know, uh, synth eyes, did that show up, or was it more of [01:05:00] a, um, you know, C4D, uh, which, which program were they using here?
Um, you know, let's sit there in Maya. Yeah. So if you search under the transcript that you can find out. Very quickly, uh, on that page where, where people were talking about stuff. Fantastic. Thank you. So I'll put that in, I'll put that in descriptive. This is a really good thing to see. Right. I think I've taken too, uh, way too much of your time.
I'm going to jump out, but thanks again, Elliot. And I'll be in touch. Um, should any more issues pop up? Not at all. Not at all. This is great. This is a, it's good to, I mean, this is my, my big thing is on, on this one is, is, uh, figuring, realizing that we needed to add something to our UI to prompt people to To scan, scan promiscuously, the scan, the scan information takes up no space.
You know, there it's like kilobytes for the scan. And it's just so incredibly useful for post. So [01:06:00] we'll modify our UI and, and, and, and prompt people for that. Uh, with, with, uh, with Liberty. Cool. I look forward to that update. Take care. Fantastic. I actually can't, I actually can't wait to see this re recording, um, cause that, that was, that was a lot of info, I'm like anything that I'm gonna run into, I'm probably gonna have to refer to this video.
You know, and so, that was I'm so glad this came up, cause this is exactly the kind of thing that's reality. Like okay, we have incomplete data. You know, there's, there's like, There's some shot tracking data. There's a bit of mesh, but things aren't quite lining up, but we just kind of logic it out, you know, move the camera up, put the, you know, mesh where it probably may have made sense to put the mesh and stuff works.
Right. Um, and I know you have to go, um, and I do too, but, uh, Friday I, I'm gonna come back on, um, and I hope that, um, I'm hoping that I can, uh, go through why my phone [01:07:00] is not, not linking with, um, AutoShot. And I think this has something to do with Rob's phone, so I think it's better that we didn't get on today, um, You and I, because we wouldn't have been able to get anywhere anyway, the prompt to connect the phone, uh, to trust the computer isn't even coming up on his phone.
And I don't know. Right. And like, we need that. So I just stayed in and listened. I saw some of this email and I, I have to take a look. The by default, we are storing the takes on iCloud and there may be a way to access that on iCloud. I just need to sort of, uh, I need to think about that a little bit and understand It's been so long since we've pulled takes off of iCloud that I actually don't remember how we do it.
Right, like, yeah, I'm like, I wonder if I could do that, but now I have his phone. And I'm like, oh man, I don't know how long I need his phone. Oh, so you physically have it. Okay. I physically have it, but every time I go to plug it in, it just, the prompt [01:08:00] doesn't come up. I've literally been troubleshooting.
You don't need to do any of that. As long as his phone is on and has Jet Set running on it. Actually, you want to do it now? Yeah. We need to solve this in three minutes. Yeah, we can do it. Let's see. Um, I got his share your screen and, uh,
something else coming down on me. Yes. And I promise I will not. Bye. The guy said, bye. I'm here. This is, uh, it's fascinating. Honestly. Well, it's the real, it's the real thing, right? It's, it's like actually solving the real stuff that actually shows up. It's That's, that's just reality, you know, I need, I need you to know that like how many apps and mods and things that I've logged into, and I just wished I could talk to the creator of the made it.
And it's like so convenient and awesome and I feel like this makes it build. So that's [01:09:00] why I'm just so like grateful that you're here because if I have a question I can just ask it and I think that a lot of. Places are missing that. Yes, everything is tech, but it's just nice to be able to talk to somebody.
Yeah, there's a category of problems like this that are intractable in forum or email. You can't get there from here. You just, you know, fire it up and you walk through it because they're, you know, every once in a while you like chase your tail in circles for 20 minutes looking for the zoom key. But that's why, you know, that's why I'm going to transcribe and delete that part of it.
That's, that's not useful information. The rest of this is solid gold, you know, just like, Oh, we just worked our way through it. You just need like a tab that says troubleshooting. And then like, if you're having problems with this, that, and I would just like go through, you probably already ran into my problem, but if it's a new problem, you can just add it to the.
Archives of videos. I just, I love it. So yeah, yeah, it's, it's an all of our, all the office hours and sometimes we have an office hours and you know, like the, we didn't, we don't, that one was a really great one, you know, because it's really solid problem, solid [01:10:00] solution stuff. Sometimes we, you know, the, it's the problems aren't, aren't, you know, particularly wants to record for later.
It's not, it's not going to be that informative, but something like this, it's really informative. And those are the ones that I really, I put up on them on the page. There we go. Goes everywhere. So there, there's his, uh, there's his phone. Uh, oh my God. And so then, uh, let's pick out your project folder. Um, 'cause right over there, uh, you're on, are you on a Mac or are you on PC right now?
This looks like, uh, this looks like pc. On pc. Okay, great. So let's, um, is, is this, uh, the project folder where you, is that where you want to put stuff? Is that, that's, uh, is that, yeah. Okay. That's good. That's perfect spot. Yeah. That's just a test. Alright, well in that case, let's, uh, which day did you want to, to sync?
Oh, what's shooting day? I would have to say that it would have to be this. What? Okay, and you can you can always sync that the others if you want to as well. Um, but we can just go ahead and just pick the shooting day and, and uh, go ahead and pick which one you want. And then we're going to click sync. [01:11:00] I think it's safe to say this one.
Alrighty. Just go ahead and click the sync button on the right and it's going to start pulling, pulling takes and it's going to, it's going to take a little while to crunch through and pull, pull all the, all the take data. Um, and so, but yeah, you're cranking, you're pulling takes and then you pick out the take and, and, uh, and process it just kind of the same way we've been doing it.
And, uh, that's it. I mean, there's no, you don't need to have it. It doesn't need to have a trusted access or anything like that. It's just, if Jet Set is on, on the, um, on the same, is running on the phone and you're on the same network, uh, generally AutoShot will detect it. Cool. And I could just pull this into, uh, s like you guys did.
That was gonna be my next question. You answered it . There's a, there's two, like, literally, like I broke the Cintas down super, super tight, uh, because that was, uh, and that was, uh, there's two tutorials, uh, on this. Okay. I'll show you where, where they're at. Um, this is mm-hmm . In the, uh, [01:12:00] under tracking compositing.
There we go. Let me put the link on here. And there's two synthesized tutorials, uh, that go over the message. Uh, the, the techniques that we went, one is the basic, you intro to how you track the synth eyes, uh, and basically, you know, loading into the jet set, taking the synth eyes and the scan and then the, you know, AI mats and all this kind of stuff.
And the second one is, is, uh, called fixing a misaligned shot. And it's exactly the same technique technique that you saw earlier. Um, in that case, I had, I had scanned geometry, so I didn't need to, to, um, I didn't need to make it my own mesh, but same, same technique, you know, to be adjusted until things lined up.
Like. All right, good. And then we can project points and solve. Yeah. Okay. Yeah, this is, it's so, it's so amazing how simple it is. You, you kind of forget like, wait, I don't have to do this stuff. I don't have to, but you know, I'm letting go of everything I was taught. And I'm also, um, I found out the professor, his name was, uh, is Josh Hall and I've [01:13:00] contacted him on discord.
Um, and I'm going to talk to him further about this. I'm waiting for his response about, um, About jet set and you know, just seeing what like, yeah, because he, he was very passionate about it. And I'm like, maybe you should talk to Elliot. And we can show them this, this, this, this, this is actually a great session because, you know, this is something that, you know, kind of gnarly coming in off offset.
This is realistic, something, something was, was, was strange, but we, you know, we fixed it. And by the time we went through it to fix it, it's like minutes, you know? Yeah. Yeah. Yeah, I'm gonna, um, I'm gonna talk to him. I know you are a busy dude. I'm gonna let you be, um, I think there's this other guy who's been waiting so patiently.
I'm sure I'll run into something, and I'll be on Friday regardless. Alright, fantastic. Well, it sounds like your takes are sinking, so you're off to the races. Uh, and I'm gonna have to get off in a bit, but there's, there's also Bernard Mathis. All right. Thank you. [01:14:00] Thank you. All right. Thanks guy. And, uh, all right.
Bernard or I don't think I've heard much from you. Uh, I think you're on mute.
All right. So, all right. So that's, that's probably gonna be that for office hours. Uh, burn Bernard, if, uh, let me see if you were trying to contact me on, okay. I see system requirements for auto shot. Yeah. So, uh, application I'll support. Yep. You need, you need them Silicon for that. Okay. Guys, good office hours.
I'll talk to you guys soon.