Transcript
# Office Hours 2025-02-07
**Eliot:** [00:00:00] Good. Good morning. Good morning. Good plan All right, Jason. Let's uh, let's uh, I think you're on first. Let's uh, let's take a look. Are you working on calibration?
**Jason:** No Um, well, yeah. Well what I'm working on is I've done um, I was just looking up everything back up CMO 4k I can, I can see everything, um, when I'm just hooked into it.
But when I go onto a jet set, I, I'm not able to use my camera. It's only allowing me to use my phone. Oh, okay. Um, so let's
**Eliot:** see. So in the, um, in, so you're running Jet Set Cine? Yeah. Okay. So the, right now, the key part we use the, the, the camera for is the calibration process. Um, and, and you want to tell you what, let me, um, let me send you a QR code.
And then what we can do is, uh, we can screen share your. Uh, your, um, uh, your iPhone, your jet set system. And then we can kind of go [00:01:00] through that, that kind of step by step. So let me go find that just a second. There is a QR code in a short period of time. We're going to be able to do this all directly on zoom.
It's going to be fun.
**Navaz:** All right. Hey, uh, excuse me, Jason, I actually have a suggestion. I think the problem you're having is because the Simu is connected to the phone and the reason for you not seeing your camera. Is because it's sending the video feed through the phone.
Does that make sense?
**Jason:** Kind
**Navaz:** of. Cause that, cause that's the problem that we had. Cause normally like what. And I don't mean to take over, uh, Elliot, sorry,
**Eliot:** but,
**Navaz:** um, cause we had the same issue is that, um, we weren't able to see the camera. So all we do is just unplug the CMU because the CMU is only being used for, um, for the calibration.
And then once you've done the calibration, we just unplug it. Because, I mean, if you have the Simu [00:02:00] app connected, you'll be able to see the camera on the phone. So it's kind of like using it as an external monitor kind of thing.
**Jason:** Yeah, like right now, and I'll take the blur off of
**Navaz:** this
**Jason:** thing. Yeah, but sorry
**Navaz:** Elliot.
Let's go to it. It's just an
**Jason:** issue that we had. Yeah, so when it's going through now, you know, I can see myself, right?
**Kevin:** Yep. Yes.
**Jason:** And I can see this is, you know, this is by using the camera. Yep. Right? This is going through the, uh, the cmo. Uh, but then when I switch over to Jetset,
when I switch over to it
**Eliot:** mm-hmm .
**Jason:** And so I just pick anything. I'm only seeing, I see everything, but it is only using, all right. Just, it is using the phone, the iPhone. As far, far as
**Eliot:** I tell you, uh, I tell you [00:03:00] what's going on. You are seeing the future in a little, in a, in a funny way. Uh, because right now what we take when we hook up the CMO to Jet Set, we're actually only using it for the calibration phase and we don't yet, not yet, but, uh, composite the Cine video live in the phone as, as you're, as you're tracking now, we are actually working on it.
Literally, as, as I, as I say this, um, so we have what we're, the, the process right now is we pipe the Cine video through and I can, I can screen share and walk you through the calibration phase. And then, you know, you'll, you finish the calibration phase and it's going to paint a little, um, yellow, uh, reticule, like a little rectangle on your screen that shows you roughly what the Cine monitor, um, the Cine field of view lens would be, you know, overlaid on your iPhone.
Um, Mm hmm. Kind of like a range finder and that's that's kind of how we had to start doing this to get it get it to work Now, of course first thing is everybody says I want to actually see the live city video comped in the iPhone [00:04:00] and coming Coming
**Jason:** soon. So what you're saying is the camera is obsolete right now pretty much.
Is that correct? No,
**Eliot:** the
**Jason:** only only just for calibration
**Eliot:** Yeah, for the, right now, the Cine camera, we use it to calibrate and to calculate the offsets between the iPhone and that Cine lens, your Cine lens optics. You know, we need to know exactly what those Cine lens optics are doing, and we calculate the offset.
But after, after we do the calibration, you can actually unplug the live feed from the camera, from the, uh, Uh, from the city camera and you'll see a, you know, an aiming reticule on, on jet set, kind of a, a, um, a range finder view, uh, and yep, it's confusing. And yep, this is why we're doing,
**Kevin:** yep,
**Eliot:** this is why we're building compositing in the phone for this exact hundred percent reason.
So you have, you have just wandered into the future. Uh, but I can, I can get you through, through the present, uh, in terms of [00:05:00] calibration and, and, and shooting. Uh, but that's, uh, that's your, that's what you're seeing right now. That's why it's, it's a head scratcher. Cause it is. So,
**Jason:** so I guess my biggest thing is in the future or when you, when everything is.
work through, I'd be able to use my Sony
**Kevin:** opposed
**Jason:** to using the camera on the iPhone. Yeah. Is that what you're saying?
**Eliot:** Yep. You'll, and you can already use the Sony. That's the whole point of Jet Set Cine is that when we We, cause we need to know when we're compositing that cine footage into a 3d world, we need to know exactly what the optics were of your cine cam.
So you're running, what is it? FX three and, uh, um, FX three, FX three. Great. And, and, you know, it was like a, like a 40 millimeter or a 50 millimeter lens, uh, on it. Something like this
**Jason:** one is just, uh, um, that's just when I threw it, it's like a 35 to one 50, but you know.
**Eliot:** Okay. Yeah. So, uh, whatever lens setting you have in there, we need to, [00:06:00] uh, we need to know the exact optics of that.
And that's what the Jet Set Cine Lens Calibration part does, is it computes the offsets and the optics and the distortion. And we, we, we know after that, we know everything about that lens. Uh, and then when you roll it, roll a take in Cine, we store that data so that you can then bring it into the 3D world, um, and have everything line up, line up correctly.
Oh, there we go. All right, let's see. Jason, there you go. And you can, you can record your machine. So that's, that's the current, the current setup is it's designed for an onset preview, uh, so you can actually see, you know, with a yellow kind of aiming reticule on, on the phone, what the scenic camera is going to see.
It's not exact. It's a range finder. And then in post, we bring all those pieces together, uh, into what, what's your, uh, what's your post pipeline? Are you using Blender or Unreal or, you know, what's, uh. Well, I just started, so I think I'm going to use Blender. Okay, that's great. Yeah. So this, this'll work, uh, that'll work great.
And, um, and I can show [00:07:00] you, um, We, I went through a full calibration, uh, Let's see. In one of the previous Office Hours. And we can do, go through it again. Um, let's see. Let me send you the, put a link in here. Because this was, this was a really good, uh, Uh, this is a great just, you know, Piece by piece, step by step walkthrough Of the whole calibration process.
Uh, with another user, there we go, and put it in the chat, um, and I'll kind of screen share it so you guys can see, uh, see it. There we go. Let me screen share this because this, this was, uh, this is great because we just, you know, went through it, uh, just absolutely, absolutely piece by piece. Let's share it.
All right. Let me go find this. There we go. Share. All right. All right. So what we ended up doing is, is we really want to work through and the [00:08:00] way, uh, the way, the way the office hours work is you can just click this button and it shows you, they're all transcribed, uh, and so we, we can actually just kind of look, look down, uh, through the, through the course of it.
So we went through and the setup, my little, my little kid came and jumped on me partway through, uh, all right. And so then we kind of go through the different processes of, of calibration. Uh, including, uh, screen share to walk through his, uh, calibration phase. So we're going through, uh, we have a live link into, into his, uh, and so this, what you see here is what the Jetset Cine calibration interface looks like.
So he's got it, uh, again, this is running over, uh, kind of a remote link. Uh, but we're. You know, he's, uh, we're going through the room and capturing a few different images of some of the, you know, furniture from different points of view. And here's where you see both the, the iPhone, uh, iPhone [00:09:00] video, and down here is the Cine video coming in live.
Um, and then what we're, we're doing is going through that calibration process. And so then we, you know, capture a few frames, calibrate the optics, and then later on, we go through all the way through, um, come down here, and we actually go all the way through, Solving this, and then we go into Blender. Uh, let me go find the Blender.
There's my little Yeah, we're going through all the details here. Let me see if I can get to the Blender part. Oh, there we go. So then we, what we end up doing is we, we, uh, we process the shot and brought it all the way into Blender. Uh, and so then it, uh, then we can see how, uh, you know, the 3D scan of the room lines up with the footage.
And then you can start stacking stuff together. Um, so this, that's a, that's a great, you know, just absolute minute by minute, you know, breakdown, breakdown of the process. [00:10:00] Um, and so that's, uh, uh, so that'll be, that'd be a great, a great place to, to start off, or we can actually get you up and running. Uh, I do want to try going through at least the initial part of the calibration
**Jason:** and you sent me a link for it, right?
Or is
**Eliot:** it? Oh, yeah, I put it in the office hours chat. Let me see in the zoom.
**Jason:** Yep. I see it right here. I can probably do. How long does it usually take if you run through with me?
**Eliot:** You know, it's, uh, it takes a little while the first time it's probably gonna take, you know, because we're, you know, we're kind of walking through it and talking through it, we're going to take a half hour to kind of do it the first time.
If you want to take a look at that at the at the office hours because it's going to be exactly word for word note for note with that's going to be.
**Jason:** And then I can probably run through it myself because I have another another call but that was just my only concern was I was not seeing what I thought I was [00:11:00] supposed to be seeing.
**Eliot:** Honestly, you and the entire rest of the world that has worked with it is like, Hey, wait a second. Where's, where's the live feed for the camera? I'm like, yep, guess we better do that. So we're doing it. Uh, there's no choice about that because among other things, what happens is we thought that little aiming reticule is going to be super clever.
Uh, but when you get past about a 50 or 60 millimeter lens, what happens is the reticule gets super micro tiny in the middle of. Cause the lens has a lot of magnification. So then you're trying to aim a shot with a aiming reticule. That's an inch and a half wide. Not great. That's not great. So, uh, we're cranking on it.
So the good news is that you can get, get up and running, you know, while we're, we're finishing this part of it, the calibration process is unchanged. It's just after the calibration process, uh, then you're going to be able to actually see the, the live track comp, but this first part of the calibration process, a hundred percent unchanged.
So we might as well get that up and running.
**Jason:** Yeah, I'll do that a little bit later on the day. And that's in the, [00:12:00] um, that's in the notes right here. And I see it. Uh, make sure I'm on the right one. Yeah. It's the
**Eliot:** 1220 office hours. Yeah, that's it. That's a good one. That's a, that was just absolutely, uh, glad to be recorded that one.
Cause it's just from zero to a shot
**Jason:** tracked. All right. Well, that was my thing. I guess I'm done. I'm going to read through this a little bit later and go through it. And, and, uh, I'm quite sure I can get it calibrated.
**Eliot:** Yeah. And even if, uh, if you run into something, uh, then, then you kind of see what the process looks like.
So then when we, you know, go on the next office hours, we just start figuring out where you got stuck, uh, in the process and, and, you know, we'll just cook through it. Okay. All right, man. Well, I appreciate it. Hey, no worries. Thanks, Jason. Uh, let's see. Uh, Amuru, I see you're, you're linking in. I want to make sure, because I know it's super late where you're at.
So, uh, uh, can you, what's, uh, can you tell me what you're, what's up?
**Umuru:** Hi. [00:13:00] Hey, uh, I prepared, uh, one scene for exporting, um, from Unreal to USD.
**Kevin:** Mm-hmm .
**Umuru:** But before that, I want to show you, uh, my problem, uh, concerning the, the scene lock.
**Kevin:** Mm-hmm .
**Umuru:** So I, I shared my, my, uh, my, uh, ancy screen.
**Eliot:** Okay. Uh, I don't see it yet.
Um, could you? Yeah,
**Umuru:** I, I, I prepared to, to, to share it. Okay.
You see it right now?
**Eliot:** Yep. Okay.
**Umuru:** You see, take this scene, for example, look at the foot.[00:14:00]
**Eliot:** Okay. All right. Oh yeah. That's, that's a little off, isn't it? Yeah, he's moving, isn't he? All right. Yeah. That looks like, uh, something's off in the, in the Z axis. Okay. So, um, let's, let's go take a look at, um, I think where's a good place to start. Can you screen share your unreal scene?
**Umuru:** Uh, okay. I open it.
Can I share multiple screen on zoom? You
**Eliot:** know, I think so we can try it and then we can learn how to toggle back and forth. I'm on a voyage of learning with zoom myself
**Umuru:** to put a screen on.
**Eliot:** I don't see
**Umuru:** it.
**Eliot:** You can always also just stop sharing one and start sharing another. That's okay too.
**Umuru:** Oh, okay.[00:15:00]
Unreal is opening.
**Eliot:** Well, well, well, Unreal opens. I can talk a bit too. Uh, let's see. Uh, let's see, Kevin, are you, are you, uh, are you debugging something? Should I, while we're waiting? Mostly,
**Kevin:** you know, I'm, I'm trying to learn through other's mistake. You know, I'm trying to, I'm trying to see if I can get this. Ready to go in a production environment.
And so everything that. Everything that can go wrong will go wrong and lots of people [00:16:00] will be staring at me. So, so I, you know, it's tough because I'm, I'm, uh, kind of trying to do this between doing all the other, uh, regular work, uh, that needs, needs to get done. But I feel like if I'm, if I'm going to say, yes, we, we should try using this jet set thing.
Then I need to
**Eliot:** know it very well. And
**Kevin:** at the moment I'm super green. So
**Eliot:** do you have a camera at home? I forget if you've got like a, like a. Like a little mirrorless camera or something like that. Do I? Yeah. Yeah. Cause the key way is to do is just like rig up and, uh,
**Kevin:** Yeah. Yeah. So, so I was messing with, it was just the app.
I haven't gotten the stuff in yet to plug it in my camera, but yeah, I have a, I have a proper camera, so it shouldn't be, you know, that's not, that's not the problem, but it is just, you know. Hearing, uh, little things. I, I guess I had little small questions, but they're not, uh, Alright, let me jump in with,
**Eliot:** uh, Amaru.
And, uh, alright, so let's take a quick look. Let's jump back at the Unreal scene real quick. Uh, [00:17:00] alright. And, uh, you can unlock your camera cuts. And, and let's take a look at, in the 3D view, um, if you highlight your scene loc, uh, bar, what is that? Bar? Scene loc? Back, uh, back center. Okay. Where is that in the 3D scene?
Let's take a look at where that is in, in your 3D scene. Uh, okay. Let me come out to this. It's here. All right. And is that, um, it looks like it's reasonably well aligned with the, um, let's take a look. Is that aligned with the bottom of the, uh, because we're, we're, we're, we're running into problems. Oh, that's interesting.
Hang on.
**Umuru:** Wait, maybe it's not this, uh, this scene. Let me verify clearly. It was the [00:18:00] second one.
**Eliot:** It seems something a little funny.
**Umuru:** What?
**Eliot:** Uh, well, the, when I look at your scan, your, uh, it looks like your origin, uh, is a few inches above your scan floor. Uh, maybe I'm just seeing that funny. Uh, Let me
**Umuru:** talk, uh, talk, uh, one other, uh, uh, other scene. This is, uh, I think I have changed it. Okay.
Uh, cut
**Eliot:** those. Thank you for scanning, by the way. Ha, ha, ha,
**Umuru:** ha, ha, ha, ha, ha, ha, ha. Oh, for what?
**Eliot:** Oh, having the 3D scan. Makes all the difference when, if we have to fix anything, makes such a big difference because then we know where it thought the floor was.[00:19:00]
**Umuru:** Yes, this is another scene. I hide the scan.
Okay. If I look through the camera
is not, uh, I changed the distance transform.
Take this for, for example, the feet is not, uh, perfectly [00:20:00] locked to the crown.
**Eliot:** All right. Let's take a look at, uh, let's take a look. Can we look at the scene from the, from the, the sideways view? Yes.
**Umuru:** All right. This is, they are the scene lock.
**Eliot:** All right, so let's take a look. Is the scene look aligned with the ground floor? Yes. Okay, that's on the ground floor.
Okay. So that should be aligned. Scene look back center. Okay. Then let's, uh, let's pull out just a little bit. And, um, is there a scan on this one as well? Yes. This is the scan.[00:21:00]
**Umuru:** I think it's, uh, it's locked on the crown. Reasonable.
**Eliot:** Okay.
Interesting. Okay. Um, on this one, I might want to like, take a look at closer, look at the take. Uh, can you, can you zip up the take and, and send me a Dropbox? Then I can pull that. I want to, I want to look at it. You know, some of these things a little hard to figure out remotely, but let me look at the take and see if we have, we have any foot slippage.
We do, then that tells me I need to get our, uh, our refinement tracking thing done, uh, video done sooner. I, I have to send it now? Oh, no, you can send it, uh, you can send it after the call. It's just going to take me a little while to, uh, download it and open it, uh, and test it on my end. Is this, uh, which, which take is this?
**Umuru:** Uh, this is, uh, [00:22:00] it's, uh, the number, uh, uh, 14. Okay.
**Eliot:** 14. Okay. Yeah. If you could zip up that take and send it to me, then I can look at it. Um, and which, uh, are you compositing in After Effects? If you can remind me what you're compositing in. Fusion. Fusion. Okay. Okay. Great. Great. Okay. Um, yeah, I'll, let me take a look at it. I don't quite know what's going on yet, but, uh, if I look at it, if I unzip the take and look at it, then I can probably figure it out a little bit better.
Okay.
**Umuru:** I'll send
**Eliot:** it.
**Umuru:** Uh, can I open now the scene that I want to extract the USD? Yeah, yeah. That sounds good. Okay. Where is it?[00:23:00]
Take, for example, the scene. I hate this gun. I take maybe this one is, is, is, is, is, uh, you, you see my screen. I can see your screen. Okay.
This one, maybe I take.[00:24:00]
**Eliot:** Now, are we, are you going to put those in a, in an export layer?
**Umuru:** Yes. I put it on export layer. Quickly.
Okay. I take this.
Okay. For example, this, uh, this selection. I add the floor.
Now everything is selected. I go to export
**Eliot:** selected. Let's, um, let's make it easier ourselves and put it in a layer first. In a USD export layer. That way we can recreate the selection very easily.
**Umuru:** It's, it's on, uh, layer. Okay. I put it here. Okay, great. [00:25:00] Uh, export. Selected USD. Mm-hmm . Uh, put it on, uh, I project.
Uh.
I set USD, I call it, uh, I check the material, Y up
**Eliot:** and check D. And actually in this case, uh, remember we're, you don't need the Unreal materials. What you really want is the USD preview surface.
**Umuru:** Uh, it's a mistake. Sorry. Sorry. Oh, no worries.
**Eliot:** So go ahead and, uh, [00:26:00]
**Umuru:** previews this, this one.
**Eliot:** Yep.
**Umuru:** That's the one. Okay.
I send it. Okay.
**Eliot:** Yep. That sounds good. Okay.
I'm going to do an updated tutorial because in Unreal 5. 4 and later, uh, the internal Unreal exporter has got a lot better. So we, we may not need the, the Omniverse exporter for many things. But this will work, this will work, this will work now. Plus on the Unreal one you can control a level of detail, which is sometimes helpful.
**Umuru:** I forgot the, the, uh, the scene lock is not a
**Eliot:** problem, right? Oh, we very much want to export the scene locator along with everything else. So make sure those scene locators are included in the layer. Okay, [00:27:00] I let it finish after I select the scene lock. Um, let's see, you, let's, uh, you can go ahead and cancel and then let's make sure the scene lock is, is, uh, is selected along with everything else.
It's good to put it all in a layer. Then we can look at the layer, make sure we've got everything before we, uh, we export. Okay. I
**Umuru:** select the scene lock, edit, uh,
create, uh, uh, select, uh, Yeah, I will
put it for a while. Okay, let's go.[00:28:00]
I share the Photoshop screen now. Alrighty,
**Eliot:** let's make sure it's completely done exporting from Unreal. It's complete. Oh, done? Okay. Alright, so let me go to the Models tab. Model[00:29:00]
**Umuru:** it. You can hit refresh. Oh, is that the correct directory?
**Eliot:** Yes. It's there. Okay. All right.
That's kind of weird. Can you hit refresh, uh, on the, uh. Maybe I, I, I open from here. Let's just take a look. Let's look at those USD files. It's kind of weird. It is here. Yeah, I see it, um, can you, let's, let's open up the directory and just kind of look at these USD files to make sure there's nothing really strange going on.
So it's the USD exporter is not exporting a USD file that we got other problems. But, okay, let's see, that's, that's not, um, uh, screen isn't that your Windows Explorer screen isn't, isn't sharing. [00:30:00] Um, you maybe share that so I can see the, uh, the files and windows. Oh, okay. That looks like a bug on our part.
All right. Hang on. Let me, uh, screen cap that. Uh, snip, uh, and what version are you on? Downloads. Okay. So, um, actually, can you move the, uh, the windows, uh, thing to the side so I can, uh, screen cap, screen cap, the auto shot, uh, auto shot screen. You see the auto shot now or not? Uh, there we go. That's it. What I'm doing is I think I've, uh, that's a bug.
Cause AutoShot isn't showing the, uh, um, there we go.
**Umuru:** All right. I sent this to Greg.[00:31:00]
**Eliot:** Okay. All right. So let's go back. Uh, let's try making a USDZ. Okay. You see my,
**Umuru:** uh, AutoShot screen? Yep. I can see it. Okay. I select it. I load the texture size. Okay, make USD.
**Eliot:** There we go. It's crunching all the textures. There we go. Now we've got a USDZ. So now you can highlight the chamber. ly, uh, and then, uh, uh, push that to JetSet.
There we go. Let's, uh, let's try opening that in JetSet. Um, and then, I'm gonna [00:32:00] give you a, uh, you can screen share. I'll screen share my screen. So if you, um, if you want to, you can, um, uh, you can exit Jet Set and then open up your normal iOS camera app and then just point that at the, uh, uh, at that, um, QR code.
Okay. And that'll screen share your Jet Set screen to over to me so I can see it.
**Umuru:** Yeah, yeah. I don't know. I can get a see the play.
**Eliot:** You send me the QR code. Uh, yeah, look, I have, um, I have it screen shared in, um, do you guys see it? I see it. Yeah. You have
**Kevin:** to click on the tab and zoom at the top of zoom. There'll be a tab for Elliot's screen.
**Eliot:** We are one release away from me, me being able to send a link, [00:33:00] uh, that to the, to the phone that opens it up directly in zoom, almost a lot of work. Getting it through the Zoom approvals, getting it through the app store approvals. Anyway, can't wait. Ha ha ha ha ha ha!
**Umuru:** It's open the jet set, yeah.
**Eliot:** And it'll ask you if you can record, there we go.
Alright, so let me share my screen so you can see what I'm seeing. Uh, there we go. Screen share. And go back to here. Out in the park. Okay, so this lets, uh, lets me see kind of what you're, what you're seeing. So let's go ahead and let's go to your main menu. And, uh, and let's click, uh, model.
And open. And there's Chamberlee. Alright, it's gonna, there we go. And, uh, let's, uh, reset your tracking origin. Uh, cause right now your [00:34:00] tracking is, uh. Uh, I don't know where the scene locator go. There we go. Well, all right. You got a scene, but where's your scene locator? Um, That's weird. Um, okay. And this is, I'm going to want to look at that file.
Um, let's go back to your, uh, the Windows directory that had, uh, your USD file.
You see it? Yeah. Um, okay. Chamberly. Can you, um, can you just send me the chamberly. usd file? Okay. Uh, through the chat? Yeah, it's tiny. That's, that's a small file. Um, I think you can send it, send it through chat. [00:35:00] We'll find out.
So what I'm going to do is I'm going to open it up in a USD view. Um, and then I
**Umuru:** have to use the view on my computer.
**Eliot:** Oh, okay. How to
**Umuru:** call the, the, the, the
**Eliot:** launcher. Oh, omniverse launcher. O M N.
Uh, you're gonna what? OMN? Yeah. Let's, uh, back up a couple. Oh, there we go. Um,
sure. Let's see. So it's gonna, it's spelled, um, LAUN, uh, or you can go through the windows. We can look [00:36:00] it for it. Under windows. Okay.
**Umuru:** There's the guy. Yeah. Omni Omniverse. Yeah. Omniverse.
**Eliot:** Yeah. Nvidia N. There it is, kind of a strange name.[00:37:00]
Then we go to the launcher and we can go to your library and look at USD view and launch that.
Yeah, there it is. This is great. This is Pixar standard viewer for USD files. So this is my first port of call for, uh, for looking at what's actually inside a USD file. All right. Then you're just going to go file and open and go to navigate to that USD file. And let's see if the, uh, the scene [00:38:00] locators made it in,
or if some, if their exporter added something weird to the name string.
**Umuru:** So, uh, USD, USD or USD is fine.
**Eliot:** Yeah. All right. Give it a second. All right. There we go. Now we can go under the, um, we're going to expand the, um, the route. Cause I want to look at what's, what's stored inside there and let's go and then. Okay. One of the first problems I don't see a scene locator. Maybe I, I did not select it.
Maybe. Let's, uh, let's open up your layer. So let's, uh, with the layer, you can, uh, click, um, uh, not in, uh, back in the Unreal. Let's open up your, your, uh, your export layer and you can click on that. There we go. And then see contents. There we go. [00:39:00] Okay. So scene loc. Okay. So that's exported, exported. Um, Yeah, that's kind of weird.
Let's just try something simple. Let's just get the scene locator and the, and the, um, and the floor. Let's just export those two. Highlight, highlight those two guys. There we go. And then let's, uh, let's go ahead and X file export. Export city. And this you can call it, you know, Chamberlain, you know, or, or something like that, and you're going to want USD.
And then we'll just open that up directly in USD view. [00:40:00] Uh, and it's make sure that our CNC locators are coming through.
I'm really glad you're asking this because this gets a chance for us to show how we debug USD, uh, on, on an office hours. So this is, it's perfect. Otherwise it'd be a really, yeah. Sorry.
All right. So let's open our root again. Hey, Whoa. Let's, let's open up those two is, and well, that's weird. Why isn't it exporting the scene locator? Well, there's your
**Umuru:** problem. I have done mistake. Maybe. I don't know. Um, it's, it's locked. [00:41:00] I think no.
**Kevin:** Here's, here's my crazy brainstorm. Ready for my crazy brainstorm?
Yeah,
**Eliot:** go for it.
**Kevin:** The character set thing, because he has his keyboard set in French, and it's like some thing where scene lock isn't showing up because like one of those characters is, that's my crazy ass brainstorm. I'm going to shut up now.
**Umuru:** So I, I, I copied the, the, the name from, uh, from, uh, From here. Okay.
**Kevin:** So it wouldn't be,
**Umuru:** uh, Directly, directly to, to Unreal. Okay. I took, I took it here. I copied.
**Eliot:** Yup. Okay. Next I got, okay. Now, now is where we get serious. Okay. Let's go back to Unreal. We're going to export it. Uh, the, go ahead and the scene locator and just one floor, and we're going to export to USDA, which is a text format.
We're going to find this thing. If, if, uh, uh, see, look, and then just, I did it to all the, [00:42:00] the layer. I don't think you need to do that. How about we just export the scene locator by itself? Uh, and I'll show, go ahead and do file export selected. I want to understand there's something funny about, Oh, wait, wait a second.
Is that, it says down here, it's an instance. Um, just for just, okay, let's first, um, I redo, I bring one, another scene lock. Okay, that sounds good. And, um, let's just name it and all this kind of stuff.[00:43:00]
Okay. Yeah. And let's just export that by itself. Yeah. And we're going to export it to USDA. So we're going to do export selected. And we'll go down to, and this time we're going to use USDA, Universal Scene Description. This is the text version of USD. Um, and you don't normally do this because the text files are much less efficient.
But it means that we can open it up and look at exactly what it, what it wrote out. Uh, because the text file and that binary file can map back to each other. So, uh, there you go. And we'll use the materials. Yeah, get rid of the Unreal materials. That's all fine. Let's open that up. All right. Now let's, uh, let's take a look at, um, that directory.
The directory on, uh, the USD view. Yeah. Let's let's, and we can actually go further than that. We can actually just open it up as a text file. So if you go to that windows directory, [00:44:00] uh, right click and open up that, that file, we just exported it in notepad. Uh, that's a USD ASCII file. Uh, I have Visuals code.
Oh, great. Great. That'll work. But for this, you don't even need VS code. We can, you know, notepad. We'll we'll open it up. Fine. There we go. Yeah, right. Click and. Open up in a text editor.
All right. Let's take a look. Wait. Oh, that's a JSON. Okay. File. So to
use. There's [00:45:00] the USD, and you want the USD A. Uh, that was the USD file, so you want the, the USD A stands for ASCII, which is the, the, uh, text file form of it. That's, that's the guy. Okay, there we go, there we go. This is, this is why we love USD. It's because you can always, You can dig in and fix it. Okay. So there's, okay.
There's scene look back center is there. Okay. All right. So now what we're going to do, let's, let's open that in USD view. That's we're just, we're just going to work our way up from the, from the ground floor, I want to understand where, where, where things are, are acting, acting funny.
All right. So open our route and let's see if. There's root. Okay. There's scene look back center. That's, that's correct. Okay. Now we're going to do, we're going to export the scene look and the floor as a [00:46:00] USDA file. All right. I'm just going to, I'm going to find this and maybe there's something really weird with that actor.
Let's uh, let's click on that scene, look back center, uh, just for a second, the first one, the first one, the new one that is exporting correctly. Let's just click on that. Cause the first one shows up as an instance. Okay. Maybe that's not it. All right. I'm going to clear my, clear my, uh, Drawings. There we go.
Okay, so let's go ahead and export the scene look and uh, two pieces of floor
And we'll just export that to usda once again
**Umuru:** Scene look maybe two.
**Eliot:** Yeah scene look two
**Kevin:** So if you have the Omniverse plugin installed, okay, that answered my question right there. It's overriding that exporter, including for USDA. So, but all the exports that he's been doing have been [00:47:00] going through the Omniverse exporter. Is that correct? So
**Eliot:** far, all of them have been gone through Omniverse.
**Kevin:** Got it.
**Eliot:** I think I'm going to, you know, update it because I don't think they're, they haven't updated their connector for a couple revs now. So, um, they may be shifting gears. So, which case. Um, I know the, the new USDA exporter in Unreal is actually a much better. So I'll do a tutorial on that so we can show you
**Kevin:** also, uh, now omniverse launcher gives you a warning that it's like a end of life product when you launch it.
**Eliot:** Yeah. I think they're switching gears.
**Kevin:** Yeah.
**Umuru:** Uh, it's, uh, done exporting.
**Eliot:** Oh, uh, can we look at that? Uh, the windows directory, did it write out a file?
Yes. Okay. So let's open that guy up and a text file, text editor, and let's just take a look at it and what I'm doing is. Then we can scroll down. Okay. So you, we can see, oh, that's kind of weird. [00:48:00] Oh, that's the, I think that's still the first one. That's the first one. Okay. This is the first one. Okay. Then let's look at the second one.
This one is not, uh, open. I think that's the USD file. So what we want is the CLOC two. Oh, yes.
**Umuru:** No, not for the open
**Eliot:** file. This, there we go. Let's look at that guy. There we go. So there's, there's our, our floor one. And if we scroll down, uh, we should see our scene. There's our scene look. Okay. So now let's look, look, look that in X in USD view and we should see it.
Um, and we're just, now we're, you know, we're just checking our, checking our things one at a time, uh, to figure out what's going on.[00:49:00]
Yeah. All right. Let's open up our route.
Okay. There we go. So that's okay. That's actually working. Let's go ahead and, all right, we know it works in US and ASCII. Now let's try exporting it in USD format and we'll just kind of, you know, we're just going to go step by step and we're going to find exactly where, where the chain breaks. It's selected already?
Yeah, it's already selected. You'll just do file export, uh, and this do export selected. And USD this time 'cause that's the, the, it defaults to the binary version. So we can just do, um,
cloc maybe three. Yeah, CLOC three. There we go. And these same stuff. All right.[00:50:00]
Thank you for having this problem, by the way. 'cause it's such a good way to show how the, how, you know, the, the USD export and debugging process works. And if you, if you run into weird stuff, the nice thing with USD is you can always figure it out. . Um, okay, so then yeah, let's go up to a USD view.
**Umuru:** OpenTech gets it,
**Eliot:** tree. There it is. All right, so open our route. You know, the drill. All right, that's showing up. So now let's go back to AutoShot so that if it's working in, in USD view, you know, it should work in AutoShot. So let's go in models and pick our, we can refresh and pick our scene. That's four, [00:51:00] three, make USDZ.
All right. And we can push it to Jet Set. All right, let's go to Jet Set.
Is three. All right. I'm going to share my screen so you people can see this share and park. There we go. Okay. All right. So then let's go ahead and do a main menu and we'll, uh, model, we'll clear the current model just to clear out everything, uh, model, and then, uh, clear model. There we go. And we can do main menu model and open, and let's open the, the new one, scene03.
All right. And let's check our, check our scene locators. [00:52:00] So we have, uh, did our scene locator load? Yep. There it is. Back center. All right. Yay. Okay. Maybe there was something weird with that first scene locator. I'm not quite sure what that was. Yeah, that's kind of a weird one. But let's see. Anyway, so that looks like here.
So let's go ahead and get the rest of your set loaded. We'll just go through the same process and, you know, select all the rest of the pieces. Uh, in the layer and make sure to put the new scene in your, in your, uh, your export layer. The good one.
**Umuru:** This one there. I clear this, uh, just new one.
**Eliot:** Yep. Add the new one.
Get right. Click and [00:53:00] add, um, uh, add selected to, uh, yeah, there we go. That's it.
**Umuru:** So yeah, it's, uh, see content is the,
**Eliot:** there we go. Yeah, now you can select all those and export. And now that we've debugged it, you just, you go back to normal USD exports. The binder exports are much, much more efficient.
**Umuru:** Maybe the
**Eliot:** fourth one or the, okay.
Yeah. Fourth. That's fine. Just keep incrementing until we get it, get a working one.[00:54:00]
It done. Yep. Come back to autoshot fourth, make
**Umuru:** USD
**Eliot:** done and push it to [00:55:00] Jetset. There we go. And back to Jet Set. There we
go. Is here. All right, is that your set? You can look up and around and hopefully all the pieces. Yeah, it looks like it's all there. There
we go.
**Umuru:** Everything is okay on the Jet Set. Excellent. Excellent. Excellent. So now. My problem is, uh, the, the floor. Yeah,
**Eliot:** it's, uh, did [00:56:00] you, let's, let's zip up the take the floor. I want to take a look at, and, um, it's going to be a little hard for me to debug that one remotely, cause I want to look at the underlying data.
Um, but, uh, if you can send me, you want to just send me the, uh, the, yeah, go ahead and to the, pick the take and, uh, and then, uh, do file takes. Well, let's, let's make sure the take has the, uh, the correct Cine, Cine footage loaded in, in, uh, in, uh, I sent this one. The 12th. Oh, all right. Did you already send it?
Okay, great. Let me look for that.
**Umuru:** I selected 12th.
**Eliot:** There we go. So we've got, there's the cine footage match. Okay, great. And, uh, just take zip. There you
**Umuru:** go. Exit location on the desk. Okay. It's[00:57:00]
okay.
**Eliot:** Yeah, just let it, it'll, uh, it's zipping up all the files and in the, in the console window, it'll say when it's done. Uh, so if you go back to, uh, autoshot, um,
there we go, it's, it's, it's still cooking. So down here, it's going to say, uh, it'll have a complete completed message when it's done. Cause it's zipping up the, uh, the Cine footage along with the, uh, the tape tracking data. There it is done. So now we can go, go to that location.
**Umuru:** It's on my desktop.
**Eliot:** So there it is. Yeah. If you can just send me that Dropbox or whatever. Uh, that was
one of the first features we put in [00:58:00] AutoShot. And it's just been a incredible help. It's so good for sending, sending takes around because then all the data that you need is all wrapped into one. And actually, uh, before we do that, can you briefly go back to AutoShot? I want to make sure that that take had a scan in it.
Uh, okay. Yeah. Has set LiDAR. Great. All right. We're good.
**Umuru:** Uh, to pro like,
**Eliot:** yeah, support at like craft
**Umuru:** up pro. That's good. Okay. I send it to transfer me.
**Eliot:** All right. Well, let's cook and Kevin, any, any other questions come up from, uh, watching the USD work?
**Kevin:** Um, no, my quick question, I guess, is just, uh, [00:59:00] I'm a little confused. Just put it. And again, I'm not using the pro thing yet, just because I don't have the other pieces. Um, But like resolutions, can you talk me through the resolution behavior? So like I shoot a sequence and looking at the, the, the sequences folder.
Here. And I have like the half res depth. It looks like, does that sound right?
**Eliot:** Yeah. When we, when we store depth, um, the depth is just right out of the iPhone. And so the sensor is like 188 pixels wide by like 120 tall. It's something like that. One 44 P or something. It's 256 by
**Kevin:** 144, maybe.
**Eliot:** That's it. It's tiny.
**Kevin:** Yeah. Uh, and then, okay. And then. I guess the camera, because like the, um,
**Eliot:** do you want to share your screen? Uh,
**Kevin:** sure. Sure.
**Eliot:** That way we. More
**Kevin:** screen shares, more screen shares. Here you [01:00:00] go. There's my boy. Um, and really I'm just looking at, I'm looking at this Columbus stuff. I was putting a, I was putting a test shot together here where he's, uh, waving between, uh.
Worlds. Like how the pirate ship
**Eliot:** shows up. He's not a
**Kevin:** pirate ship, he's also an oil painting in a pirate ship, which is a separate, separate task. Excellent.
**Eliot:** Excellent.
**Kevin:** Um, so, so I really, I'm just looking, I'm just trying to wrap my head around this because how I built this out, it kind of, it kind of, I had to like resize things manually and then some things were like one frame off.
**Eliot:** So tell me about how you did this. So you, you, um, okay, you shot the take and then you brought it in directly into after effects or did you go into blender or,
**Kevin:** uh, I shot the take. I rendered a background. I sent it over to unreal and I rented a background out of unreal. And then I just brought everything into after effects.
So, but for example, like these things here are just the, um. Are just [01:01:00] the things out of the my sequences folder right on the on the project
**Eliot:** Yeah, the sequences and then like,
**Kevin:** you
**Eliot:** know So if I plot something like that,
**Kevin:** yeah, so if I plop this down, here's my here's my boy at the playground um, and then if I plop, uh the Oh, what is this guy down?
This is the live preview. This is your pirate ship, um, from the, uh, from the app. And those guys, um, basically look like they align, but it's a crop in right. And then different resolutions, right. But a crop resolution, not a scaled resolution,
**Umuru:** right?
**Kevin:** Like that's still, he's still in the same spot there. Almost.
No, he's shifting. See, this is why I'm a little confused. I guess this is my question. Um, and then, uh, then there's the depth, which, [01:02:00] uh, if I were to, um, fit that to the comp. Does that line up? Which one does that line? It looks like it lines up with that one. So it makes sense that the depth is, uh, not, right?
It's a different resolution, but it's the same, it's the same crop.
**Eliot:** Same framing. Yeah.
**Kevin:** Yeah. Same framing. And then this one looks similar, but not quite the same. Uh, so, and then, and then I have, uh, these other, these are the sequential images that it made. Uh, sorry, what, what are these? I don't quite remember it.
PNG
**Eliot:** sequence of them.
**Kevin:** Oh, the, the camo. Yeah. What's the camo PNG? I don't really understand. It has like color baked in and a bit of a crop or something.
**Eliot:** Uh, the camera original, it should just be extracting, um, uh, PNG sequence basically right, it's right off the, from the normal, the camera original. And then the one thing you might want to look at is how is it interpreting the color footage and after effects?
Cause it's, [01:03:00] the PNG is just an sRGB, right? There's no, there's no.
**Kevin:** Okay. And those, and those, see those had a little temporal offset in between them.
**Eliot:** Okay.
**Kevin:** Which was confusing to me. And then this is the frame. So this is, this is the in phone comp. No. What is this? Oh, yeah, yeah. Uh, no. Oh, this is the Inspironet, um.
**Eliot:** Okay, yeah, yeah. There's the, there's the.
**Kevin:** That's the higher, uh, higher quality, um, offline, uh, Roto, right? Yeah, yeah. And then, uh, so if I fit that to the comp, boom. Does that look like it lines up? Yes, it does. Okay. Great.
**Eliot:** As like, uh, opacity. So I can, I can see a little bit better, Heather.
**Kevin:** Um, and I have
**Eliot:** my knowledge of after effects faults.
Right after the ability to draw, put a one layer on top of each [01:04:00] other.
**Kevin:** Oh, yeah, yeah. I'm, uh, yeah, After Effects, unfortunately, I mean, I'm embarrassed to say it because I'm a visual effects professional, but After Effects is in my muscle memory really well. So if I.
**Eliot:** Oh, yeah, it's fine.
**Kevin:** If I need to just dump a bunch of stuff like this together and sort it out, uh, this is the quickest way for me to do so.
Yep.
**Eliot:** Yeah.
**Kevin:** Um, so, so I'm having a little temporal shift here. Okay. Okay. So let's. And I, and I lined it right. So the way I do that in After Effects, the way I would do that is just, um, I would put a marker on everything so that in the middle of my timeline, I can see what frame I'm on and then I would start, uh, sliding anything around.
That's not lined up. So these two guys are lined up. So although the, um, I feel like it's good to have like a ground truth, right. And the. My proj, uh, cam should be the, the truth, right? Like
**Eliot:** Yeah, it should be the cam original.
**Kevin:** That's just like the, the camera original mp4, right? Without, um [01:05:00] The PNG processing or whatever.
So if I use that as the truth, then I could turn on my, um, PNG on top of that. I could turn down the opacity and then we can, um, and I know that these 2 are aligned frame
**Eliot:** off. So
**Kevin:** then I can scooch them around and find find where they belong. So it looks like it was 2 frames, 2 frames to the side. So.
Confusing ?
**Eliot:** No, actually, can you zip up the tape? 'cause this is the kind of thing I always wanna like track down. Uh, sure. How do I, uh, it's a straight Oh, oh, actually, you know what? Okay. Can you, oh, let's go to auto shot first. Sure. I, I have an idea, I have an idea of what may be going on. Actually have a decent idea of what's going on.
But the, I bet it got hot and we popped, dropped a couple frames.
**Kevin:** And again, I'm a, I'm a, a super nube here. I'm just trying to, trying to wrap my head around what, uh,
**Eliot:** no, this is great. This is . It's great. It's great to go through this 'cause it's the kind of thing that. That like I just [01:06:00] don't don't have in the normal day to day testing aspects of it.
So is that the take that's that we're working with? Uh, yeah, it should be. I haven't done anything
**Kevin:** else in here since then. So yeah, that looks right.
**Eliot:** All right. So let us take a look at we're gonna click the I, the info button on the take. Let's see if the phone is getting hot if we popped a couple frames.
Okay, so we duplicated a frame, dropped a frame. Okay. I see. What I'm wondering is, and this is, this is worth, um,
**Kevin:** uh, Is if the sync dropped where the frame dropped?
**Eliot:** Yeah, can we look at it in the, at the very beginning part of the, of the, Like does it,
**Kevin:** does it start in sync and then lose it? Right, right. Okay, great.
It thoughts, um, okay, well, this is, uh, well, this is offset now. So let me just put it back to, to where, uh, where it was. So that's aligned. By file, not by my visual alignment, and then come back to the top. I was trying to, uh, challenge your [01:07:00] system, so I wrote down the slide at the playground while filming.
**Eliot:** Alright. All right.
**Kevin:** So I see it's my jib shot. It's my, my crane, my crane shot. Do you have a
**Eliot:** scan?
**Kevin:** I do not. I don't really know how to do that. So again, I'm a noob. I'm a noob. I'm sorry. It's all
**Eliot:** right. I, I, I, we're actually, scanning matters so much that we're changing the UI in the next one. So if you don't have a scan, especially for Yorkie and Jet Set Cine, uh, then the record button will have a little, turn yellow and have a little grid overlay, like scan, scan.
Okay. guys. guys. Scanning, you can fix, you can jack up a shot at a staggering level and if you have just a frame that's correct with the scan, we can lock it. Just, you know, a piece of cake. It's, it's like, it's the ultimate get out of jail free.
**Umuru:** One question, please. Do we have to scan every time, uh, we load, load the scene?
**Eliot:** Every time you change your origin. So once you set your origin, you know, [01:08:00] like you do, you know, origin and reset and you tap the floor plane. Um, after that you want, because then the scan depends on where you set your origin, because it's going to use that as, as the, you know, the zero, zero, zero point for.
Um, so after you, if you move your origin around on the, on the stage, then rescan, but then you can shoot a bunch of shots with, as long as you have the same origin, you can eat. You can even change scene locators. Um, but if you move your origin, then rescan.
**Umuru:** Okay. Okay.
**Kevin:** Okay. All right. All right. Let me have you see.
This is my, this is my Jet Set notes over here. Nobody look at these. It's embarrassing. Ah, there you go. No, no, no. This
**Eliot:** is good. This is, I mean, you're, you know, you're, you're a VFX supervisor and this is pipeline questions. And it's just always, how does it all link together? Like, what's,
**Kevin:** what Uh, yeah. Okay. So, sorry.
I'm looking at layers because I'm in After Effects and everything is confusing. Here we go. Uh, so right, right from the start, uh, oh, it looks almost like [01:09:00] It was the very first frame because it looks like it started, boom, it looks like we started and then immediately second frame one, one of them moved and the other didn't.
So if I had to guess, I would say this is not moving. Sorry, the one that I just turned off is not moving. Yeah. So the MP4 as that's frame one or frame zero, depending on how we're counting and that's the next frame. And then the other one. That's the first frame. Sorry, I'm at 50 percent opacity. Yeah, first and second frames are the same.
So I think right as it recorded, it, um, it had a little trouble there.
**Eliot:** Okay. Now, and is what we're looking at, the one that has duplicated frames, is that the, is that the COM? The
**Kevin:** PNG. We're looking at the PNG right now. Oh. Yeah.
**Eliot:** Wait a second. It was an extracted sequence. That's, that's, we can fix that. Um, okay.
That's weird. Can you send me that take? I'll show you under an autoshot. Oh, right.
**Kevin:** Because that's extracted. That's not being done live.
**Eliot:** That should be straight one for one. I don't quite understand that, but I want to. I want [01:10:00] to. And we can also do our, our, our, uh, uh, in our ground truth thing is actually always Blender.
Um, But let's, uh, yeah, that's so under auto shot it's, you've got the, is that the take selected take off take four?
**Kevin:** Yeah, that should be, uh, that should be, let me double check. Yeah, that's it.
**Eliot:** Okay. So then let's just go file and we'll do a take zip,
**Kevin:** uh, export takes up this,
**Eliot:** right? And that, what that does is it packages up the can't, the original footage.
And if you're running jets at Sydney and you are, and you have a. Um, if you have a Cinefile that's, that's matched to that take, it'll actually also zip up the Cinefile and the calibration along with all the metadata for that take into a single Cinefile. Right, right, but you just
**Kevin:** had him do a few minutes ago, right?
Yeah.
**Eliot:** This is our lingua franca. I caught
**Kevin:** that. That's why, again, that's why, mostly why I'm on these. I could probably fiddle through this more and have more sophisticated questions down the pipe, but mostly why I'm [01:11:00] always snooping on your office hours is because that was informative seeing how you, uh, We're troubleshooting the USD, you know, and
**Eliot:** yeah, I'm so glad I'm so glad it is because it's, this is, there's, there's almost always like deep reasons why we're doing this and it's, it comes down to, you know, can you fix it?
Can you, you know, you know, figure out what's going on under pressure and is it systematic and repeatable instead of. You know, unhandled exception, OXCCC5, right? Then when you put your paste, you're just screwed. Now this is, yeah, yeah, yeah. We're very much trying to have it be, everything be, um, understandable and, if something does work, because things do break, like, have it be, you know, manageable.
**Kevin:** Hard to tell, but it looks like, yeah, I mean, this stays in sync, right? So the two PNG files, uh, the first, the first two frames are identical, and then, That one in the third frame is identical as well, which is weird for the mask, but it seems like they're, um, [01:12:00] It seems like they're in sync, right? So if I look at this, uh, Masked here.
Why are you not showing me that masked right now? Because After Effects is confusing. I have to learn
**Eliot:** After Effects better because I just don't know it at all.
**Kevin:** What's that?
**Eliot:** One of these days I'm gonna have to learn after effects better.
**Kevin:** Oh, there you go. 'cause I had it on a alpha mask instead of a Luminate.
And there the alpha is full on that P and g. Um, so these, these stay these track. I mean, that looks, that looks phenomenal.
**Eliot:** Yay. Isn't that wild? How good. And that's, we're actually gonna put a one that's better than inspiring it in is the, the state of the art and the stuff is an unbelievably fast.
**Kevin:** It's banana town, right?
Like, I need to, yeah, yeah. I need to, I need to play catch up on that a little bit as well. And you don't have
**Eliot:** to, like, you can magic mask this stuff, but then you're sitting there like, you know, on every shot. And it's not as good.
**Kevin:** It's not as good. Magic mask and stuff is not getting, it's not trying to save hair [01:13:00] strands like that, so.
Um, okay, well, that was really it. I'm kind of trying to understand, like, okay, why, why are things at different resolutions and different crops and maybe not, um, in sync? And, and quite possibly because I don't have a fan on my phone yet is, is one answer. But then, yeah, here, here you see the, uh, how the crop on the Right.
Those are slightly different crops. If I put, um, Okay.
**Eliot:** Okay. Hang on. Let me, let me just be very systematic in what I'm looking at. So there's ZLCAM, which is our, um,
**Kevin:** So, sorry, what we're seeing right now. Yeah. Let me just turn those guys off. What we're seeing right now is the matted PNG. So the PNG with the alpha from the other PNG laid on top of the MP4.
Cam original.
**Eliot:** Yep.
**Kevin:** Um, and they have all been scaled to fit frame size. Everything's just been scaled to be the same. Uh, uh, you know what I mean? [01:14:00] So I could, I could put all the scales back to 100%, but then nothing will, nothing will line up. These are 30 percent bigger. The, uh, so, um, I do put those back to 100 and it's not remotely close.
Right. So is the, is it like that the camera is when you do the focal length on the app, is that. Actually cropping that. Um,
**Eliot:** oh yeah. It's just, it's cropping in the image.
**Kevin:** It's cropping in the image that it's writing as well.
**Eliot:** Yeah. The, the camera original is, is, uh, we, we record that untouched the, okay. Okay.
Actually make me, so there's two, there's two files. Cam camo is cam original, um, or zco cam, um, is the cam original Uhhuh and then Z. This guy here for comp Uhhuh is, is the cropped version.
**Kevin:** This guy here, and that'll be cropped to whatever you set your focal length. Yeah, and
**Eliot:** that's the, yeah, the comp, the comp, the live comp is cropped.
The cam original is not comped. [01:15:00] It is not cropped.
**Kevin:** Yeah, yeah, okay. So, yeah, and those don't quite align, and I guess that's just because that crop is arbitrary based on whatever I'm setting the zoom focal length preview on the app to?
**Eliot:** Uh, that, and just give me a second to kind of think through this. Um, because we're in the middle of the timeline so that we may be seeing the temporal thing.
**Kevin:** Oh, okay, yeah, I could go back to the beginning but then there isn't really much to look at because I was kind of looking up at the sky.
**Eliot:** Okay, alright, so we go, go in the middle.
**Kevin:** Somewhere early there, we're looking down, um, at least, and yeah, there's a temporal shift there. Oh, no, that doesn't look like a temporal shift, that looks like a scale shift.
His foot's, uh, his foot's in the same position, yeah, uh, right there. The first frame where his foot touches his other foot, yeah, so we're on the same frame. On these, um, [01:16:00] but the scale is just a little bit off, right? So when I was doing my other project, I just kind of laid everything on top of here, and, you know, you can, uh, difference these, or whatever, and then.
**Eliot:** Yeah, yeah, this is, this is interesting.
**Kevin:** And then scale them to, uh, you know, I can, I can pull this trick. Right there and try to get him, looks like 91%, but 91 percent is a weird scaling factor. I don't know why something would be 91%, so that's why I'm asking.
**Eliot:** Yeah, honestly, it's interesting. I don't think I've had people look that closely at the live comp in Jet Set, you know, in the base Jet Set.
It's always the Cine stuff where we're doing the live editing. That's fair. That's my,
**Kevin:** that's my goal to get there soon. I think he's still shrinking a little bit, so maybe I, maybe I overshot the mark there. 92. Okay. Cool.
**Eliot:** One question is what, um, what was your sensor, your background sensor simulation setting?
Was it kind of a default or was it, did you have it set to like a. [01:17:00] Uh, because there's some of them that have different aspect ratios and I, this was one of these things that we're halfway through, you know, in terms of being, I think everything was
**Kevin:** default there, you know, I was just kind of trying to, yeah, I was just kind of trying to film something and, uh, and see, yeah, I'll take a
**Eliot:** look at it.
There's a couple things like clearly we dropped a frame, um, and that's, and just, and for, for reference, the.
**Kevin:** That's helpful. Now I know where to look for a dropped frame. Yeah. Be on the lookout for it. So this is, this is helpful.
**Eliot:** Um, when we're doing Cine, what we, what we, we do is instead of recording data based on, um, exact frames, what we're recording is timestamps, right?
And so, you know, when we, we've, you know, the frames, cause you know, it's a phone, sometimes it's the internal crystal sync on the, um, on the frame rate is good. It's not like. I also see that the frame
**Kevin:** rate is 30. 011 frames per second, which is, uh, interesting.
**Eliot:** So when we're recording the timestamps, uh, the [01:18:00] tracking data is on timestamps, right?
And so we know that, you know, at a pretty high degree of precision. And then when we do the, uh, the Cine processing with Cine footage, what we end up doing is we reinterpolate that based upon, you know, we synchronize the initial frame on time on a time basis, and then we're forward proper propagating in time.
So it's actually fairly insensitive to. Uh, it, it doesn't, doesn't usually get jacked up that much by the occasional dropped frame on the, um, on the, on the jet set side. Um, because the timestamps are like, don't care about frames. They just care about absolute time, but that's deliberately because.
**Kevin:** Yeah, and I don't think it dropped any subsequent frames because then once I think that you can see right here, I had my markers.
This was the project that I was working on. Um, that's a cool frame where you can see the oil painting, uh, oil painting swooping across him. Um, the so right here, once I synced off of this, this was my reference frame. And once I synced off of that, it held. So what it's worth. Um,
**Eliot:** Oh, good. [01:19:00] Yeah, that makes sense.
Um, okay. Yeah, we'll, we'll look at it. Um, we're always, we're always tweaking the, the, um, the frame recording systems. Uh, we do push them on the phone kind of hard. So it's, it's, uh, I'll have to look at it to see if there's what, what it is. Um, and, uh, the other thing we've looked at, and I'll have to ask Greg about this, is that with.
MP4. Um, sometimes if you have the, uh, the predictive vectors, then we may be able to sort of re extract and have it just, you know, kind of reinterpolate one of the missing frames. I'm not sure how well that will work, but we should look at that. Um, I mean, obviously we don't, it doesn't come up with cine footage because, you know, the, the cameras is doing that and they're, they're, they're good at that.
But on the iPhone, when we push the iPhone this hard and we drop a frame, then we want to be able to recover the footage. Okay. To some reasonable degree, you know, with some degree of interpolation and fix it. I mean, people aren't usually comping high and stuff off of, off an iPhone, but it's, it's [01:20:00] good to have.
Nor,
**Kevin:** nor would I be, this is not my, uh, this is not my ultimate workflow. This is my, uh, messing with Jet Set.
**Eliot:** Test, test it around. It's, it's, it's a good, but it's good to know. It's good to know that. And I, I want to look at, uh, especially the P and our PNG extraction should be one for one match to the MP4.
I'm not that's that's weird. Yeah. Yeah. That's interesting. I'm just going to look at it. If it's bug. The nice thing is this is why we set things up this way is like, it's very repeat reproducible. I can take your, your, uh, take. And you extract it and we'll see one to one what's going on.
**Kevin:** Uh huh. Uh huh.
**Eliot:** All right, cool.
Yeah, yeah. So, um, send me the, send me the link and Oh, yeah,
**Kevin:** yeah, yeah. Um, that came out and there it is. JetSet, my project. Okay, I'll just, uh, put that in the chat right this second, uh, before I forget it.
Uh, where did I put you? I have too many drop boxes. That's the problem. [01:21:00] Oh,
**Eliot:** did swapping the cable on your NAS fix the, fix the, uh, throw put?
**Kevin:** I haven't done that yet. So I'm, I'm, I'm, uh, you know, too many, too many things to do. Um, so I, uh, I'm working off of my SSD, my local SSD more than I would like to be.
**Eliot:** I have, I have a theory and we run this theory by you.
Um, which is one of the things we're, we're figuring out is. You know, the way we're doing this, I think is going to be the future, right? So we have teams, you know, in different places and shooting different places and stuff like that. And, and you have to both, you have both like the, the, the tracking data and things like that.
You're bouncing around and the proxies. And those are actually not that big, you know, proxy data and that kind of stuff that, that can transfer on pretty easy. Um, the, the camera original files are monsters. Um, and getting to those from point A to point B is, is a thing. You don't want to just put it all in Dropbox.
Otherwise you're going to smoke your Dropbox limits in a short period of time. Um, so one of the things we're thinking about, about [01:22:00] adding into the system is, um, a, a, a file cloning kind of system. It's cause whereas. We're, we have to run into the ingest side of things anyway, because we're, we're talking to you.
Sorry, you broke
**Kevin:** up there. You have to run into the ingest side of things,
**Eliot:** of things anyway, because we're already talking to the camera originals. We have, we have to, you know, we have to do extractions from them. And what I, what I kind of wonder is if the correct way to do it doesn't end up being, you know, you have on set, you know, you have to clone your mags, right?
So you have some, you're cloning them onto something big, right? I don't know if you have a big set of, uh, you know, 20, 10 terabyte drives or something like that that you have to clone the mags onto and then you want as soon as possible You need you want to copy those files into different safe places and I'm starting to wonder if the way to do it ends up being Okay.
So you clone two sets on set, and then you take one of those, one of those things, plug it in and you do a, um, uh, you know, a peer to peer, uh, bit [01:23:00] transfer, bit copy to someone else who also has, you know, you have a couple of different places that are geographically separate, but everybody's running on fiber where you just do a peer to peer bit copy.
Of those CAM original files to a couple other pieces of giant, you know, USB 3 C drives. Um, and that way they're in geographically disparate locations. And then everybody who has to do a frame pull has the stuff relatively local, but you're not trying to run it through Dropbox. You're actually just doing peer to peer, peer to peer bit copies that you can reference and verify.
Yeah, I mean, Dropbox
**Kevin:** is not, is not production useful because of the sizes on this show. I'm working on, they're shooting, uh, Ari raw, which I'm, I. I think is not a good format because it's because it's so huge like compression is a good thing guys. Let's let's get on board with compression, right? We can do compression modestly and make all of our lives better.
But I think, uh, I don't know, I think Ari has really [01:24:00] good, good marketing and, you know, Ari Raw is the best image quality there is. So. So let's use that. Um, so, so that's, it becomes very cumbersome, right? Um, for this other, uh, shop that I kind of run the post end on, we do almost everything Sony and I have a mirrored drives across the, uh, an East coast and a West coast office.
And we're using a sync thing.
**Eliot:** Yeah. Yeah. Yeah. Sync thing. Let's look, look great. That I was, uh, and the part, what we're looking at is. Use again, again, getting some open source sort of system, but building it into part of our infrastructure. So it's just doing it under the hood. Um, but yeah, same thing is exactly what the kind of like, I don't know exactly that algorithm, but something on the, on the same sort of thing where, yeah.
Yeah. Okay. So that, that's actually a great data point, uh, on. Uh, so you're not, you're not running a giant Synology, you're not, I mean, maybe you're, you're running a couple of NAS systems, and then you're just thinking No, I am. I'm
**Kevin:** running a [01:25:00] giant Synology here, and then a giant Synology on the East Coast.
**Eliot:** Okay.
**Kevin:** And, and the Synologies are running sync thing.
**Eliot:** Okay.
**Kevin:** So, and then if we have like, like a desktop, somebody who's just like on a desktop, sometimes we'll have like a, you know, a work from home Artist or something. And then usually we'll just give them like a, you know, we'll give them like a giant, like kind of two disc, uh, raid zero or something.
And then they can get on part of the share or something, you know, depending on what, depending on what we're doing, you know, um, well,
**Eliot:** makes sense.
**Kevin:** And then there'll be running the sync thing client on their windows desktop or whatever, you know, instead of with direct attached storage, instead of the storage doing the overhead for that.
**Eliot:** Okay.
**Kevin:** Yeah. So for my primary setup, I have Synology is running same thing. The setup that I made, um, the trick with the other things I think is just that it's, it's whatever [01:26:00] shop is doing, whatever, right? And we're, you know, who's doing editorial and who's the DIT and who's controlling that pipeline. And I think for episodic television, it's a little bit more stable than like a movie where a movie kind of like spools everything up and then, and then everybody just, you know, we're just getting through this one thing.
And now our whole infrastructure is right. But only, only a little bit more so, right? Like this, this, this one show I'm working on was doing editorial abroad and now they're bringing editorial to Los Angeles. Um, and it's, it's so fluid, I guess is my point. It's hard to,
**Eliot:** you know, I'm, I'm, I'm just kind of thinking through the, um, um, uh,
the, the practicalities that like, so one of the key things is. How to think about it. So do you have someone in a, in a central area that, so one of the key questions is [01:27:00] you might have a 50 to one shooting ratio or something like that, depending on what you're doing and this is, it does actually make more sense to, for editorial.
To be able to run a frame, frame, pull things, and then only send out a image sequences, like, because, you know, a lot of shots made to go and do frame pulls and send out a package image sequence for each shot, does it make sense to have editorial to the polls and send out these, these shot packages to all the different, different artists, in which case the artists are just dealing with frame sequences.
They never touched the art 3d stuff. Right. I think so. Okay.
**Kevin:** So without asking my post super on this show, what he thinks, um, I think the answer is yes, because like no VFX vendor wants all your camera originals for rights reasons. It might not even be like kosher to do that, you know, in terms of where this footage is going and who's getting what.
Um, and then, um, You know, because certain things, certain things, uh, you know, uh, not on this show, but maybe another show also, there would be like things with maybe like [01:28:00] intimacy or something where like, okay, all this footage is locked down and nobody gets to, nobody gets to see camera originals because, you know, some no nos might slip out or something.
Um, you know, so you kind of need to be able to, I think, manage that pipeline a little bit. And, and yeah,
**Eliot:** this is great. Cause what it told me is that. The, the parts that we want to work on are like the, the link point is that editorial, the link point is it, so we, you know, and we want to, you know, cause we, we don't yet, um, process timelines.
We now can do, you know, name shots and auto shots. So we can pull multiple shots from a single take and that kind of stuff. We don't yet process timelines. It's, it's coming up fast. Um, and. What I wonder is if that's, then we, we have to target that process toward editors or editorial, so they do, they're, they're in control of that and they're, they're, they're basically hitting a button and the, the, uh, the data packages are like, you know, shot one data packet shot to pile in a zip drive.[01:29:00]
Or, you know, or however you do it, push, you know, send it on, right, right. I think
**Kevin:** so, and, and, and there's a couple of things here. One is if you're shooting, I mean, a two camera show, ARIRAW, I mean, there's just so much data. It's like physically not practical to have any redundant copies of this data that aren't.
necessary for backup purposes. It's just, it's just many terabytes, right? So there's like a physical media kind of like problem there. There's also like what I was saying, like maybe intimacy, maybe something else, just like not, you know, we don't want all the vendors to have all the footage for, uh, for something.
Um, And then, um, yeah, the, the one thing I'm thinking about in terms of like Jet Set and using it is like that preview for editorial, um, and the limitations of it. And honestly, from what we're talking about, I feel like that's the, the biggest hit against Jet Set versus just spooling up a [01:30:00] damn computer running Unreal with a tracking system with an ultimate with whatever.
Um, yeah. Is because of essentially what a positive response we got from editorial. And I think I told you this last time, uh, the network was watching the rough cuts and thought that some of the VFX were final because the closeups, if it's a closeup and it's an even green screen, Ultimat's going to do a good job and that Unreal background is kind of out of focus, it's just going to look good.
Right. Um, and so that, like, that was like a really, the, the showrunner and the, the higher ups were like, very happy with like, being able to show the network this thing instead of what the network is used to seeing, which is either a bunch of actors on a green screen or some kind of like, garbage previs or postvis or something.
That's just like, not. You know, it's a little janky. Um, and so that was a big sales point. Um, and I think what you're talking about, obviously, and you're talking about today with that other, that other [01:31:00] fella, um, getting the cine camera through. So it's comping the cine camera, hopefully at the cine camera frame rate.
Oh, yeah. Oh, yeah. Okay. A lot of time. Yeah, match. So obviously that's like a huge step. But then, but then that doesn't, um, We're still struggling on a phone to render things, um, and one of the things that I like to do, you know, um, for example, and, and I'm not saying, and, and this is definitely not me saying, hey, these are all the features Jet Set needs to have, or this is what you, even this is what you should be shooting for.
I'm just kind of like talking through some of the things, and I think one thing I've learned, and one of the reasons I'm so excited about Jet Set is we can't let the, we can't be the perfect, be the enemy of the, let's get it working most of the time. Right? Um, I, I really want something that works most of the time.
Um, and so, but like one of the things I really liked doing was in unreal, I would just be recording the light position as we would go around and the DP would be maybe [01:32:00] moving something. I would be matching my virtual lights and his real lights or vice versa. And then that would be in the take for that shot.
And so then when I load that take up later, then the sun would swing around and be already dialed into a position. And I know that like a VFX artist later can swing the sun around. Um. But it's one of those really nice things that helps us, helps us dial in the look on set and, um,
**Eliot:** Yeah, that's real. That's real.
Like, it keeps, like, per take modifications. And the Gaussian splats don't, don't handle that. Not yet. Right. You know, um, what, oh, this, this is cool. But I will test this is, uh, uh, see if I can find this.
**Kevin:** But I would easily give up per tick modification for a system that actually worked. Yeah. That's my, that's my point.
That's my, you know, my broader point. You know,
**Eliot:** there's, there's the, you know, how many, uh, and, and, and it's, you know, I'm very curious. I think some of this is horses for courses, sorts of stuff. And at a [01:33:00] certain point, if you have a sufficient number of people on set, The additional cost of having the two or three extra people it's going to take to keep the, the ultimate and all that kind of stuff looking, it's not, it's just not a thing, right?