Transcript

# Office Hours 2025-01-13


**Navaz:** [00:00:00] So, yeah. 


**Eliot:** Good morning, Elliot. 


**Navaz:** Elliot, what's going on, man? Good morning. 


**Eliot:** Hey, you know, it's good to see you. 


**Brett:** Alright. Uh, Navaz and I were just talking a bit, but he and Roman came out to the space that uh, we're working at. Oh, fantastic. It was like two weeks ago. Actually, it was before the holidays. It was before the holidays.


Uh, and checked out kind of what we're doing and seeing what our challenges are. Alright. And what the, uh, the main thing we're doing is The main challenge that we have right now is really this live preview through unreal and getting something that the owner of the space is happy with. He's got a lot of connections.


He's lived in L. A. his whole life. He's worked in television his whole life. He knows a lot of people and he has a lot of potential contacts to do really interesting stuff with this. But he doesn't want to show anybody this work until we're where he thinks we should be. Right. [00:01:00] Um, and uh, In your backgrounds, is a lot of stuff moving or is it more, more static?


It does move a little bit. I mean, he wants to be able to move the camera. We, we actually have got the camera set up on a jib now, which is nice. And we played with that a little bit last week, uh, before the fire. So I was actually out there Tuesday and he lives in the studios in West LA. And I was driving down the 405.


I saw the Palisades fire as it started. It was kind of insane. So I haven't been out there since then because I live near. near the Eaton fire. So I was concerned that I might have to be evacuated. So I hadn't been back there. 


**Eliot:** Right. First things first. Yeah. 


**Brett:** Yeah. But I'm planning on going back down there tomorrow.


And like I said, Navaz and his partner, Roman came out and helped us out. And they were talking about the big thing is we can do. All the work in post and I'm pretty satisfied with how things are working on that front. I mean, I'm very satisfied. It works very well. I'm getting [00:02:00] good results. Uh, I figured out a few little tweaks or bugs in your script if you're curious about it.


For the unreal live, not for the live preview so much as actually for the, uh, when you're importing a take, um, the main one is, uh, and this took me a while and I'm so upset at myself with my background and finishing editing and all this kind of stuff that I didn't realize it's, uh, it's offsetting. The, uh, the live camera, the footage, the camera footage by a frame.


Uh, so it looks a little loose and on the sink when you're watching it in unreal and when you render it out, it's actually a little bit loose. But what I found is when I go back into fusion and move, move the video frame, it locks right up. So, uh, it's okay. Let me 


**Eliot:** make sure I just understand that these are the things I always want to chase down.


Because we, we do a bunch of things to get our timing precise. Yeah. In the frame, in the frame, from the frame to Unreal. Do you, if you have a take that you can, [00:03:00] and the thing that helps us get this stuff is an example. If you have a take where you're getting that, if you have a take where you're getting that problem, can you zip it up and send it to me?


**Brett:** Yeah, yeah, and now I haven't done that before, so do I do that through, uh, 


**Eliot:** Autoshot? Autoshot. It's under the file menu, files, zip up, takes, you just pick out whichever take is problem child, you know, have that, have that. Yeah, 


**Brett:** it's pretty consistent. It's happening with everything now because I was chasing it for a couple days and I was like, why can't, because it always looked loose and I thought I had seen it look better, but what I realized is that I had seen it look better when I did the offset inside of Fusion and didn't realize I was even doing that because I hadn't compared it to a render out of Unreal.


With the video in it. Uh, so what it's doing is, uh, you know, when you when you do the save and run and it transcodes the footage to EXR when it puts that EXR sequence in and that EXR sequence is correct. It has matching time code when I look at it against my camera raw. Everything is a [00:04:00] match, and I've even been able to match the transcode settings, the, you know, the color and everything.


So, because my plan is to hopefully be able to send a key version of the video back into Unreal. Because the nice thing about doing that Instead of finishing that shot completely in Fusion, you get all the lighting inside of Unreal actually cast onto your video by doing that, which is kind of a nice feature.


Um, so, but what ends up happening is so you, you run the script and you, you paste it into the Unreal, uh, console, command console, and, uh, and it makes the sequence and everything looks good. I found that when you render that out, that video, Is actually a frame early. 


**Eliot:** Okay, let me double check that and make sure I understand.


So, when you're rendering it out in Unreal Are you rendering just the background or are you rendering? No, I'm 


**Brett:** rendering the entire shot just as a test to see. Cause you can, you can preview it in Unreal. I go in and actually turn off the hidden in game setting [00:05:00] so I can actually watch it and play it in real time.


Uh, and I have a pretty beefy system and it's able to play it in real time, but it's still got a little bit of pixelization and stuff. So I like to render it out and see like, what am I actually getting? If I just take this. Render it out. I don't even necessarily pull a key on it. Usually I'll just go and turn the hidden feature off so that it's there.


Sometimes I'll put a temp key on it using your, the, the feature you guys have added in the script, which allows you to make a, uh, it's not necessarily a material for king, right? 


**Eliot:** Yeah. Yeah. 


**Brett:** So yeah, and that works great. Uh, and I'll put it in, I'll render it and I'll be very careful. Like I'm going to make sure I'm not touching anything and slipping it on myself.


On accident and I render it and it looks the the sink is a little loose like the movement on the camera original and the movement uh in the unreal cg environment what I discovered is the the camera footage is a frame different if I push it a frame later it syncs up so it's actually [00:06:00] usable and totally cool and I can fix it inside of unreal too so I go into the sequence and unreal and I move it over a frame 


**Eliot:** and where I just couldn't Cause I, cause when I, when I've been done experiments with that, what I ran into is that I had a hard time getting that to sing, I was getting very kind of erratic behavior in the real time, um, real time key.


And so in the tutorials, I actually turned it off and composited it, just render the frames and comped it in, I think fusion and then it locked, but so I want to double check. Um, if you and that's where I just go ahead. 


**Brett:** Yeah. Yeah. So when I do that, because I've done the same thing, uh, if I put it a frame later, I'm able to get it to sync up.


Like if I render it out with the video and out of unreal. So I basically, I use your script, I paste it in and it creates the sequence. And then if I do nothing other than just turn that video back on so that it's actually visible when I do my render, I've noticed that. [00:07:00] It's a little loose, but it's only one frame.


And I, for a while, I thought, Oh, the tracking's bad. What's going on? Why is this one? And I, like I said, I was a little frustrated. I chased it for two days until I realized it was in fusion, um, it was inside of the edit side of resolve. And I said, well, what happens if I just push it one frame later? And then it locked right up and it looked great.


And so that's what it was. And then I went into unreal. I said, okay, so is this happening inside of unreal? Am I missing something? And I did the same thing. I took the video track and in the sequence and moved it one frame later, rendered it out and then everything looked great. 


**Eliot:** Oh, I want this take. Yeah, this, this could be very, very useful to understand.


Um, because I, I ran into some weird stuff with rendering the image plate, and the first, the first round was that the initial, uh, antialiasing sampling system was causing weird, you know, timing artifacts, so we changed the antialiasing systems, um, so I want to try it, I want to try it, because I, I've, I've, I've [00:08:00] beat my head against this on the Unreal side, and if it's just a frame, it seems to 


**Brett:** be just a frame, because Because like I said, my goal would be to take the key, take the video footage, what's in the image plate, key it inside of go, go back to the raw key it inside of a fusion and then, and then spit out a keyed version, uh, with an alpha, but match the color of what you guys are doing, which I've been able to do.


It's just changing the raw settings and getting the right setup. I think it's, uh, SRGB with, uh. Linear gamma, uh, uh, it looks right. If that's what you guys are doing, then that's what I'm doing to get it. But then my goal would be to take that back into unreal and replace the footage in the image plate from your trans code to now my version that has the alpha.


Right. And the reason I like that workflow, like I said, is because the lighting really looks a lot better because I get all the lighting and unreal is now being cast on, on my image plate. 


**Eliot:** [00:09:00] Right. 


**Brett:** Um, makes sense. What I, I got to work out because one of the other things that happens, and I talked to you a little bit about this, I've gotten around this, but because if you move the camera forward and backward, um, if you've got any kind of objects near your, your, your actor, uh, sometimes the actor will end up intersecting with that object in the unreal environment.


And also since that actor is not necessarily really moving. In, in the shot, but it's the camera that's moving. I'd prefer it if the actor is kind of in a locked position inside the unreal, but I think I figured out how to do that too. So I'm not really, that's more of a unreal workflow. That's really not.


Your guys's issue. Although, and I like, I like what it does, where it puts the image plane, image plate in the scene, uh, that's what gives you the good lighting, but yeah, depending on your camera movement, it can cause some weird things with the intersection, but that can be fixed in the render, because I'll still finally take it back into Resolve to do the color and [00:10:00] the final finish, I really just want to be able to do some of the compositing and pick up that Unreal lighting.


on my actor. That's really what I'm looking for. Um, but I figured out a lot of those things. The one thing, and I've even figured out this offset. I've gotten it all working the way I want. Uh, it's really just a matter of kind of giving you guys a heads up and yeah, I'll, I'll spit out a couple of takes.


Should I post those? 


**Eliot:** Uh, yeah. Yeah. Just to put post 'em and put 'em on if, if it's okay to put 'em on the form or whatever, or just Yeah. Yeah. These are all 


**Brett:** tests. There's no 


**Eliot:** nobody name anything. Yeah. And just send a link and stuff. 'cause I, I very much, we, we go through it pretty systematically when we're testing to make sure that we're frame on.


'cause I'm, I don't want things to be a frame on. Like we spent too much time to get it to be frame on and I want to find out exactly if there's something causing a frame offset. I want to know exactly what it is. Yeah, it seems 


**Brett:** to be doing that. So, uh, and it seems to be doing it consistently, like I said, cause I did multiple takes trying to figure out this problem for myself so I could come on with you guys today and go, okay, here's what's happening.


And this is what's [00:11:00] happening, but I did figure out what it was. It's not that your tracking's bad. The tracking is very good. All of that's working very well. It's just that script. It gives you an offset and it may be because of the 30 to 24 because I'm shooting 24 frames on the uh, On the video footage on the on the raw on the black magic.


**Eliot:** Well, if it reproduces we can find it I mean, yeah, that's the great that's the good thing. Are you on 5 4 5 3? 


**Brett:** We're running 5 3 2. Um, and mainly that's because I don't know if the we are also doing the the real time preview I don't know if those low net plugins work on uh, 5 4 also. I don't um, what is there something inside of uh, Omniverse.


Oh, uh, they don't have an, uh, the, the plugin that you apply in Omniverse, they only go up to 5. 3. 2. They don't have a 5. 4. Yeah. 


**Eliot:** Yeah. You can write out a USD from 5. 4 with a native USD exporter. Um, and, and, uh, the native USD exporter will work fine for, for simpler things. If you get something really complex, it can, uh, but we have to check it [00:12:00] cause I know they, I know they redid the, the USD Um, so the, the native one may, Oh, to make it better.


Yeah. Yeah. Yeah. So we use the omniverse one just because initially the native exporter wasn't really up to snuff. It like it just missed a ton of stuff. Uh, but it is getting better. Um, so at some point I'll do a, we can do a tutorial on that. But that's, yeah, 


**Brett:** I'm really focused on making it work in 5. 3. 2 and not trying to move beyond that until we've got it and everything is working very well.


On the, on that side, except for that one issue with the one frame offset. And I'll, I'll send you something on that. And I figured out, I mean, I can still do everything I need to do because I, now I know what it is. It's just moving a frame and I'm good to go. And I can, and I can fix it in an unreal. I can fix it in fusion.


I can do everything I need to do. The other problem or the other thing we were dealing with, and this was what Navas was helping me with a little bit is the real time preview. Inside of unreal. Um, and like I said, it has a lot to do with the owner of this space and what his expectations are, what he wants [00:13:00] before he presents this to all these people that he wants to involve in what we're trying to do over there.


Um, and the first thing was, it. It was the setup and getting it up and running every time was kind of a, an arduous task. Uh, we've gotten past that by doing a couple of things. The big thing was he didn't have a full rig at his studio. So I was basically taking my rig out there, but I don't want to take it all put together.


So I would disassemble it, pack it up, take it out there, unpack it all, put it all back together. And there's invariably, and he knows this because he's worked in post for years. I was like, you know, when you take things apart and put them back together, it's always, there's always something that's not going to work quite right.


And you're going to have to do troubleshooting. Um, but we've kind of minimized that cause now he does have a rig on site. Um, and you know, and Navaz actually talked to me about automating some of the things that I have to do every time. Cause so like when we do the unreal life preview script, if we come back another [00:14:00] day and want to set it up again, and we reset the origin or anything, there are a few things we have to go in and clear out some of the things that are created by your script.


Make sure that we get them all out of there, because otherwise we get wonkiness because now we're, you know, it's a, things get a little wonky when you moved a couple, you're a couple days out from the first time you set it all up. In my opinion, you want to redo it every time. You want to at least redo the origin, rescan the set, do those kinds of things just to make sure that everything's as good and up to date as possible.


But that's usually like a 10, 15 minute process, which is not bad. That's totally acceptable for us. And I think we've gotten there. Um, the one thing he gets frustrated with is, so we've got these sets that, uh, have multiple layers. We've got actors behind desks and things like that. And, uh, and that takes a little bit of prep to get the preview to represent that.


Inside of, uh, unreal because you have to make layers and you have to position things and, you know, go [00:15:00] into the, uh, the compositing tree and 


**Eliot:** yeah, so there's things you 


**Brett:** have to do to make that work that don't happen automatically because your script isn't necessarily designed for a set that's got.


Objects going at different levels in terms of where your actor standing, um, so, but again, I've kind of worked that out. The other big thing he's wants that I don't know that we can. I was talking to them about this is the quality of the preview coming out. And this has a lot to do with unreal and the way it does these live previews.


The big thing is the anti aliasing in the live preview is pretty much non existent. I mean, you can play with the settings and stuff, but it doesn't really. Improve the quality of the image in my experience. Now there is a way to get a very clean image out of there using the image capture. Um, the problem with that is it won't, it doesn't really seem to work inside of Composure.


It'll work with your camera so you can get your CG environment [00:16:00] and it looks great. The problem is your, your camera footage is now no longer in line. Now I talked to another guy who is doing this. Or working, working and trying to work it out and get it working better for him. And, but he's using a much better camera than ours that can be genlocked.


And this is what I was trying to explain to the owner that I don't think with the cameras, because we're using essentially three pocket cinema, six Ks is what he's got over there. And those are great cameras, but they don't genlock. You can't genlock them externally. Uh, And, uh, the other guy that we talked to is basically Jen locking his camera and using Ultimat to do all the keying.


So he's completely bypassing Unreal when it comes to the camera footage. Um, and, and he says working great, but I told him, I said, well, that means we've got to buy an Ultimat and a new camera. Uh, so, and he goes, well, I don't want to do that. I said, well, I think we're going to have to just accept that this is what it's going to look like.


And in my opinion, [00:17:00] It looks good enough when everybody understand this isn't the final, this is just us setting it up, give you an idea of what it's going to look like, maybe play with the lighting a little bit, so that we can get the lighting right for what we have for the set. But Navaz actually was investigating doing a real time preview inside of, uh, Blender.


So, I don't know how far they got on that, but he and I texted back and forth about that possibility. So, maybe he can talk about that, or it looks like there's somebody else here who's got some issues, so. I don't want to monopolize all the time, but that's kind of where we are. 


**Eliot:** Jump in with Robert, because I know Robert has some stuff impending, but I'll tell you this.


So, here's, here's the stuff that my rough, like, three minute take on this. Is, you are not alone. It's something that comes up with. Everybody, which is that they want to have, want to have a live preview that, that shows up on a monitor that the people who are paying for all this stuff looking like, oh, okay.


I get this is, this is, you know, it's the client feed basically. And the problem [00:18:00] is that trying to do that hooked into unreal is, and it's not unreal specifically. It's the nature of having, and this is how virtual production has been doing this for at least 20 years, right? You have your tracker and you have your engine, you have your ultimate.


And getting those three things to all work together is, is a rolling bear, the way we, you know, did it in the old systems. Yeah, we hard genlocked everything, right? Yeah. Um, and it's, uh, bear. Um, even on the, on the, and, and I came to the conclusion. That because we have to, we're deciding internally, like how, how you tackle these things.


And one of the things that we, we discovered very, very recently is that we can generate, um, and I made a video of this and we already have a pipeline and blender for doing this, that we can build a Gaussian splat of a scene that you basically, you know, you render, you know, a couple hundred images in different points and crunches down to a splat that splat will render in the phone.


And it looks like, you know, 80 percent of the original, like done, like it rendered, like looks. [00:19:00] Getting the works in 3D, everything done. Um And I went, okay, there we go. That's and so the last piece that we need to do, and which is queued up next on our, on our dev list is taking a composite in the phone, taking that live feed, which we already have from the Axiom because we already have it for the calibration and compositing it.


And that way in the, in the, you know, jet set viewfinder, you see. The semi footage composited in real time. We know how to write keyers. Keyers are not a problem. Um, and composited, so you see that on top of the, you know, Gaussian splat you spit out of them, and especially Unreal can render a thousand frames in like, you know Yeah.


40 seconds, right? 


**Brett:** But that's all Blender specific like you'd have to create the The, the scene inside of Blender to do this. 


**Eliot:** We, we shipped the first version in Blender and we're already building the version in Unreal. Because all you need to do is you have to render, you know, a bunch of frames of your scene along with some associated metadata files.


And then we run that through first, um, [00:20:00] Reality Capture, which takes about, you know, five minutes to crunch stuff. And PostShot takes about 30, you know, to crunch something. And then you get a Gaussian splat. And, and Yeah, then it works. It just just works. Yeah, I mean, 


**Brett:** that sounds fantastic. And I think I've talked to him out of this.


I've kind of pushed him in this direction of like, we're not this is what it's going to be for right now, you know, uh, and I kind of think he's accepted that because I kind of So, but anything we can do to improve that preview is, is great. 


**Eliot:** It's key. 


**Brett:** It's critical 


**Eliot:** because it's at a certain point, you know, the, especially when people aren't as technical and they're, they're not used to looking at it, they're seeing a green screen and actors.


And if you have like really bad looking CGI behind them, they. They're just, you know, you spend most of your time kind of trying to tamp down panic and see, you know, is it going to look like the end of the loss? Right. And you, and you need to have that preview there. Um, but if we can get it all on the phone, that means we can fix it because then it's all under our own code stack and we just solve it and we can, we can [00:21:00] solve when it's our own code stack, we can solve it, but it's not in our own code stack, it's you, you come into these very intractable problems where, you know, I don't know how to fix it.


And so, um, that's, so that's kind of where we're, what we're aiming for. And it's, and I'd expect where, you know, we'll do it fairly quickly, um, you know, in the next, you know, it's going to, it's, there's some number of weeks of work on this, exactly how many weeks, you know, we're gonna have to get into it to kind of, kind of see it, but it's literally, it's going to be the next thing up on our, on our decks to kind of, to solve this sort of stuff.


Um, but that's, that's where we're going. We're going for compositing in the phone where it's all under one roof. We. You know, we already have the feeds going in there, we know what we need to do, we already have Gaussian splats. Um, I've got the first versions of the Unreal Gaussian splat generator working.


Um, they're a little bit goofy, so I'm going to have to go through and make them smoother. Um, the Blender one's pretty good. So I did a tutorial on that one, you know, and you can do it for any 3d software. Like we've got one for [00:22:00] Maya, we've got one for Unreal, like, you know, there's ones for Houdini, like generating Gaussian splats is just like this game changer for the scene.


And you can still overlay. If you have stuff moving in the scene, you can bring that in as a USDZ file. If you have cars or something moving in, yeah, just bring those in as USDZ. But the overall scene lighting, you know, where you look around and the feel of it is done with the Gaussian splats. So it looks like it's a render.


I literally like what you render it. Um, And then, then everything's in the phone and then it's, then it's easy and we can, and, and I can solve it, you know, and we're the, the next thing we're shipping, uh, is. Um, a remote zoom, uh, integration into, right into Jet Set. So you see what I do on, on this, on the, on the call.


Sometimes I'll patch it on someone's thing. That's with our own homebrew system that we built like three years ago, but we're doing a direct integration into zoom. Sure. Um, so that you'll be able to, and you'll be able to do this. You'll be able to send a link to, you know, the person you're working with and they'll open it up and it'll open up the file of the scene, patch in a zoom link, and so you can talk to them remotely and walk them through [00:23:00] operating Jet Set.


On a zoom call because it's such a game changer. So, uh, so that's just a little peek of the future, but you're not alone. This is, and it's, it's fundamental and we have to solve it, but I'm just going to tell you that the way I think we're going to try to solve it is inside the phone instead of trying to hog tie a bunch of external end.


I mean, you can do, there's, there's a specific users who can handle it and you can clearly handle it, but it's. Man, it's hard to do that in production when you, especially when you're getting up and taking apart stuff and it all goes in the truck, drives over here, you're like, Oh shoot, what happened to, you know, some random connection, you're dealing with, you're dealing with so many weird things, you can get power levels on a stage where the voltage output is on some different level than what you'd expect, and it jacks up stuff, right?


**Brett:** Oh, we've even discovered that with the, with the phone cooler. He's running, he bought this phone cooler that he wants to run on AC power, and I've, I've said, you know, actually the axon, the power out of the axon gives you [00:24:00] better power and it gets colder because we're running it for an hour or so. And I'm like, the phone's heating up.


I'm getting warming warnings and I feel that cooler. I'm like, this thing is barely cold. It's not even cold really. And the phone is definitely hot, you know? 


**Eliot:** And watch out for running AC, long AC power lines on a stage because you can get, and it's an electrical engineering term, it's called, uh, different, you know, like, ground level, ground level loops, where your ground is, voltage is here, is different than it is here, and you hook the wires together and weird, crazy, double E, unsolvable analog stuff starts happening, where it's on the battery, on, on, on the camera, and There's like, there's no external outside interference.


**Brett:** That's the funny thing is because everything I've been doing, I'm running off batteries, uh, when I'm doing my testing at home, just cause I prefer to have the camera not tethered to a bunch of cables. Um, but he's really. Hard. The owner again is like, I want everything on AC power. I don't want us to be having to change batteries.


I don't want to be doing that because it's all here and I get it. But then now we've got this big today. It's working. You've got it [00:25:00] on a jib. It's really starting to come together. It's, it's all a work in progress, just like what you guys are doing. Uh, and I appreciate you guys doing this. And the product is very impressive.


I know I complain about things, but I'm very impressed with what you're doing. You know, uh, This 


**Eliot:** is exactly the stuff I want to hunt down. Cause it's that frame off stuff that drives you nuts. It's one thing when you're just doing two, two shots, but very quickly, if the show lights up, you're doing 600 shots and it'll happen in an eye blink.


And that's what we want to be prepped for when the anvil comes down. Cause we all know what it looks like. 


**Brett:** Uh, yeah, I'll zip up some stuff. I'm actually going to be shooting a bunch of stuff tomorrow too. Uh, that will, where I'll actually be on the stage, but most recent stuff I have, I didn't shoot on a, I shot against a green screen, but it's a real kind of.


Cluji setup, but basically my son's room, which is painted green. 


**Eliot:** No, that's good. That's good. I, I love this. Actually, I really like that. I love people testing in their living room with a pool of green screen because that means we can iterate and we can test right there and right [00:26:00] then and we can, we can solve everything.


If you have it cracked in the living room, going up to the stage is not a big deal. Yeah, I'll, 


**Brett:** I'll, I'll send you the stuff I did 'cause I've, I've got a few and they all were doing that one frame thing, so, uh, I'll pull 'em and send 'em to you today. Uh, so I just post, do I post 'em on the forum? Is that where I put 'em?


**Eliot:** Yeah, go ahead and post, post the link on the forum, like, you know, with some explanatory text so I know which, which one, which one is which, and the, like, what the problem is and stuff like that. And that way I can pull it down and we can recreate it and uh, and, and hunt it down. 


**Brett:** Should I just put it on like a Dropbox or do you guys have any place to put 'em or?


**Eliot:** Yeah, Dropbox 


**Brett:** is fine. Okay, that's fine. I'll put it on Dropbox and give you a link. Uh, in the form and I'll shut up now. So, so no worries. 


**Eliot:** No worries. Kudos on the hat with the pink Floyd. 


**Brett:** I got this. Uh, I don't know if anybody lives in the LA area. They had that big, uh, like a big, uh, what do you call it? It was like a museum.


It was a traveling thing. It started in London that had like all this stuff. I got this there. Uh, it was kind of a cool thing because I was like, oh, that's like what they called themselves way back when the Sid Barrett. [00:27:00] Yeah, 


**Eliot:** exactly. 


**Brett:** A true fan knows that. Yes, of course. 


**Eliot:** All right. I'll stop talking. All right.


I'll leave you with your saucer full of secrets. Okay. Hey Robert. All right. So we, uh, actually wait, let me double check. Uh, Navaz, do we need to fix anything that you're, you're, you're cranking in? I know you were in here first. 


**Navaz:** Oh, no, everything's good. Um, I do. I will say this is that I read, um, through the test with Brett, we found that, uh, loan at two on blender only works.


Um, with Mac. Really? Right. It's interesting. So, like, I mean, I was trying on my PC for hours and stuff and then all of a sudden, um. You know, Roman put it on his computer, and he runs a Mac, and then all of a sudden, it worked, like, it worked flawlessly. 


**Eliot:** That's weird, I usually test on a, alright, I'm gonna have to try that.


I usually test on a PC. There's some, there's some weird settings on Windows that you get, that, that can bite you. Um, so let me, I'm gonna make an, uh, [00:28:00] little hit on Windows. I'll do a quick test after call and see, see what it's doing. Yeah, right out, 


**Navaz:** right out of the box. I mean, it works perfectly with, um, with, uh, you know, with the max, um, the other, well, actually the other thing I needed to ask you, um, we can, I can do it after, um, but it's more about the object, uh, locators.


Oh, 


**Eliot:** got it. Got it. Okay. Yeah. That's so I tell you what, so Robert, I know has a shoots coming in fast. So let me, let me get Robert, uh, hooked up. All right, Robert. That's a, let's see, where are we at with that? 


**Rob:** Hey Elliot. So yeah. Um, the process was strange in that one of my lenses, uh, was able to calibrate at every distance.


The other one didn't at all. And, um, the first one. And, uh, two of the distances of zoom, it was not able to push it back to the phone. Um, I did it all kind of like round robin in the app where I just kept on making a new [00:29:00] profile, new profile, new profile, new profile. So I don't know if that's not the right way to do it, but it seemed like it captured everything fine.


And I just don't know the auto shot part, why it's, why the error. 


**Eliot:** Yeah, do you have your phone there? Uh, let's try a screen share with your autoshot and let's just crank through it. Let's figure out what's going on. Because if I can see what's going on in the console, I can probably debug it. Uh, and let me pull up the zip file because I know you sent me that.


That sent me that, so let me pull that up. Um,


**Rob:** okay, so, screen sharing now, so,


share, okay, so you should see my auto shot and now let's go, jet set,


**Eliot:** archive real quick so I can, I can check this calibration. Okay, so let's see. [00:30:00] All right, so we've got ghosts, uh, ghosts everywhere, there we go, all right, so, uh, is this one of the calibrations that gave you problems? Let's see, it's com, this is 


**Rob:** scene. Yeah, this is the one that wouldn't push back, so it calibrated, um, and then couldn't reach the thumb.


**Eliot:** Okay, let's uh, let's punch in your correct sensor width, let's just see, let's see what it, uh, it's in the 27th state, calibration, push it over, run there. And this is the one that, is this the one you sent me that the, the archive, uh, archive. zip? Um, 


**Rob:** I put a few in there actually. 


**Eliot:** Okay. Let me, let me pull up the archives.



**Rob:** think, I think I put like at least two. 


**Eliot:** It's crunching. Let me see what's going on.


Cine, there's the iPhone. Let's go take a look at what we've got. [00:31:00] All right. So coming over. Yeah, it looks like a decent amount of detail.


Phone, um, You've got an object in your field of view. Looks like a mic, a microphone or a cable or something like that. That could be jacking you up. 


**Rob:** Correct. And that was part of, I guess part of my concern with this was that the, where I'm mounting it, the lens is always gonna be, seems like it's always gonna be in the way.


A little bit.


**Eliot:** We can't your syn, oh wait. Oh shoot. There it is. That's what's going on. Uh, okay. So in your syn files, um, I think, yeah, it is what's breaking. Is, uh, you, [00:32:00] you're, this is from what, is it a red Komodo? 


**Rob:** Yeah. 


**Eliot:** Can you, is it possible to get a clean feed from that? Cause right now your, your, your video feed, this is what's breaking.


Your video feed includes, is the, A, the video's inset, so your overall HDMI feed, the video's inset, um, a chunk of it, and it's got the name of the reel, A001C03, and also the timecode burned into the image. And so, we, this is, you're, you're running into something that we're, we're gonna be, you know, adding a fix to, um, At some, at some point soon, um, but I don't think we have it yet, um, where we can handle image insets.


Right now, we, we basically depend on a clean feed, and so what's happening is that the algorithm is, is seeing the A001, C003, and the time code as, as image matches. So it's probably going to throw this thing off. It's probably going to have a hard time converging on it, um, because there's this, And we, and we need to have this in the UIs to remove all the overlays from the frame.


Like this is going to, we're, we're about to [00:33:00] do a big rework of, um, the cine camera, uh, calibration stuff because we, as preparation for doing live compositing in the, in the phone. Um, so this is on the list, but I can almost, okay, so let's take a look. So it calibrated, I'm worried. I mean, it were, it calibrated, but that's failed pushing.


So I bet I just ran for it. Termination, convergence. Oh boy, I bet that, I bet it broke the calibration. Although it said, it said convergence. Um,


that's it. I bet that's the problem. Okay, so on the, do you, do you have, do you have the Komodo? 


**Rob:** Yeah, got it. I got the whole. 


**Eliot:** All right, let's take a look. Um, let's try this thing out on the Komodo. Is there a way we can set that view to not have a burn in and not have a data burn in to just be a clean feed?


**Rob:** You're saying basically in like the letterbox area and the black [00:34:00] bands, it's actually typing the time code numbers and stuff? 


**Eliot:** Yep. Let me, let me show you, show you this. This, if you look at, at the archive you sent me and I'll, I'm going to share my screen so you can see it because this 


**Rob:** bites 


**Eliot:** people and, and I got, we got to fix it.


Um, and we're going to be able to write some code. All right. Uh, allow. All right. They're just sending one JPEG. Okay. So can you see this? I 


**Rob:** don't see your screen yet. Uh, I 


**Eliot:** think there's multiple shares. You might have to click on a 


**Rob:** different one. Okay. So I exited my share and I don't see yours yet. No. Um, 


**Eliot:** I think.


Oh, I 


**Rob:** see. It's a tab. It's a tab. 


**Eliot:** Okay. Yeah. Go to that share. So this, this is what's, this is what's biting you. This, so there's three, three things. The, the burn in. 


**Rob:** Yep. 


**Eliot:** And the other thing that's biting, biting you is that it's, and this is on the reds and we got it. We got, we have to write some code to handle this, which we're.


The process of doing it's just going to be a little bit of bit of time to get it get there. So the reds do, [00:35:00] um, like, like the Sony's just do us and almost every other camera. They just do a straight, you know, the, the HDMI feed is just a straight one to one of what's, uh, what's coming in on the camera.


Great. It's kind of exactly what we want. Um, And some of the reds do an inset, and then they have the burn ins on the outside of it. And right now that breaks our calibration process. I didn't know they did this when we wrote the initial calibration, and then we ran into this. So we're going to add some code that automatically insets.


Like, you know, Oh, that's the user. Say, this is the part I care about. I don't, and the worst offenders are some of the old DSLRs where it's actually like written all, you can see my, my, my pen. Right. 


**Navaz:** Yeah. 


**Eliot:** Where they write it on top of the image. I'm like, I'm like, you can't fix that optically. That's, that's, uh, you know, that's, that's a no go, but at least you can, at least the red is, is insetting it.


So let's, let's look at the red menu. Um, You can even put it up on the, uh, on the screen and we can try to find it. Cause there should be, [00:36:00] uh, and it's a Komodo. Let me see if I got a red Komodo manual running around here. Is this, yeah, this will bite you. 


**Rob:** And, um, quick, quick question. I'm just thinking like, um.


And I'm not, I'm sure we'll solve this today, but is it possible to calibrate after a shoot also? So we just capture everything and then later we calibrate it, then it knows how to map things. 


**Eliot:** Um, you need to have an, a calibration operating in the shoot. We are going to add something to where you can, um, change calibrations afterwards like that.


We are going to add something to that, but you'd need to have a calibration running during the shoot so that it knows that it's a Cine. It's it knows that it's a Cine shoot. Um, so it has, it has load, you know, so, so it knows, okay, I'm on a 24, I'm on a 32 or something like that. Um, you just, we need that because that's, that tells the pipeline to switch into Cine mode and to, you know, like look for extractions form from the, uh, uh, from the Cine [00:37:00] footage and all this kind of stuff.


**Rob:** Um, but, but just to talk that through since, since multiple did actually calibrate correctly. So like other shots with this camera, I've already got them to, to sync back to the phone. So worst case scenario, I would be able to just select the wrong setting and then later change it is what you're saying.


**Eliot:** Yeah, I think, I think that should work. I don't know if we've added the piece and auto shot to swap in a different calibration yet. I don't think we've actually added that. It's just come up a couple of times. And I know we were, we were talking about it, um, as a, as a stop gap in some cases, but I, I really want to, let's, let's see if we can't crack this nut.


Um, because then, then we're correct from the, from the get go. And then the right solution, which we will eventually do is having something that detects these. Repeating pixels flags you during calibration and says, okay, we need to do an either not an inset or, uh, or, you know, or fix the camera settings. So we're, we have that planned for the next chunk of work to get ready to do compositing in the phone.


Cause [00:38:00] then, then. Then we have a closed loop. Once we're composing in the phone, you can see the back, the tracked background and the calibration. And you'll be able to just point the camera at the phone at the ground and see if things are sticking, you know, lining up. And then, then you just know, right?


Then it's a closed loop system. So this is what we're, we're getting there. Um, okay. So let me, let me find my, let's see if I have a Komodo manual. Uh, red. There's


Komodo operations guide.[00:39:00] 


**Rob:** Guides. Would it be a guide? 


**Eliot:** Uh, let's see. Yeah. I'm already in the operations guide 


**Rob:** under the, 


**Eliot:** the SDI. 


**Rob:** There's a menu setting called a guide. 


**Eliot:** Uh, let's see. So, yeah. Okay. I'll show you what I'm looking at. Uh, let me share my screen so you can kind of see where I'm at. Share Komodo. Okay. So this is kind of what I'm looking for is under, uh, is the Komodo send you're, you're monitoring this on SDI.


**Rob:** Uh, it's SDI and then we convert it, but, but yeah. 


**Eliot:** Okay. That sounds good. All right. So SDI, okay, so let's see. Guides. Let's 


**Rob:** take a look at the monitor. That's, that's currently enabled for [00:40:00] me, so I'm wondering if I just disabled guides there. 


**Eliot:** Yeah. Let's go ahead and disable guides. Do you have a way to, to see the output of your, your SDI output?


I mean, I guess you have, you probably have your rigs so you can hook it up. Yeah, you can actually look at it through the Axio C app, and you'll be able to see what the, the output footage looks like. 


**Rob:** Got it. Um, I was not running that after calibration, so I'll have to find that.


**Eliot:** Yeah, that's what we want. We want.


**Rob:** Okay, so it's a different menu. 


**Eliot:** Uh, no, let's see. So I've got, it's going to be somewhere on the, um, uh, let's see, the SDI port settings. So we can, we have guides, tools, and magnify. So I'm just reading through, uh, the guides, tools, and the magnification [00:41:00] to see. Uh, if there is a way to, a way to see how this is going to behave.


So they've got a picture of different modes. There's simple mode, the clip name, um, basic. Okay, so they have, they basically have increasing levels of, of stuff around the edges. They may only let you do,


we should be able to turn off guides and tools. And the overlay, we can get, if we can get rid of the guides, tools and overlays that should get us part of the way there, then we'll just have a slightly inaccurate counter calibration instead of a broken one. 


**Rob:** And like worst case scenario too, I can obviously like zoom in and post export that clip and then use that like just basically manually take that out myself.


Thanks a lot. 


**Eliot:** Um, [00:42:00] give me a second to think through this. Um, what we'd end up doing in post is first we need to get a functioning calibration because we, we need those. Oh boy, this is, this is interesting. Let me just, sorry, let me think through this. This is going to work.


I think what, what we, we ended up doing is that in post, what what's fundamentally it's going to say is it's going to miss because we have this inset on the borders. It's good. It's going to miscalculate the field of view a little bit. Um, and I, I hope it does. And I think that's, it'll be limited to that.


I hope it doesn't miscalculate the offset. You know, we're going to have to see there's, there's a bunch of things that are variables that are changing here. So I have to kind of think through it. So if we're fortunate and that [00:43:00] it's only the offset is, is only the field of view, then we can go through the whole pipeline and then, uh, are you going into Unreal or Blender or where's, where's the end of 


**Rob:** Blender?


**Eliot:** So in Blender, what will happen is that we're, we're probably going to have to do an adjustment of the field of view of the camera, um, very slightly to, and we'll see if that can, we can get that to lock in. Um, there are, there are of course ways like, you know, uh, I know you're working with Kai and, and I went through some of the synthesized stuff with her.


So the nice thing is that as, as long as we have a scan, um, of, of the, of the, and you know, make sure you get your scans of the scene after you drop your, your, um, Uh, as you drop, after you set up your origin, make sure you get a scan of the scene. Then we can, one way or another, we can post track some things and fix it.


Um, but I also want to have it so that the real time track coming through is okay. So there's a couple layers of things we're gonna be able to go through and try. The 


**Rob:** scan of the scene. I don't think I've done that yet. That's [00:44:00] so, uh, that's like a 3d photogrammetry scan. You're saying from within the jet set app.


**Eliot:** I'm within jet set. It's actually a happily simpler and faster than, than photogrammetry. So, uh, on that one, it's just under the, uh, uh, the set tab. And we're about to, um, we're about to update our UI quite substantially, but I think you're still only on the production one. Um, and under that, there's a, if you click on the set tab, on the left, there's a set of scan settings and, and one of them says, you know, start, stop, you know, et cetera.


So, uh, are you running Jet Set right now? 


**Rob:** Yeah, I'm looking at it right here. 


**Eliot:** Okay. So if you click, uh, if you already set your origin somewhere on your floor, you know, just, you know, pick your origin to make sure you know where your origin is. Yeah. And once you have your origin, then you can click start and it'll start scanning and you'll see a wireframe overlay over your, uh, over your environment.


And it'll scan pretty fast, right? So it's, it's, it's not capturing text here. It's just capturing the geometry of the environment. And what [00:45:00] you want to do is after you set your origin, um, and before you shoot, uh, and you only need to do this when you move your origin, right? If you move your origin, you need to rescan.


Otherwise you don't need to rescan. So you can just sweep it around and do it, you know, 10, 15 seconds scan, just to cover most of the area of the shot, uh, the, the actual shot you're going to be in, then you can hit stop. 


**Rob:** Yep. 


**Eliot:** And it'll show a wireframe overlay. And if you don't want to see that, uh, that you can tap, I think it says hide or gray.


You can, there's, there's a toggle that says what, what, what way it displays whether wireframe 


**Rob:** solid or, uh, 


**Eliot:** yeah. And you just keep tapping that until it's hidden. Um, but that is a key piece of information and it's so important that we are changing our, we're about to ship a, um, And a big update to the Jetset UI, uh, where all the buttons on the top get, or most of the buttons on the top get collapsed into a single dropdown menu on the top, because [00:46:00] we had to add more buttons, we were adding buttons and it was getting ridiculous.


Um, but we are going to add a, a, a warning on the record button that has like a little like picture of a grid and exclamation point, because it's so critical for post production to get that scan, that scan, once, once you have a scan and the origin. You can fix almost anything in post, um, with, with our synthesize plugin, almost anything.


If you have your origin and that scan, we're good. 


**Rob:** You know, and, um, actress does not need to be in the shot. You simply looking for inanimate objects that are in the frame. 


**Eliot:** That's exactly it. Because those are the things we're going to, we can put tracking points on later on. We can actually detect tracking points in the, in a shot.


Then we know, because once we tell SynthEyes, Okay, just to check, detect all the tracking points in the shot and we give it an AI roto. So like ignore it or ignore the actress, right? We don't want the actress. We just want tracking points in the shot. And then the presence of that scan lets us do something magic in SynthEyes, which is we basically fire [00:47:00] rays from the camera point of view because we have that.


We have the camera and we have an initial calibration that's decent. Like this is why I want you to have a decent calibration going in. It may not be perfect, but decent. And we fire rays from the camera through the image plane of where those trackers are detected onto that 3d mesh that gives us survey data.


Survey data is magic. It's stone cold magic. And post tracking it's, it's what you do if you're on a big shoot and it's a hundred million dollars and you've got a dedicated survey team of three people with a theodolite, like a little, you know, like the thing on the tripod, they're going beep, beep, beep.


And, and continuously just pinging stuff on the set. Um, and then, then they send that into the tracking team, but we can do it without that. Um, but it's, it's that, and you saw how fast it was, it's that 10 seconds of scan is the difference between post being like, no problems, we got this, and uh oh, that retro is crappy.


Um, that's, that's why we're putting that into the, an alert into the record button. It's, it's, it's such a big deal, [00:48:00] uh, to have that scan locked into the 3D geo of the scene. 


**Rob:** And so like this, I don't need to bring this or should I still do this part too? 


**Eliot:** That's a separate piece that's setting the origin.


And so you, you always want, and you can set the origin either with those, um, where you, it's an optical origin detection, or you can just, you know, you know, how you turn on jets that it detects their horizontal planes in the scene and you can just tap on it, set your origin. Um, and it, are you bringing, do you have 3d objects you need to align precisely into the scene or are you just kind of shooting it and, and, uh, putting in stuff later?


Okay, so in that case, you just pick your origin. I would say pick your origin at some place. It's kind of convenient. That's that's a either a mark on the floor or a, you know, someplace that's optically you can you can find it. Um, you know, a crack in the in the pavement. Are you doing an exterior or an interior?


**Rob:** Exterior. 


**Eliot:** Exterior? Okay. Yeah, you know, pick, pick something, you know, on your origin that you, the edge of the pavement, something that you know, Hey, I, that's where I put my origin. Once you pick your origin [00:49:00] scan, you know, just a quick scan of the, the, the areas that's going to be in the shot. Um, yep. You don't need to go crazy, right?


And don't, and in fact, Jet Set is not the right tool for doing the, the giant 360 degree scan. I would say just kind of sweep through the stuff that you're going to be, the shots going to cover. That's it. Stop. And then, then you can shoot. 


**Rob:** And is that, um, saved automatically so that when I change locations, I can hit clear and then it's saved to the one that it had previously done?


**Eliot:** No, you really want to do that almost on a, as you're moving around, right? It's, It's too twitchy, honestly, to try to pre scan things. Um, and you, I guess you could do that with a different app, but generally when you're shooting, you sort of want to say, okay, I moved, I dropped the origin and you know, it takes seconds, right?


To, to, to, to scan. Oh, I 


**Rob:** understand. I'm just talking about the saving process. So like, let's say I'm set up for a shot in location A, I scan the location, I then do five takes. [00:50:00] Now I'm going to move, um, 50 feet away and we're going to have the actress walk into the shot. I'm going to set the origin again over there, right?


And then I would rescan. Yeah. And so. When I do that, um, is it saving the first scan and then I can hit the clear button on the UI? Oh, yes. And again, and so it's just going to create a log of all the scans and later I'll know, okay, it was at 10 a. m., so therefore this is the scan from the 10 a. m. part of the shoot.


**Eliot:** Even better, AutoShot knows which scan was loaded at the time you shot and automatically will drop that into the Blender scene. 


**Rob:** Amazing. 


**Eliot:** Yeah. Yeah. Yeah. Yeah. So as soon as you hit like scan and then stop, boom, it's saved. 


**Rob:** It's saved. Okay. Um, so in the, as we were chatting here, I got the CMO app running and I can confirm that even with that guides toggle off, that I'm still seeing that black box and the numbers at the bottom.


So it must be another setting or not or not 


**Eliot:** tools. Can you turn off tools? 


**Rob:** Uh, what, in what sub menu is that in? 


**Eliot:** Let's take [00:51:00] a look. That looks like it's in the, um, uh,


Overlay mode. Okay, okay. Here, here's, Oh, I, I think you're gonna have to set it in, of all things. Alright, uh, you're gonna want into the SDI mode. There's something called Overlay Opacity. 


**Rob:** SDI mode. Shh. 


**Eliot:** Uh, uh, let me, do you see my, uh, my screen sharing? 


**Rob:** Oh, sorry. Yeah. Backed up. Yeah. 


**Eliot:** Okay. So there's the Komodo operations guide under SDI.


Okay. 


**Rob:** Monitoring. Got it. One sec. Menu.


SDI. Okay. Yeah. 


**Eliot:** All right. And then see down here where it says overlay opacity. 


**Rob:** Yes. Okay. I have the overlay turned on also. I could [00:52:00] probably just turn off the overlay. 


**Eliot:** Let's see if you can turn it off. 


**Rob:** Okay. I'm checking Simo now.


Uh, yes. So now there is no data there. There's just the black, like the black letter. 


**Eliot:** All right, and you have the lens that was the problem lens mounted? 


**Rob:** Uh, this is the one that didn't calibrate at all. So one of them was not pushing back, but this one had zero success calibration. 


**Eliot:** And what's the focal length on this guy?


**Rob:** Uh, 28 to 70. 


**Eliot:** Okay, so I'll tell you what, let me, um, let me, uh, you wanna, you wanna just calibrate it right now? Let's, uh, and I'll, let me jump in with a QR code so I can watch, watch it, and then we can kind of take a look and see if we can't crack it. 


**Rob:** I might have to adjust a few rigging things, but yeah, give me, give me like two minutes just to get this.


**Eliot:** Yeah, no worries, no worries. This is, this is great. I always like finding things that That, that break in a reliable and they're repeatedly break. Does that mean [00:53:00] that's the best kind of breakage? Cause then, then, then we can, we can track it down. It's the ones that like it broke once, then now it's been working for the past six weeks, those are the crazy ones.


**Rob:** Big detective through email versus in real time is definitely much better. Oh 


**Eliot:** yeah, yeah, yeah. This is, this is the. 


**Rob:** I'm out of hot shoe mounts here, so I'm trying to think, how am I going to connect this? Um, I


think I'm just going to gaff tape it out. There's a 


**Eliot:** I gotcha. No time to be brave. I'm good with gaff tape.


**Navaz:** Hey Elliot, since we got a little, uh, break while he's doing something 


**Eliot:** You 


**Navaz:** should check out, um, 3d maker pros, new LIDAR scanner. It's basically [00:54:00] the same type of scanner that, uh, you, you, uh, showcased at the summit, but this one's only, uh, I think it's like 1, 600. 


**Eliot:** All right. I mean, look at that. And that's, that's always interesting.


Yeah. 


**Navaz:** It's called the Eagle. They're actually sending me a demo to, uh, try out because one of the things that I'm working with them about is trying to do that animated, uh, Gaussian splats. 


**Eliot:** Oh, okay. Yeah. 


**Navaz:** So like you got me hooked on that. So now I'm trying to try to make it instead of looking at a static, you know, model or it's not static, God's flat.


Let's say like outside with the trees and stuff. I mean, that's what makes the guys in his flat look real. If, if, uh, like the trees are moving and, you know. Like certain items are moving in this, in the guardian slot. So that's something I'm trying to work with them on, but they're going to be sending me a demo of it.


Wild. This is, 


**Eliot:** this is a wild thing. 


**Navaz:** I just wanted to tell you that real quick. 


**Eliot:** All right. So let me take a [00:55:00] look at this thing. Eagle in action. So there's it running 3d scan.


But I'll do cameras shooting.[00:56:00] 


Oh, they even have it in a relatively inexpensive version of geomagic. 


**Navaz:** Yeah, they actually gave me, uh, the geomag, the full, uh, the full geomagic. I mean, I've been working with them for like almost two years now on the, I usually demo their, uh, their scanners. I mean, the ones in the past were okay, but this one, I think it's going to be groundbreaking more so because of what you showed, um, you know, with, you know, at the summit and then to be able to do that with a, with a cheap price point.


I mean, that's my biggest thing. It's like, I mean, if it's under 2000 compared to 10, 000 or, or 20, 000, you know, I mean, at least it's a piece of equipment that you can have that, you know, that hopefully can do exactly. I mean, I haven't got one yet, you know, in hand, but they're sending me one of the, um, the [00:57:00] demos.


So, but I mean, one of the things we're looking at trying to do is that, you know, animated guys can splat with it, you know, but apparently it goes up. I think it was like up to 160 meters. 


**Eliot:** Wild. So yeah, it looks, so they've got, they're running lighter and they've got multiple cameras pointing in the different directions, so they've got all the different pieces of the, uh, Oh, this is, this is wild.


That's wild. 


**Navaz:** But for 1600 though, I mean, that's pretty good. Oh yeah. That 


**Eliot:** means you actually like, you know, there's lots of people that have it and you go rent it for whatever per day or whatever, or you buy it or that kind of stuff. 


**Navaz:** Yeah. Now it's, it's at, it's at a price point to where it's not, you know, it's not breaking the bank, you know, to where you're like, do I really need it?


You know what I mean? But yeah, yeah. 22, 


**Eliot:** 000. Decision is a big decision. 


**Navaz:** Yeah But the good thing is they're sending me one for free so that way at least we have one You know what? I mean that we can try so 


**Eliot:** wow. I can't wait to see see how how it behaves. This is I'll i'll [00:58:00] be super curious to see it. It looks 


**Rob:** amazing.


**Eliot:** All right, excuse me. All right And all this stuff that's happening. I mean it wasn't I mean, I guess it was it was probably nine years ago that I tried doing a big photogrammetry shoot in venice and just got my butt kicked Um, so this was, this is really cool to see all this stuff kind of manifesting and working, um, and doing, oh, and I, I, I bought, I bought a, uh, a, uh, uh, an Oculus, uh, so I haven't even had a chance to open up the box yet.


So looking forward to trying out, I still want to figure out, I want to try to intersect that with Blender. All right, Robert, we, uh, how are we doing? 


**Rob:** Good. Um, can you see on my phone now? 


**Eliot:** Okay. Let me, let me, uh, link in over there. Let's see what we got. There's. Island. Oh yeah. Okay. That there it is. Let me, let me share a screen so you can all see it.[00:59:00] 


**Rob:** And I forget, but aren't I supposed to be seeing what's on the camera, not on the phone because it's connected to the CMO, or am I supposed to see what's optically on the phone right now? 


**Eliot:** Right now you'll see what's optically on the phone. The only point right now where you see what's in the camera is when we're actually actively doing the calibration.


You'll see both the Cine feed and the Jet Set feed. So let me share a screen so this is on the, uh, uh, there we go. Share. And there's the island. Alright. Okay, there we go. So now, now you can see what's, what this is. 


**Rob:** Awesome. And so I remember you needed detail, so I just spread a whole bunch of crap all across my desk.


So let me know. Even better, 


**Eliot:** even better. So yeah, so let's uh, let's uh, start the calibration process. It sounds like you already have an origin. Uh, so let's Sorry, I'm just 


**Rob:** gonna 


**Eliot:** put in a battery for the 


**Rob:** gimbal so I can turn it easy. 


**Navaz:** Hey, Elliot, with what you're doing right now, are you able to remote [01:00:00] control, uh, the, the iPhone?


**Eliot:** No, we don't, we don't have, um, this, this, it's an excellent question. Um, so, okay, two, two answers. One, what I'm doing right now, which is with the remote, like QR code stuff. No, no, we don't have, we don't have that. We are adding a whole set of remote controls into the browser based remote. Like it's already, it already had remote controls and we're actually upping the ante considerably.


Um, to have, to have a menu in the, that's in the, the, you know, the local jet set browser, right? You're, you're familiar with, um, and they'll have a dropdown when, when we update the jet set UI to have a dropdown menu, we are also updating the remote control system to have basically the same dropdown menu.


Um, so you, you're, you know, you can go into the origin or you can go into, you know, object, et cetera. And I'm not seeing my field of view update. Let me, let me come back in there again, again, and see if I can get a live, a better live feed. Um, Oh, there we go. Now, now it's, now it's updating live. Okay. But it's an excellent, that's an excellent [01:01:00] suggestion to be able to actually remote, remote jet set.


And we will have to look into that. We're, we're building a direct integration with zoom, um, right now. So we can actually do this, uh, so that you can be able to use this, but users are going to be able to do this instead of just me. So this will be super, that'll be super useful. All right. Start. All right.


And where's our cine frame. All right. 


**Rob:** So I guess it's a little hard to see what's in 


**Eliot:** focus 


**Rob:** now on the main camera.


**Eliot:** And so, right. And the main camera is on a, is it set up at, as like a 40 millimeter or, uh, got a 50 right now. 50. Okay. There we go. Okay. So it's a little zoomed in. Okay. And it would be great actually, if there was even more stuff on your desk, do you have like a, um, cause a large fraction of the screen has no detail.


Yeah, um, it gives just and also the other thing is is you're in front of a [01:02:00] monitor Which is a big reflective object and that that can jack stuff up. Um, so do you have like a I don't know uh carpet is always good or uh, or um, um wallpaper, um And something with just a bunch of textural detail and no reflect and no reflections Reflections are always a bear This wall.


Hey, here we go. That looks It looks, there we go. There's, and those are, those are just boxes or something. Okay. Yeah. They're like, uh, milk 


**Rob:** crates 


**Eliot:** and yeah. Okay. And, uh, that's that right there. We go. That's more like it. That should do that. Should that should work fine. Okay. So should I start calibration? Yep.


Go ahead and start calibration and just do it. Capture a test frame. Oh, okay. I mean, yeah, you're, you're already running. So now you can just click test, uh, test frame, [01:03:00] you know, stop in position. Oh, where'd my, where'd my feed go?


There we go. There we go. So a hundred and some images. Okay. And let's just keep moving laterally by about a foot. Capture images. Okay. Move laterally.


Go.


And can you get any more lateral movement on the other side of the room? 


**Rob:** Yeah, 


**Eliot:** I can on this way. This side more. So we can get some more, more parallax in the image.[01:04:00] 


Can 


**Brett:** I ask a quick question about this process, Elliot? Yeah. Um, is the goal to try to keep whatever object you're putting in the frame at a relative distance that matches where your focus is going to be when you actually shoot? Or does it, does the distance, how does the distance play in? I guess is my question for the calibration.


**Eliot:** Um, yeah, you want to have it roughly the same distance for no other reason that, um, the, um, the cine cameras optics will change very slightly as you focus to different, different distances. It's pretty, it's pretty subtle. It's pretty, you know, it's breathing slightly. And so you want to pick a distance that's a reasonable approximation of the same distance you're going to be shooting at just to keep, keep all the variables the same.


Like if you calibrate and, and if you're at like three feet away, then the, the lens is going to breathe substantially between when you're three feet away and when you're like 15 to 10 feet away. Um, you know, depending on, on, on the lenses, especially when they get really close, [01:05:00] close focus. Um, Okay, so I think that's enough capture so you can click save.


Let's get rid of those the parentheses. I'm a little worried about print. I always We're gonna do some more more When we when we redo the calibration stuff we're gonna


**Rob:** Okay. So now I'll, I'll try and bring this up in auto shot and just see if we get to work. Yeah. You can go ahead and exit. 


**Eliot:** Oh yeah.


And then go ahead and share your auto shot screen and let's take a look at, at what we got. 


**Rob:** Sharing now. Shot. Share. 


**Eliot:** All right. And then go ahead and just click refresh on the, uh, on the, the, um, up on the client link. There we go. All right. There's the updated. [01:06:00] All right. Um, and punch in your sensor width. Oh, there we are.


Oh 


**Rob:** yeah. Uh, can I stop that? 


**Eliot:** I have to just let it, let it go. It's going 


**Rob:** to. It actually already stopped, and this was the issue I was having the other day, where it was just saying false. So if I hit it again now with the correct sensor width


Yeah. And it just stops there. 


**Eliot:** Okay. I'm, I wonder,


I bet the, have you used parentheses in your file name before, right now you have calibration Komodo V2. 


**Rob:** Dude, that's interesting. The ones that have been erroring are the ones that have a parentheses in the name. That's 


**Eliot:** it. That's it. Oh, damn it. Okay. Wow. We're going to have to, yeah. Parentheses. There's, there's a set of special characters and computer stuff.


I already have a parser in there to catch spaces. I forgot about parentheses. I just, I never, I never, I guess I don't add it because I'm a computer person. And I'm [01:07:00] like, oh, that's going to kill a file name. And we need to, we need to catch that in our parser. So you don't do, so you don't do that. So we're going to.


We're gonna have to redo the calibration and we're gonna next, oh actually, uh, okay, yeah, we're gonna have to redo the calibration. We can't just 


**Rob:** rename it? Oh, you can't just rename it like with out parentheses? 


**Eliot:** Yeah, you'd think you'd be able to do that, but it, we, um, uh, give me a second. I don't think we have that built in.


Um, give me a second. 


**Rob:** Yeah, take your time. 


**Eliot:** No, ah, dammit. This is, this is, uh, okay. We're going to, I'm going to make a note to Greg right now that we have to catch parentheses in the, in the file name saver and, and ban them. Uh, and, and, uh, uh, or, or replace them or something like that. Uh, let's redo it. Um, cause I actually don't think we have a method of, cause we, you know, don't normally want to change the name of the calibration after you create it.


Um, [01:08:00] and so what we need to do is catch that up front. So if it's okay with you, let's just do a, let's do another calibration. And this time we will, we'll, we'll save it without, uh, the print. That, that's it. There's, there's the, that's the problem. I'm gonna make a note right now. , we need to catch that, um,


**Rob:** part to place the origin because the, [01:09:00] um, camera lens is blocking the ground. I'm not sure how to solve that. 


**Eliot:** Oh, lemme uh, lemme look at Zoom real quick, or, oh, let me look at the, uh, there we go. There's Highline. Um, oh. Oh, okay. Gimme a second. I


**Rob:** managed to put it on the very corner around the lens, but 


**Eliot:** not easy. And I'm at a fit. You, you can also, um, just point down, right, like it, uh, when you, when you're setting, you're setting your origin, you're just like, can you, can you put the camera down? Course. Just tap there. Course. Okay. Gotcha. 


**Rob:** Alright, so let's, 


**Eliot:** and I'm gonna take a screen cap.


Hang on, hang on. Can you hold there for just a second? Uh, snap. Grab is that screen cap where the barrels in the frame, we're going to be, we're going to need to handle that. I think we can handle it. Um, I don't think that's going to jack it up, but I want to, I want to make sure we, we have, um, cause that is going to be a common situation [01:10:00] and I want to make sure that we don't barf on it when we hit it.


Uh, there's a new one. Okay. Well, the 


**Rob:** way that I can, I can make that better is if I. Was to flip this camera, because the lens on my phone is at the bottom, so if this is long enough, we'll see if this helps. 


**Eliot:** Okay, that might, that might not be a bad idea. Because normally we want to have the lens close to the camera, but of course then, then you're seeing the barrel, barrel of the lens.


Uh, let me, let me just send this thing. That's much better on that front. Okay, okay, so that, that sounds good. Um. 


**Rob:** Okay, alright, um, I'm going to begin. Oh, it's funny, the whole UI is. Upside down and not flipping now. Does this auto rotate this, uh, the Jetset? 


**Eliot:** It should auto rotate. Let me, uh, flip over and see your, your screen.


Uh, [01:11:00] okay.


Let me refresh this.


**Navaz:** Um, Uh, Ellie, can you share with what you're seeing? 


**Eliot:** Oh yeah. Yeah. Sorry. 


**Navaz:** Thank you. Cause I'm kind of curious too. What is this US? 


**Eliot:** All right. Share there's Highland. Okay. So let's see. Um,


Oh yeah, this, give me a second. Cause the UI.


So why doesn't the, I would have thought, uh, can you move the camera in a little bit? I'm just trying to make sense of this here. Um,


I thought that I thought the [01:12:00] whole image would flip.


**Rob:** Or just restart the app for now and see if that, 


**Eliot:** um, let's do an origin. Um, Oh, okay. Okay. Actually, actually a second. Yeah, let's, let's restart the app. I'm, I'm trying to understand kind of what I'm seeing here. Um, cause I, we should automatically rotate the higher entire UI when you flip. There we go. 


**Rob:** I don't know why I just wiped up from the bottom and it just flipped around.


So, 


**Eliot:** okay. Okay. There we go. That's all right. The world, the world makes sense again. Okay. But we need to, um, right now your tracking's red, so let's go ahead and, uh, let's, let's get you a, get you a working tracking. So let's go and, uh, before we do the calibration, nope, not yet. Not yet. Let's go back and click on origin.


Cause you see that, that red, um, compass, it means you're, it doesn't know where it is. Um, and it sort of, Oh, I see the compass. 


**Rob:** Yep. 


**Eliot:** Yeah, yeah. Those are our dashboard indicators. Oh, there we go. Now it's [01:13:00] starting to figure out. Um, I think it got, I think it just kind of lost track when we flipped it over. So let's go ahead and, uh, let's just click your origin button again, uh, and then just reset the tracking.


So if I hit OK here, it seems to close. Should I hit cancel? Um, go ahead and click, um, Oh, I see. So what's going on right now is because we flipped around, it got confused with where it's at. And go ahead and just click reset up in the upper left hand corner. Oh, I didn't see that. Got it. Yeah. And then start a new map.


And this way, this way we just go from scratch. There we go. Tap on the grids at the origin. Now we're good. So now, now you've got, got, got green tracking all the, in the world, the world makes sense. Okay, great. All right. So now, now we've got tracking and now we can do our calibration again.


There we go.[01:14:00] 


You'll find our highly dense visual environment and that's more like it.


Okay. Click test frame. I'm going to refresh my feed. 


**Rob:** Looks okay. 


**Eliot:** There we go. All right, that, yep, that looks good and keep the frame and, and let's, uh, let's work through our calibration.


There you go. That's good.


**Navaz:** [01:15:00] Elliot, I got a question. Uh, how many, uh, test frames are, I mean, let's say like, if you end up doing like 30 or 40, I mean, is there a point when, when the thing actually breaks? 


**Eliot:** Good question. Um, we've only ever done like, I don't know, a dozen, something like that. And that's usually more than sufficient to get, get the calibration kind of locked in.


Um, I think going more doesn't help that much more. We're already getting down to usually half a pixel of convergence and you don't usually get much better than that. Um, so it's, uh, I'd say, um, but it's, it's a good question. I don't know if 


**Rob:** something 


**Eliot:** weird happened. Got gray on the last shot here. That's kind of weird.


What's going on? Um, maybe click clear. Or actually, oh, let's click ignore frame. Let's see it. Let's I don't quite see what's going on Maybe it kind of froze up. 


**Rob:** So I think I got uh, maybe 


**Eliot:** 10 here now. Okay, [01:16:00] so i'll just save it Yeah, let's, let's get a, let's get one more or two more. Okay. All right. That's, that's fine.


This'll do it. Yeah. Let's get rid of those parentheses. Just use dashes or underscores dashes. Underscores are all fine. Also no slashes. I think we parse most of those out, but, um, parentheses I didn't see coming. So, uh, we're, we're gonna, I just already sent a note there that we have to catch, catch those.


All right, and now we can exit and go back to your screen share, zoom 


**Rob:** screen, share. Auto shot share 


**Eliot:** and that's, let's refresh, uh, there so we can update our, 


**Rob:** yeah, 


**Eliot:** that's good. And punching our sensor with,


there you go. Now it's 


**Rob:** amazing. [01:17:00] 


**Eliot:** Sorry about that. We're going to, we'll, we'll put that in our, in our, in our, uh, I just. Did I didn't catch that one. So we'll, we'll, uh, we'll put that in our, in our code update to lag that or prevent the user from rendering those. I got to, I got to see if we can do it in real time and prevent people from, from entering that.


Cause otherwise it bites your ass. And then there's sort of, and you're going, what, what, why did it break? And it in parentheses, what? Yeah, there's, there's, there's a bunch of things that are. That will bite you in file names when you're, when you're doing a lot of processing. So slashes will bite you because they're, because they're special characters.


The computer uses for things. Yeah. Apparently parentheses are one of them. Um, 


**Rob:** yeah, no worries. Periods periods. Would that be okay? 


**Eliot:** That's actually a really interesting question. I just automatically, uh, there you go. Calibration push to Jet Set. So, uh, we can look at Jet Set and then, and, uh, there we go. Let me [01:18:00] flip back to that, uh, screenshot.


Yeah, I got it. Share. There we go. All right. Yeah, there you go. There's your, there's your calibration. 


**Rob:** Okay, and um, did we figure out why the first one wasn't pushing back? I forget. Uh, so there was two lenses. One of them was working at every time except one wouldn't sync, and then the other lens wasn't working at all.


So we solved the lens that wasn't working at all. It was the parentheses issue. 


**Eliot:** And one of them, I think you were running into, were, were the, uh, where you had a, uh, you had the numbers in there, in the field of view, and that those data overlays, the problem is the, the calibration algorithm just looks for, for future points.


And it says, oh, those, there's, there's feature points, but those feature points aren't, aren't moving with the scene. And so it breaks, it breaks the algorithm. And again, we're going to, we're going to put a detection in because, because now we've seen this crop up where we're, you know, we're going to put a detection thing of, of, of a certain percentage of the pixels from one shot to the next are identical, which [01:19:00] is the case of an overlay.


We're going to put up a flag that says, Hey, you've got an overlay on, uh, it just caught us off guard early on. 


**Rob:** Yeah, yeah, totally. It's, this is the beauty of iterative software. So I'm really, really grateful that we, we just solved everything though. I'm feeling much more confident going in now. 


**Eliot:** Oh, fantastic.


Well, I mean, you can, uh, I mean, you want to just calibrate all the things that were problem children and just ping me and let me know, uh, if, if there are any others. And again, what's, what I would expect is going to happen is because there, there's still a problem, which is that the. The data on the red is inset, right?


There's the full, full field of view and it's slightly inset. And so it means our calibrations are going to be a little bit off when we, um, when we pull them into, into Blender. Uh, and I think we can resolve that. We're going to have to do a little bit of adjustment by eye. Um, Um, to, to calculate that, but I think we can fix that.


Um, we, you know, between me and you and Kai, uh, in post and, uh, and just make sure, make sure after you drop, uh, after you drop your origin, when you're doing pretty, you know, production scan, you know, five, 10 seconds, [01:20:00] just where the, the shot's going to be covered, boop, stop. And then, and then that's that, that way you're, you're just covered.


If you, if you have, you know, you know, decent tracking, you know, and actually yellow is okay. It just means it's probably finding new, new, new points, you know, green tracking, et cetera. On your dashboard and you have a scan, there's, there's just not a lot that can go, go wrong, uh, in post. If you don't have a scan, the world is harder, a lot, a lot harder.


Like a lot harder, you can, you can handle it. Like you can put in geometry, man. If you have the jet set tracking, you can put in geometry manually and synthize where you eyeball in where the, where you put in where the floor is. Cause you know where the floor is and then you have to eyeball in where you, where the wall is and you put geometry there and project points there.


But instead of a five minute process, it turns into a two hour process 


**Rob:** per shot. 100 percent 100. I'm measured twice, cut once type of person. So this is, I'm going to get it right when we're capturing. 


**Eliot:** Oh, that's exciting. That's great. So [01:21:00] 


**Rob:** thank you. I'll let you know if anything else comes up and um, yeah, otherwise I'll see you on the other side.


**Eliot:** Fantastic. Fantastic. All right. Good luck. Talk to you soon. 


**Rob:** Thanks Elliot. Talk to you soon. Bye everybody. 


**Eliot:** Bye bye. 


**Brett:** All right. And Brett, you came back in. Yeah, so I did. Unfortunately, this thing is not a frame off, but I actually have my render from this weekend and it was a frame off. I don't know why it's not doing it now.


It's one of those when you said the thing about breaks once and then. So the thing I brought up earlier, I could zip up the take for you, but now it doesn't seem to be doing it. I really now it's on right now. It's on. Yeah. But I have the render that I did this weekend and it is definitely off. So I, maybe I slipped it accidentally.


I thought I was very careful about it, but it's always possible that I grabbed something in timeline. 


**Eliot:** Well, I'd say is, is if you find one in the wild where it's, it's like you're going through jet set processing and it's a frame off and it repeats. That's gold for us. Um, and we're 


**Brett:** shooting tomorrow. So [01:22:00] hopefully if anything comes up, I'll definitely send you one of them But the one that I was going to send you is now working.


So 


**Eliot:** Of course, of course. No, I I I I totally get you and yeah. Yeah. Uh, okay Okay, 


**Brett:** I had that happen once before and I just rebooted my computer and it seemed like it worked but then I did that I was working on this all day saturday and a little bit of time yesterday and it was consistently doing it. But then this morning it seems to be working fine.


So, uh, which is good. I'll be shooting tomorrow. A bunch of testing tomorrow. We're gonna try to shoot. We have a short film. We're gonna try to shoot on sunday. So we're gonna kind of be prepping for that this week and shooting tests. So if anything comes up, I'll definitely post it and let you know. But For now, it seems to be working.


**Eliot:** Okay. All right. Well, that, that, that sounds good. Let me know if anything else crops up. I'll probably sign off here and get ready for another call. But, uh, this is, this is. Yeah. And the other thing is 


**Brett:** Navaz, if you, uh, want to, uh, post anything about how you guys are able to do this through Blender, I'd love to know more [01:23:00] about that.


Uh, we can talk about that offline or something, but, uh, 


**Navaz:** Yeah, I was actually thinking about putting a, a post on, uh, on the forum just cause I could see it being beneficial to 


**Brett:** someone. We'd love to do that. I'm, I'm much more versed in unreal, but I have started training on learning some stuff for a blender because I, and I've been talking to the owner of the site of the place and talking about talking at blender, he's hard to change his mind sometimes he's now he's like, got these people.


If I brought all these people in the non real and I'm like, I know, but it's not bad for us to have versatility and be able to use multiple things, you know, so. 


**Navaz:** But at the same time, we still can do it with Unreal and just speed up your guys's, um, you know, um, you know, workflow. 


**Brett:** Yeah, anything you can do to help the automation on some of those repetitive steps would be great.


And, uh, I think we've taken it as far as we can in terms of the quality in Unreal. Yeah. Uh, that, that's the big thing is the anti-aliasing, uh, the, [01:24:00] on the preview, once you render it, it's fine. I, I've worked with the, you know, the console commands and the, I've gotten a very nice looking image coming out of Unreal on the renders.


Uh, it's really just the preview that he's not happy with. Yeah. And I don't know what else we can do. Like I said, the research I did, it indicated that we need a camera that can be gen locked and an ultimate, which is darn cheap investments. Either one of them, you know? 


**Eliot:** Yeah. Yeah. I mean, you heard my, my spiel on this earlier, maybe, maybe give us a few weeks.


To crack this nut because it's, I'm telling you, you're walking into, you know, what's, what's the, what's the military version where you're, you're, you're like the interlocking fields of fire, where you're like, you're, you're walking and you're, yeah, there's no, there's no win. There's just different levels of losing.


**Brett:** And well, I just keep trying to talk him out of the preview in general, but it's his thing of, or that they're talking about the idea that it's not going to look perfect, that it is a preview. You, you tell everybody that [01:25:00] what they're looking at is not. And we show them like this weekend is really an attempt to really show them a pretty close to finished product, if not a finished product and say, here's where we can get with this, right?


**Rob:** Yeah. 


**Brett:** So, so when you look at that preview, understand that it's not going to be that it's going to be more like this. So. If this all goes well, I think I can kind of convince him to stand down on that and, uh, really start moving forward, but he's been really adamant about this looking a certain way, and I understand, you know, I understand his point of view, but it's also like, well, without spending more money, we're, this is kind of where we are right now, and this doesn't have to happen.


So, yeah. It doesn't limit what we're able to do in post and finish. And he and I both come out of post. He was a colorist. I've been a VFX, a flame guy for years. And so, I mean, he understands that, you know, people, especially when, and I think it's because we're in post and we're used to things looking perfect.


When we finish production is a bit of a different game and he's not as familiar with [01:26:00] that. It's a little nervous. I don't know what it is. It's all good, but I'm trying to trying to convince him that, you know, it's good enough for what we're trying to do. So, yeah, 


**Eliot:** I would, I would just, yeah, buy, buy some time before you drop the change on, on, on the ultimates.


Cause it's, it's not just the money on the old, it's what it is. It's, it's the, the fussing on the day to day operations. Uh, it, it eats. Like it's you can use. Oh, I agree. I 


**Brett:** spent hours doing this preview thing and I'm just like, you know If I could just unplug this hdmi cable and start shooting I could get something for you but we spent all day messing with this, you know, and and uh 


and I'm sitting here wasting all this time on your preview, which I think is less valuable. Look, I love the guy and he's great. He's very, he's got a lot of resources, a lot of contacts, but he's really hung up on this preview thing 


**Eliot:** to get about it's the client fee or ultimately what it is and the client fee is really important and everybody decides, but.


Okay. All right. More soon. Uh, this is awesome guys. And, uh, I'll talk to you soon. All right. [01:27:00] Thanks 


**Navaz:** Ellie. I'll be setting up a meeting, uh, uh, probably towards the end of the week with you and Bill. Excellent. That's everything else. 


**Eliot:** Fantastic. Fantastic. 


**Navaz:** All right guys. Be safe, man. 


**Brett:** See you soon. Yeah.


Everybody's be safe. 


**Navaz:** All right. Bye.