Transcript

# Office Hours 2024-12-13


**Eliot:** [00:00:00] All right, morning all. 


**Brett:** Good morning. 


**Eliot:** Morning. 


**Paul:** I'm the only one here. 


**Eliot:** All right, who wants to jump in first? 


**Paul:** Um, I would love to. I think I spoke with yourself, Elliot. Um, I sent you a report, um, how we weren't able to sync up a Jetset Um, the calibrations on Jetset. 


**Eliot:** Oh, interesting. All right. Let me take a quick look.


**Paul:** Yeah, it was weird because I sent like a really complete report. We're, we're like trying to use this in our next recording and can't get it to work. All 


**Eliot:** right, well, let's get it, let's get it figured out. Um, all right, let me find your, I 


**Paul:** get the calibration and it, yeah, I'm [00:01:00] there, all there, but I put the sensor with, and every time I push it, I don't know, I'm probably doing something really stupid, but it goes for a long menu and then it just doesn't push to it.


All right. Yeah, 


**Eliot:** let's, let's figure this out. All right. So let me look at your. Get this real quick. Yeah. Sorry. I was out yesterday. So I'm catching up a little bit. No worries. All right. 


**Paul:** That's 


**Eliot:** absolutely. Uh, okay. So there's that. So it was, and I see it was at an email. Uh, there it is. There's the log. Okay. So let me look at the log real quick.


So we've got, um, okay. Does feature extraction, retriangulation. All right. So it looks like it's got. 


**Paul:** Yeah, and it goes really dark when, um, when I'm doing like the, um, triangulation or just trying to kind of match them up. Uh, the one [00:02:00] that I'm getting from the Accusome is, is really dark for some reason. The, the footage when I'm trying to match the triangulation points.


**Eliot:** Interesting. All right. Let's tell you what. Um, so I see I'm looking at, I look at the, at the, uh, the console you sent me and the things that are interesting. It says it does, it says it's no convergence. So there, I wonder if we have a problem with the solve. So let's, um, all right, let's, uh, can you screen share, you know, let's start here.


Yeah. Got a shot. 


**Paul:** Yeah. So let me just get my shot up


short. Yeah, there we 


**Eliot:** go.


Let's see. So there you are. So let's go over to calibration. 


**Paul:** So obviously this is now changed um the day before I was able to I [00:03:00] think um, But I had the sensor width incorrect. 


**Eliot:** Okay, let's just punch in the correct one and uh, see what we do Do you have the correct sensor width? Uh, yeah 35. 9. There we go.


Can you hit the little i button right next to the uh, this guy? Let's take a look at our at our calibration 


**Paul:** Yeah, uh, sorry, there's a little, if I press the I, you need to calibrate once to retrieve the calibration. Oh, okay, I 


**Eliot:** gotcha. Okay, so let's click calibrate, and let's uh, let's take a look at what we got.


**Paul:** So this is now different, because it's in a different position. 


**Eliot:** Is it the same calibration? 


**Paul:** Uh, I think, yeah, it's getting the calibration from yesterday that I did, because I redid it, I tried again, and um. 


**Eliot:** Alright, reasonable. So it's crunching, it's thinking.[00:04:00] 


It'll be interesting to see what the calibration images look like.


So the Lens Calibration System, what it's doing is it takes, looks at the images, detects feature points, and then does a, uh, you know, basically a little photogrammetry solve between the, uh, the various points to, to back out the, uh, you know, back out the, uh, the offset and the, okay, let's take it here. Yeah.


Okay. Fail. Okay. So, aha. So let's, um, can you click GUI check[00:05:00] 


and pop up your, Oh, cool. Not returned. That's. 


**Paul:** Weird. Yeah. And it goes on for a long time, right? And it's not the first day I did this. It wasn't like that. And it showed up a few different angles. 


**Eliot:** All right. Let's go to, let's go to manual mode. Uh, can you pop up a, uh, a, um, file explorer, uh, whatever we call it on the Mac, um, finder, there we go.


Yeah. All right. And we're going to just kind of go to, um, let's take a look. All right. Let me think through this for a second. And you have, you have takes with this, right? 


**Paul:** No, I didn't have any takes cause I, I was watching a tutorial online. 


**Eliot:** Okay. All right. Yeah. And I got stuck. Let's see if, so I had to, I had to pull it over for calibration.


Um, if you go into your takes [00:06:00] tab, do we have a, uh, a directory set? Just so I understand where it's, where things are at. 


**Paul:** Sorry, if I go where? You go to the TASKS, yeah, 


**Eliot:** TASKS, yeah, okay. All right. Let me check to see, um, 


**Navaz:** can 


**Eliot:** you click, uh, open on this? Let's, I just want to see if when it was doing its calibration, I'm going to have to check real quick to see where it pulls down the temporary files.


Cause it does the calibration, pulls down a zip file, opens it up, and then, then it crunches through those. And I need to just check where that lands. 


**Paul:** It's on an external drive. All right. 


**Eliot:** That 


**Paul:** should be all 


**Eliot:** right. Um, and just pull up the finder. 


**Paul:** Yeah. Sorry. You can't see it, but yeah, I need to share. Sorry.


What I'll do is I'll just share the full screen if that's okay. That sounds good. Yeah. Let me just stop sharing and share the full screen.[00:07:00] 


**Brett:** Elliot, I have a real quick question while he's doing that. He's using the Mac version. I have not yet tried that, um, because our Unreal rig is all PC based, obviously, but, um, does that work pretty well? Is it, is it functional as the, as the, um, PC version is? 


**Eliot:** Uh, yeah, they're, they're basically running on the same, same code base.


So they're, they're very, they're very close to equivalent. There's a couple things that are, that are a little bit different. Um, we can't run InspiringNet, uh, on the Mac because of the implementation. It just doesn't, doesn't work that way. Uh, but that's, uh, that's, that's a kind of an edge case. So almost everything else should be very close to word for word between the two.


So every once in a while we run into a, a quirk where the map implementation of ColMap solves slightly differently, that, but that's kind of an, it's a pretty far off edge case. Thanks. You get that with a really marginal lens calibration, you run into problems. 


**Brett:** All right. I had a couple other questions. I'm going to let [00:08:00] Pedro finish this, but I have some other questions when he's done.


**Paul:** There are these guys. I'm just clearing out my screen. 


**Brett:** I have a quick question about the Unreal. And by the way, Paul, who just joined, he just says Zoom user. He's my partner at the green screen space I've told you guys about. Oh, great. Great to remember. The comedy thing. So he actually owns that space. So I told him he should jump on this morning.


Um, but the question I had is the Unreal, the live preview render our script. So we, we're doing a set that's got a couple of layers in Unreal. It's got a desk that we have a character sitting behind. And, uh, so when I export the unreal live preview scripts, um, I only get one layer in my comp, obviously, and I have to keep going back in and adding the other layer and, uh, you know, I saved out a blueprint for the compositing [00:09:00] with the extra layer, but I have to keep going back.


Is there a way, so when I export that, it creates that composure composite and it leaves it there. Uh, but I found that if I don't re export it sometimes. If I'm working later, if I've moved the camera that it tends to kind of start to drift. So I kind of have to start it over and re export it. And then there's a process that happens every time.


I guess my big question, well, like 


**Eliot:** the 


**Brett:** drift, like I'm seeing parallax strangeness on my set when I start moving my camera around. So I re export and then it fixes it. And it may be because we've changed the origin or something's happened like that. Um, but I'm trying to figure out if there's a way to keep some elements of that composure composite so we don't have to rebuild it every time.


Um, or is there a way within yours to deal with layers? I don't know if you could probably not. You understand what I'm saying? Cause if you have stuff and for instance, when we're using it [00:10:00] on jet set, it works great because we're using the depth occlusion and the character appears behind the desk on the phone, but we, we really want to do a live preview.


We're not, and it's mainly because. We want to be able to show people the way the green screen space is set up. It's got a monitor that faces the stage. And since we're working with this comedians and stuff, we want to be able to turn that thing around and say, here's what it's going to look like, but then we're not going to use it for anything more than that.


So, but we really would like it to be as quick and easy as possible. Every time we have to show them something 


**Eliot:** and what's in the foreground just that what's your 


**Brett:** foreground objects? It's a desk so you have a character sitting behind the desk. Yeah. Oh, it's kind of showing the space So and we put somebody at that table, but we have the desk it kind of goes over that And we want somebody sitting behind the desk.


So, what we end up doing is we, uh, we use the depth [00:11:00] occlusion on the, uh, on the phone for the shots. But then when we want to show the preview in Unreal, it doesn't really work that way. So you have to make a separate layer that has the desk. And then put that layer into your Composure Composite and actually you have to adjust the material editor and do another over operator and add another layer.


So, so I'm trying to figure out if there are ways to speed that up every time we have to kind of turn that monitor around and show it to somebody. 


**Eliot:** Is the, is the final, so this is mostly for the, the kind of the onset preview and then for 


**Brett:** onset preview for lighting, for showing to clients, give them a sense of what the shot's going to look like.


And we have it all running on a nice big HD monitor so they can see it full screen and, and, and that's working. It's just the process, the process of getting there. We're trying as quickly as like, as we can to, uh, All right. I can 


**Eliot:** tell you a bit of, okay, philosophy on this [00:12:00] is that, um, the whole thing of having, uh, going from, uh, from tracker to engine to compositing.


Is an unholy bear and it's been that way for 20 years the entire history of virtual production It started out, you know used to be like encoded grains and then brainstorm and then and then um, you know Old mac here and now we're it's slightly different and it's such a bear That it's almost impossible to keep that stuff just functioning on a day to day where we are heading rapidly is Um, one of the, one of the things that is a breakthrough is we can now generate a Gaussian splat from the Unreal model.


And it looks, you know, three quarters, like, like three quarters of a real render, like it's, it's, I mean, it's based out of made at, made out of Unreal renders, Gaussian splat generated from that. Um, we're in the middle of building that tool, um, so that you can, the key and that will 


**Brett:** allow you for the iOS to look a lot better is what you're saying.


**Eliot:** Yeah. Then it looks like it looks like an Unreal and. The rest of it's gone, and I guess 


**Brett:** it'd be interesting if we could do [00:13:00] something where I could broadcast that signal to like an iPad or something and hand it to a client and go, here's what your shot looks like. But if I set up an iPad with Jet Set, obviously it's based on location.


So if I move that around, everything kind of gets wonky because the AR is changing location. So I don't know if you can know of any technology where I could do something like that once you guys get there. Because that would solve our problem. 


**Eliot:** There's, there's, yeah, this is solvable. There's ways of doing screen mirroring and stuff like that.


But the core, I guess, I guess the core takeaway is that trying to engineer deeper into Unreal is, is intractable, is very intractable. Like it's extraordinarily difficult to make progress with that. Whereas we're, you know, we're doing the Gaussian splats and we have, you know, then one of the next thing we have to do a couple of chunks of work on the Jetset UI and, uh, Make some remote linking stuff work.


And then we're going to be working on compositing in the phone, taking the live feed from the city camera, you know, whatever focal length it is. You know, matching the rendering in the phone with the map 3d splat and copying it right there. And then it's [00:14:00] done. It's just done. You're at three quarters of what it looks like with unreal.


And there are no external connections. Buckus. Excellent. 


**Brett:** Yeah. I'd love to get away from it too. It's, it's very buggy. It's very. We spend all this time. And so Paul's just kind of panning around showing our studio. He's got a bunch of cameras and stuff there, but none of them are rigged out. So I had a camera that I've rigged out a black magic 4k, and we've been taking that in, but every time to take it over to his studio, set it up, plug everything in, get it all working and then troubleshoot because every time there's something, so now we're setting up a camera when his camera is to do it.


So it'll be always set up to save us some time, but that's our biggest thing is we're just trying to figure out how to. Yeah, speed up the ability to just to show. A client, something quickly go on a large screen instead of a phone that's mounted on them. Don't make them, you know, especially because a lot of these comedians are already on the set.


They're going to be in the thing and they want to see the guy that runs the whole thing really wants to see what it looks like. 


**Eliot:** [00:15:00] Oh, yeah, we totally get that. They have to see the Sydney footage and Compton stuff. Uh, but, uh, you know, and I want to get back to Pedro, but I literally gave a talk yesterday at production summit and the talk was called bye bye brain bar.


And it's, you know, because it's death, it's just, it's, and it's unfixable. And I'm just telling you, like this problem that we've been fighting this problem for 20 years. And as soon as you have multiple machines, all hooked, wired together, You're in it. And, and I think that it's just, you, you can't, you can't fix it.


And so what we're, what we're doing is, you know, the Gaussian splats are a game changer because we can check that stuff 


**Brett:** out. It's a, it looks great. 


**Eliot:** We rendered iOS looks, it's not exactly as unreal, but it'll match back. You can re re render in Unreal later. Use Cryptomats. Yeah. I mean, you run the Unreal renders blazing fast, right?


You can knock out the days where the shoots in, you know, 15 minutes and run Cryptomats, composite it and do it. Like production level super fast. So that's, that's where we're going. I want to get back to it because we gotta, we gotta solve this thing. But [00:16:00] that's absolutely, that's where we're going. 


**Brett:** That that's our biggest thing.


We just want to be able to show it to somebody quickly. So that's really, 


**Eliot:** yeah. I'm on multiple calls a week with this one. And on these calls, just hang in there, the Gaussian splats and the, you know, and a little bit of foreground USDZ, right. You might have to have a CG table in the foreground so we can have, because the Gaussian splats don't do occlusion all that well.


They're, they're kind of fuzzy. Um, so yeah, the, 


**Brett:** the depth occlusion in Jet Set is working pretty well and we're getting a good preview. Once we get it working, it's matching the phone relatively well. Uh, it's just the process of getting there. It's killing us every time. So 


**Eliot:** how good the Gaussians, was the Gaussians blast we're going to transform?


I would have probably never written the, the live data out to Unreal because it's just such a butt kicker and I knew it, but we didn't have another choice at the time, but now we do and. Anyway, let's get back to 


**Brett:** Pedro. I don't want to talk about it, but that's what we're really struggling with. And that's what, anyway, there you go.


We're going to 


**Eliot:** solve 


**Brett:** it. [00:17:00] It just in a different way 


**Eliot:** in the phone. 


**Brett:** Yeah. As long as I can get it out to an iPad or monitor or something that I can go, here's what it's going to look like. And everybody goes, Oh, that looks amazing. Yeah. And then we move on. And then I turn it because everything we've done to shoot looks great.


The stuff I'm actually doing in production is looking very Anyway, okay, uh, 


**Paul:** thank you very much. 


**Eliot:** Thank you. That was 


**Paul:** interesting nonetheless 


**Eliot:** Well, it's it's a peek at the future of where we're going um for exactly this reason and and it's not it's not just you it's everybody who has to Wire into it. So, all right, so let's take a look and let's see.


So we've got Uh, let's look take a look at footage. Let's double click on footage And okay, I mean street Okay, I didn't even know I guess this is the default thing Okay, so then let me go see [00:18:00] where


CTO won that one.


Okay. Elaborate Preferences is auto shot. Okay, so let's, uh, go on. Uh, so on in Finder. Let's open up. Library[00:19:00] 


Finder. Shot. Uh, no, no. In, uh, in the, the Mac finder. Uh, okay. Sorry. 


**Vaz:** Would 


**Eliot:** like to find a, yeah, Greg said it was. Oh, 


**Paul:** yes. Sorry, she's got me. It's 


**Eliot:** located in here in this directory.


All right. There is, let's take a look at your finder. So, um, okay. Oh, okay. So in the, in the Mac finder, what you can do is, uh, we need to go to your, your home directory. So you let's, let's go to, let me grab my little annotator. Let's go up to go. And then you want to go to, um, Home. Uh, actually, can you click on go again?


Go and then where is our, okay. Can you click on home? [00:20:00] Okay. I think you've got, yeah, that's the one that shows it downloads. That's interesting. I wonder if libraries is a hidden directory. Um, okay. Uh, we can figure that out. It is a hidden 


**Brett:** directory. Oh, it's okay. 


**Eliot:** Um, you come from a console. Okay. Uh, Pedro, are you okay with console, with a Mac terminal?


We can do it that way. 


**Paul:** Uh, I'll say, I'll say yes. 


**Eliot:** Um, okay. So let's, uh, actually you want to hit, um, uh, command T and it'll pop up a terminal. 


**Paul:** Okay. Sorry, because I'm not the most, uh, command. What the hell is it? 


**Eliot:** Yeah. Term. It's actually called terminal on the Mac, if I remember correctly. Terminal. There it is.


All right. So.


Let's see. So then if you type in, uh, LS, we should get a directory listing. All right. Oh, all right. There's [00:21:00] library. Okay. So you can type in CD, uh, space, uh, and type in L and hit tab, and that'll go to library. 


**Paul:** Uh, CD 


**Eliot:** L. Yeah. And then, yep. Library. And then type in P and hit tab, uh, PR, uh, PR E and then hit tab.


There we go. There we go. And then, uh, type a slash and then type a, yeah, for auto shot. Hit tab. Uh, maybe capital, nothing, maybe. Uh, capital A 10. Hit tab. Oh, nothing. Oh, nothing. All right. Let's, uh, delete the A and hit and, uh, hit return and type in ls. Oh, it's not even making it there. Um, that's interesting. Um huh.


Uh, read stole this library. [00:22:00] Preferences auto shown. Um,


gimme a second to figure this out. Something is strange. Um, uh. Can you go back to the calibration tab on this guy on autoshot? Something's weird. Yeah, I mean, iCloud, Jetset. Yeah, there's a Jetset folder, iCloud. Um,


that's strange. Um, give me a second. So why isn't it pulling down the file? But it looked like it was pulling down the file. Can you go ahead and click, uh, calibrate again? You can just leave it as, as at the default sensor width for now. I just, I want to,[00:23:00] 


and I'm going to look at your, uh, I'm going to look at your, your console that you sent me. Cause something's something's funny. It's cause those files should just be there. And they're not. So let's see. Applications, out of shot.


User repeated library preferences, out of shot.


And can you open up that terminal again? Um, Because it's running. And, let's see. What's your screen here? Library. Okay. Let's type in CD library. CD space L and hit space or hit tab. That's fine. [00:24:00] Type in LS.


All right. And, um, can you open up that window just so we can see kind of, uh, I'm looking for, okay. All right. And CD preferences.


Okay. Hit LS. Alright, and then,


uh, can you type in, um, ls dash, um, wow, what's that directory specification? Hang on. Flash? Or what's the dash underscores? Yeah, it's, um, directory. Directory's only, it is.


Sorry, sorry. ls. ls. ls. Yeah, [00:25:00] it's like LS dash, oh yeah, LS dash d. That should give you directories, uh, ls, uh, and then space, and then dash, uh, d uh, dash lowercase d sorry, sorry for the crash. Jumping into, you know, so sorry. Uh, LS. And then, uh, and Ellis is just the list director in command. So type in Ellis and then we're gonna specify it with a, uh, and hit a face and then a dash.


Uh, and then lowercase D. Nope, not a slash, a dash. Sorry, what is a dash? Oh, a dash is 


**Paul:** a short, 


**Eliot:** short horizontal line. 


**Paul:** Gotcha. 


**Eliot:** Okay. Yeah, gotcha. Sorry, I just used underscore or minus. And no, a minus. Okay, there you go. And then type in lowercase D. And let's try, let's try that. Let's see if that gives us the directory list.


Oh, come on Unix, what are you doing? [00:26:00] Um, Alright, so I guess And let's try that again. Let's do cd space and type in capital, uh, capital A. Uh, and autoshot. Okay. Adobe Oh, there's autoshot. There we go. Auto Ah, okay. Uh, and I see. So just type in auto, a u t o, and then hit the spacebar. It was that capital letter S.


I don't know why we have that letter S. There we go. All right, and hit return. Now we can type in ls and that gives us the directory listing. Okay, uh, calib temp. All right, let's go cd space, uh, calib tmp. Uh, 


**Paul:** sorry, cd space calib. 


**Eliot:** Yeah, cd's just change directory and, um, now we're just going to the directory.


This is just the manual way of going it because sometimes the finder doesn't let us see all this stuff. All right, so type in ls. Let's see what's in calib temp. [00:27:00] Okay. Now there's, there you go. There's a Jet Set Calibration. Now the world makes sense. So can you CD into Jet Set Calib, uh, you know, 12, 12, 22, 24.


And yeah. And then type in LS. Okay. So we're seeing this the other day somewhere. Yeah, this is, this is the, the cool map, um, directory. So can you type in CD? Let me think about this for just a second. Um, before we dive too deep, I wonder if it would be easier to, Um, okay, let's, uh, do, uh, cd and then just type in two dots.


Let's go up one directory. Uh, two periods. Uh, and you're gonna need to space after cd. So the, so Unix knows that it's a CIPRA command. There you go. Alright, type in ls.


Okay, so let's just copy that zip. That zip file has all the files. So what we're going to do is, uh, [00:28:00] we have to do this text wise. So in the command line, just type in cp for copy. Oh, okay. Space, and then type in a capital J and just hit tab, there you go. And then, uh, all right. And then, um, put, put it, and then, uh, you need to copy it.


We're going to give it a destination. Oh, okay. Yeah. So then we're going to go, uh, type in a tilde. Um. Sorry. Type in a. Oh, it's a tilde. It's a upper left hand corner of the keyboard. Looks like a little wavy, wavy line that goes up and down. Gotcha.


Second, uh, do we have a different, yeah. And then a then a slash Yeah. Can tell. Oh, no, just a slash It's a just the, the diagonal slash There we go. And, uh, and then type A Period. All right. Uh, all right. That should do it. Oh, wait, wait. You need to have a CP command. Oh, okay. Yeah. Before the jet set calibration.


Should I put it here? Yeah, just type in CP and it's got to be lowercase. Sorry, [00:29:00] this is all gnarly. No, no, no, no, no, no. I'm sorry. I'm just such an amateur at coding. All right, there we go. So now we have just pushed that into our home directory. So now we can go up to the finder. Uh, let's go find our finder panel and we're probably, okay, there we go.


There it is. Okay. So now we can, uh, push that to your desktop or someplace where it's easy. This is fine. We can just right click and unzip it. Um, you know, whatever, whatever makes sense. All right. So let's open that folder up. Um, and let's take a look at the image calibration images. Cause that's, that's, there's our problem.


All right. Right. Yeah. So let's see, you can probably on the app on the Mac, you just kind of double click on the, on the zip file. I should, yeah, I just got this. Oh, okay. Let's go, let's look into it. Let's see. Oh, okay. There we go. Okay. That's it. So let's take a look at, um, the iPhone pictures. Let's, uh, open up the iPhone tab and can you just run the preview and have [00:30:00] a, okay, that looks fine.


Let's, uh, walk through a few of the images. Let's go through, can you, can we space through, um, how about, uh, open up the batch and preview and then we can, we can, uh, then we just page through them. All right. So let's just, uh, right arrow key and let's just walk through those a little bit and then it all looks fine.


Can you, um, It's not letting me, there we go. Oh, there we go. That's the one. All right. So let's just, uh, right. Okay. Oh, it seems reasonable. Um, now the one thing I'm worried about is it looks like you're standing in one place and rotating, rotating the camera, like a, like a, yeah. Oh yeah. That's it. There it is.


That's it. Shit. Excuse me. All right. We got, that's, you have, we got to write this into our UI. And we're gonna do it. Um, and, and the reason is you have just, you have just landed on the, uh, the, the one bug, the, the numerical, and it's, this is inherent in [00:31:00] solving. So the way the calibrate calibration or any photogrammetry system works is you, you pick an object and you move the camera around it.


And so that's how you generate power parallax. And that's how we get a solve. And so that, and so if you are in one place and you, and you rotate, there's no parallax and there's, you can't get a solve out of it, this, we're going to put in this into the UI, we're actually going to put a little marker that with a little overhead view of the camera on a central thing.


And that, and as you go, it's going to go around it because. This is there. There's your problem. 


**Paul:** Okay. Okay. Okay. Well, thank you. Um, the one thing I get really dark footage in the preview on, on, on the phone and that's, that's one of the problems, like I can see well, but when we're doing the solve, it's like super dark.


So that's why I was like, Hey, let me figure out if I can just get all these points to match and that the [00:32:00] phone is quite far. Well I can show you even the phone is relatively far from here so it makes it a bit of a problem although we've got some rods that are coming. Um, I, I can see that these have to be as close as possible right as well.


**Eliot:** I mean so the solve will work at almost arbitrary distance between the two of them. What you're going to run into is the anti radical. Is, is, yeah, you know, one of the, the aiming reticule won't be accurate. As soon as the, the, the, he, the actor gets anywhere close to the, um, close to the camera, it's, it's, it's, cause it's basically a range finder, you know, at that point, yet another reason why we're ultimately, we're going to be punching in the real time cine feed into the iPhone and, and doing our comp there because then it just all solves.


And then it doesn't matter where you put the phone. We just, we're just going to solve the offset. So the offset is, is robust to different locations. Um, but the human experience is not. 


**Paul:** Well, yeah, yeah. Sorry. I mean, such a basic thing. That's probably why it worked the first day. Cause I was [00:33:00] moving around and then I was like, why is this not working on the second day?


But yeah, we're, I'm really interested in working with this. We do architectural visualization, which is like, we do movies for all the architectural world and stuff like that. Film type of ads. It's very niche, very specialized. Um, Yeah. We want to kind of have this so that we can put people very quickly. It can kind of solve all that issue much quicker.


And I'm, I'm testing this out. Probably one of the first, probably. I hope I'm one of the first to not have textual visualization. I haven't seen many. Um, but I hope, yeah, I hope that we can integrate it in the pipeline. 


**Eliot:** That's great. That's, that's, that's great. This is, this is, uh, and, and what you're running into is.


I mean, this is exactly the kind of thing we have to solve in the user interface because it's obscure, it's a numerical, and it's inherent to the algorithm. That's why it's crunching 


**Paul:** for so long, right? It's trying to understand what is the difference between the points. 


**Eliot:** Yeah, and [00:34:00] I'm like, Oh, I see it. It's not, it's when you see a no convergence again, it's, it says to run this big old numerical thing and, and it wasn't, wasn't converging.


So that there's, there it is. And, and so, yeah, I'm sorry about that. I need to, I needed to make that clearer in our tutorial. Um, and it's hard. It's, it was the, cause I was, It was crunching through that to, to get it right, but we, we need to make that better in the, in the UI and, and more, more clear we're, we're looking at trying to do it like a real time map where you show the camera and show it, show people where to go and stuff like that.


**Paul:** Uh, yeah, 


**Eliot:** just something like the top down view and where it shows, you know, here and the cameras should be moving object. That'll help. 


**Paul:** And, and the Gaussian splats, they're just in Unreal? Like, there's no other software right now yet? Or I'd assume you can render out and insert it, right? Make a Gaussian splat.


**Eliot:** Yeah, yeah. So the, the, you can make a Gaussian splat in any software. The trick is maintaining the original coordinate space. Uh, so we have to do a bunch of very careful things to make sure that, because otherwise the Gaussian splat [00:35:00] solver will just like solve its own thing. It's going to be in some random orientation to exactly match the original.


So we have it, have it running in Blender and we're about to release the, uh, the blender add-on with that because we wrote you have to write stuff out of Blend, the original app. Then we take it into reality capture and it does the initial point solve. But you have to constrain real reality capture to not like, change the cameras.


And then once you have that solved, then you can take it into a post shot. So we build a, a script that automatically runs 'cause reality capture, it's really powerful. But that ui , oh God. Yeah. Uh, and, and it's. And it's full of dark, you know, corners where you have to have four, you know, check boxes down and three, you don't have to do that.


So we scripted it and it works. You don't even have to open it up. Like you just have it free on your, you can download it with a Unreal bottom. So now it's free. So you download it on the system, you don't even need to open it up. The script just, you know, remote drives it. Does it all. You have to be on Windows because RealityCapture is on Windows, but you know.


Yeah, we 


**Paul:** have Windows machines. I, I'm, I'm [00:36:00] not the most tech, I'm not the tech guy in the company, but like I saw this and I'm like, I'm going to try this out. This, this seems promising for the future. 


**Eliot:** I think the, the Gaussian splat stuff is going to be wild because then you can basically make your architectural renderings show up in your client's phone and they don't need to be in the same room, right?


They could be, you know, you can show up at the, at a demo or something like that. You'll hear, here's, here's your iPad. You walk around and then you walk around it and it looks exactly like it's rendered in V Ray because it was rendered in V Ray. 


**Paul:** Well, that's the stuff we're using, right? Chaos V Ray, uh, V Ray Corona.


That's exactly what we're using. That's awesome. Thank you so much. I really appreciate that. I'll, I'll probably pass the floor to someone else. Um, they've been very patient here waiting, so thank you so much. I'll stay in the background because this is interesting. So I might have to jump soon, but I'll stay here in the background nonetheless.


All right. No, no problem. No problem. Thank you. Cheers. Thank you, Elliot. Thank you everyone else for your time as well. [00:37:00] 


**Eliot:** Hey, glad we could. Thank you. Thanks, Pedro. 


**Paul:** It's interesting. Sure. She is. Thanks guys. Thank you. Bye. Bye. 


**Brett:** Hey, Elliot. I'd love to show you this shot. I rendered out that it shows the set that we're dealing with.


**Eliot:** Yeah, that's it. 


**Brett:** Um, and uh, this is one that had a full post process and definitely looks better than the live preview thing we're doing. Uh, this is, uh, there's a skeleton on the set, but this is the set we're trying to use. So I just had a skeleton that I shot to get some green. This just got better . Yeah, there you go.


But we did a handheld shot and we got really good. The green screen didn't go down to the floor, so there's a little bit of wonkiness and the, the. Desk itself is transparent. That's why you can see through. Oh, that's 


**Eliot:** fantastic. 


**Brett:** That's yeah. So, but you could see the tracking was great for this. It's really just that live preview.


Um, and it's just because people that Paul was bringing in, you know, he's concerned because they're high profile and he wants to really impress them. So he wants to be able to show them and they [00:38:00] want to be able to, they've seen some of what we're doing. And they want, but that he, they all want to see, Oh, we can see it on set.


Right. We can see it before we shoot. So that's what we're trying to get to as a way to quickly show them. Oh yeah. It's going to look like that, but we'll do a full pro post process like we did on this shot and we were, and we showed them this and they were really impressed with this. Cause this is basically the set we want to use with some modifications, but they're doing like a talk show thing.


So they want to be able to do something behind a desk. Um, 


**Eliot:** are those monitors real? Are they, are those, no, this is 


**Brett:** everything is, uh, is unreal except for the skeleton. Yeah. That's great. And, and is the skeleton animated or is he, uh, No, no, it's just a, uh, it's just a plastic skeleton I had from Halloween. So he's real.


Okay. Yeah. So he's real. So I put him on a, against a green screen and then shot this on the set that we're trying to use for this project that we're doing, or on. 


**Eliot:** That's fantastic. So just, uh, so because you have translucency in that, is this [00:39:00] achieved with a post composite or are you compositing in Unreal?


So 


**Brett:** the way I'm doing this, the way I did this shot is I rendered out a multi layer EXR from, um, from Unreal that had the desk on one layer and had the rest of the set on the other layer. Uh, and then I just, you know, did my green screen and slid the, the skeleton between the two layers in Fusion. Perfect.


And. And I was able to do this very quickly, and I, I did some pretty high quality render settings on Unreal because I've been learning Unreal in the process too, because I didn't really know it before this. Um, and the main reason we're using Unreal instead of Blender is because of the live preview thing.


Uh, it's really just go, hey, look, here's what it is. That's, that's the whole goal is just be able to show something quickly, even do a camera move and go, it's going to look like this. Then we'll unplug, turn the monitor around and shoot it without them looking at it anymore. 


**Eliot:** Yep. 


**Brett:** Um, but, but it's just that one little bit of time before we start, [00:40:00] whoever's in there wants to go, what am I going to look like behind this desk?


How's that going to look? 


**Eliot:** You know, it's absolutely key. I totally get it. And, and, uh, saying, Hey, it'll be fine. Later. It doesn't cut it. You have to show them like right there before we're rolling. Like where, where we're going at least two. Yeah. And even the preview, 


**Brett:** You know, and I haven't figured out on Unreal how to get a live preview that doesn't have severe anti aliasing issues and other things.


Uh, even on a beefy PC, I think the live preview, the quality is going to be limited. Um, That you can get a, you can't get something like this because this took probably half an hour to render. Cause I checked, I turned up all the quality settings and did this multi layer EXR with alpha and all this, but you know, it shows that post is the way to do it.


And that's what, cause I'm a post guy. I think I told you that, but you know, that's a 


**Eliot:** hundred percent is you, you just are, you can render everything so much faster. And if you automate stuff, then you can render fast and post, right? It's not [00:41:00] real time. Yeah. It's fine, you know? Yeah, and 


**Brett:** we have some pretty fast PCs, so we're able to, I mean, I was able to execute this shot probably in less than an hour, I'm sure, of that, between, that included bringing it in, rendering it out.


Um, well that is another question I had on AutoShot when I, when I do the Unreal Live. Preview script. Is there a way to make it bypass the render of the plate or, uh, because some, because I don't use that much. I ended basically, so I'd love to be able to just export the Unreal live preview without rendering the plate just so I can get the camera movement.


And then I'll render it out. I can even do a quick low res render out of Unreal and put it in Fusion. And check it there. Um, 


**Eliot:** I think you 


**Brett:** can, 


**Eliot:** you can just turn on. It looked like an element. 


**Brett:** Um, so yeah, I'll look at it again. I don't, I, yeah, I wasn't able to get it to turn off. So I may have missed something because I would love to be able to bypass that conversion to the XR [00:42:00] of the camera footage.


**Eliot:** Give me a second. So that's, um, so when I go to, 


**Brett:** so when I go to X, actually, it's not the live preview. I'm sorry when I export, it's when I export the shot on real to get the camera movement. So that I'm going to render it. So it's not a live preview. I apologize. It's the other. It's still using the same thing.


It makes a script and you copy it based or you paste it into the. The console in unreal. Um, but what ends up happening is when you start that process, if you're trans, if your footage hasn't been transcoded, it runs a transcode to the X star of the camera footage to the drop that image plate into the unreal.


See what I'm wondering is, is there a way to not do that? Because that can take a little bit of time, especially if you've got a lot of shots. And you've got longer shots. Oh, I 


**Eliot:** see. Auto shot automatically does the frame goals. Um, 


**Brett:** yes. So what I'd love to do is just send it the animation of the camera [00:43:00] without the plate, because then I can just turn around and go jump in on real and render that out and take it all into resolve and fusion.


And do everything that way. Um, there is a one frame offset. I found the camera footage is usually, I believe it's one frame later, uh, than the render I get out of unreal. I'm not sure why it's consistently that way. So I just do the offset and then it seems to sync up. And then the other thing, and I asked you about this before, when you're doing the cine technique, if there's a way.


To get, uh, low res temps, basically your previews that you develop right now, right now they're full frame out of the phone, so they don't really match the framing of the cine lens. Um, so using them as for an offline workflow says, so to speak, um, and it would work great for this project cause we are shooting this whole thing in 30.


So everything would be, the frame rate would match, which is a nice, but even when we're doing 24, we can convert them and it would be great to be able to use those to cut with. [00:44:00] Um, that is, uh, 


**Eliot:** that is also part of what we're going to be doing with compositing in the home. Because then we have, we're compositing at 24 frames a second or 23.


you know, whatever. And then we just record the mp4 of that and it's matched to timecode. And then, absolutely, then you just drop in the editorial timeline and it matches one to one to your, you know, timecode matched to your camera originals. That's, that's where we're, that's what we have to do. Yeah, because then I could, 


**Brett:** because right now I have to kind of process everything.


Because I, I need to see what it looks like, what the framing is, even looking at the low res, I'm like, yeah, but some of the top of that's cut off, we may have gotten too low, uh, so, the only way for me to really see the shot is to run it through the whole process. Now I can render low res out of Unreal, so that it comes out in a couple minutes, but it does make me, every time I process a shot in AutoShot, if it hasn't already, Run the transcode, it will do that.


Now fast machine, that's fine. But if I've got 20, 30 takes and I want to review all of them, that [00:45:00] becomes, you know, something that takes some time. 


**Eliot:** This, the, the, um, this, we got to, we'll get that in the, um, The, when we switched to compositing in the phone, cause then it's, then it's everything just lines up, right?


It's a 24 frame per second, you know, live composite preview. It's, it's the Sydney footage matched, you know, exact framing, because that's way what you see on the monitor is going to be the Sydney footage. And that's what you get, you know, in the lab, that's, and so that's why I want to. I wanted Paul to jump 


**Brett:** on earlier, because he's not the tech guy on this, but he keeps complaining about the time it takes to do all this.


So I wanted him to get in and hear from you that what we're trying to do is pretty complicated and doesn't always work right away. Yeah, it's, it's And to hear that you guys are doing stuff to try to get around some of those things, you know. 


**Eliot:** We looked at a bunch of different ways of kind of pseudo solving the problem, and, but When the real pro way to solve it is just to comp in the phone and then everything works right?


Then the footage [00:46:00] that you edit with is time good match frame match everything matched at that point That's that's just what we're going for. You know, it's just gonna we're we have a Existing solution is a little bit of a halfway solution You know that you we started out with to figure out figure out which way was up But it was just became so clear that we just have to do this.


So we're going to do it. I mean, it's just going to take us a few weeks to hammer through it and get it. Right. It's a, it's a big old, um, uh, it's a lot of engineering. Uh, but 


**Brett:** absolutely. I, I appreciate it. And I liked the fact that you guys have these office hours and listen to us. I told Paul, I said, you got to jump on.


These guys are really responsive. Uh, you know, he's the, the money guy, the guy that behind all this, it's like, I take so long, it takes so long, you 


**Vaz:** know, 


**Brett:** so I'm like, what, we're learning something 


didn't, and we didn't have a rig there. I was having to haul my rig out every time and reset it up. And now that should make things a little better.


Now that you realize we're going to have to do that, you know, so. 


**Eliot:** Yeah, I appreciate your patience while we hammer through it. [00:47:00] It's, uh, you know, I think that will solve it. And then you have, uh What's that? Cameras for 


**Navaz:** see if that fixed it. There you go. Um, Hey, good to see you.


Not since yesterday. I mean, right now, um, we actually had a meeting this morning and we're going to, I'm going to send you an email later, you know, kind of explaining what we feel we can do. Um, I mean, more so, I mean, we think it's gonna be beneficial for everyone, you know, like, like something that we can do that can showcase everything.


Um, well, basically in a nutshell, what we'd like to do is two short films every two weeks, 


**Vaz:** you 


**Navaz:** know? Yeah. And basically showcasing how the evolution of Lightcraft, because honestly, it's like every week, I mean, well, for us, it's like every day, we're finding out something new, you know, like, like yesterday with the Gaussian splats and stuff like that, and to be able to incorporate that.


On a biweekly basis into our, uh, you know, into the short films, you know? [00:48:00] So, I mean, right now I'm putting it together, um, you know, just email, just breaking it down, like what we, you know, what we would like to do and stuff like that. So that way we can, um, at least see where we were, you know, what's feasible and what's not feasible, you know, and, uh, cause one of the things that we were talking about this morning, which, I mean, this is totally, I mean, I don't even know if it's possible yet, but we're going to look into it was Animated Gian Flats, which I don't know if anybody's done that yet, but what we were thinking of is that if we were able to have an array of cameras and film somebody, let's say, walking, walking around, um, and then each frame is turned into a gian flat to where then you can actually view it from any different angle.


**Eliot:** Mm-hmm . 


**Navaz:** You know, and really what that's for is, I mean, it's for a different project, which I'll talk to you about, which is gonna be pretty huge. But to be able to do that, because then let's say if you were in a virtual environment and you [00:49:00] end up going in front of the actor, you'll be able to see them from a different point of view with a different, um, you know, um, you know, with different lighting and, you know, just say if they're wearing glasses, you can see through their glasses from no matter where you are in that environment, you know, I mean, I don't know if anybody's done that yet.


And I know it's a, it's a huge undertaking, but at the same time, I think that the way that guys is plotting is going, um, I mean, to do frame by frame would be, you know, I mean, it just sounds daunting, you know, just to think about, but putting it all together, you know, would be really interesting because, I mean, we were looking at what, um, what Disney did, um, or it was showing, I think it was on, uh, on Wednesday.


Um, I don't think you were there to see it, but what they did is they were focusing more on, um, VR movies and, and why it would work and why it wouldn't work and. I mean, honestly, we were looking at it as, okay, we can make it work, but to be able to use live actors, I [00:50:00] think that's the problem that people have is they feel like most of the content, the VR content that's come out is mostly a 360 camera.


We put it into VR and that's the experience, you know? I mean, we're more so looking at it like, I mean, there's, there's gotta be a weather better way to get. That 3d parallax effect of moving around an object without, um, without having on a 2d plane. Does that make sense? 


**Eliot:** Yeah. Yeah. This is so you guys are figuring out are seeing what the research world is saying.


So there, there are examples of moving gaussian splats and I'll tell you, right? It's. It's out on the hairy edge of R& D. Like, way, you know, there's uh, there's some crazy stuff out there and it's usually captured by like a big array of cameras and they, you know, have to do, there's a lot of, a lot of wild stuff.


**Navaz:** Yeah. But we were thinking more simple on just an [00:51:00] actor. You know, I mean, not, instead of an environment, we do it on an actor and guys can splat the actor, you know, from different angles. So that way we can move around the actor in a virtual environment. Does that make sense? 


**Eliot:** Yeah, it's the, the current way that you, I mean, all we're set up right now for is the stationary, you know, basically stationary Gaussian we, we bake out the, the Gaussian splats of that and, and, you know, generate and do a post shot and that kind of stuff.


And so we, and that, that's working, you know, we're hammering through the details to, to get it right. Um, I mean, if you, if you want to dive into actor hour, I mean, I can find the paper if you're curious about it. This is, uh, uh, yeah, I guess 


**Navaz:** it's, there's some, yeah, we're planning on it actually. I mean, well, I mean, you know, now that you've met us, Ellie, you know how fast we actually see things and we try to like, you know, we'll probably end up having a test by the end of the week.


You know what I mean? And, uh, you know, to see how we actually, how it would work. [00:52:00] Cause honestly, I think that's the one thing that VR. Is missing is being able to show people, um, you know, from different angles and actors, instead of being on a 2d plane and you can't walk around a 2d plane, you know, like to be able to, um, see it from all different angles with different lighting and everything.


I think that's, what's going to be huge or, I mean, this has nothing to do with, with Lightcraft obviously, but, or it kind of does because we're actually going to be doing, um, uh, a mock up in Lightcraft, you know, to see how it actually works. 


**Eliot:** You guys want to see, this is, this is the wild, hairy edge of, of the R and D stuff.


So this is, they are, they're computing stuff from, uh, you know, that's exactly what we're looking at. They're breaking, breaking a lot, breaking a lot of PCs, doing it, but, uh, and I, I, to like 20 percent understand what they're doing, there's, there's a bunch that I go, okay, this is amazing. I don't, I'm going to wait a little while until they have, uh, You know, somebody comes out with a commercial implementation [00:53:00] or something we can compile with or something like that, but it's incredible, right?


It's, it's some crazy stuff. That's exactly what 


**Navaz:** we're looking at. 


**Eliot:** Yeah, they're, they're doing some stuff where, you know, they're, and they're starting to monitor. I saw one where they're, they had like a, you know, they changed what it was on their t shirt as they're running around and I see it. I do not understand it, but I see it and I'll, I'll, uh, 


**Navaz:** wait.


But yeah, overall, like, um, I mean, but other than that, I mean, we're basically looking at trying to figure out a plan and allocate time, obviously. I mean, right now we were trying to look at our schedules and see what we can actually do. You know, um, I mean, I'm looking at trying to do, uh, 14 days of commitment, you know, to allocate at least 14 days of, you know, per project.


Or not per project for both projects. So it was like seven days per project, which honestly, I mean, have you seen what we've done so far? I mean, we're able to do stuff in two hours, you know? So if we are able to do it [00:54:00] in two hours, I'm pretty sure we can get in a week, especially now that Conrad's on board with us.


And it's like, I mean, with Conrad's willing to, to assist with it, it's like he brings a whole plethora of knowledge, you know, to, to speed up the process. You know? And I mean, I really, I think that's what people need to see. It's like, it's. Because what we got from yesterday or from the production summit was everyone was thinking it's just a previous thing, you know, or at least that's how what their first impression was and our explanation was, no, it's not a previous thing.


This could be final product, you know, if you actually just do these certain steps and stuff and look at it as such, you know, and I mean, it's basically like a volume, a portable volume. You know, and affordable, portable volume at that, you know, and, uh, I mean, but that's why what we're looking at. And so, so that's why, like, it's really good for us to at least work with Conrad since he's been around for so long, you know, and he understands it [00:55:00] more, um, excuse me, more of the details because honestly, I'll tell you this.


We have not run into a problem that we can't overcome with like, right? I mean, like craft, like craft. I mean, the way that you guys have it now within the last six months has been a lot, um, a lot more user friendly, you know, I mean, there are issues. I mean, obviously it crashes every now and then, you know, but the, the, the reset it up.


All you got to do is restart jet set, you know, and, um, I mean, you're back at it, you know, compared to like, once, once something goes down on a, on a real set. Like you have to go figure it out. I mean, we're talking about like a 30 minute wait between being able to shoot again, you know, compared to like what Jet said, we we're able to do it in in minutes, you know, and oh, we're ready to shoot.


You know, just gimme a second, you know? But, um, but yeah, so like right now I'm just compiling an email, you know, that we're just kind of breaking down what our idea is, you know, and. You know, seeing if it's beneficial to you guys and beneficial to us, you know, cause honestly, um, I mean, like we told you, it's like, we're not [00:56:00] really YouTubers, but at the same time, YouTube is a powerful tool, you know, and it's a powerful tool also for, for, um, to show other filmmakers, cause that's where other filmmakers are going to learn, you know, to, to, you know, how to do things better, they're going to YouTube to sit there and see, well, okay, what's out there.


You know, and a lot of people, when they see, um, Jet Set, at least the people that we, we know, have seen Jet Set, it looks, on the surface, it looks complicated, because I mean, of, of the, the Marvel way of thinking, I, I, that's what I like calling it, the Marvel way of thinking is that when they look at VFX, they think, okay, minimum 50, 000, a hundred thousand, maybe, you know, to the million, you know, no matter what it is without even knowing, okay, well, or diving into it and saying, okay, well, does it really cost that much, you know, and really it doesn't, you know, I mean, when we broke it down and we started seeing it and we were able to show, okay, well, this is what we were able to [00:57:00] do.


It really puts in perspective that, okay, well, where's all the money going, you know, when these big productions are doing, you know, these, you know, fantabulous shots, you know what I mean, and we're able to do it for next to nothing, you know, based on, I mean, obviously before like craft, it was, there was no other real option.


You know, but now that like hand track, 


**Eliot:** every shot is just going to take you 


**Navaz:** for it's just ridiculous, you know, I mean, it would take weeks, months, you know, but now with, with the way that auto shot is putting stuff together. Cause one of the biggest things for me was the file management, you know, to be able to know where your files are and already have it already put together.


I mean, it's very well put together and that makes it so much easier to understand. Instead of having to go through, uh, let's say a hundred shots and try to figure out which one is the one you're looking for, you know, we can just like, okay, it's already this one, you know, we name it, whatever you want to name it, you know, and you know, where it's at.


And I mean, that's where I think that by showing people, I mean, one of the [00:58:00] things that Conrad was talking about was doing a live stream of us making, you know, uh, behind the scenes. And so we feel like, okay, eight o'clock we start and then, you know, our goal is to be done by five o'clock. You know what I mean?


And see where we're at at five o'clock. So that way that, um, you know, the viewers can actually see, um, oh, I mean, they can ask us questions. They can, you know, they can make comments and suggestions and stuff, but at the same time, show them how the process actually is using jet set, you know, on a minimal budget, you know, with, you know, With basically, you know, what everybody could have.


I mean, everybody has a camera. If you're, if you're doing filmmaking, you have a camera, you have a phone, you know, so can you, can you actually do this on a minimal budget? Yeah. You know, and most of the models, you know, we can get it for free or we can, you know, do the kit bashing, um, or with cargo and stuff like that.


Um, I mean, but we could show, okay, well, this is the whole evolution of it. And then on top of that, with showing how we can use the VR with it, you know, of set [00:59:00] design. And then, um, you know, showing them like how to move stuff around. I mean, just basically going through the whole process because what we've noticed is that it's one thing for us to show or to tell people, Hey, this is what we're trying to do.


This is how we're trying to do it. But then when you show them, they're going to be like, Whoa, okay, that is pretty impressive. You know, and then and then when I ask you a question, go ahead. 


**Brett:** I wanted to ask you a question. Actually, what's your pipeline? Uh, because we're trying to do somewhat similar types of things.


Okay. Are you doing any kind of live preview on set or are you just every using everything on the phone? Are you doing, are you running out to a PC trying to do anything that way? Um, well, 


**Navaz:** our live preview is, um, well, we haven't done the live preview, like the live link, if that's what you're asking. We're not doing the live link.


Yeah, just because what we're doing is more, I mean, we, we stream from the phone, the phone view to, to, to the screens. Okay. So you can see, you know, what that is, you [01:00:00] know, um, or what the camera sees. And we found that to be more beneficial. And you're 


**Brett:** sending that out to external monitors? 


**Navaz:** Yeah, to external monitors.


Yeah. What, 


**Brett:** what are you using? You're using like an Axon wireless or something to do that? 


**Navaz:** No, actually we're, we're just screen mirroring. Oh, 


**Brett:** okay. Yeah. Like an Apple TV or something like that. 


**Navaz:** Yeah. Well through Apple TV, uh, smart TV. Yeah. 


**Brett:** Okay. 


**Navaz:** You know, and, 


**Brett:** uh, that's a great idea. Yeah. If we're trying to figure out ways to quickly show it, that's it.


We have an Apple TV. That's actually a really good way to do that. I appreciate it. I'm just very interested in how you're doing it because it sounds like you Locking it in. I'm curious about your whole workflow, what 


**Navaz:** your pipeline is. So basically, I mean, well, the way our setup is, is that, I mean, we have a Mac book, we have, um, you know, the phone.


I mean, because, you know, excuse me. So with what just said, with auto shot, you can, you get an IP address. Now, this is what's interesting. So you get an IP address. Which is the web browser. So from that web browser, [01:01:00] you can put it on. It doesn't even have to be an Apple TV. You can just put on a regular TV and or any smart TV and you just got to go to that web address as long as it's on the same network and you can get the live, um, video feed from the camera or from the, from the iPhone.


So you can see, yeah, from the iPhone. And so, you know, we just put that on, you know, an 80 inch flat screen TV, you know, and does everybody can see it, you know, from there. That's 


**Brett:** great. I've got so, I feel so stupid for not even thinking of doing that. But that's a great way to show the previews. 


**Navaz:** Yeah, but that's part of the, the, um, the situation that I'm explaining to Elliot is like to be able to show people that it's like, Hey, this is how our setup is and why it's so effective because I mean, that way, you know, Um, you know, I mean, we, we, we can move the, uh, the screen around, you know, to where, you know, the actors can see what it is that's going on.


So they know how to interact with certain objects or, you know, whatever it is, but, um, I mean, that's one way to do it. So the other way, I mean, what's interesting about that [01:02:00] way is that you only need to have, um, um, I think you only need to have auto shot open on one of the computers, right? But then what we do is we stream from the MacBook to an Apple TV.


So we can actually have multiple streams showing us, you know, everything that's going on from different angles, you know? Um, but being that we're only two people right now at the moment filming, I mean, we're filming, you know, quick shots. I mean, not quick shots, but let's say like cinematic shots. Cause we're going from, from filming to a final product within like two to three hours, you know, of each individual shot.


And it's, and what's really our Achilles heel is just processing. You know, it's rendering out. That's the only thing that takes the most time. What's 


**Brett:** the 3d software you guys using blender or using, 


**Navaz:** yeah, we're using blender and, um, listen, I mean, cause I think if I remember correctly, you're doing the podcasting, you know, so 


**Brett:** yeah, the, so the location that we're working on, my partner [01:03:00] was on earlier, but he's been doing a bunch of podcasts with the comedians and he's shooting on the green screen stage, just throwing something behind him, but they've been talking about.


doing some sketch comedy. They want stuff that they've written and we kind of pitched this idea of, oh, because one of the things was a talk show and that wasn't set. I showed a little earlier, but they want to be able to kind of switch sets in and out. You have short sketches. Uh, so that's kind of the plan and the idea is we want to do short sketches.


But the thing we've been really hung up on is because of the people involved, he wants the producer, the guy that owns that space, who has all the connections, wants to be sure that we can preview something that's reasonably close to what it's going to look like. We'll still do a post process, but we've been using the unreal life preview script, and that's been having a lot of headaches.


There's a lot of setup. There's a lot of, there's just a lot of variables that can cause it to. Unravel quickly and just make you [01:04:00] spend hours troubleshooting the thing. So I wasn't just looking for, and I'm going to pitch to him this idea cause we do have an Apple TV on site. I'm just doing it all through Apple TV and mirroring the phone.


Cause that's fantastic. And if he goes for that, then I could just, I can stop doing the unreal life preview and say, we're not going to do that anymore. Uh, and I would love to be able to say that. And then we can go to blender and we can do it however we want. 


**Vaz:** Yeah. 


**Brett:** Cause to me, unreal still feels like a gaming thing.


Like I said, I've come to it recently. I'm more of a traditional Viz effects guy. So. That, that interface, even just the way that Unreal works is very different than anything else I've ever used. So, um, so it's been a struggle there, but I've gotten pretty competent using Unreal. But this, this preview script has been giving us a lot of trouble.


**Eliot:** And 


**Brett:** one of 


**Eliot:** the things that you might find useful is that, I mean, the, one of the things, and I already, uh, got, uh, I got Vazdas, but the, uh, We built a, um, uh, like Megascans [01:05:00] has, you know, Unreal can bring in Megascans pretty well, pretty well. But when you try to take a Megascans, uh, download, you try to get into Blender, then you've got like several hours of hand fixing and stuff like that.


We actually built a, uh, a Blender add on that you see pointed at the, you know, you unzip the Megascans folder, point it at the top and it goes, and it, and it makes all the, uh, uh, the materials into Blender assets and all the 3d things into correctly textured 3d Blender assets, because then you can just drag them and drop them into your screen.


And it works. And it's, so it's, it's so new, we haven't, I haven't done the tutorial for it. That's, that's for next week or 


**Brett:** Yeah, cause I would love to be able to use Blender more, but we've been kind of tied to this Unreal thing because of the fact that I'm trying to do a live preview and there's really no way to do that.



**Eliot:** would 


**Brett:** That 


**Navaz:** was the main reason why I wanted to do Unreal was that live preview. But then when now that I've been using Blender, I mean, it's Blender is so much more. I mean, it's easier. It's very easy and it's made for, you know, or it's not made for, but it's, I [01:06:00] mean, the learning process is so, um, so easy compared to unreal.


I mean, at least for me personally, I mean, I'm older and I, I hate learning new things, you know, but at the same time, oh, I got as 


**Brett:** much gray as you do, so I , 


**Navaz:** and so, uh, 


**Brett:** yeah, 


**Eliot:** Elliot's only . Nobody here is under 40, I'm 


**Brett:** sure of that. Yeah. . 


**Eliot:** Yeah. I'll tell you what, like blunt blender kicked my, I'm sorry, unreal.


Just kicked my rear end. I, I, I looked at that and I, I get it and I can, I've even programmed CN that thing and oh, it's, it's just hard where it's literal every, you can't. You can't, you know, if I'm real every once in a while, you'll get yourself up a cricket. I, you know, I couldn't, can't get, get out of it.


Yeah. Whereas blunder, I've never, never ran into that, you know? Yeah. And it's like fiy, oh, this went wrong. Look up it up. Oh, that's what I need to do, fix it and move on. And so, yeah, we did, we 


**Brett:** did a thing, we were there yesterday. We changed, just changed the frame rate because Paul wants to do things in 30 and I had been doing everything in 24.


So first I changed to 29 97 'cause that's what he asked for. And we were getting. [01:07:00] It looked like we were getting drop frame time code inside of Unreal. I'm like, well, we don't want drop frame. We want non drop code. So I must've spent two or three hours trying to solve this fucking time code issue.


There's like seven or eight fields inside of Unreal where the time code has to be reset. And you have, and you have to reset it on the tentacle. You got to reset. I mean, it's just like all of these places and we were just chasing our tails, trying to figure out why things weren't working. And it was just cause we had missed, I had changed it everywhere except for on the tentacle.


I forgot about the tentacle and it was throwing my, my, so my loan at, and my camera signal were way out of sync. I couldn't get them locked up. I couldn't do it. And I didn't know why. And it was just this. So it's that same, because in unreal, just to find all the places where I had to change that was a task, you know, uh, well, how many places do we enter 24 as a time code?


Well, apparently quite a few. Is that 


**Navaz:** Brett, what you'll notice is that, well, this is what's going to blow you away. So. [01:08:00] When you go and sit there and let's say if you weren't using that whole setup, right, um, of going with Unreal live preview, just taking that IP address and putting it on a TV browser, you're going to be like, wow, like all that, all that time and effort you guys put into it, you're able to see the same thing in a live preview on a screen.


So that's running through a browser, you said, so where's the 


**Brett:** IP address from? Where's that IP address? 


**Navaz:** When you go into auto shot. Oh, sorry, Ellie. If you wanted to explain, but, um, but if you go on, yeah, if you go into auto shot, you go, um, where you know how, like, when you do the, the takes, you know, when you go to that screen where it flashes on it, now you go all the way to the right.


What would it be under? I forgot what it would be under. 


**Eliot:** Yeah, I'll punch it up, actually, because this is worth kind of seeing how So this 


**Navaz:** will give you basically 


**Brett:** the 


**Navaz:** phone preview in a browser. The phone preview. Yeah. Okay. With the background. Whatever the phone is seeing is what you'll see on this browser.


So you don't technically [01:09:00] need to have You don't need the Apple TV. Yeah, you don't need Apple TV. As long as you have a smart TV that can browse the internet and it's all the same network. You just put in the IP address. I think it's the IP address 8080 and then you'll be able to see it, you know, right in there.


And then you can control. You can actually, which we haven't tried yet, which we probably try is whether or not we can actually initiate the, um, the takes, you know, with, cause it does have a record button on it. And you just hit the record button and it should be able to do it. Let's see, Ellie. Yeah, there you go.


I'll 


**Eliot:** show you real quick. So, Yeah, yeah. It's um, you know, it's just, it's just talking to the IP address on Jet Set. So you can actually just click if it's, if you're in AutoShot, you just click open web page. And uh, and you know, it'll pop open over the web page that has it. 


**Navaz:** But you see that IP address that's right next to connect?


The 192. 168? That, that's what you have to copy into the browser. And then if you copy that into the browser, Yeah, that 


**Brett:** right there. Where is that? Which, which field 


**Eliot:** is that? Oh, I see. Okay. A quick [01:10:00] correction on that. That's, that's close to it. Um, if, uh, so when we, if it's on the same network, you can see this automatically detecting, uh, the iPhone.


And then it opens the webpage. This guy is actually a manual override. Um, in case, uh, in case like it, it doesn't see it here, uh, but just if it's detects it here, then you can just, you know, click open webpage. And the, the other way to do it is in inside Jetset, you go into the main menu and you just go to the, uh, the settings menu, and you just go to the bottom.


It'll say like, URL and it'll, it'll tell you what, what it's, you know, in this case it's 1 92, 1 68, 86, 80 80, you know, 36, uh, port 80, 80. And then, uh, you pull this up and this is the digital slate. And so digital slate, you can also go to settings and you can also go to video, um, video, there we go. And if I hit play.


Is it going to be, where did my video go? Yeah, that's actually interesting. And the play should be, give you the live. So yeah, it should give you a live preview and I'm actually curious why it's not. So, all right, I'm going to go [01:11:00] look at that, but normally you pop this open and it gives you a live preview.


Uh, and this is the, the remote control system, uh, that we, that we had to, that we had to build that is, um, that, uh, that Kumar is among others is giving a workout. Uh, because we, for a lot of times when you're running on set and it's already on the camera, the camera's up on a boom, they're flying around, you can't run up and tap on it, right?


So you want to have remote control console. That's what this is for is it's uh, is it, and we all use it as the digital slate as well with everything's done because Jet Set has a little mini web browser built into it. That we can, so we can talk to it remotely, uh, on the, uh, via web. But I, I loved, Vaz came up with the idea of punching that into a smart TV, which I think I never even thought of that.


Yeah, it's great. I mean, anything 


**Brett:** I could give, I can just get a quick preview. It doesn't have to be through Unreal. It just has to be something I go, this is what it's going to look like. And everybody goes, Oh, it looks great. And then I turn it off and I do my thing. They don't have 


**Navaz:** to worry. What's crazy is that all it is just putting it on [01:12:00] and then clicking video.


And there you have it. So all of the, yeah, the Unreal live preview and all the tracking don't really need it. Or at least I've loved it 


**Brett:** and I hate it. I hate it. It's such a nightmare. Yeah, 


**Navaz:** it really is. I mean, eventually it's going to get better, but I mean, for right now, I mean, you're getting a live preview from the phone with the background, whatever it is you see on that phone, then that's what you're going to see on that, on that video preview.


**Brett:** I'm definitely going to try 


**Navaz:** that out. 


**Brett:** So, uh, and you're, you're getting it, your name is Baz, right? Is that right? 


**Navaz:** Oh, Navaz, Navaz. But I mean, but easy for, I mean, people to say is Boz, B A Z. 


**Brett:** Yeah, okay. 


**Navaz:** That's why I was like 


**Brett:** boss. All right. Anyway, I, uh, I like your idea. You're getting it from copying it in that field in auto shot, you said, right?


Yeah. You just copy that IP, put it in a browser and you get to preview right away on any TV. That's in the network. 


**Navaz:** Yeah. 


**Brett:** All right. Yeah. I'm going to try this because this is all this will solve if this looks good enough. Uh, it solves all the problems. It solves our [01:13:00] biggest problem. How about that? Yeah, it's a big problem because Paul's getting very frustrated that I wanted him to hear from Elliot.


Like this is hard. Well, 


**Navaz:** the one issue that we did find is that you could only have one TV screen, uh, streaming from that video feed. So let's say like, if we have like on our monitor, we have a, you know, we, we push play and then we push play on the TV. They'll switch to the TV. Is the only feed that we get, you know, from that, but then all we do is screen mirror from the TV to another TV, you know, so if we wanted every single TV to have it, you know, but it's still, it's still on the phone at that point, right?


You're 


**Brett:** not, you're okay. 


**Navaz:** Yeah. It's still on the phone. It's just basically taking the live feed from the phone and putting it onto whatever device or TV or any monitor that you want to use and see. I mean, that's what we found is a solution than using, you know, unreal with the live preview. You know, I mean, obviously there are benefits to having the light preview, [01:14:00] especially for the future stuff that we're talking about doing with lighting and VR and stuff like that.


But yeah, after talking to yesterday, I mean, I think we can do the same thing in blender. And so if we can do the lighting setup that we were talking about, the adaptive lighting, um, in blender, we're just going to keep running with blender. I mean, cause it just seems like blender is more versatile for our needs.


I mean, what we don't want to do is like, let's say like how you said, you got into this issue of. Going through everything and it's just one little thing that's not right. You know, we want to make it where it's not, it's not, I mean, everything that we have is on one table and we can see it all and know, okay, well, this could be the issue or that could be the issue.


Not to where it's like something that's totally unrelated. That could be the issue, you know? 


**Brett:** Yeah, it's, it's been very frustrating because every time I go out there, it's just, it's, And I'll, and I got it streamlined at home where I can get the whole thing running in about 10 minutes, but because I have to take the camera and the rig apart and then put it all back together.


It's like, and he and I, [01:15:00] the guy that runs this place, we both come from post. He was a colorist for years. I'm like, you understand this. Anytime they install something new, they put something new and always, or if you take something apart and put it back together, something happens. It always happens. We've both, we've both done this for 30 years.


We know this is the way it is, but he gets very frustrated. So I'm just, you know, he's, he, because he's the money guys and he's a colorist, no offense to colorist, but he's not as technical as I am. So he doesn't understand what I'm trying to tell him. This is why, you know, I need you to set up a rig here.


So we're, we're getting there. And, but I think this using, then 


**Navaz:** you understand how simple it is. It's like, Oh, I could see, I could see it already. Yeah. It's almost filthy, you know, because 


**Eliot:** You know, the one thing you're going to want to do is, is it, we are, it does use wifi, um, to transmit the video signal.


And so the means that the quality of the video signal, you know, and how, how, you know, avoiding, 


**Brett:** he's got extremely fast. He's got, he's got like one gigs feeds. [01:16:00] Even his uploads are really fast. He's got a blazing network over there. 


**Eliot:** Okay. So yeah, every once in a while you run into something where, and whenever we have people who are, you know, testing and stuff and that they're going to go on a shoot.


Uh, we actually recommend bringing along a travel router because it's, it's patches in underneath the existing Wi Fi because sometimes, man, you go onto a stage and it is, the Wi Fi is just bonkers and like, no one knows what they did to it and, and it doesn't work. He's got a real 


**Brett:** quick network. So yeah, we, we, that's not been an issue for us.


It's actually faster than what I have at home. So like when I'm moving, uh, 3d models back and forth, it's very fast and everything is very 


**Navaz:** fast. One of the things that we were able to do was create a wireless hotspot with our laptop. And then we use that instead of actually having a travel router. So I mean, we, that actually worked out very well for us because at first we tried the iPhone of making a wireless hotspot from the iPhone.


And I don't know, it just, it seemed like it was a little bit laggy, but then we [01:17:00] did it off the laptop itself, you know, created the mobile hotspot and connected the phone to it and then connected the monitor to it. Flawless. 


**Vaz:** I mean, 


**Navaz:** and that's more for when you're on, um, you know, let's say remotely on a location where there isn't wifi or, or, um, you know, let's say if you don't have a wireless router, because I mean, our, like our thing is more trying to be minimalist, you know, with, when we go out to shoot, especially if you're trying to demo it to somebody, but trying to demo it to somebody, it's a lot easier than having to, to take all that time to set everything up.


But if you go and have your laptop already set up with the mobile hotspot, everything can already connect to it. Right then and there. I mean, you're up and running within like, I don't know, two minutes, three minutes, you know, you can actually show them the live feed compared to like having to boot up on real and, you know, do that whole process and stuff.


It's like, I mean, it's, it's like I said, there are benefits to having it that way. But then for us to be able to show somebody, it's so much easier to say, hey, hey, this is what you're seeing. [01:18:00] You know, um, yeah, I mean, you say, 


**Brett:** yeah, you saved the headache of, oh, I'm sorry, something's wrong. Give me, give me five minutes.


And that turns into an hour and that's, what's been happening to us. And like, I'll hook it all up and I'm like, oh, it was all working perfectly at home and now I got to figure out why it's not working here. Uh, and then it becomes a troubleshooting session and I would move the project. Cause he's got a PC out there.


So I put the unreal project on a little thing, a little drive and move it out there, but And we talked about doing via the cloud and other ways, but there's just always some little thing. He's got a different IP address. So I got to reset the live link. I've got, you know, there's always going to be something that doesn't work perfectly.


And even if he's preset everything in unreal, there's still about five or six things that you have to redo every time you got to re export that live link script. That's what I was talking about that we start to see over time. If we try to reuse the composite that it output the first time. It starts to drift probably because we move things around or something's changed that we forgot [01:19:00] about.


And then we're like, Oh, we have to redo it again. But because we're using these layered, uh, on real projects, we have to rebuild the composite every time. And it just becomes this process that you're like, Oh, okay. Yeah. We don't, the desk is not in front of the actor. Now, let me go fix that. So it just, it's always this thing of like, Oh yeah, there's five or 10 steps that I have to do every time, no matter what, even if everything's working perfect.


So I'd love to just go, Oh, boom. There it is. That's what's going to look like. Okay. Everybody happy. Let's go. Yes. 


**Vaz:** Yes. 


**Brett:** That's what we want. That's what I want because I'm so tired of hearing from him and not understanding, like what we're doing here is very complicated. We're tying all these different pieces of gear together, trying to make them all work and talk to each other.


And, uh, And the other thing I've found is the HDMI is a complete nightmare when you start running long runs, which I do. So we started using SDI, uh, basically we'd run a converter because the Unreal Card is SDI. But [01:20:00] we were basically running these long HDMI cables and then we converted at the very end.


But because you're running a 20, 30 foot of HDMI, It's dropping out all the time. Yeah. We're losing the signal or, or the connection to the camera's a little loose. So if you move it the wrong way and it drops a signal, it's like, Oh, and Unreal is not so clean that you can just plug it back in and everything starts working again, it freaks out.


It's like, Oh, reboot, reboot, reboot. So this is a glimpse into what I've been dealing with. And I've got this producer on my shoulder. It's like, I can't show that I can't bring somebody in here. If it's like this, you know, so, uh, I just need a way to go. What about this? Can we do this? Is this good enough?


You know, and solve this problem and erase it from our list, and it's been at the top of the list for like a month now, you know, like, well, we can't do this. We can't do this. So I appreciate you telling me what you guys are doing. I'd love to, uh, You guys are in L. A. too, right? Yeah, we're actually located in Simi Valley.


So we're not that far. I'd love to [01:21:00] come out and see what you guys are doing at some point, if you wouldn't mind. Like the next time you're shooting. We shoot every night. 


**Navaz:** Like, I mean, we're actually, that's what we're talking with Elliot about, is coming down to their studio and filming down there. So that way we can make better tutorials.


Or not so much better, because that would make the tutorials bad now, but, you know, But more detailed tutorials on the things that we found as users, you know, of, you know, using Lightcraft because I mean, when they, when you explain it in the, in a, in a technical term, um, and you have to be technical to understand it, but when you actually show it, you know, like, okay, well, this is how we've been able to do it and this is how we've been able to get away with it.


You know, I mean, like our setup, our rig is so it's, it's all 3d printed. You know, I mean, I'll show you, this is our rig. Oh, wow. Awesome. And it's really lightweight because I printed [01:22:00] in TPU. 


**Brett:** Oh, wow. That's a great idea. Yeah. We're buying all these little things to rig this thing out. So that's, I don't have a three printer, but that's a great idea.



**Navaz:** mean, I have like, 3D printers on top of 3D . 


**Brett:** Wow. Excellent. That way we can, we can, if you guys ever want to come out, we've got a nice green screen spot that he's got all rigged out with lighting and he's got, he's got two six K. He is got a six K, six K Pro. He's got like five or six K, he's got cameras out coming outs, butt


So, and he really is willing to, he wa he loves this collaborative thing, getting people involved. So I'd love to come out and see what you guys are doing and if you ever want to come and see. The set that we've got, we're in West L. A. Um, I almost called the city. Well, 


**Navaz:** how about this, if you need help, um, well, the next time you go down there, you know, like, maybe we'll, we'll find some time to come down there and just kind of, so that way you're not, if you're gonna display it to somebody, you're not gonna be alone trying to explain it all.[01:23:00] 


I mean, I will say, I'm not, we're not Unreal specialists, so if you're using Unreal, you know, that's something that we're, we're not, you know, familiar with. Well, I'm more 


**Brett:** looking for this, this live preview thing, and I'm just picking your brains about what you guys are doing, because it sounds like you guys are doing something that's not entirely dissimilar to what we're going to do.


I mean, you're doing short films, but we're doing sketch comedy, but The idea is the same. We want quick turnaround. We want to be able to do this fast and have it look good and everybody to go, Wow, you know, that's and and because the people because of the people that Paul knows if we can get them involved.


He's got some named comedians that could definitely. Make something like this, draw, uh, draw an audience, you know. That would be fun 


**Eliot:** because that's, I was seeing some of the old Python skits and they, they, they just did, they did, you know, because they're, they're going all over all these interesting locations kind of thing to think that they're, you know, in places that have lots of unusual, like little spots.


And, uh, and I remember thinking this will be great when we can get this into the sketch comedy [01:24:00] world where you just like, yeah, that's, 


**Navaz:** I just sent my, my phone number and my email address, you know, in case you, you know, 


**Brett:** oh yeah, yeah, I'll put mine in, 


**Navaz:** you know, 


**Brett:** and yeah, I go out there at least once or twice a week because I live in Burbank.


So it's a bit of a trick for me. Um, but he's got a nice. Yeah, he's in West. Yeah. So it's probably even more for you guys. Definitely. But, uh, 


**Navaz:** but you know what? I mean, it leads to help you get it get a step ahead. I mean, because here's the thing that's interesting is that because I feel like I see things differently than most people with with this jet set thing, because the difficulties that you're having, I don't see the same difficulty.


You know, I see it as being, you know, I don't know. We could solve it. I mean, the way that that Elliot has has developed the software. It's so easy. That I think that a lot of people overthink it, you know? And when you, I mean, 'cause for us to be able to do stuff in less than two hours is just, I mean, I don't know.


It, it's pretty impressive. In fact, I actually show you the video right now, but I don't know how much time we have left. [01:25:00] Do we have, how much time do we have? Yeah. Show, 


**Eliot:** show a quick, uh, show that quick video. Yeah, I wanna, 


**Navaz:** maybe We're gonna see it now. Yeah, can you guys see this? 


**Vaz:** Yeah. 


**Navaz:** Yeah. All right.


**Brett:** Oh, it's great. 


**Eliot:** Nice rack folks


And that right there, you already, you already captured.


**Navaz:** Alright, let me get out of there. Let's see how I can stop screen mirroring.


**Brett:** There you go. Cool. Alright. Well, that, that first shot looked really great. Uh, so that, that looked [01:26:00] like, uh, everything was, uh, virtual there, right? The set? Or did you have any practical at all? 


**Navaz:** Um, yeah, everything was virtual. 


**Brett:** Yeah, 


**Navaz:** can you guys hear me? 


**Brett:** Yeah. Yeah, 


**Navaz:** we're good. Yeah, that's better. Anyway, but yeah, well, I'm 


**Brett:** very impressed.


I'd love to have you guys out. I'd love to see yours set up and have you check out ours. Uh, like I said, he's got a nice little green screen spot. Um. So if you ever need that, he's pretty flexible. He's just trying to get people involved and develop projects to come in and out of there. That's really what he's trying to do besides these podcasts.


But the podcasts give him these connections that really will, could help us to get at least get some name comedians involved in whatever. Uh, and, um, anyway, uh, I'm very impressed with what you guys are doing. It's very cool. Thanks for sharing. 


**Navaz:** Yeah, guys, 


**Eliot:** this is awesome. Wait, you 


**Navaz:** know what? It's all because of Jet Set, you know, I mean, yeah, I mean, I guess the interesting thing [01:27:00] is that we yeah, we always we always no matter if we're filming, you know, VFX or anything like that, because we more to look at the practical VFX of let's say, like what you're talking about with the with, you know, doing the scenes.


It's like just because I mean, most people, when they think of VFX, they think of spaceships and fantasy stuff. It's like, yeah, we're looking at it as, you know, team replacement, location replacement to where now instead of having to spend 20, 000 to get permits and you know, for the location and getting it the way you want it, we can actually do it virtually and make it look really good, you know?


And, um, I mean, what's interesting about the shots that we showed you is that we did those in two hours. I mean, we got it to that point in two hours. Which if we actually had, let's say like a few more hours, it would have been production, you know, or not production ready, but it would have been, you know, ready to go on YouTube.


You know, there's no way that people would be able to tell, you know, that they weren't, you know, weren't real, you know, and so even with what you guys are doing, I think [01:28:00] that, I mean, that's, what's kind of good about like people like us is like, we can come in and just at least give you guys, uh, you know, advice on how.


We would do it, you know, compared to the way that you guys are doing it. If that's saying it's wrong, but, but just the easiest way to do it, you know, being that we've been using jet set, you know, religiously, like I said, like we, even if we have it, even if we're not filming VFX or, or virtual anything, we still have, have like craft running because.


At the end of the day, if a producer goes back and says, Hey, I want to add an explosion here or a portal here, or I just want to cup here. We at least already have the tracking data to actually add that. That's great. 


**Brett:** That's great. You know, it's a brilliant idea actually. Cause I spent my entire career.


Taking things out of scenes or putting things into scenes with nothing like that and going, okay, well that's gonna take a few hours for me to, yeah, you want to, you know, like the Starbucks up in Game of Thrones is like that probably. Yeah. Like they could have taken that out like that, you know, . Yeah. But tracking, it's 


**Eliot:** like, [01:29:00] um, but I'll say that there is a, there is, but I'll 


**Brett:** reach out for sure.


**Eliot:** Yeah. Yeah. Link up. And I'll also say that getting Gaussian spots of a live action set. It's going to be one of the things where you can actually just render in, you know, a clean plate. You just make your own clean plates where you need them. It's, it's going to be, we're talking over this with one of the effects supervisors yesterday, and that's going to be a big deal.


So this is awesome. Oh, thanks Elliot.