Transcript

# Office Hours 2024-11-15


**Eliot:** [00:00:00] Okay. 


**Mark:** Morning. Oh, hi, Elliot. Hey, how you doing? Okay. Good morning. You're the one with the picture. This, uh, Junico is on board, too. Oh, fantastic. Hi, Junico. 


**Juniko:** Hi, Elliot. I'm working with Mark on the book, so I've been using Jet Set, too, and we've been trying to decipher things so we could add them to the book.


**Mark:** Oh, fantastic. Yeah, I've been, yeah, I've been putting it through the dunderhead, uh, uh, experiment, me. Oh, no worries. No worries. Uh, and, and the, the last project, uh, hopefully this will be brief, but the last thing I tried, I thought I was like really getting into it, but in my, in patience. I'm only able to do a test like in between the break in my class, but I had a green screen and I use the chamber house and I kept putting the origin point unless until I got it to a point where I was going forward and I had a person sitting on a [00:01:00] chair and it was linking up to the piano, right?


But I didn't make a project. We did, Junico managed to find all the material. Um, now, now we never record just the background by itself. I guess it's always just a composite and the, the raw. Green screen. Is that right? 


**Eliot:** That's what's recorded in real time. Uh, is the, you'll have to pardon my voice because I'm coming out of a cold thing.


So it records the composite and records the camera original in real time. And then you can also, uh, in the review tab, uh, you can hit a button. It looks like a little film strip button on the live comp, and it'll re render that as a clean background pass. So, yeah, yeah, yeah, yeah. So this is a, we just added, we added that workflow, uh, originally designed for a re rendering Gaussian splats, but it'll work for, for any, any CG background.


So I'll show you the, uh, the link to that. Cause that's that you're going to, you're going to want that. Let me show you the tutorial. 


**Mark:** Yeah, because, [00:02:00] uh, when Junico has tried to, uh, you know, backward engineered for Unreal. She had a lot of problems. What, the thing was upside down or something, Juni? I don't know.


**Juniko:** Yeah, yeah. I, um, you know, I found, Mark found the Charterhouse, um, you know, information. It's just the OBJ and the textures, you know, and materials. But when I tried to bring it in, it, I can only download it. As a USD Z and so when I tried to bring that into unreal and I, the way I did it was I unzipped it and I found the USD C file.


And when I bring in the USD C file into unreal, uh, it comes in with the wrong up access. Oh yeah. It's going to be mangled. Yeah, it's really mangled 


**Eliot:** the USD files. I mean, different people use USD in different ways and we use USD as basically a one way trip from your original core 3d app, whether it's Blender or unreal, et cetera, [00:03:00] into Jet Set.


And then the only thing that comes back really from Jet Set is the tracking data, because we, we kind of assumed you're going to, you know, Start from a, uh, a normal 3D app, you know, Unreal or Blender or something like that. And that's where your main scene is. And then we, we just push the USD really as a blocking proxy, um, into, into Jet Set.


Uh, because, uh, frankly, when we go, uh, Jetset can't store and the iPhone can't store the texture resolution that is commonly in place in a real 3D model in an unreal might have, you know, you know, several gigabytes of texture and geometry in a model and the phone just can't handle that. I mean, there's just.


There's no way. 


**Juniko:** Right. 


**Eliot:** Right. So we, what we aim for with the USD models is, uh, kind of a, uh, you know, cut down blocking version of that to get into the, into the, uh, into the iPhone. Um, but if you're, so for your case, if the image, if the USD image and the, like the charter house actually looks decent because it's, it's a baked, you know, [00:04:00] it's a baked, um, it's baked geometry.


So I would try the re rendering. Uh, so I put a link in there, is the, uh, In Floam Re Rendering Workflow. And what that will do is it'll, from the take you selected, it'll regenerate a clean background plate, rendered, tracked, everything. And you can actually just airdrop that, right, right off of Jet Set into your, your Mac or whatever.


Uh, and I'm, if you, if you want to sync it with AutoShot, you can do that as well. But then you'll have a tracked, clean background that can be composited with your, your live action foreground right in your editor. Everything should line up Um, cleanly. 


**Juniko:** Okay. Okay. We can definitely give that a try then because that was the only thing that Mark wanted.


He wanted to re render, you know, the, uh, CG background and, and maybe enhance it or maybe make it look more cinematic or something, you know, but, um, I honestly didn't know how I was going to bring all those parts together. And even they come in as one big blob, blobby mess. And then if you separate them out according to each OBJ, Uh, they are not [00:05:00] assembled and there's no way to assemble them in correct orientation to each other.


So it was, it was just not working. 


**Eliot:** Yeah, you're trying to do an extraordinarily Difficult task. Um, and which is why we actually just don't do it. Um, and then good idea of the jet set workflow. We going, frankly, going from, so normally the way a project tends to work is you have your 3d model file in your original core application, whether it's Blender or Unreal or Cinema 4d or Nuke or, or not actually not usually not Nuke, but like, or, or Houdini is one of the ones that we're, we're dealing with Maya, et cetera, and that's where your 3d model lives.


That's the source. That's the core. That's the, that's where all the stuff is. That's where you're going to render. And then what we pushed to Jet Set is, is just a very, uh, a small subset and kind of a blocking subset of that. And to just enough to work with, to see where you're at. Uh, because frankly, trying to do a perfect material transfer, uh, is It's actually one of these crazy, impossible tasks in 3d.


There [00:06:00] are teams, large teams that are focused on doing that. You know, NVIDIA's Omniverse is a, is a, not a small team and they work on that, uh, that, that problem. 


**Juniko:** Right, right. I understand. Yeah. I even tried going through Omniverse and they said that you could do a conversion there, but nope. Not anymore. I don't know, maybe it used to be, but not, it does.


I couldn't find a conversion going backwards and even in omniverse. So it was like, okay, nevermind. 


**Eliot:** Oh, right. All right. Well, that's so that, that should be, that should be great. And I've got a, a, a leak from, uh, from bad beetle to, uh, to show a test of how far they got with an hour of shooting and so let, let me, uh, let me pull this up.


This is really exciting. Um, let Let me pull up full, do a full screen of this, actually, but before I jump on, uh, Mark, any other, does, or, uh, does that, well, well, uh, 


**Mark:** I haven't had a chance to review your link. I finally found the link. So once you, it re renders, that's something that we can download off the phone, [00:07:00] just like that.


You just pull 


**Eliot:** it over and then that then is it's matched to your camera, original and length, et cetera. Like the, you're the original, the camera original file. It's in jet set. And so you drop those two on top of each other in the editorial timeline and you should be able to pull a key on your camera, original footage, and it'll drop in on top of that tracked footage.


Oh, fantastic. Yeah. I think that's going to be a go to for a lot of. Uh, groups, um, because then you autoshot and the whole autoshot workflow is really designed for really a heavy VFX workflow. You're going to split it down to, you're going to be working with 300 frames, 200 frames at a time, EXR extractions.


Like it's, you know, It's how you have to do it. If you're going to do sort of larger scale VFX, but there's a whole category of things where you really want to track the background and you, and you, the keyers, the keyers that are inside edit, uh, editors, these days are pretty good. Editors can handle, you know, you can put in your, your log footage and, and color grade it and stuff like that.


Right. And premiere, right. And, and, uh, and resolve and it'll look [00:08:00] great. And then you're not dealing with EXR files and you can key, you know, three minute takes no problem. Whereas, you know, three minutes of EXR files. Is a lot of the XR files. 


**Mark:** Okay. Well, I'm looking forward to this. That, that'll be great. And I'm, I'm anxious to see, uh, Bad Beatles, uh, work here.


**Eliot:** Yeah, yeah. Let me, let me pull it, pull it up really quick. See 


**Mark:** with mine, I, I, uh, was combining a green screen and a luma key to make a ghost at the same time. And I didn't have enough. So I had a lot of black. So it's, uh, kind of a special. Background. 


**Eliot:** All right. So let's, let me play this real quick to see, see, uh, what they've been up to.


And this was, uh, this in like about an hour of shooting of this. Well, let's, let's, uh, 


**BadBeetle:** well, actually, Ella, what we, what we were trying to do is see what we can do in less than an hour. Oh, wow. So from shooting to, uh, the [00:09:00] actual, uh, you know, uh, to get to this point, I mean, obviously it takes a little bit more work, but, uh, 


**Eliot:** Wait a sec, so I gotta, I gotta know what's, so is this, uh, is that, is that a shot with a cine camera?


Is that Jetset? The footage looks great. 


**BadBeetle:** Yeah, that's all, that's all, uh, the first link I sent you was the raw footage. And we just, uh, cause last night, like late night, somebody asked us if we can do a test shot. And I was like, yeah, sure, you know, we could do it in a few hours. And we were like, I wonder if we can do this test shot in less than an hour.


I mean, the, the, what took the most time, I mean, from shooting to actual, uh, uh, you know, getting to where we're at now, I mean, it was probably, I would say like 45 minutes and it was mostly processing. You know, like the, you know, processing it through jet set and, uh, you know, like a king out the green screen.


I mean, we kind of cheated on that. We used AI to, to do the green screen. I mean, it needs a lot more cleanup, 


**Eliot:** you know, to, 


**BadBeetle:** to make it look better. So, 


**Eliot:** [00:10:00] all right. You guys, so tell me which parts of this are, I actually don't know which parts. So the guy is real. I got that. But, uh, is this, is the rest, is he, uh, on a, on a complete green screen background?


**BadBeetle:** Yeah. It's all really, yeah. Do you see the first, uh, the first link? 


**Eliot:** All right, let me, let me go back to the first link. 


**BadBeetle:** And, uh, Yeah, the first link will show you exactly what it was. And then the second link should show you, uh, You know, the results. 


**Eliot:** All right, so let me go to the first, first link. And, oh, jeez, there we go.


Share, let me share this. Oh, this is so exciting. All right, there's, there's the, Okay, gotta blow this up to full, full screen. Wow! Whoa!


Oh, this is great. And this is, so, and is this shot, what's this shot on just so I understand what I'm, what I'm looking at? Is this, uh, this is, uh, is this in the iPhone or is this on a, on an external cine camera? Oh, it's a, yeah, it's on an external, uh, 


**BadBeetle:** camera. We [00:11:00] just used a, uh, I think it was like a Canon, just a random Canon that we had.


And, uh, because we didn't have all of our, we didn't want to go through the whole setup of everything. Because we were trying to see how fast we can actually do it just to show them. I mean, and, uh, I mean, obviously there's a little bit of green spill on the, on the actor, you know, but 


**Eliot:** wow. Let me share this, uh, let me full screen this 


**BadBeetle:** thing.


**Eliot:** Man, this is great. This is great. It's, I mean, it's tracked is you kind of feel in it. And like the lighting that is, is like, you know, kind of, this is, Oh man, I love this because you guys are doing kind of this high, um, uh, not high key. It's actually, I remember the film noir 


**Mark:** type of thing, right? 


**Eliot:** This is, this is the kind of thing I've wanted to see in green screen work where you like, instead of like you made a decision, I'm going to light him from here, I'm going to really show the contours of him and what he's wearing.


Oh, this is cool. This is, I can't believe [00:12:00] this is an hour. 


**BadBeetle:** That's great. Yeah. It was less than less than an hour. I mean, the only thing that took the most time was, um, the jet set part. Right. So like, let's say like, you know, uh, you know, uploading it to jet set, then, you know, finding the, um, you know, the cinema matching and then, um, let me see what else to come.


I think that's the only real thing that took long, uh, the longest time to get it to, to output to blender. You know, that was, that took maybe about like 10, 20 minutes. Just to get it all, you know? Right, yeah. Get all pieces in place. Yeah. And then, um, I mean, we haven't composited it yet, or, you know, done any of the, you know, extra stuff to it.


I mean, we did add, add some, you know, some audio to it, you know, from the chains and, and the water dripping and stuff. And 


**Eliot:** so, okay, so this is, so is, is, uh, which, which keying used to use an ai, a key system on this, was that in net or was that, uh, 


**BadBeetle:** no, actually I've been having problems with, um, using the, the.


Um, [00:13:00] the auto shot, um, you know, uh, King, what we did is we used, uh, runway ML and they have like a, you know, AI, uh, King thing and basically you just click on what you want to keep. And I mean, it basically did it within like, 5 minutes. I mean, it did the whole footage. I mean, there's a longer, I mean, we took like, maybe 10 takes.


And this is like the first one we just focused on. And, and I was like, you know what, I mean, I'm going to send it to Ellie. I mean, I think we did it, we started at like, uh, at one o'clock was done by two, you know? And then, 


**Eliot:** wow. I mean, this is, this is super exciting because it's like, it looks like Batman, right?


Like they have that has that dramatic. Like the, you know, the parts of, of him are just falling into darkness, have the bright illumination of what he's doing. I mean, this is, sorry, I'm just, I'm waxing ecstatic as I, excuse me, you guys have to pardon me because my voice is shot, but, uh, this is, this is super [00:14:00] exciting because it's, you're, you're creating the look of the thing you want to make, you know, the, the real look of the, of the project you're, you're aiming for.


With kind of high contrast lighting and all these things. And I love, um, man, I just love, I'm going to go, I'm going to go back to the original, the camera, original footage. Cause you know, this is, there we go. All right, share this. I just love your, that this is why I'm so excited about this kind of virtual production is because you're going from, you know, this is accessible, right?


You know, you can look at it at the green screen sort of thing, go, okay, we can, we can, we can make that, that, that set up and getting that kind of result out of it. Cause you, you lit, you, you have, you know, you have, you made strong decisions, creative decisions and lighting and followed through with it. Ah, it's exciting.


I can't wait to see what, what, what, what, what you guys are doing next. 


**BadBeetle:** Well, let's see what we did this for, for this client is more, uh, they want to redo, um, like [00:15:00] a TV series that they, uh, they filmed and they were like, well, we spent all this money on sets and locations. And we were like, well, look, dude, we can do this, you know, in a shorter period of time, we can pre vis everything and let you see everything, you know?


And, uh, they were like, oh yeah, let me see what you guys can do by, you know, you know, in less than 24 hours. And I had this ready. I mean, I think he contacted us at midnight, sent it to him at two o'clock, you know what I mean? Wow! Ha, ha, 


**Eliot:** ha, ha! There's a mic drop? Ha, ha, ha, ha! I, I, I bet the, I bet they were, did not expect that to happen.


**BadBeetle:** Yeah, I'm pretty sure they're like, they're, they're probably going to assume that we already had this footage. You know what I mean? There's like no way, you know? Right, right. You know, and I, I don't really explain too much of the process of what we're doing yet. You know, just because I want to be able to, you know, To get to that point of being able to sit down and say, Hey, look, this is exactly how we did it and why we did it, you know, and, you know, and just kind of, you know, just go through the whole workflow [00:16:00] from what we learned last night was that this can be done.


So, I mean, you could do, I mean, you figure if we did this in an hour and the shooting probably took like maybe five minutes, maybe at the most, you know, so you figure, I mean, if you have, let's say a hundred shots lined up, you know, I mean, you could probably have it all, you know, a hundred shots done in like less than seven days.


You know, 


**Eliot:** Alden, when he was, when he was doing the, the, the, the, the series that he's, you know, he's been working on, um, they, they got, I think it had the last, uh, the episode two of it, of the tidbits and bite stuff had about 24 shots in it. 25 shots. And they finished that in a mor in a, in a morning. Like it was, you know, boop, you know, 'cause he, they pre-shot.


Uh, I don't know if you've seen the, the video of it. I'll, I'll, I'll link to it. Uh, but this is. The key is what what he's doing is he's pre shooting everything with just you know, uh things that are inside the um Let me pull this up real quick. Uh These are stories and then there [00:17:00] there we go.


All right, I'm gonna put put a link on this. There we go So, all right, let me put this here. There we go so this was the uh, the post the link to the kind of the user page of what we're doing and and I mean As as soon as you guys have something more You know, that you're, you want to put up, I totally want to put this in our user stories because that thing of like the producer said, calls, talks to you at midnight and he pushed him a comp at 2 AM is, I mean, that, that's, that's, that is so impressive too.


I know. 


**BadBeetle:** I'm not even gonna lie. Like last night I was like, man, I wish I had your phone. I would have called you. I was like, dude, we did this so fast, you know, and it was like, I mean, it was, it was just impressive. I mean, I'm pretty sure this morning I got a meeting with him after this. And, uh, You know, he's probably gonna be like, dude, can you do this, this and this, you know, and I'm, I mean, we're at the point now of understanding how jet set works to where we can do literally [00:18:00] anything, as long as we have the 3d model, you know, or designed and everything we could basically, you know, shoot it.


Probably have it ready for them within or to, to view, you know, within, you know, cause I think that if I, I can, I can speed up the workflow if I use different workstations so I can get it down to where I can get to this point in less than 30 minutes, I think, you know, because then I take the footage and, uh, you know, footage from the camera and already key it out, you know, that would be, I mean, cause that was the whole thing.


I just went step by step by step instead of actually doing it, you know, like while I'm waiting for jet set to finish processing. You know, already have the key ready. 


**Eliot:** Right, right. This is, this is great. This is your, you know, like, this is the pipelining aspect of it. That is, is kind of the, the, the thing that makes it is that is the rate of speed.


You can go, especially if you have a bunch of shots you're putting together. Um, I, I just think this is, you know, People are going to start doing this where you almost pre shoot the, you know, this, this is what, [00:19:00] um, a combination of what, um, Olin was doing is kind of pre shooting the project. So he, just with kind of CG stand ins, uh, inside Jet Set, so you actually know all the shots that you want.


You have your list of shots racked up and organized. And then in production, you can just, you know, Bomb through that, that shot list at this crazy high rate of speed. Cause you know exactly what you need. You've already have the project loaded in your, in your head. And you know how it's going to cut and flow.


You don't have to like find that in post. You want to find that before you shoot, you want to understand the flow of the, the shooting. And then it's, it's down to just brute force shot processing, which you can go really fast. You know, and that's, I'm just super excited that you're kind of seeing how it clicks together.


Um, and how, you know, how rapid you can, you can, how rapidly you can go on that. So this is, this is exciting. It is really exciting. Hey, thank you for sharing that with us. This is really fun. Oh, 


**BadBeetle:** no problem, man. Um, but I think, I think the only thing that I, I came, uh, it was the day before yesterday I came into an issue.[00:20:00] 


Yeah. And that issue was. The, the file name, the project file, it just, uh, for some reason I had to change it and I kept on getting an error if I was using the same project file, like for another project, 


**Eliot:** just to stop to remember, uh, Mark, I'm going to meet you for a second. Okay. So, so you were changing the project file, um, on in, in inside jet set.


**BadBeetle:** Yeah. Like, let's say like, if I didn't change, let's say if, uh, Let's say the, the shoot's name is called, I don't know, let's say interrogation, right? And I shoot some of it today, and then tomorrow I shoot some more. When I actually, uh, tried to do it the second day and put it into the same project file, I was getting an error.


That's strange. 


**Eliot:** Um, tell you what, do you have your iPhone here? Like we can, we can map in and just, uh, just test it. 


**BadBeetle:** Oh, I, I don't have it right now at the moment, but, [00:21:00] um, but I, I, I kind of like, I was stuck on that maybe for like 30 minutes until I was like, okay, so something's not right. I thought the whole foot, all the footage was, was bad because we did have one issue with, um, filming on, uh, on the day before or something.


We did something. And, uh, I think, uh, the, the cooler on the phone wasn't working right. Or it wasn't, you know, uh, cooling the phone down and so, but we were filming a lot. But I mean, that was for a different project, but, um, what, I mean, that was something I, I, I came across if I'll try to replicate it and record it so that way you see it.


**Eliot:** Yeah. And, and you can jump on if you, if you want to bring the phone in on, on an office hours, what we'll do is. Um, well, we have a remote assist system, so we can pop up a QR code and it'll, it'll push your UI, uh, like onto the, onto the call, um, so that we can actually look at it together and we can walk through the, the settings on the project file, uh, cause that's, uh, I just want to make sure that it's, it's doing, that it's behaving in an intuitive [00:22:00] manner, uh, that we're not, that we, we didn't do something in the UI that is like getting, getting people's wires crossed.


**BadBeetle:** Yeah. Well, I think it was, I mean, it could have been. On our end, just because you know how, when you do file management, you know, you have to put it all in different files. And, and being that I was like, I was just trying to, you know, do a test. And I was like, Whoa, wait a minute. What happened? 


**Eliot:** You know, like I want to see it just to make sure.


So what we're always trying to do is re engineer the system to make it, you know, and maybe we got a bug in there. Maybe we were just, we did, we did something weird and dumb in the UI that, that, that gets again, this is something where if I watch people do it, Then usually I can go, Oh, I, we, okay. You know, I'll make my notes and we, we, we rework the UI or something like that to make it, you know, clear and stuff, or maybe we got a bug.


I just don't know. But if I, if I watch you doing it, I'm sure we can figure it out. 


**BadBeetle:** Yeah, it wasn't a big issue. I mean, once I figured it out, I was like, okay, so that was the, you know, and I just kept on getting the error no matter what I did and what I changed. And then I was like, okay, so what, I mean, I just kind [00:23:00] of backtracked everything and was like, okay, what didn't I do?


And I was like, okay, so I didn't make a new, new file, you know, or, uh, you know, name a new file for our project file, you know, and then once I did that, then all of a sudden it just went smooth, but then I was also thinking maybe what it was is that being that you can do Senate, uh, I was able to do, uh, the Sydney calibration, uh, and then when I tried to do it again, uh, I think it would, that could have been the issue, you know, but I mean, being that I just made a new project file and it just worked, I just kind of ran with it because I was like, oh, okay.


**Eliot:** Okay, okay. Yeah, let's, let's, let's, uh, if you can bring that in on the next office hour, let's just look at it together. And I want to, I just want to understand, you know, because when we're, when we're designing the file systems, we end up making a bunch of assumptions on how things work and, and you, you know, you just get too close to it.


Whereas when, when you have people coming into it, you know, with fresh eyes, they go, why'd you do that? And then we go, Oh, why did we do that? And then we can, then we can make it. Cause we're [00:24:00] actually going to be reworking that a little bit upcoming close. So a couple of things that we're, we're going to be, Uh, that are adding very, very soon are object locators.


Uh, and this is going to be really helpful for, um, uh, the preview. Like if you're going to shoot your project before you, you go, you know, before production kind of in the previous, we're calling it sort of pre shooting and pre editing where you have, you know, a CG stand in for your characters. You know, and, and the, you know, the real, with real 3d scene, et cetera, but then you can shoot and block things with your, your actors, your, you know, digital actor representations in there.


And what we, before we, if you want to move the actors around, you have to go back in a blender and reedit, re export and stuff. And that's kind of a pain in the neck. So what we're adding is called object locators. So in the same way you have scene loc underscore wherever, and you can bounce around to the different scene locators in Jet Set, we're going to add something called an object loc underscore, you know, character A, character B.


And a new object panel where you click on that and you can move around your characters Right? So he's like, Oh, I want you over here. I want you over here. And you just grab them and move them [00:25:00] around. And then when you, you shoot, then, then the, the, the data of where they are is stored in the file. Um, so you can actually recreate that in post if you want to, or if you're just using it for previs, then you just move the characters where you want for that shot shoot.


And then you can look at how it's going to cut together in the edit pretty easily. Uh, so that's, that's coming in pretty fast. 


**BadBeetle:** Yeah, that would be awesome. One thing I will say that was kind of a shock was, uh, as soon as I pushed, uh, Record on the iPhone. It automatically started doing the, um, uh, the, what do you call it?


The barcode. Yeah. Like, okay. So that I guess you guys just implemented that I'm assuming because it didn't do that before for me. 


**Eliot:** Um, let's see. So there's The barcodes we've had for most of the time that we've been doing Jetset Cine. That was kind of our first original way that we did our identification and synchronization.


And one of the pieces that we're just now adding in is, and then the barcodes, they work, but [00:26:00] sometimes they're optically fragile. Like if you're in a situation where you've got lots of reflectance and stuff from lights, it's really easy to get a glare off the iPad or whatever you're using for the digital barcode.


And it screws it up, right? Because you have this giant glare in the middle of the screen. the black and white barcode and the machine vision can't pick it up. So what we're just now adding is timecode based take matching. That's something I'm going to be testing internally, you know, later today and early next week.


Whereas if you, if you're using a tentacle sync, Right, which hopefully you are on any Jet Set Cine shot because it's magic. Uh, the timecode that goes into the Jet Set tracking data is now matched to the timecode in the, in the, um, uh, the, the Cine camera device. And we can match takes based on timecode instead of, instead of depending on the digital slate.


And that's, I think, going to be, Uh, much, much rat more quick, uh, and sometimes some more reliable under kind of production stresses, then depending on, on a clear view of the optical machine vision mark, those, those flashing barcodes are, you know, they're just machine vision markers that we use to detect, um, take [00:27:00] ID and timing.


Uh, so. Where, you know, uh, and they break sometimes, like sometimes they flash and then they don't flash correctly. So there, there's a bunch of problems with those things and we're, we want to both fix the problems over time and also have a way that, yeah, I think timecode is just going to be a better way to match takes, um, on that.


So those are, those are things that are, that are coming up. Um, all right, so let's see. And I see Brett having a question to everyone tracking focus on a manual lens, using a follow focus system. Uh, not yet. We were just literally in a discussion, uh, uh, with low led virtual the other day. Um, they make, um, Nice little external encoder.


Um, and, uh, so that's something we're, we're very much looking at, um, to figure out how, how we're going to approach that. Cause that's, it's clearly a need. You know, we have. The default way with Jetset Cine is that we use the, uh, the LiDAR system on the, on the, uh, on the iPhone. automatically computes [00:28:00] the, the focus distance based upon, uh, recognition of objects in the scene.


So if there's just one person, it's going to stick the focus right, you know, right about where you'd expect it to put the focus on, which is the person, you know. Now the problem of course is when you have two people. And that automatic, that autofocus distance is that transmitted into post production. Um, so that's, is 


**Brett:** that taken, that's from the phone though, correct?


Because I've seen it. Yeah, I'm getting some, some focus information, but it's coming from the phone and we're using manual cine lenses. And I'm just curious, we have a follow focus system, but we haven't fully implemented it yet, but I'd like to be able to see if there's some way of, you know, transmitting that data.


Into the, uh, three D's scene. 


**Eliot:** Oh, yeah. We're we're, uh, we're heading toward that, that direction. It's going to be, uh, it's going to be a real engineering thing to do it. Right. So it's, it's not going to be, it's going to take us a little while to get there. We, cause I, oh boy, we know this problem pretty well.


Uh, we, I had multiple patents [00:29:00] on lens calibration systems. We built multiple sets of encoders on our previous systems, and we want to do something that maintains the ease of use. With this stuff, it's really easy to get in something that is technically functional, but it's frankly a mess to operate on set under pressure.


It's really, especially with lens calibration. It's so easy to get into a mess. Um, cause the follow focus systems, you say, okay, I'm going to just rack the rack, the lens from lock to lock. Well, half the time, um, The focus puller will have set the stops to be a software stop so they can do the same amount of hand motion, but only goes back and forth between, you know, 0.


5 meters and one meter when they're doing like doing a subtle focus. So you have to be very persnickety with how you capture your focus, your focus data to correlate it to the lens. So, uh, that was a long winded answer of we're getting there. It's just going to take us a little bit of work to do that.


**Brett:** Yeah, I [00:30:00] mean, that's really our biggest issue right now. Uh, I did have one other question. I'm sorry if everybody doesn't mind. The other question was actually about, if you're on Unreal, And I know you have a solution for this, and the image plane intersects with your geometry. Uh, I saw your video about adjusting the X position and things like that.


But there's a video, uh, I think his name is Joshua Kerr, he's maybe your UK representative. 


**Eliot:** I know the one you're talking about. Yeah. I found the magic switch. Yeah, but it's grayed 


**Brett:** out when I use it. I try to do it in Europe. In the project that's, or the sequence that's generated by your script, I can't seem to activate that.


**Eliot:** Okay, we are actually, I defaulted that setting in an upcoming build of AutoShot. Hang on, let me close the door real quick, just a second.[00:31:00] 


There we go. Alright, yeah, so the um, Uh, yeah, I said, I, that's, I think how I found that, that fix is there's a way to set the material to be, um, not subject to depth. And so, uh, we put in that fix as a default. It's not in, we haven't released the autoshot build yet, but we, we put in that fix along with some other fixes to the, uh, the antialiasing that, so now that, uh, if we got it right, then you should be able to finally hit play.


In the, in the level sequencer and actually see the, um, uh, see that the comp, you know, the green screen comp track correctly live with the, uh, with the tracking, uh, I 


**Brett:** was working on a shot and trying your method of adjusting the X, but because I was [00:32:00] moving the camera, you know, back and forth in Z space essentially, and I had this.


Character behind a desk, he would frequently move through the desk and then in front of it when I moved the camera back. Um, and even when I adjusted, so the other thing is when I adjusted the X, if I moved the camera too close to try to kind of see him, what's going on behind the desk, you can see it, but then it intersects with the floor plane and disappears.


And so you actually have to create a sort of a dynamic and that ends up looking weird if you're dynamically adjusting that X position. Because I was also getting some reflections from the image plate that were nice that I wanted to keep, but if I played with that X axis, it would move it too far away from the reflected.


Oh, 


**Eliot:** got it. Yeah. So, so 


**Brett:** I'd love to be able to, uh, just keep it from intersecting with geometry and see if like what Joshua was doing. I saw that. I was like, Oh, that's the answer. And then I went in and I couldn't turn it off. 


**Eliot:** So I think we switched that by default to, uh, off in the next, in the next upcoming build.[00:33:00] 


Um, so we have, we have a couple, we have a buildup coming where we, that has the timecode matching, has that, has a couple of the Unreal fixes rolled into it, and I need to, um, you know, make sure we get, get all that stuff QC'd, and then do a bit of a video recording of it, because I want to, uh, now that we have that working, so you can actually render it with the, with the comp in there, and, and also show how to hook it up to, um, you can actually directly encode right inside Unreal.


So. It's just a little clunky. You have to have, uh, an external build of FFmpeg, uh, and also set the Unreal encoder, like a command line encoder settings to point to that build of FFmpeg, which we actually do. We have that in AutoShot and we have a button in the tools that gives you the path directly to FFmpeg.


But I needed to do the tutorial to go to the walks through the steps and does it and, and make sure everything is behaving the way it should be doing. So that's, that's coming up, coming up soon. Um, and then we, then it should be, um, kind of default baked into the, [00:34:00] the export settings of how we do it. Um, and you can turn it off if you want to, if you want it to be occluded, but usually people don't want it to be occluded.


It causes as it causes exactly the problems that you're talking about. 


**Brett:** Yeah. And I really liked. working, trying to finish the shot in Unreal when you have objects that are passing in front of your actors and stuff. Because if you try to take it into, uh, you know, Fusion or something like that, then you have to separate the layers out and everything.


So it's nice that you can kind of just position that image plate where you want. But the problem I'm running into is if I move my camera back and forth, he comes with me and he goes in front of the objects that I want him behind. 


**Eliot:** Yeah. Yeah. That's, that's. Now, theoretically there's a way to do that with Cryptomat, and I know the, um, the Andiax people did that with Blender and Fusion, and I know Unreal can generate Cryptomat.


I've never done it, yet, that's another one of those things that's on the list, because it's If you set it up right, it's, it's true. Like you just, there's like a couple of clicks you set and it [00:35:00] embeds the cryptomatte information into the XR file, pull in a diffusion and you say, okay, I'm in front of these things that behind these things.


And after it's set up, it's great. But, but making sure that the setup is correct and that the channels come in correctly. You know that there's, I need to do the tutorial for it, um, because there's a bunch of little, uh, as with everything else in visual effects, there's like 30 little checkboxes that need to get checked correctly.


And then it's just magic, but getting to those like, you know, 25 checkboxes, get them all like lined up. So the, the, all of the little bits like fly through correctly is, is always the thing. Um, but yeah, yeah, I, I gotcha. Or, or, or, uh, and I'll ask one 


**Brett:** more question and then I'll let somebody else, cause I feel like.


The SynthEyes refinement workflow, I watched the tutorial. Uh, I haven't gotten all the way through it yet. Is there a section on using it? With, um, unreal, or is it only really work in blender? 


**Eliot:** Aha. Excellent. Excellent question. So, um, [00:36:00] unreal kicked my butt and getting that data from synth eyes over to unreal.


What you should be able to do is do an FBX export out of synth eyes and then load that into unreal as an FBX camera animation, and I could not get that to work for me. For the life of me, uh, it was coming in. It was doing strange things where it would come in, uh, not on frame. Uh, so, you know, like the, uh, 24 frames per second, the effects would come in on 30, even though I went through and I made double sure that it was exported at 24 frames per second, cracked it open in a, in a binder editor and like, yes, the FBX is 24 frames a second.


Unreal would import it at a different frame rate. And I, I kind of lost it at that point. And so. We actually, uh, wrote a dedicated piece of code inside AutoShot to take in an external tracking file that Synthitis can generate and push that directly into the Unreal keyframes. So, that's another thing on my list to test through because I just went, that's it, we're writing code, we're not, you know, fluffing around with [00:37:00] this stupid export.


FBX makes me crazy because it's I mean, you're just dealing with a 30 year old binary file format. That's been patched and prodded and hacked for decades by at least three or four different companies, you know, from Kadara to, you know, to all the different places that have bought it and send it along.


And it's, it's a mess. I mean, if you've ever programmed it, it's, it's beyond a mess. So, uh, what we're doing is just pulling it into text land where, you know, synthesize can export a standardized, it's just an ASCII text file. Position X, position Y, position Z, rotation X, Y, Z, you know, field of view. That's it.


And so we can pull that in and, and make sure that it comes in exactly in, in the, um, in the image sequence, in the level sequence, you know, frame to frame. So, uh, that's the code is written. I need to go through and do a test take of that, uh, and kind of, you know, show, show the workflow on it. Uh, so, [00:38:00] uh, but actually, I mean, if you have a test shot that you want me to do it with, uh, I'm all, I'm all ears.


Uh, cause otherwise we have to go generate our own stuff. And, 


**Brett:** uh, yeah, actually I, I've got something I did the other, actually my shoes, some more stuff this weekend. I can send you to, uh, we just picked up a gimbal. So I'm on, I'm running that test on that all weekend. Uh, so I can shoot some stuff. I've gotten pretty good results.


Just shooting some testing. We haven't shot a full scene yet, but we're getting close to where we want to try that, but we want to, before we bring any talent in, we want to be pretty sure that we can do this, uh, without too many hiccups because some of the people we're trying to bring in are. You know, they're big names.


They get fussy. We're just trying to make this all work as perfectly as possible before we bring anybody in. 


**Eliot:** Debug everything, you know, quietly in your garage before you, like, let. So, the key thing is make sure after you drop your origin for the post tracking thing, and we're going to put an alert into our UI for this, is [00:39:00] you, once you drop your origin and stuff, do a quick scan.


Of your, cause that's, that, that's, that scan locked to your origin. That's locked into your surroundings is the key that lets us do that accelerated post production process. It's like, and that's what you're using to position the 


**Brett:** image play. Right. Is that 


**Eliot:** what we do is we actually pull in, uh, share this, share the video real quick, so you all see what, what this is, because this is.


This, to my mind, was the point where we were getting this workflow working is where we went from, okay, it's kind of fun to, nope, this is, now we can do pro work. Now we can handle, you know, the hit when somebody has a project coming in, it's got like, you know, 50 or 100 shots and they have to get tracked and all this kind of stuff.


So, uh, all right, where am I at? Um, share. There we go. Share. There we go. Okay, so, um, we go through, this is just the tutorial stuff, like installing InspireeNet. It's optional, like you can do, all this stuff will work with ModNet and PPMAT, which are the two AI roto models that are built into AutoShot. [00:40:00] Those work on Mac.


InspireeNet, we just wanted to show it because it's, the quality is incredible. Um, but it also only runs on Windows, so it's that trade off. So, this is after you, um, Uh, so autoshot, uh, will generate a, uh, a, an automatic, uh, sizzle script. So then once you, you load in that sizzle script into, into SynthEyes, um, then what it does is it brings in the complete tracked shot from, uh, from, uh, you know, the Jet Set shot.


And so this is, this is city footage. Using a Cine, Jet Set Cine Lens calibration, uh, and it has the scan data from Jet Set, and it brings in the, the camera tracking data, and it's that combination of things. The fact that we can, um, by default position things to where we're close, you know, like we're probably within a centimeter of where, of the actual, you know, correct position, um, and the, the, uh, the scan data lines up with the camera calibration, which lines up with the, uh, the [00:41:00] scan, uh, with the, um, Uh, basically all the pieces line up the live action footage, the scan and the camera tracking.


That's what makes the magic happen. That's what lets us do the other pieces in this process of, you know, automatically detecting all the features, uh, And make sure you watch both, both the videos, uh, because the first one I did a manual process of detecting features. And the second video, which is fixing, you know, a misaligned CineSolve.


Uh, I used the new, uh, script that Matt, uh, Markovich, uh, Markovich contributed, which is great. Cause it just, it, it buzzes through, uh, Um, it's like a small, medium, large, extra large feature detection, and you get a metric ton of features, uh, which is what you need because you're going to throw away a bunch of them because they're, they're bad.


Um, and, uh, but that's That process is how we, uh, so here's, here's where we have, we've detected all of our features and we're going to go in and just, uh, lasso select a bunch of these things and drop them onto the mesh to create survey data. And [00:42:00] this is, this is really the key of it is we can, you can, there's, there's all of our, our points, but those points are 2d points, we're just going to grab them, lasso select, shot, drop onto mesh.


And those have now turned into survey data points. And the key thing with survey data is that you have 2d points, uh, that are, you know, tracks, 2d points that correlate to 3d positions. This is stone cold magic. And then outside of the, unless you've worked on a big time production where you had a survey team, which a bunch of people come in with a, it's about a 15, 000 device called the Theodolite or, uh, uh, or it's called a total station.


That's right. Uh, is, and it, and the laser fires. Uh, and detects the X, Y, Z position of multiple points that were usually where you have your, your tracking X's on the green screen and you have a team gets it there and shoot that it takes two people to do it because this thing weighs like 20 or 30 pounds and you have to move it around and take all that data and then pull it into your tracking software.


Like right there, you know, the team [00:43:00] itself is, is, you know, 2000 bucks a day to get that team on there. But that means that you can reference your 3D tracking data exactly to what was on set. So this is, uh, I get exuberant about talking about this, but anyway. This means that when we solve and synthesize, it restrains the solver to match what was, what we actually shot on set, the real time jet set tracking data, um, so that you retain your position, orientation, and field of view, et cetera, information.


Um. And when it solves, it's only going to move the camera like a centimeter just to get to that sub pixel, you know, perfection. Uh, but everything else is correct. Your coordinate systems are correct, your, your orientation. And, you know, people who are heavy 3D trackers look at this and, and go, Oh my God, this is, you know, cause normally even if you get it solved, then you're going to spend the next two hours like trying to adjust it.


A monocular solve by definition doesn't have, uh, orientation or position information or scale information. So you have to make it up. And unless somebody happens to have something in shot that's of known [00:44:00] scale, you're sitting there eyeballing it in per shot and the hours go by. You know, does that look right?


I don't know. It thinks it's like 30 percent over. Try it smaller and They don't want your time, they don't want your budget. This way, bang! One to one. No ifs, ands, or buts. That's the scan, that's the footage. You know if you're lined up. It's very, very apparent. So, um, anyway. That's part of the mini rant. But that to my mind is the key.


And, you know, You only need that level of precision if you have ground contact. If you have that visible ground contact plane, then your level of precision jumps through the roof. If you're, if you, if you're doing, you know, quote unquote cowboy shots where it's, it's like, you know, hips and up and the camera's just following them, don't need it.


The normal jet set data is going to be fine. Um, but I just wanted to. Go talk on that just a little bit because it's a big deal. 


**Brett:** We've been very impressed. The biggest thing we're missing right now is that focus element. Um, just [00:45:00] because we're, we're using manual cinema lenses and, and as we move the camera around, we really need to be able to adjust focus and we'd love to have the environment reacting to it.


That's, that's really what we're looking for now. Oh 


**Eliot:** yeah. Yeah. I, I, I a hundred percent hear you. 


**Brett:** Yeah. 


**Eliot:** And we are, we are heading, heading in that direction. 


**Brett:** Oh, cool. Thank you. I'll shut up now. So someone else wants to talk. 


**Eliot:** All right. Uh, let's see. Nicky. Oh, uh, Mark. And I know your name, it says Walter, but let me, you're muted.


Let me, uh, meet you. Hang on. Uh, All right, Mark, you're muted right now, um, 


**Mark:** asking you to unmute. Okay, how's that? There you go. Okay, great. Well, uh, Junico made a, a good point, so, uh, we have to get the Jet Set Pro, uh, but she's concerned the stuff I already shot on the free Jet Set, is Jet Set Pro going to see that?[00:46:00] 


Does it travel over that data when you, uh, change to, um, another version? 


**Eliot:** Let's see. Oh, I mean the, um, when you, the, the same takes will be if you store them in your local iCloud, um, or actually I'm sorry, you're going to be on the same phone. So the same take it tracking data that you shot is already there.


Um, that will always, that will always work. Um, if you're moving to Jet Set Cine with an external Cine calibration camera, then, you know, your Jet Set Cine takes are, are, um, are. Are stored along with the rest of your takes. And there's just a, some internal JSON data that tells us that it's a Jet Set Cine take, but you can work back and forth between Jet Set Cine and Jet Set takes, AutoShot will automatically process them either way.


Okay. 


**Mark:** Junico, did you understand? 


**Juniko:** Yeah. Yeah. It just means I, you know, the only thing is I just want to know if it was an upgrade, cause you probably get this a [00:47:00] lot because everybody's going to jump into the free version, right. And then find out, Oh gee, I need stuff that I could only get in pro and I didn't buy it.


So if I could upgrade like instantly from, you know, the free to the pro will all my data travel with. And so, yeah. 


**Eliot:** By, by default, um, uh, and this is actually a important little piece that, that we're, that I'll tell you where we currently are. And by default, Jet Set currently stores, um, all the tracking data, all the calibration data, et cetera, in a play.


It's, it's stored on the local, um, storage on the phone, right? Uh, but it's in a place that Apple designates as iCloud storage. So it's, and you can see the path in your, uh, Uh, project, uh, the project window where you click on your project name or making a new project, and it'll show you the path. And the path is, you know, iCloud slash jet set slash, you know, project name, et cetera.


Right. That's by default. And we store that there by default because. Uh, if we store it [00:48:00] in the app storage, which is the ad storage linked to the, the application, then if you, if you, uh, you know, delete the application, boom, the takes go. Um, and so that said, there's a bunch of productions, if you're in IP sensitive stuff, that can't ever have things on iCloud.


Um, so because that's just very IP sensitive. So, um, we are changing that so that you can pick. Um, when you are doing your project files and you create a new project, you can say, am I storing this on my iCloud storage? Um, and either way we, we use, um, AutoShot to directly link. We don't use iCloud to bounce data back and forth between Jet Set and AutoShot.


We always just use, use a direct link for that basically, uh, because it's far faster. Um, but just the details of exactly where, which bucket of storage on the phone it's stored in is we're going to add a choice of either storing it in app storage Um, uh, which is, is not transferred to iCloud or the quote unquote iCloud storage.


Again, still on the phone. It's just in a bucket of, of [00:49:00] SSD space that is, that Apple assigns to iCloud and does get mirrored to iCloud. It's, it's the iCloud storage. Um, and eventually we'll be able to store to an external SSD. Um, if you pick that. So that, that's going to be upcoming in the, um, in a bunch of user interface changes that we're doing to Jet Set right now.


Oh, okay. What was 


**Mark:** the, what was the CMO recording? Because you said that if you don't want something in the iCloud, you're recording something on that CMO on a chip, aren't you? As I recall. 


**Eliot:** That will be one of the options. The CMO 4K has, you can plug in an SD card into it, um, and that will show up as a potential external SSD recorder that you'd be able to record Jet Set takes on.


And we've done some experiments on it and it looks promising. It was not as fast. Like it was, it takes substantially more time to get the data out to that versus either the local iCloud or local, you know, app, app space storage. Those are nearly instantaneous. So we're still experimenting. And we want to be careful with that [00:50:00] to, to make sure that it behaves exactly how people want it to behave.


But it's, it's basically those three choices. App storage, you know, iCloud storage or external, you know, Anything external is under the same bucket, whether it's an SSD or whether it's an SD card, it shows up as a mounted external drive and you can, you can write to it. 


**Mark:** I see. I recently purchased the, um, uh, iPhone to ethernet, uh, converter and, uh, because I think, oh, there I did something.


Right. Juko. See,


**Eliot:** Oh, yeah. Yeah. Those, those are, those are a big, big win, especially. The, if you have a lightning, um, a lightning cable, they're okay. The, the, with lightning, you're restricted to a hundred base T speed. Um, you know, like, you know, and, and it's, it's faster than a wifi, but you know, not hugely faster. If you have an iPhone 15, uh, pro or later like a 16 pro [00:51:00] or pro max, that's a USB C.


Connector. And then that's, you can get a gig ethernet connector on that. And that's, that's a fast way to push files off of it. Um, and so we, whenever you're dealing with things, something that, and, you know, Nikhil, we, we may run into this with you. Cause I know what you're running into is, is kind of intermittent network transmissions.


And this, this may be a way, a way forward on that. Um, to, to help debug it. Um, so yeah, anyway, uh, Oh, 


**Juniko:** well, um, Elliot, are you going to have tutorials by any chance on that Ethernet connector, um, or instructions or anything on that? 


**Eliot:** We probably will. Um, the, the nice thing is that it generally, you know, if you plug this to your Ethernet, it, the, the Jet Set device is going to show up on your network just the same way as it does with, with Ethernet.


Um, the problems we run into is that, uh, different networking systems every once in a while will do weird things. Like, for example, the wifi and the ethernet can sometimes be on different subnets, which case the devices can't see each other. You're right. [00:52:00] Which is very, very frustrating. So there's this large category of, of network connectivity that, um, that we end up getting, getting hit by, uh, that isn't really under our control, but we have to come up with ways that are, uh, solid ways for people to handle it.


You know, in, in production, one of them is getting a travel router. Um, and so this is, I'll send, put a link on this because this is yeah, I remember saying that. 


**Juniko:** Oh, yeah, that would be, that would be great because yeah, I tried it and it didn't recognize it. So I, that's why I thought, okay, I think I need more information.


**Eliot:** Yeah, this is, I'll send this link. This is the, um, travel router that. And this, as soon as you're going anywhere, um, other than your, your own, you know, your own place, uh, just get, uh, this travel router is turning into a very, very useful little piece of, of, of gear. They're inexpensive and what it, a normal router wants to talk directly to the modem, whether it's, you know, the fiber modem or whatever.


And [00:53:00] this is different. This is designed to go in a place that already has a wifi network. You know, it's called a hotel router. Um, But then you plug your own devices into just this router, and then it's making the DHCP, which is the, um, making IP addresses for all the devices of it, but now it's under your control.


And then all your devices are talking to the same little network, same, you know, mini router, that then is hooking into, it's using for its internet access, the existing Wi Fi signal, but that way you're in control of how it's doing its, It's IP address generation. And that way you can set up your, your system in home.


And if you go in your office or whatever, and you take it out on, on, on site or on location or on a stage, you just like, okay, I'm switching from this wifi to this wifi, but your systems are still all, you know, Jet Set, AutoShot, et cetera, are all talking to the same router. They don't even know anything changed.


They're like, yep, I'm happy. I'm good. Okay. Okay. The world can be like going to, you know, going into chaos and it doesn't affect your own [00:54:00] IP addresses. And that's 


**Mark:** Because we ran into a terrible problem doing it at school. Either there were firewalls or some damn thing. 


**Eliot:** Excuse me. I'm going to grab some water.


You guys. I'll be right back. Pardon me. My throat is a little dicey these days.


**Mark:** But that sounds helpful, Junico. There's always some other gizmo that'll solve the problem. 


**Juniko:** Yeah, yeah. Nikhil, did you have an issue with that too? Yes, 


**Nickhil:** actually. Um, that's exactly why I'm here today. Uh, I'm trying to set up a, a specific router, uh, because we're going to be shooting, uh, in the desert with no Wi Fi, no internet.


So I want to do live compositing. So I'm going to bring a laptop with me, um, and hook it up to Jet Set and hopefully right now I'm running into issues of the live link, not [00:55:00] being able to pick up the, um, connection from Jet Set from auto shot. It shows as a connection, but. But something with low net, I need to work out with Elliot.


**Juniko:** Oh, okay. Okay. Yeah, I've experienced that too, where AutoShot did not recognize the iPhone, um, in a different location. But yeah, okay. That'll be interesting. Thanks. Yeah, yeah, of course. 


**Eliot:** All right, uh, so yes, so, uh, Mark, that was, that's, um, so that I would say that I would grab one of those to lower, basically you're trying to remove variables and the networking systems can have enormous numbers of amounts of complexity if you're dealing with a big institution and you want to, you want to cut that complexity way down to where you're dealing with, you know, just something very basic and standard so, um, that A can talk to B and B can talk to A and everything else is, is not something you're worried about.


Okay. No, because I can guarantee I can sit here and I won't be able to debug your institutional network [00:56:00] chance equals zero, but buying a little piece of hardware and sticking it into the, into the loop so that you're, you're only debugging that now, now we can, we can get there. 


**Mark:** So are you still turning on that, uh, working on turning lead into gold?


Have you succeeded yet? Or, 


**Eliot:** uh, which, which part of that would be, uh, yeah, yeah, yeah. Oh, yeah, exactly. Debug remote debugging of large institutional networks. Yeah. That that's where I kind of go, you know, uh, let's, let's just, sometimes you just, you throw a little piece of hardware at the problem to solve an otherwise utterly inscrutable problem.


Thing, right. And that's, that's, for example, the tentacle is a good example of that. We found that little piece of hardware cost 200 bucks and it locks time code between a jet set and a Cine camera that solves this huge number of things. And now we just kind of standardize on that. Just, you know, get the tentacle.


It works. And then, then your, then your time code problems go into the world of manageable. 


**Mark:** [00:57:00] Fabulous. Well, Jenny, I, I think we're, we're kind of, uh, set. Did you have any other, uh, questions? 


**Juniko:** Well, I want to, I want to just hang out a little bit longer and listen to Nick Hill's, uh, situation. Oh, good idea. 


**Eliot:** Let's jump in, because I think we're going to be running right into the, these network things that I was talking about.


Yes. We're going to be in some network joy here. 


**Nickhil:** I love it.


So yeah, I was kind of explaining it earlier, but I'll kind of just run it back again. So I'm trying to set up a production, uh, where I'm using Jet Set to track, um, our entire shoot, uh, two days in the desert. We're going to have no internet connection. Uh, it's going to be, uh, we're, we're going to have like a tent, have a generator connected to my laptop, um, hooked up everything.


But I wanted to do live compositing so we can get all the lighting right, uh, all on set. Um, but right now I'm having issues with the low net protocol. Um, finding the status of my phone [00:58:00] on auto shot. I'm able to open the web page and it shows time caught a time code and the video feed all works fine. Um, I go into settings, um, and then I Go to the external tracking protocol, low net to, uh, set the tracking FPS at 24.


And I set it to this PC, take the external tracking destination and put it into the low net to a live link to click 


**Eliot:** save after setting the, no, okay. Let's listen to this PC 


**Nickhil:** and then save. 


**Eliot:** Yeah. Set, set to this PC and then click save. Um, because that pushes the state, um, uh, and, and, and it saves the state back, that's gotten 


**Brett:** me a couple of times.


It may just be. 


**Eliot:** Yeah. Yeah. That, that's, uh, this is, 


**Nickhil:** yeah, so now it's, now it's showing up. Thankfully, there it is. That's what it was. Uh, now I'm going to go into autotop. You need to make 


**Eliot:** that save button like blink. 


**Nickhil:** Yeah. [00:59:00] Yeah. Like I literally would never, uh, realize until you just said that. So that's, well, that fixed most of my problems now.


Yay. All right. 


**Brett:** Saving at the right time is a key element in all of this. I've got the one because if you're taking, if you're doing the Unreal preview script, which I'm sure you're doing, uh, if you haven't saved your map. And you open the sequence and then you try to save all, it won't save, it won't save the sequence.


**Eliot:** I see. Can you say that again? I got, so, 


**Brett:** okay, so here's the bug. It's not really a bug. It's it's unreal. It's not you guys. It's um, so if you use the live render preview link out of auto shot to go into unreal and do a live camera, um, once you, if you haven't saved your map, your level, and you open the sequence and now both of them are unsaved.


If you try to hit save all, it will not save the sequence. You have to save the map first. So, uh, it's just, it's a simple thing. All you do is. 


**Eliot:** I've seen that and I [01:00:00] couldn't, I was like, what? Just, I couldn't figure it out. That's what it is. That's, 


**Brett:** that's the, that's the sequence. You have to save that map before you open the sequence.


And then you can save the sequence, you can adjust it, you can do whatever. But if your map is unsaved and you open the sequence and you hit save all, the save will 


**Eliot:** fail on your sequence. What we're going to do in that case is we'll put it, we'll pop up a flag when you execute the, The, the, um, the, um, the live ender script that says, you know, have you saved, you know, have you saved your, um, level yet?


You know, your level yet, um, please say before doing that, uh, cause that, that, that got me and I scratched my head. I'm like, there's, and then it would work and it 


**Brett:** would still work, but you couldn't save it. And once that happens, I found that you really can't save it. You have to go through the process again.


**Nickhil:** Right. 


**Brett:** Um, okay. Delete your sequence, send it back again. This is solid gold. But, but that's it. That's it. It happened to me a bunch and I was like, why is this keeping, and it wouldn't happen every time. And I [01:01:00] wasn't figuring out what was happening. And then I figured out, oh, it's the map. You have to save that level first and then you can open the sequence and you'll be okay.


Uh, and then you can do whatever you want and you're fine. But as long as that map has not been saved, everything will fail when you try to save it. 


**Eliot:** Thank you. Thank you for for for that because I now we can now we can embed that into the system of where just either does it or checks or something like that.


And that way, no one else gets gets bit by it. And it's just part of Yeah, it's 


**Brett:** kind of and I did. I think I told you this last time I was on. I got it working the gen lock with the black magic. Okay, great. It's, it's the same process as you used on the a JA. It really is exactly the same. You just choose the black magic components instead.


The, the blueprints and things like that. 


**Eliot:** We're, we're still gonna record the tutorial just to have a one-to-One mapping of, of every little 'cause. The, probably there's something slightly different in the black magic. Um, you know how , well, there's a, 


**Brett:** there's a weird Gotcha. That's not, again, it's kind of black magic's thing and not yours.


If you, when you're configuring the [01:02:00] card. For instance, if you're not using some sort of dual link setup with your SDI, normally if you would set it up for one of the SDIM ports is for the input and the other is for the output, right? But it's a little weird because the way you said it, you choose, I don't remember what it's called, you choose, uh, output one, but that's actually SDI one and SDI two.


So when you're, if you're trying, for instance, to send your output from Unreal back to a monitor, theoretically, you'd go, okay. My input is SDI one. My output is SDI two on the card, on the physical card, but you actually choose S you choose configuration one for both of those, it's a little confusing, but that's what it is.


It's, it's, it's all based on the way you configure the card. So when you're using the black magic software to configure the card, uh, configuration one is one in two. If you set one as your input and two as your output, if you're in a dueling situation, you [01:03:00] might do one and two as your input, three and four as your output.


And then it's different. So it's a little weird inside of unreal when you're trying to choose less. Yeah. It's less about the input because the input's pretty intuitive and makes sense, but if you want to send it back out to a monitor so a client can look at it, for instance, so if you're trying to do a live preview and go, here's what it's going to look like.


That's where you, that's the gotcha of like, if I'm trying to send it back out, you don't choose two, you choose one again. 


**Eliot:** Well, your live output, were you using the, um, they've gone through a couple of generations of, of live output in unreal. And the most recent one was, uh, you know, look at how they were setting that up.


It was, um, it was, it was some kind of unusual thing. And I, I didn't, we didn't quite get it running. It was, um, what is it called? Some name for it. Let me look at this for a second. [01:04:00] What did they end up doing? It's, uh, I don't


remember what it was called, but it was, it was separate than the normal composure stuff. And you created a, um, what was it? Um, streaming from composure out. 


**Brett:** What I'm doing is I'm using you have to set a media out and then you have to make a black magic media out and then you basically select that as your media and then you probably want to do some sort of O.


C. I. O. Color management because it comes out linear. It's very dark on your monitor. Uh, so if you set up an O. C. I. O. It'll convert to, you know, S. R. G. B. With like a two four gamma or something like that because it works, but you have to it. You have to set up an OCIO, you have to set up a Blackmagic Media Output, and then you select on your Composure Output, you select the, [01:05:00] uh, Both the OCIO and the media output, but you have to actually add it.


It's not there by default. 


**Eliot:** So it looks like this. You set up a media capture, um, and then put the media capture. So the media 


**Brett:** capture, that's a different workflow. I actually know some people who are trying to get that working. But mine's actually at the composure level. So you select the comp and then the details, you would actually go and add a media, but it does suffer some, some quality issues.


Like the antialiasing isn't really there on your monitor. So I have a friend that's actually trying to get that media capture method to work. Cause that is another method of doing it. And you seem to get higher quality, but if you're trying to make that work with composure, I haven't figured out that how to do that.


So like using your composure, um, um, Comp that is created by the script. Oh, there is one other issue. The CG comes on offline on that. Now it was working for a while, but now it comes in offline. It's a quick fix. It's like a, a drop in, but the CG layer in [01:06:00] your composure comp says like media, not available.


I'm not in front of my system right now, so I don't, but it's an easy fix. Uh, what was, so what was the fix? You sort of, uh, it's, it's the change from inherited on the camera settings. It's actually in the details panel as well. You change it from inherited or no, you change it from standalone to inherited.


You change it to inherited. Uh, here, if you wait a second, I can open up my system. I'm not at the same computer. I can tell you that there's an issue. Yeah. And it was working initially, but when, when there was an upgrade to auto shot, all of a sudden when I'd use that. The the unreal preview script, the CG layer in my composure preview or my composure comp would come in offline.


But if I change the camera settings. Interesting. Okay. This is, uh, and it was working before, but then it stopped and it hasn't been fixed in the last two version of them. But I just kept changing and it's fine, it's a quick fix, [01:07:00] but, you know, I just wanna let you guys know that it was working and now it's 


**Eliot:** even bring it up or like, send me a screen cap of the setting to get, I'm opening it, I'm like, I'll patch the, um, I'll patch the, the la the light?


Yeah. It's, it's 


**Brett:** the CG layer. That's the, the camera layer is working fine. And, um. I mean, you have to go through the configuration if you're using a card instead of like the can link, but that all works, but the, uh, the CG layer comes in with this weird offline. Hold on. I'm going to open my project and I'll say exactly what the setting that I use to fix it is.


Um, cause I imagine in your script, if you guys can trigger that to turn that to that, you'll probably be okay. Um, I was, I was surprised though, cause it was working. And then all of a sudden, at one point when I upgraded the auto shot, 


**Eliot:** Yeah, well, this is Yeah. Thank you for finding that. And if you see something like that, just, please just like, you know, bounce me something on forums, whatever it is, the screencap [01:08:00] of like this checkbox should be this, it should be here.


And then we can, it's just some, some of these things, it's harder to catch when it's out in the, you know, far out into unreal country. Then it's, it's less, you know, it doesn't show up. And we 


**Brett:** are trying to get it into Blender a little more, especially for creating environments and then porting. The reason we're using Unreal so much is because the producer on this project I'm working on really wants to be able to have clients be able to see a live preview.


We won't record it that way. We're not going to do a tape recorder, none of that. But he wants to go, yeah, this is what your shot's going to look like. And then we can just. Unplug and shoot. Let 


**Eliot:** me toss a workflow idea by, uh, actually you can go, go find that. But, um, when 


**Brett:** you get back, I 


**Eliot:** have a workflow wildness idea that I want to pass by.


And hey, Bobby, good to, good to see you. I think you're in there somewhere. All right. Let's see. Fantastic. Bobby is going to be launching us into our Houdini round trip workflows. Oh, nice. [01:09:00] Yeah. Yeah. This is going to be, uh, going to be going full bore on, on some of this stuff. So this'll be, this'll be wild. I think Nikhil will be very interested to see what, what you, what you're working on and on that as well.


**Brett:** All right. So I just did. So if you click on the CG layer, it says missing camera. So basically, you bring the comp, you use the script, it creates the, this, the uh, composure comp, and you've got the comp and then the two layers, the Jet Set Cine Camera, the Jet Set Cine CG. The CG actually says missing camera when you click on it.


Uh, what I've found to fix that, if you're in the details panel and you go under composure, input, you change your camera source from override to inherited, and that fixes it. Here, I'm wondering if I can show this to you. Here, I'll, I don't know if I can turn this. Oh, I'm all blurred. Let me see if I can turn my blur off.


Uh, I'm [01:10:00] not sure I can here. I'll take a quick screenshot and show you where it is. I got a picture.


That'd be great. Let's see if that worked.


Yeah, this will, I don't know if you'll be able to see this. I could send this over and try to put a still up and see if I can get you out of the blur. Let me turn the blur off. It's pretty hard to do that. Um, there it is. Background.


None. Okay, there we go. You see my disgusting, messy production office here. Oh, 


**Eliot:** that's all good. 


**Brett:** That's why I put the blur on. I don't know if you can see this. Let me see if I can send this over to my 


**Eliot:** Yeah, it's, it's uh, send, send a Or you can just send a screencap to like, support it at Lightcraft. 


**Brett:** Yeah, I'm gonna, I'm gonna pull it up and see if I can share it with [01:11:00] everybody.


Okay, that should be over on my computer now. Close my door. Didn't seem to go. Let's see if I can send this via email. Or AirDrop, I guess I could have done that.


Okay, now it has emailed myself. Let's open that up and show it to everybody.


And,


yeah. It's taking a second to come in. But yeah, it is under detail. It's under Composure, Input, Camera Source. That's where, that's where it is. And then if I can get this up, I don't know if we can see this, still trying to get the [01:12:00] email over.


So I don't know if you can see this right. That's all backwards, but it's composure input. And then it's a camera source. And you change that from, uh, to inherit it. And that's what fixes it. I don't know if that works. Can you hear me? I can't hear you guys now. We must have changed. Yeah, I mean, Elliot is off desk.


Yeah, there we go. That was my fault. Okay. Uh, so yeah, I think, uh, for some reason I can't get the email over here. My system's not working. But I'll send it to you. I'll email it to you or I'll post it on the Where should I put it? 


**Eliot:** Uh, you can just bounce it on the forums or something like that. Put it on the forums.


**Brett:** Yeah, I'll put it up there and I'll explain what the issue was and show you how I fixed it. Um But that's, yeah, that, that bug. [01:13:00] And then the, and it's not really a bug and it's the, the, the image plane intersecting with the geometry. If there's a way to get around that besides the, because even using that X adjustment, uh, like I said, as I get to a certain point on my camera shot, as I get too close, everything, it starts intersecting again, even if I take it as far out again, I have to dynamically do it.


**Eliot:** Next build of autoshut, I think we, we defaulted that to, to the same way Josh had it set up where it's just, it doesn't, it's not affected by depth where we, we, there's a switch that you put on the image plane that says, is it affected by a depth, uh, depth sorting or not? And we just switched to not because it's rarely what you want, uh, cause it causes all those, all those, all those problems.


**Brett:** Yeah, and I'm, I'd be curious to see if that helps, uh, if, my question becomes if it's, if I want my actor behind an object, and I turn that on, does it, does it, now are they going to appear in front of that object all the time? 


**Eliot:** Yeah, they'll probably appear in front of it. [01:14:00] 


**Brett:** Yeah, so I'll probably end up having, because we have a specific scene that we want to, it's on a, like a talk show host stage, and, It's like the old Carson style.


So he's sitting behind a desk and, you know, um, and, uh, we got it working using the depth occlusion and everything. And it puts the image plane right there. But as soon as you move the camera back, he comes in front of the desk because the image plane is tied to the camera. So like, as long as you don't move the camera, you know, on that Z plane, you're probably going to be okay.


**Eliot:** It should be, um, in auto shot by default, it should by default use the LiDAR. And it 


**Brett:** does. So, so actually on, in the phone, the part of the actor remains behind the desk the whole time. But when I, when I send the shot to Unreal using your script, it initially will position him behind the desk. But as I move the camera back, he, he, because he's tied, the image plane is tied to the camera position.


He actually moves in front of that, [01:15:00] that desk. 


**Eliot:** Interesting. So the, so there's a setting and I'm curious, and maybe we're just hitting the limits of what the, The LIDAR can do. Um, but in auto shot, this switch, let me grab my little, uh, annotator. Uh, this switch right here, uh, you know, there's add camera image plane.


If you set depth manually, then yes, it'll, it'll lock the depth to the, to the camera, but this should be unchecked by default, in which case it should like, if, as you, as you move, if the camera's like, you know, one meter away from the person and the person stays still and the camera moves back, that image plane should more or less stay.


Stay fairly put. Uh, the problem of course is, is if there's other objects that jump into the scene, um, then, then it'll, then it's going to make an average, you know, it's going to average out what it's seeing in the depth and you may run into problems at that point. Yeah, I mean, 


**Brett:** what I see is if I go outside of the camera cut Uh, not really looking at the sequence, but actually just kind of looking at the environment and seeing what's happening with the [01:16:00] camera and the image plane.


They're tied together. So, so if here's your image plane and here's your camera, if the camera moves this way, the image plane goes with it. So whatever position you put your image, your camera in, the image plane is always right in front of it. That's what makes it 


**Juniko:** correct. 


**Brett:** So, and that's the problem, like, it has to do that in order for it to work, for the whole system to work, because now you're seeing the environment change perspective, you're getting the parallax and all that, but it's not happening with your image plane, because you wouldn't want that.


You'd want it to stay locked. In its position in front of the camera. The problem is as soon as I move that camera backwards, if my, if I've got a desk here, my actors here, my camera's here, right? The camera starts moving backwards. The image plane starts going until it hits that desk and then it just goes right through it and goes over here.


So now the desk is back here and my actors out here. So, and I don't know if there's any way around that. I mean, outside of fixing it with compositing, which certainly is possible, but I'm trying to. [01:17:00] See if there's a way to keep that actor behind that desk the whole time without having to create a bunch of layers or using depth maps and figuring out what's in front of him, what's behind him, 


**Eliot:** you know.


Can you zip up that take? I want to and send it because I'd actually love to see the one because it intuitively, if it's detecting the actor distance as the, and I did some tests with this where, you know, I move the camera back and forth and the actor kind of stays more or less put, right? You know, because it's basically using, it's setting that.


Image plane based upon the ladder distance from the camera. So, you know, if the camera is further away, it's just a longer ladder distance and the image plane, you know, should more or less stay put. It starts to break when you start to get a lot of, a lot of ground in there. And then all of a sudden the, the ladder detector is seeing a bunch of ground, which is closer, but you know, this is the problem with automated solutions.


But, uh, I, I'd actually be very curious to look at the tape to see. Yeah, 


**Brett:** I I'll do that. I've got it. I've got it here for sure. Um, Because yeah, it's been a bit of a frustrating thing. I did the X axis, the X position thing, and I got it [01:18:00] where he's behind the desk the whole time. But as soon as I get in tight, where you can kind of see him sitting, where he was sitting behind the desk, then his legs intersect with the floor.


And the legs disappear. Yeah. Uh, yeah. So there's just , there's no simple, there doesn't seem to be a simple, universal solution that will fix the problem. Oh, I 


**Eliot:** hear you on this one. Yeah. The lidar in the real time aspect of it, we can use the LIDAR aspect of it, and the lidar does a roughly per pixel map, but it's just not high enough resolution for, for post-production.


You see, you know, you'll see in the letter's only like 170 or 180 pixels across resolution. So it, yeah. It's, it's okay for preview. It just is nowhere near the precision of matting that you need for post. And so, um, Encryptomat is man, it's, it's one way of doing it. It's just like a, it's a heavy, it's a heavy approach and I'd love to find.


A better approach. Um, I have not yet found that 


**Brett:** better. Yeah, that's a big, I mean, [01:19:00] that can be solved with compositing techniques. Obviously, I'm just trying to figure out the easiest way to do things as quickly as possible because this thing that we're trying to develop. They want quick turnarounds. They want very quick turnarounds.


And that's kind of what I've been selling them on is this idea that, Oh yeah, if we do it this way, the Turner, the quick turnaround is completely possible. You'll still have to spend a day or so posting a five to 10 minute episode, but theoretically we could get it done. Shoot one day, be done by the end of the next day.


That's kind of where our goal is. I, and like the stuff that, uh, I forgot his name, I'm sorry. What he was doing is very impressive. What he was able to do in an hour, the stuff he showed earlier was Was amazing. And then very encouraging in terms of what we're trying to do. 


**Eliot:** And, 


**Brett:** uh, 


**Eliot:** and honestly, for like five, for like a five minute segment, and if you, and you're going to post it a day later, if you're going, if, if we, um, if we set up the, the, the cryptomath so that it's just automatic, you know, automatically generated, then especially [01:20:00] if you're comping in, in Air Resolve or something like a fusion that's GPU accelerated, then it's fast.


Like you can pick the desk and you pick a couple of these other things and the cryptomath stuff works. So it's more or less the. The setup pain in the butt aspect of it, but that's, that's actually something that we want to do that. I I've been, I, I know they're the right way to do it and I've seen it done.


I just haven't done the tutorial. Like, like, okay, in unreal, click here, click, clear, click here. And maybe I just need to be doing that. Now that could just be the next thing that I need. If there's a 


**Brett:** great, because I was, uh, there's another. Uh, somebody that's been using your system, the, the creative twins, I've been looking at some of the stuff they've been, um, and they've got a really interesting workflow where they're actually really finishing the shot, uh, in unreal and taking it back.


So they're not doing the full composite infusion. They're doing the, they're pulling the key infusion and then basically sending EXRs with an alpha map back into unreal and they're getting some really cool results. Because then they could put light behind them and do all kinds of really interesting things.


[01:21:00] Uh, which I, I was like, oh, that's very cool. Cause I was, you know, I'm just an old school 2D, uh, you know, compositor. So my thought is, oh, we'll just take the render out and we'll finish it in Fusion. But then I saw some of the stuff they were doing. I was like, wow, that would be really actually great to be able to do things like that.


So maybe finishing in Unreal or Blender. Actually creating the shot and taking back another resolve, maybe with an alpha matte or something to be able to separate your foreground background. But, but the shot itself is not. 


**Eliot:** The, um, Andy acts, uh, the, uh, this is, this is a great one to see because for, if you're, if you were in Blender, I'd say you have a solved problem because Andy X, uh, the, the Moonland, uh, uh, project they did, uh, he went through this and actually cut substantial detail on this.


Let me go, let me go find it. Cause this is worth, worth seeing. And I think in Unreal, it's actually fairly straightforward to set it up. Um, let me find, uh, here's our This is worth, this is worth seeing because this is the, um, there [01:22:00] we go. Let me pull, pull up the moon, moon land. All right. This is on our, this is on our user story.


So this is a new land. Uh, all right, let's let me put this up. I'm going to share a screen so you guys can see it. Um, there we go. Share. All right. Let me full screen this. So. Uh, they, this is a pilot. This is a project we did with, uh, a YouTuber called Andy Axe over in Norway and that their production team, and they were shooting a pilot for a kid's TV show, and this was, man, this was the first group out of the gate to use the, uh, the synthesized tracking pipeline.


I mean, man, that it was still smoking when, when, uh, when Morton picked it up and, and, you know, just ran with it. So then they shot the whole project so they could, uh, uh, over here, there's piece called Post Production of the Jet Set where, uh, he, the um, the real time tracking, again, the real time tracking is pretty good for, but when you have the heavy ground contact on this, uh, he goes to the process, let me turn off the sound so I can, I can hear myself think, so he goes to the, the process, pulls in the [01:23:00] synth eyes, does the tracking stuff, you know, does all the tracking optimization, and then he shows, Going through and, and, you know, taking into Blender, he shows the setup of, of, uh, setting up the Cryptomats out of Blender.


And then, this is really useful, he jumps, he dives right into, um, uh, doing the, um, the Cryptomat extractions in Fusion. Uh, and so, and getting one of, one of the packages to get, uh, Cryptomat data out of there. Um, and then you can actually just pick which rock you're gonna stick, you know, stick that behind and it, and it works in the comp.


So this is This is pretty fast, you know, and Fusion is very, very quick to do it. I think After Effects can do this as well. You know, Undo can of course do it. So I suspect what you may run into is, is the first time you set this up, it's going to be a giant pain in the butt. But then what we can do is, is add this to the scripting system so that every, you know, honor shop thing that generates is already set up with crypto mat, right.


And unreal, it's not going to take that much extra effort. Like if you, uh, tell us which switches we, you [01:24:00] need flipped in crypto mat, we'll just like add them to the code because we're, we're, you know, that's, that's a piece of cake. Uh, and then I think you'll be able to, um, actually comp pretty fast. Cause then it's, then you're getting perfect Antalya stages, like, and, and you'll get them infusion.


Cause I think you might be fighting. The problem with the image plane method is that your image plane is at one, one depth. Yeah, you know, but the actual objects are at many different depths and it's kind of intractable. It's hard to fight that one. 


**Brett:** And I'd love to be able to, like I said, my, my, my experiences in finishing things in a 2D environment like Fusion, um, but I was fascinated by the, the lighting possibilities of how we took everything back into Unreal.


Um, that's the biggest thing I found. Interesting. I'm watching what those guys were doing. I was like, Oh, that might be the way to go with it. Anyway, I'm, I'm just trying to, you know, plot out a workflow, uh, that's efficient and, and gives [01:25:00] reasonably good results. I mean, it's, I told you it's a sketch comedy thing.


It's not, we're not making feature films, so it doesn't need to YouTube. Uh, but you know, we're working with some comedians that are used to. Things that are certain level and right. I'd be sure that nobody goes, well, this looks like complete shit. You know, we want to be able to show them, go, this is what it's going to look like.


And it's going to be amazing. Let's go. Let's do it. You know, 


**Eliot:** and then, you know, make sure that you're, you're set up. So when they're like, this is great. And then they're like, now, then you're dealing with a hundred shots every, every, you know, a couple of weeks. And then, then, then, then you're okay. Yeah. Yeah.


Yeah. This is, this is totally worth dialing, dialing this down. Um, yeah. All right. So buying jets. So I'm looking at Walter's comment. Uh, and Oh, actually I accidentally, um, let me send the Moonland thing to everybody, uh, cause this, uh, right now I'd send it to just, 


**Nickhil:** um, one more thing. Um, if you guys have a second, I connected my phone, uh, through [01:26:00] ethernet with the lightning, uh, port.


And for some reason, auto shots, not able to, uh, Find the client. I put in the IP address, um, into the network, uh, settings in my computer and in my phone so they're talking and on the same subnet mask. But for some reason, AutoShot is still not able to find it. Um, at least in the web page, it doesn't show time code or video.


**Eliot:** Let's see. So, okay. Uh, there's a couple things to test. Um, One is, uh, your windows firewall setting. So this is the documentation. Let me put this up because man, that works, that works, uh, makes networks make me crazy. Okay. So let's set up this thing in on a shot network configuration. Okay. So let's take a look under, but the post this post this link on here for everyone.


Um, and so like, go ahead and click that. Cause you. You need to set your Windows network configuration as private on the network [01:27:00] properties, uh, if you're using the standard Windows Defender setup. Um, this is, this is one of these little weird, random little things that can bite you. So take a, take a look at that, see if your, if your Windows network profile is set up to that, to, to private.


And then it'll, uh, then it'll be 


**Brett:** discoverable. Did you put up a link for that video you just were showing, uh, with the guys that did the Proptomat? 


**Eliot:** Yeah. So that's the, uh, that I just put that out to everyone. And that's to, uh, Andy ax Moonland. Uh, take a look at the, uh,


I just posted it. 


**Brett:** There we go. I 


**Eliot:** see it. Okay. Okay. Great. And, uh, Nikhil, do you see the, uh, network configuration one? 


**Nickhil:** Yeah, I'm looking at it. Uh, the link you sent just now. Um, I'm just trying to figure out where to find out on my windows. I think it should just be this. Let's see.


No,[01:28:00] 


**Eliot:** it's already set to already set to private. 


**Brett:** Yeah, 


**Eliot:** it is. Okay. So. So let me think. So this is, uh, are you on a, like a home router? Is that where you're on? 


**Nickhil:** I am, but I just turned off internet access, um, on my, on my PC, just so it's only the local networks connected to my PC and my phone. Um, 


**Eliot:** let's see. So, all right, we can, we can go gnarly.


Uh, if you want to do, um, you want to screen share and we'll do an IP config. Yeah, we'll do just kind of some command line searching to see if it pings. 


**Nickhil:** Maybe you can also double check if I am on the private network, uh, cause I'm not exactly sure. So I'll just open up Zoom on my PC right now. Ah, that sounds good.


**Eliot:** Oh, shoot. And guys, actually, uh, I have, I just realized it's 1030 and I have, uh, I have a [01:29:00] call actually coming in, uh, actually right now. Uh, so I'm gonna have to, um, I'm gonna have to stop on this one and Uh, and, uh, and what we can do is jump in either on at, uh, 9 a. m. on Monday. Um, and, uh, and, uh, let's see, Nicole, you can get first, first up on, on doing this.


And, uh, I'm gonna go from there. Yeah, sorry. I just realized All good. 


**Nickhil:** I'll probably do some tests in between until I'll let you know what I find. 


**Eliot:** Okay, well glad, glad The same thing helped. At least we're once, one step forward. Yes. That is one step 


**Nickhil:** closer, . 


**Eliot:** Okay folks, this is awesome. This is a great, great, great.


I'll put it up on the web, uh, as soon as that finishes, uh, finishes uploading. Alright, thanks all. Thank you. 


**Juniko:** Thanks Elliot. 


**Eliot:** Thank you. Bye.