Transcript
# Office Hours 2024-11-18
**Mark:** [00:00:00] All right, morning. Good morning. Thank you for getting back to me this weekend, but did you see my latest email that I solved the issue?
**Eliot:** Oh yeah, I haven't, I haven't even seen that yet. Let me go take, take a quick look at, at this. Uh, see what we're,
**Mark:** it wound up being an out of sync
**Eliot:** situation. We got a little echo going on.
Oh, okay. Oh, there we go. Now it's fixed. All right, let's see. Let me go, uh, look at the, uh, I'm going to check the emails.
Okay. Oh, there we go. So now we can share. Solved it. Resize the green screen element using scale only. Okay. I still want to make sure. So when you were, um, all right, let me just like pull this guy down. So I can, I get it later on. Uh, green screen footage, final wall. Okay.[00:01:00]
Okay. So what I would, the only thing I'd request for anything like this, um, almost as a reflexive action, the first thing I'm going to ask for is that take zip, uh, because if I have that, I can replicate everything. Okay. Like absolutely everything. So let me show you what the, let's actually go through it.
Have you, have you zipped a take before?
**Mark:** Zip to take? No, no, no. Okay. So this, I didn't even, I didn't even, uh, properly log into this project. I just went and shot it .
**Eliot:** Right. It's not such a,
**Mark:** a disorganized fellow.
**Eliot:** No worries. No worries. So let's actually can, uh, can you, can we, do you have a screen share, uh, of where I can of auto shot?
'cause what I can do is I can show you how to zip a take. Cause that is, that is our primary diagnostic tool for every problem we've got, um, to, uh, to figure out what's going on. Um, that's the, like the, the non real time version, real time version is we just patch in and see what's up. [00:02:00] Um, so do you, do you, uh, do you have the takes loaded in, uh, in auto shots somewhere?
**Mark:** Oh, um. Yeah, somewhere, but I think now I know that last meeting, uh, you said the first person up what I think and I think bad beetle is has joined us because he had some some issues.
**Eliot:** Okay.
**Mark:** I think instead of. Uh, taking time away from him. Um, let me, let me show you the, um, uh, the, the composite I did, uh, just this morning.
Uh, I'll share that. And then we'll, we'll get into, uh, what you're talking about. Let me do this.
Okay. So let me.
Oh, okay. [00:03:00] Here we go. I
was so happy. That, uh, I was able to resolve this, uh, through, um, You know, just back and forth, uh, cause that, that worked well. But I think maybe, uh, you'll, you'll make a tutorial about zipping and everything, uh, a little later on.
**Eliot:** Yeah. Yeah. Oh, thank you. [00:04:00] It's the, I'll put the link up there. It's, uh, for a take zipping.
It's literally like one click, uh, in the file menu. So once you have, if there's a take, that's giving you a problem, um, then I'll just share it, share it with you so you can see this, uh, share. Go to autoshot real quick. There we go. So I've got, uh, autoshot loaded. Uh, and so this is just a take I was, I was dealing with.
Um, you know, on the other day. And so then I can look at this and go, okay. Uh, so, you know, okay, here's pick, pick whichever take I'm dealing with. And then I can do just file and then, uh, take zip. Let's see. So let me, uh, where does yeah. Export take zip. Right. And then just, uh, I'll write it out to, you know, some directory.
And what that does is it contains, uh, and I don't actually, I don't know if you can see that now. Um, but it just, it writes it out to a zip file, and then you just send that zip file to us. Um, and that [00:05:00] is, um, that has all the information from a given Jetset take. Uh, and you can, you can even include the Blender source file if you want to, but usually we don't need that.
But it's literally just file, take, zip with whatever problem take you've got selected. And it'll make a zip file, and that file will contain the source, uh, source video for both even the normal Jet Set take, like the iPhone take, as well as the Cine take. If you have that selected in your take, uh, any calibration stuff we need.
Uh, it has everything. And then we just open that up on our systems and we can replicate whatever problem it is. Super easy. It's, it is, it is, that's like the first tool we built in AutoShot and Greg and I just live on that.
**Mark:** Well, I have to mention that in the book then. You guys are definitely servicing your, your customers.
I can, I can see that.
**Eliot:** Well, it's, it's, um, it's, the whole system is built on a structured data methods where we were very careful with how we capture data. How we store data, how we transfer it through it. And if you [00:06:00] do that, then it becomes very straightforward to, to fix things. You know, like the, the way we handle the, we have a standard USD file format coming in.
So that means that whatever app we're coming in from, whether it be Maya or Houdini or Blender or Unreal, et cetera, um, it all goes into USD. We can open up USD cause it's a, it's an open source file format. We can look at it and we can very rapidly diagnose what, what went wrong and put in a fix. Uh, and you know, and how we handled the USD from that particular flavor and everything, you know, it's, it's because we're dealing with very standardized methods to go from a point A to point B, then we can always fix it.
I mean,
**Mark:** that's great. That's good to know. Well, uh, I'm going to hand this over to, uh, uh,
**Eliot:** to Mr. Bad Beetle and Rick, uh, on, on, uh, let's see, Rick or Bad Beetle, who wants to jump in next? Yeah, someone was,
**BadBeetle:** was here first. Hey, good morning guys. I'm gonna let Rick go first. Um, I'll, I'll hop on after him.
**Rick:** [00:07:00] Okay. All right, Rick.
Oh, okay. Are you sure? I'm you, you were here first, so I'm absolutely happy to wait. Mine. Oh no,
**BadBeetle:** go ahead Rick. I, I actually have to, I'm gonna send another, uh, video file we did to Elliot, so I'm just waiting for it to upload.
**Rick:** Oh, fantastic. Understood. Okay. Well, hi. Uh, hello again, Elliot. Morning. Good morning.
So, I have a question that I am trying to replicate a workflow that you, uh, you executed beautifully in a YouTube video, uh, entitled AutoShot Unreal Round Trip. Yep, uh, where you work with a, uh, a little tiger, uh, on the front of a fantasy castle. And there's a moment in the video in which you are aligning the image plane and the, the tiger's, uh, feet are going through the ground and you fix that quite easily by deleting transform data.
And [00:08:00] setting a, an X position, uh, uh, key frame at the beginning of your animation clip, uh, where the image plane is aligned properly. Uh, I've tried, uh, I, I'm working in Unity, which is different from Unreal, of course, but I tried to replicate that exact workflow, not only by exact step by step, but also by making the count of the differences between Unity Unreal and Unreal.
Um, and the outcome that I received is not, uh, that, that there is no, uh, there is no adjacent fix or, or, uh, similar outcome. Ah, okay. Okay.
**Eliot:** Let's, you know, I'm super curious. I've actually, I don't think I've, I think I might've opened unity once and it was 10 years ago and I don't know what, I mean, let's just look at it.
Do you have a screen share or something we can take a quick look at? Cause on [00:09:00] the Unreal front, I found out, um, Josh Kerr posted a better way of doing this, which is in the image and we're, we actually are implementing it in the next release of auto shots. You don't have to do the strange sort of like move the image plane back and forth.
You just tell the image plane material to more or less ignore depth sorting. Um, so there's a little checkbox, just like everything in Unreal. There's a checkbox buried like four layers deep and under a cupboard. Um, we're like, Oh, click the checkbox and then things render the way you'd expect them to render.
Uh, so we'll, uh, we'll implement that. But in the meantime, Unity is, is, is a new, is a new thing for me. So I'm just curious. I don't actually even know how it thinks. So I'd love to actually see how it, how it thinks and how it works. And maybe we should just build a, figure out a
**Rick:** direction. Sure. So, uh, I just, yes.
I'm going to give this a shot and see if this will work. I did get a message from unity. Oh, I'm sorry. From zoom. Say I share, I may have to restart my device. Do you see it? Yeah, I see it. Oh, beautiful. Then I don't have to restart zoom. [00:10:00] I was not expecting that. Um, okay. So here in the this is unity. Uh, this is unity timeline here at the bottom.
This is the scene view. The working view on the left and on the right above my inspector is the, uh, the game you so drag this out of the way, um, as I scrub as you know, as we're all familiar with. Um, we have the forward and backward motion of the LIDAR, uh, data doing its thing. Yeah. And in the game view, of course, it does look like our tripod is tracked into position.
Uh, it's where it should be. Uh, but of course, sometimes it will hop in front of this foreground cube. And other times, of course, uh, it jumps really far back. So if I open up the animation data Then what I am, what I am given [00:11:00] are keyframes for the scale for the image plane here, and then data for the camera, which is here.
So I have position keyframe, uh, data, keyframe data for the camera position, camera rotation, and for the image plane, only scale.
**Eliot:** Yep.
**Rick:** That makes sense. Okay. All right. Um. Well, before I do anything, would you make any kind of recommendation based off of what you're just seeing, or should I just make sure the
**Eliot:** scale works in unity, the same way it works in, uh, in Blender.
So in Blender, what we designed the image plane, its origin is actually the same exact same spot. As the, um, the nodal point of, or as the, basically the camera location. So as you scale the image plane, it grows and shrinks the frustum to always exactly match the camera view. And it looks like [00:12:00] that's, it's brought that over, uh, from, uh, from the, cause you imported this from blender, right?
That is correct. Okay. So it looks like it's brought that over. So what I think, Oh, good. In fact,
**Rick:** the way that I bring in, uh, Jet set information, take data and the entire rig from Blender is simply to save like autoshot will generate the Blender file. I simply save and close it and then I drag the Blender file into Unity because Unity will read Blender files exactly as is with a, um, Uh, and create a prefab from that Blender scene.
The only work that I have to do when I first drag a Jetset tape from my project window into Unity is on the, on the root node, apply a 180 [00:13:00] rotation on the Y axis, because it is backwards, and to do the same thing on the, image plane because the image plane is behind the camera when it's first, when it first comes in, it's back here.
So I apply a 180 and that, that position is it correct. Those are the only changes I make from Blender. Okay. Okay.
**Eliot:** Okay. That's, that, that's quite promising. Um, all right. So when you're in your scale, um, and so the, what you, what we want to do, let me, so I, Make a, make a guess here is what we actually want to be able to do is keyframe the scale so that the tripod is not like going through your, your cube there.
Is that, is that the correct goal?
**Rick:** Well, I, I can do that. And I have keyframed the scale simply by deleting all, all of the [00:14:00] keyframes for the image plane scale, with exception of the very first one, and then resetting that to basically creating manual keyframes throughout the clip. I can do that, uh, for a series of shots that is going to be, you know, obviously that's not the worst workflow, but it is a manual workflow.
I was fascinated by your ability in Unreal to place one keyframe, one correctional keyframe at the beginning, uh, of, uh, Of working with that tiger, and then it rogues through your entire shot.
**Eliot:** Yeah, well, usually, um, the way you recall this. Yeah, yeah. I remember that the, the, the way key frames are usually set up is that, uh, you only need to set one if you want a constant value.
And so if we have, if you're at frame one, uh, and you, you know, set your, yeah, there you go. So they've got your scale. So now let's [00:15:00] adjust your scale. Um,
and yeah, there we go. So there's, there's your, your tripod. Um, and then that should, that key frame should just hold through the range. What happens when you go to frame two? Does it disappear?
**Rick:** Well, uh, yeah, so I was just making sure that I was aligned up with the four. It will, uh, it will stay that this value of 3.
14, which is the new, uh, new information, it will remain throughout the entire clip. However, what will happen because of the parenting situation, is that It moves and it will, um, you know, it will disappear and moves just with a different relationship to the camera. It will always be correct at that one point, but it does not hold true for the rest of the clip as it did in the unreal demonstration.
**Eliot:** Right. And in the [00:16:00] Unreal one, what I did is I moved it, I basically moved it pretty close, I didn't have, so what's going on here is in the Unreal clip, I did not have a foreground occlusion, so you have a cube in the front, so I just pulled the clip back until, and actually we can try this, you can pull that back until it's just behind the cube, but the problem is you've got a foreground occlusion, and so now you have to dance back and forth, and, whereas I didn't, I just, you know, moved the tiger up and, You know, then, then I didn't have a foreground occlusion.
Um, this is, this is, this is what's going on is you've got, you've got two, uh, two constraints. Uh, and when you have, when you're doing that, then, yeah, I, I remember there may be another way other than keyframing it, but I don't know that way. I mean, what you could do, give me a second is in different systems have different ways of doing things.
And there may be some clever way to set up a constraint to, um, tell that, that, that [00:17:00] distance to always be there. I think it's going to be really, that's going to be tricky. I think what you're heading rapidly toward is, uh, can, can Unity do Kryptomat? Uh, let's, let's take a quick look at that.
**Rick:** Are we, uh, are we exploiting the possibility of having the render out and passes then?
**Eliot:** Yeah. Yeah. The, um, there's a unity Kryptomat. It's a, it's kind of a research project. Um, let's see, cause this, this is what you're running into is a classical, is a classic problem in compositing. It is an ancient problem. And so the way that everybody in compositing handles this is, you know, Kryptomat and object IDs.
And they actually, they're magic and they're just, there's flat out magic. It means that there's a compositing step. Um, but before I jump into it, let me just think if there's. There's anything smart that comes up that I can think of. Um, I mean, there are constraint systems where you can kind of keep your, keep [00:18:00] your object and given the spot, it gets tricky because you have the camera, it's parented to the camera and yet you want it to be at a, you know, you want it to be at a fixed distance.
So it, it, it changes. I. Um, there may be a way to do that, but it's, it's gonna be tricky. Um, and
**Rick:** I would, would the, would the following work, I have not tried this yet, and I, I'm imagining it would break things if I were to un parent or duplicate the, um, basically take the image plane, break it off away. Uh, do not have it, uh, as a child of the camera or scene locator or shim, have it be an independent object in the world and.
Position it. Um, manually and then copy and paste, uh, the scale data.
**Eliot:** I think, I think you're going to be in a world of hurt because the camera [00:19:00] motion is per framed basis. So that keeping it parented to the camera is why it stays aligned. As soon as it's not parented to the camera, your hand doing every frame and it actually becomes nearly impossible.
I mean, this is, this is how people used to do match move. Like back in the, you know, early nineties, back in the nineties. And one person would spend weeks on a shot, um, to try to get that right. So it's, um, I, you know, again, for, if you don't, the other possible way to do this. Is you ditch the cube.
There's, there's the simple approach where you take the cube and go for that shot. I mean, it's a, this is not, we're not talking, uh, what's the best way to put it? We're out of elegance here. We're, we're in the, in the, uh, You know, look, some old movie I saw where the hero's trying to get away to, for the, for the bad guys, they go, look, UFO.
And the bad guy looks at him and the hero like kicks him or [00:20:00] whatever. And, uh, this is a look UFO situation. Okay.
**Rick:** Well, that, I mean, that's, that's fair. I, um, I have also done, I've done the manual key framing. Uh, of the, uh, of the depth of the, uh, of the image plane. Yeah. And, you know, the thing, you know, Ian Hubert is correct in his Blender tutorial.
Uh, it's a forgiving process. One does not have to be exact, nor does one have to do every single keyframe. You can usually, uh, uh, skip about quite a bit, which is, which is promising. Yeah. Um, but, uh, Yeah, I was looking to see if it was possible to get away from that. And I of course did throw a foreground, including object into the scene because, uh, the beauty for me, or at least to my, uh, my limited experience perspective is that, uh, the beauty of a virtual production, as opposed to rear [00:21:00] projection or just pure green screen.
Is the idea of an immersive world, especially with Jess being like, you can have a three dimensional. Uh, like a stage in the round and then place your steam locators everywhere and shoot from behind an object, shoot from here, shoot, to me, that's the magic, the beauty of it. And I, uh, I really wanted to maintain that and make, you know, maybe the, maybe the process is that for those instances, uh, this has to be done.
Um, It's another another beautiful aspect of why I like the image on a plane and having to position correctly as opposed to, uh, just really close to the camera is that when I'm activating things like, uh, localized fog, or if I have a localized spotlight on this area, the image plane in unity. I don't know if unreal does this [00:22:00] unity.
Uh, my tripod will cast shadows. My tripod will accept light. If I apply a normal map, uh, the light will, I can have a rim light on the tripod. Wow. Um, I can have, if I activate fog, it will look like. The tripod is sitting in fog, and as the camera moves away from the tripod, as long as the tripod is keyframed to the remaining position, as the camera backs away, that fog will become thicker around the legs of the tripod.
It's very immersive, and that's what I'm after. And AutoShot Jet Set workflow has taken so much technical workflow off the table. That it's, it's a delight, but maybe I'm being greedy in this last request I'm trying to figure out. Can I get this one little bit of extra? Um, that, that, that is the, uh, the, uh, the impetus for my
**Eliot:** question.
I've got another idea, [00:23:00] and, um, this, because ultimately what you're dealing with is an object that is fixed in space. Right. Um, and, uh, it does, uh, it does not, it does not actually move. The tripod is just, just standing there. So another approach you could be using, and this is, this comes down to a very, very system specific stuff is sometimes you can project, you can do a camera projection, uh, cause you already have the camera, you have a camera texture, you know, the, which is the image sequence and you know exactly from which camera is projecting.
So if there is a unity, actually, this might work, uh, unity camera of let's see if it, it lets us.
Let me see if I can do this.
Project.[00:24:00]
Let's take a look.
No, it doesn't come up immediately. Um, but what this, what it is, oh man, I wish, I don't see it yet. Um, project with Cameron. Okay, let's take a look.
There are people doing custom shaders and unity to do. projections. Um, it's, it's
**Rick:** tricky stuff. [00:25:00] Um, there may be a, um, there may be an unofficial pathway to doing camera projection with Immunity. And I'm happy to research that on my own. Um, but you think that that might be a
**Mark:** valid, uh, lunchtime or something.
**Eliot:** Because then what you could be doing is the way it would work.
**Mark:** It was,
**Eliot:** uh, soft, but it wasn't right. Uh, the, uh, Mark, I'm just going to meet you real quick for there for a second. Um, so the, uh, what I'll, what you could do is then make a cylinder, um, that is where the tripod would
be in this, in the screen. Um, or whatever, it almost doesn't matter. And then what, what's happening then is the, um, uh, the, then you set up the projection stuff to project the texture from the camera's point of view. It's, and it's, you're just projecting the, you know, the, the transparent, you know, transparent, um, picture of this, but then it's, it's going to project [00:26:00] onto that three object.
And since the 3d object isn't moving, as long as you have the, uh, this, this in, in roughly the correct place, it'll, it'll show up as the texture in the correct location. Um, wherever you, you know, wherever you are in the, um, in the, in the, um, in the, in the scene, uh, cause you are literally, you know, I actually
**Rick:** might already have an idea of how to accomplish that in unity.
Okay. Cause
**Eliot:** now, now we're doing things correctly, you know, it is cause you have a stationary 3d object. You can, you have the knowledge of where the camera is and you can project upon that. I mean, and Nuke, Nuke does, and like a bunch of the 3D systems, like how RDF systems for this, you know, Blender, Maya, whatever.
It's, it's a, it's a, this is a technique from, like, not quite caveman days, but shortly after that. You know, shortly after the first two pixels were, were comped with A over B, somebody came up with a camera projection. And this is how people, I mean, You know, this is how you did stuff before you could actually render anything.
You just camera project photographs onto [00:27:00] simple geometry. And this is, I mean, as an interesting, this is actually how you did the pod race and star Wars, like one of the ones back from 90, when was it? 2000. Yeah. It's camera projection. So, uh, but there's almost always a way to do it in unity. They may, you may end up with a custom shader, um, where it gets a little bit gnarly cause then you're, you're backing out the camera projection matrix and reapplying it.
But, you know, it's,
**Rick:** it's. I'm moderately familiar. Uh, I've, I've built shaders. I've messed around with shader graph quite a bit, uh, simply out of necessity, really. So, uh, I may not get it right on the first try, but, uh, I won't be jumping in cold. So I will, I will investigate that pathway. I really appreciate that.
Elliot, no worries. No
**Eliot:** worries. That's, I will, I will
**Rick:** check that out.
**Eliot:** We've been planning to do that by default. Um, at some point, it's just that every single. Program has their own, their own special way of doing it. And so it ends up being tricky to get it corrected in all the different, different systems, but it's, [00:28:00] it's, it's such a useful thing to be able to, you know, project camera project stuff.
So, um, okay. Yeah, let's, let's try that. Let's see how that,
**Rick:** how that, how that works. Um, I will, I will do that. And then I will certainly report back, uh, at some future date. Thank you.
**Eliot:** Yeah, I'd like to see that. I mean, the, the, what you're talking about with being able to use the, the projection or the, um, the image plate to cast shadows.
Unreal, man, I went back and forth on this quite a bit. Their, their default way of doing things is with a blend where you have like a, you know, fall off in the alpha transparency. In order to get something to pass, cast shadows, you have to set it to a clip where it thresholds the transparency at basically zero or one.
So you end up with this like really blocky cutout on the, on the key. So it doesn't do, doesn't, you know, do that correctly with, and that's, that's with the normal Lumen. I don't know what, what that is. What the, what the path tracer can do. I haven't, haven't tried that yet, but, um, no, this is, that's, it is such a nice thing to have the 2d in the 3d scene.
I mean, it's [00:29:00] such a nice thing to have, because then you see where everything is and you're not dealing with The whole, I mean, kma and all that kind of stuff, it's a thing and you, at a certain point you have to go there. But boys, it's a real pain in the butt. And I I see why you want to keep it in the, in the engine.
Uh,
**Rick:** it's, yeah. May, may, I, may, I demonstrate what unity is. I have a, I have a very dirty composite, just a dirty bill that I put together demonstrating the shadow in the fall. Uh, oh. Let's see if you have an interest in
**Eliot:** seeing it. Oh, I, I, I love, I love seeing this.
**Rick:** Let me switch out this, uh, shared screen real quickly.
**Eliot:** That's also great to know that they're, that the, the going to Blender and then to Unity is actually a pretty clean path. That's, what it's doing right there is like, that's exactly what you would do with a direct import. It just, it's doing, doing all the right stuff.
**Rick:** Yeah, it is, uh, it is painless. It is, uh, it just simply works.
Which is a delight. So, [00:30:00] okay. So I've just pulled up this file.
**Eliot:** And have you been able to write a USD file out from unity to go into jet set? I haven't even, I've never tried that.
**Rick:** That's painless as well because unity has a, um, a USD exporter package, which you can simply, uh, install, uh, using, um, using package manager.
And so, so I, I export from unity as a USDA into a folder into my. You might adjust that folder. Uh, I just, um, Located in auto shot and bring it in, you know, well, I don't know, shot, you know, makes it into a USDV. It goes straight to Jetset. I shoot, and then I bring it right back in to auto shot into Blender.
I don't even touch it in Blender. I just, as soon as the auto shot generated it, I just say clothe and then drag that that created Blender file into Unity. And it's there, the prefab, which is delightful. It's a, it's a very clean [00:31:00] workflow. All right. So here is the, uh, let's see, are we seeing the correct thing?
Are we seeing the, a video player?
**Eliot:** No, right now I've, I've seen another screen. It just has like a, a, um, uh, what's it a finder on it. Uh, yeah, let me,
**Rick:** how's that look?
**Eliot:** There
**Rick:** we go. All right, so this is just me walking in my living room, uh, and then you'll notice shadow on the cubes, and then the fog in front of my legs as I go further back into the, Wow.
And that's what I'm looking to do without the manual keyframing, but what I am doing in this because the camera is stationary, I do not have to wait. I take it back. I did manually keyframe going back and forth, but [00:32:00] all I have to do is manually keyframe the, uh, the forward and backward action. Um, and I, you know, it's a certain process for, for moving video, but And my workflow for this, I did not spend time making it pretty, I did everything from creating, from, uh, from using a scene in Unity, to adding some cubes, adding a few lights and effects, going into Jet Set, coming back into Unity, all of it, uh, I did it in about 10 minutes.
That's how quick it was, which is delightful. Um, final cut. So I'm an apple guy. Uh, I, I, I'm very accustomed to DaVinci Resolve, Avid, Premiere, Final Cut, uh, Final Cut just came out with Final Cut 11, which has its own internal magnetic mass. So, You're familiar with Magic Mask in, in, um, what do you, in, uh, [00:33:00] Resolve.
Yeah. That, that allows me to shoot without a green screen, one click on a subject, moving or not, create a rotoscope plate, which is what this is. Very rough. I did, I, I literally did not. I did not finesse it at all. I just clicked on me and said, whatever it gives me, it gives me. And I, I just went with God, uh, but on this, um, and that way, uh, I exported as a PNG sequence, which of course I used in unity, but that means I'm not seen in unity.
I've got just my subject, uh, moving as the camera recorded. And, uh, and then I've got this thing, but I also have a custom, uh, green screen shader in unity. Which allows me to do the live keying inside of Unity for the exact same effect. But what I like about this is that with a green screen stage, I am limited to a performance that is the size of a green screen stage.
Whereas in this, I [00:34:00] can, I can walk, I can walk a mile into the background if I so desire. Uh, walk into the horizon and, and, and do a rotoscope AI key or a, you know, AI rotoscope. So. This is what I like. I like the idea of an image plane. I've never been a fan of Unreal Composure or, or, uh, Unity's Graphic Compositor, because it's a separate plate that sits on top of everything, and it does not look as integrated.
If you're going to integrate it, you really need to, to, you know, light it in real life to look a certain way. And with this, you can light it, Close and then you'll know it's like here I am absolutely. Almost in silhouette. And then I walk into the light. Wow. And of course I've got shadow, ah, I've got light on myself and I've got fog around my feet.
Yep. Like that's the magic that I'm after and that, that I've been chasing for the last couple of years. And, uh, just that's really [00:35:00] slid into a, uh, into a, uh, into the workflow nicely. Uh, and that's why I was kind of hoping I might also rely on your expertise to maybe figure out that last little. Magic of could I avoid manually keyframing, but I will check out that
**Eliot:** that
**Rick:** camera projection and see what it gets me
**Eliot:** because what it what it could get you is then instead of having to keyframe the image plate, then you if you have a stage.
I mean, sometimes with this, you have somebody moving around a lot, in which case you're going to be keyframing something, you know, whether it's the image plan or whether or whether it's the. Uh, or projection. But if you have something that's kind of more stationary, then you might be able to project onto it and, and that's it, you know, then, then it, then it's doing, then it's doing all the right things because the geometry is in the right spot and the projections are in the right spot.
Um,
**Rick:** yeah, I, I think, I think you're right for, for a shot like this. Uh, key framing is inevitable because, uh, it's also, you know, I've got myself between two cubes back here as well. Um, [00:36:00] Um, so my, you know, keyframing will, you're correct, it will be inevitable, but for those stationary, like, like, if I'm, if I'm shooting two people talking at a cafe table and they're not moving, I would love to project that so that I don't, on those shots, the keyframe is not necessary, because that can reduce the keyframing manual, uh, uh, workload.
By like 50 percent or more, depending on the production.
**Eliot:** Right, right. And you always just want to focus on the things you're going to have to do for the next, like 50 shots, because it, it gets, gets old. Um, oh, this is great. I love, I love the, the ground fog interaction and the, and the lighting interaction, and this is, I mean, this is one of these, these things that's, it's a real, it's a real deal.
And the, um. Um, we did some experiments with trying to bring the depth, um, data from LiDAR directly into, into the, the frame, and it, it, it was too chattery, et cetera, to get that. There are things coming out, um, that are, that are look promising, some of the AI depth [00:37:00] generation stuff that gets, gets promising.
It's not dimensionally accurate yet, and so one of the things that I, you know, I talked with, uh, uh, Hun Kim over at Switchlight a little bit about, You know, maybe combining our, our LIDAR data with, uh, with their AI, uh, depth data so that it was both accurate and clean. Um, and you know, at some point we may get there cause that's then, you know, there's just so many benefits, um, to being able to have, to be having the comp, you know, the live action in the 3d scene, there's no two ways about it.
Um, and here's, aren't that complicated, you know, you know, if, if there's, there's, again, there's a level. Cool. Where, yes, you have to go to a separate composite, you know, when you start, you start trying to go, go at a certain, certain scale of things, but, um, wow, this is cool. All right. Well, this is great. Let me switch over to, uh, see if we got a bad B.
All right. So let's take a look. Cause he's got some, uh, uh, he's got sent me a link. [00:38:00] And, uh, a bad beetle. Are you there?
**BadBeetle:** Yeah, I'm here. I mean, all
**Eliot:** right. Is it okay if I show some of this?
**BadBeetle:** Yeah, go ahead.
**Eliot:** All right. Let me, this is,
**BadBeetle:** this is only test footage that we're, uh, I mean, we were, we were trying to figure out different, um, problems that we may foresee during filming.
So, I mean, obviously almost everything I'm the actor, just so you know, yeah, no worries. No worries. Go
**Eliot:** ahead, you can share it. Alright, so let's, uh, It feels better already.
Oh, I like the, like, light and shadow going on. Oh, there's a rack. Oh, jeez. This is,
oh, wow. Oh, that's what was there.[00:39:00]
Oh, there's another, okay, there's another person over there.
You've got it on red so that the glints on the weapon are accurate.
Wow,
you guys are doing a good job of matching light. Like man, it makes, it makes or breaks the comp and you're, and you guys are not afraid of going to like a, a real film noir kind of, kind of look that is
what, okay, so it's in a window. [00:40:00] Okay. How is this made?
Do we get to see behind the scenes of that?
Oh, there we go.
That is wild. Cause in the, in the shot it was, you know, moving window and we'll flare and,
Oh, this is great. That's great. This is what, so I'm going to come back on a couple of these things. So can you, uh, can you walk us through a couple of these things? So we've got this, this element where you rack focus. So how is [00:41:00] this all put together? So that, that was a separate element. Is that a. Uh, is that over in, in a separate element in unreal or in, in, uh, after effects or how did you put the elements together?
**BadBeetle:** It's, it's a separate element in a blender.
**Eliot:** Oh, okay.
**BadBeetle:** Well, so it was kind of like, uh, I mean, when, uh, I forgot what the name of the guy was last, uh, last meeting when he was talking about rack focus. So that was something that we, we might be using in our, you know, in our project. So we were like trying to figure out like, well, how can we do it?
You know, and especially in blender. So one of the things we were thinking at first was. Try to have, um, everybody in focus and then in blender, you know, do the rack focusing, you know what I mean? But being that I'm the only actor we had, you know, so we were like, okay, so maybe what we'll do is we'll just film two different shots.
You know what I mean? And then do the rack focus on the second shot.
**Eliot:** Now, I gotta say that this is a case of literally wearing multiple hats in the production. Oh, yeah. Well,
**BadBeetle:** [00:42:00] this is done with only two people.
**Eliot:** Oh,
**BadBeetle:** great. You know, I mean, we basically, uh, you know, we're doing it at night. We're trying to do just one hour, just, you know, or just spend one hour on each clip.
Yeah. And, uh, and so with that, with that shot, we actually were watching, um, the Batman movie, you know, the last one that just came out and there's a scene in it that we were like, okay, we know they spent about a million dollars on this shot. Can we replicate it, you know, in less than an hour? So we're like, okay, so we actually did this shot and, you know, with what we had and we're also using, um, a basic, uh, I think it's a Canon, uh, 80 D, you know, camera.
So it's not even a high end camera. You know, to be able to show that, Hey, this is the quality that we can get from, you know, just, you know, cheaper equipment, you know,
**Eliot:** you got to tell me, so you've got a muzzle flash and it's affecting both you and it's affecting the background. So, is this, do you have an image plane in, in, in blender with you on it?
And you're, [00:43:00] you're lighting up the flashes in blender. How are you doing this?
**BadBeetle:** We use the aperture, uh, MC for that.
**Eliot:** Oh, all right. Okay. No, but wait a second. Wait a second. But there's, but you had to, so you had to key frame the backgrounds virtual. So did you key frame the lighting in the background to match that?
**BadBeetle:** Yeah. Yeah.
**Eliot:** Wow. And you did that in an hour?
**BadBeetle:** Hey, pretty much. I mean, me and my, me and my partner, um, see, I handled, um, the, the auto shot stuff and while I was doing the auto shot stuff, he keyed out and then he, he's pretty, I mean, you figure we've been doing this for a long time. So it's kind of like, I mean, he kind of already knows what to do.
I mean, that's more of his part. My, my part, my focus has always been on the. You know, the auto shot and you know, in the, you know, the jet set stuff, you know, I mean, he does more of the, you know, speed compositing, I guess you can call it, but
**Eliot:** I love, man, I love the, the sort of the night, you know, that that is high contrast.
So, okay. Let's talk about, [00:44:00] okay. The key looks great on this, especially because you have in this original shot that it's, it's, this is not an easy key. So. So did you use a combination of AI keying and, and, uh, maybe a normal grid and a normal keyer to extract some, or how are you? Yeah, we, we used,
**BadBeetle:** uh, um, I think we used RunwayML again, you know, and, uh, and the only reason we're using it is because we, I mean, I finally did get, um, InspireeNet to work, but it just takes so long compared to the AI, you know, doing it, you know, it's just, I mean, and also we can, uh, we can key out certain, uh, uh, Certain objects.
So let's say if there's something that we want to keep, you know what I mean? Let's say like in, in the, in the, um, you know, with the gunshot, we want to keep the gun, so we just kind of like, you know, um, mask it out and then the AI figures it out for us. But I mean, obviously these are not finished shots. I mean, these are just what we did real quick and to see what we can do.
**Eliot:** The fact that you're doing this in an hour or two is why, I [00:45:00] mean, this is, this is so exciting to see this because you know, you put the, the shots are good, right? You're going to, you put a bunch of these together and, and to see it. And, and you're, you're already doing stuff. You're, you're like, it's a handheld shot.
You have, you're matching, you know, muzzle flash and the lighting and stuff like this, and you're able to go through it, you know, quickly and see. People respond to this in, in, to visuals in a way that they do not respond to a 120 page script that they got to go read somewhere on the weekend when they're trying to juggle, you know, kids and whatnot, right?
They, like, they, this is, this is what people look at and they end up passing on. Um, yeah, this is cool. This is really cool. That's okay. So it's great to know that the key, the runway is doing that well on the King front, because then, then you can go, then you can go dark, right? This is, this is what makes, I mean, the Matt, the Batman just, you know, I someday I want to see it.
I've seen the trailers and clips. And so I just don't see anything bad, but [00:46:00] I just, this is what I wanted to see in green screen and blue screen to work. Is this really high contrast high? Oh,
**BadBeetle:** here's something interesting. So every, uh, all of our, or our green screen is based, I mean, everything in the background is a green screen that we just lit a different color.
**Eliot:** Oh, really?
**BadBeetle:** Yeah. So it's like, and we actually figured that out. Um, I think it was, uh, I think it was last night. We were like, cause we were trying to figure out instead of having green spill. We wanted to have a different color spill on the back of the jacket, you know, or the back of the character. So then what we did is we started adjusting the lights to different colors.
We can make any color, you know, and it makes it easier to key out.
**Eliot:** Yeah. Cause then the color edge contamination is the color you want.
**BadBeetle:** Yeah.
**Eliot:** You have, you have. You have now replicated the sand screen process that they used on Dune, which is, I believe, 180 million. Right? See, that's the crazy
**BadBeetle:** thing. When you start looking at, like, wow, [00:47:00] you know, we're able to do this with nothing, you know?
I mean, obviously, I mean, let's say, like, for this, uh, the stagecoach scene. We were, uh, I mean, we, we don't have a lot of props at our house, but, um, we had a hat and I was like, okay, well, where's the hat? And it was like, I think it was already midnight and most of our stuff's in storage and stuff. So I was like, oh, let's just do it without a hat, you know?
And, uh, then, um, it was suggested that we, uh, as funny as it is, we're going to put the iPhone on my head, right. And then track the, you know, or track the, you know, the head movement to put a hat into the scene. You know what I mean? But then we're like, ah, you know, we'll do that. We'll do something like that for, for the next shot.
You know,
**Eliot:** that's, that gets super tricky. Yeah. Super tricky fast. I mean, what's great about this is that you're using real light and real people and real, you know, you know, shadow and all this kind of stuff, because all these, these beautiful glints you get on the other side of your face and your nose and the side of the leather, leather Jack and stuff like that, that's, you know, that's, that's [00:48:00] reality.
And it, and it's. It's so hard. CG works great for a lot of things. And I think it's, it's the go to for, for backgrounds where it's out of focus. And that'd be the hard part to get, but you can't beat a leather jacket at being a leather jacket. Yeah, it's real. They, they do that really, really well. Oh, this is cool.
This is, this is, uh, well, I mean, uh, I, have you, uh, I remember your, your, have you showed any of these shots to the, I guess the, the groups that you're looking at, at, uh, at talking to, to get some approval or whatnot, or,
**BadBeetle:** Well, right now we've actually finished the script. Um, We were thinking the problem that we have with DC is that they want everything specific to that, you know, what they want, you know what I mean?
So, I mean, they gave us basically a list of certain criteria and stuff and we're trying to adjust the script to that, you know, and I mean, but even so we're starting to look at it. Um, a little bit differently. I mean, now that we're [00:49:00] getting more advanced with jet set, I mean, we could do, I mean, it's not funny.
It is. We've been looking at it and we're like, like, I mean, we're looking at bat doing the Batman fan film. Cause we're Batman fans, obviously. But, and we also feel that having, um, an already established base. will showcase exactly, you know, what jet set can do, you know, but at the same time we can do anything, you know, I mean, if we had the right script, I mean, everything, all these shots that we've been doing is just kind of like, I mean, we're sitting there at midnight, you know, at the house and we're like, okay, what should we do?
You know? And we just go, we just go look for a model. You know what I mean? We're like, okay, so let's do this. You know, and so then we just start doing it, you know, and but if we had a script and we already knew this is the shots that we want, you know, it's like we can make. I mean, like, you know, and not a full fledged movie in one night, but I'm pretty sure we can do a pretty decent short.
And so like a lot of the shots that we're doing, we're just focusing more on the cinematics of the, of the actor, because we [00:50:00] already know that jet set can do, you know, the walking scenes or, you know, the, you know, the wide shots. We know that it can handle all that, you know, we're looking at the, or we're looking for, um, pushing the limits of.
How can we, you know, let's say with the rack focus thing, being that when he brought that up the other day, I was like, okay, so how can we do the rack focus?
**Eliot:** And
**BadBeetle:** I was like, okay, we can do this, you know? And my partner was just like, oh yeah, this is, this is going to be easy,
**Eliot:** you know,
**BadBeetle:** and, and then, I mean, it would be easy, easier if we had two people, obviously, you know, but being that we only had one person, it's like I had to, you know, stop and then, you know, do the other shot and stuff.
And, but at least we understand the concept of. How we need to do it when it comes time for us to do it, you know,
**Eliot:** let me show you something that's going to, it's coming up. That's, that's, uh, this is one of these R and D breakthroughs. That's that I think is going to be a big deal. Uh, it's kind of, I've been looking for this for a while and it looks like it's finally kind of coming, coming out here.
Uh, let me find this, [00:51:00] uh, let's see, where is this? Cause it is, it is direct AI based generation of 3d Gaussian splats. And then, and it's, and it's really, uh, kind of, um, yeah, Vista dream. There we go. We'll pull this up. Um, and this is, is that, you know, Jet Set is, is very good for the 3d part. And the 3d there's, you know, 3d part is always kind of the parts of the pain in the neck to get, to get built.
Um, I mean, share my screen. There we go. Um, go share. Okay. There it is. Okay. So this is. Um, or you give it a, a source image, right? Give it an input image. Um, and then you tell it to, uh, to actually generate multiple viewpoints from that image and then build that into a 3d, a Gaussian splat. And so you end up with [00:52:00] something that you've got a pretty decent amount of, a little bit hard.
It's probably kind of small on the page, but there's a decent amount of 3d information in that. Like there's enough there that you could shoot shots with it. And we can read, we can read Gaussian splats directly into Jet Set. You probably have to take them into Blender to get them scaled correctly, but we have a tutorial for that, but there, you know, so there's, there's that and the, and we're also building a tool to, uh, to bring in, yeah, like all the mega scans became free.
But if you try to get them into Blender, then you discover that, that they, you can only, they only have it in this kind of this weird, you know, kind of raw format download that you have to sit in there and put your own materials together, which, or your own 3D things together, which doesn't make sense. So we're building a script for that.
So we'll have that in another week or so. Okay,
**BadBeetle:** that's good. Because we actually had that problem. We were looking for models.
**Eliot:** Ah,
**BadBeetle:** yeah. Oh, I gonna sketch Fab.
**Eliot:** Yeah. Yeah. It's, it's, uh, that with the, with the script, it'll work great. Like we, you know, then, then, then all the materials and stuff were correctly applied in Blender.
And it's not that big of a script. It's gonna be a few lines. Uh, it's, it's just, they, they [00:53:00] didn't make it easy. And yeah, in 2025, they're gonna have a direct link to, uh, to Fab for, for, for, uh, uh, for, uh, for Blender stuff. But Great. You know, we'll probably start charging for it anyway. Um, yeah. But in the meantime, we can do a script and then we can actually get, get models.
But I, I think. I, what I think is, and we'll, we'll see. And I'm curious if you guys, as we go through, we're trying to figure out what's the future, what should we be building toward and. I think shooting with, with real actors and digital sets is the way to go. Instead of a pure AI approach, I've tried it.
I've messed around with, you know, all the different things and prompting makes, makes me crazy after a while. Cause you just can't get it to do what you would just naturally do with a camera and an actor. Like you just put the camera here and the actor there and you're like, okay, next shot, I want you to get up and go over here and then like, turn around and like, you know, shoot the, you know, shoot the cameras out at whatever.
Um, and, uh, and you know, your actor can usually do that. Whereas AI, it does everything, but what you want it to do. And it's really good for, I think, making assets, right? Like this, like a, a [00:54:00] chunk of a 3d scene that you're then going to go back, pull that into 3d, dress it up with a bunch of additional 3d, like procedural data and models and stuff to kind of bring it out.
So you're not building everything from scratch. Um, because otherwise in 3d, you're going to end up spending a lot of time. Like, Oh, do I have the right material for this corner of the, of the panel? Right. Yeah. And like days go by. Uh, and I think AI will handle a bunch of the sort of. I don't know, rough brush strokes of it were, but then you get into 3d and you can actually really, you know, as, as again, and as your project gets bigger, bigger, you, you go farther into 3d.
Um, but I think we're still going to be compositing for a long time. I think we're still going to be putting things together in 3d for a long time. Oh yeah,
**BadBeetle:** definitely.
**Eliot:** So that's, that's just my guess. That's uh, but I, this, this piece of it where we can tell the AI system to generate a chunk, a little corner of it that we can build upon.
Is appealing because then, then you'll go, okay, now we've got a little corner and we find a little piece of reality to shoot in and, and, you know, exactly the same way you're doing, you've got lights, you've got to [00:55:00] get a person, you got, you got, you know, all the things that make it look like a movie, uh, they are under your control.
And those are not, those are not usually that hard to get, you know, the giant stagecoach background, that's a little harder to get.
**BadBeetle:** Well, to find it and then, you know, have to source it. And then, uh, You know, but the pay for it, you know, and
**Eliot:** then it's
**BadBeetle:** so much easier doing it this way, you know, and I mean, when you look at one of the biggest costs that a lot of these productions have, or at least for the independent filmmakers, it's mostly locations and permits,
**Eliot:** you
**BadBeetle:** know, cause most people will work for free, you know, or, or minimal, you know, amounts and stuff.
And, and then when you look at it, let's say like even Blair, which project was, uh, I think they, they spent like 20, 000, you know, just with permits and stuff.
**Eliot:** Yeah, you know,
**BadBeetle:** and you know, that's what we're looking at. Like, this is rebel. I mean, I know I keep saying like jet sets, revolutionary. It really is. I mean, being that, I mean, being with that, we've been in the industry and seen.
[00:56:00] Like to do what we have done, let's say, even with what I shared today, it would have took at least, I mean, on set, we would have had a minimum of 10 people, you know, and you know, and then when I say minimum, I mean, normally it'd be like, you know, 20 or 30 people, you know, and most of them are sitting around and, you know, and everybody's just there just to make sure stuff and, you know, and post tracking shots.
Yeah. And
**Eliot:** then it doesn't work. The guy's in the, in the, in the room for three days trying to get the shot to track. Yeah. You know,
**BadBeetle:** where's Larry? We haven't seen him for three days. Yeah. You know, tracking and, you know, and see being a, being able to do that. And what's cool about it too, is that we've been, I mean, we can replicate shots.
I mean, I mean, let's say, I think we did. Five takes each of each individual shot, and we were able to do it back to back to back to back to back, you know, we weren't we weren't sitting there like, okay, let's stop. Let's rest. You know, it was like more like, okay, let's do that again. Let's do it again. Let's do it again.
You know, and it was just made it [00:57:00] so much so much faster and streamlined. It's almost spoiling us to be honest with you. Right? So we're looking at it and we're like, this is so easy. We're like, Okay. Okay, every night, you know, after shooting, you know, you know, on sets and stuff like that, we're coming home and we're gonna do our own stuff, you know, right, you know, just because we can, you know, and, you know, because we want to do, I mean, well, let me ask you this, Elliot, what other type of tests would you like to see?
**Eliot:** You know, with all this stuff, the, um, we always want to see what people are trying to do, right? Yeah. So what's like, what's best for people's project. Then we, when we hammer on the tech to kind of get it to go in that direction. Um, and you know, this is, this is, this is already cool to see it. You know, I, I mean, it would be like, what, what are you, what do you want to do with the script?
Where do you want to take. Take them and, uh, and that, that would be what that tells us what we need to engineer to get there, uh, build to, to, to match up with that. I mean, uh, [00:58:00] vehicle shots are, are one of these things I think will be really cool when people realize. You can do them, right? You have to sort of act the camera operator has to act the camera, right?
Cause there's, when you're shooting a vehicle shot, um, you know, right now we have, we lock the, the scene locator just to like the person's seat or something like that. And if you, if you don't act the camera, then, you know, it looks a little bit artificially constrained, sort of like the old school where you just bolted the camera on the, on the hood and the background, you know, and then nothing is moving except the background.
It looks a little bit weird. And what looks more modern is where you're running a stabilized camera head next to the car. So the camera head is, is, is oscillating slightly next to the car. Um, Um, but, uh, you know, but, and the, and the car, you know, car is staying relatively level, but the camera heads moving and that's what, that's what, you know, the edge cars and stuff like that.
And the, the ones with a big hydraulic crane on them, a stable crane that looks like, you know, the camera's doing this. And so I think you could actually do pretty sweet driving shots. It's just that the camera operator needs to have in his head. I'm an edge camera, [00:59:00] you know, or like I'm a stabilized camera.
So I'm, I'm. And they'll be able to see it, see the comps. So they get a, get a feel of it. So very, very helpful if people are used to shooting on real stuff, so they know what it, know what it looks like when they're, when they got it.
**BadBeetle:** Yeah. So by the next, uh, office hours, we're probably going to have a pretty decent, uh, car shop for you.
Challenge accepted. Yeah. I mean, is there anything, uh, is there anything else that you would like to see? Like. Because I mean, we most of the time when we're on set, I mean, let's say car shots are big, like you said, but it's also being able to track. I mean, that's the biggest thing is like, because it encompasses everything.
And it's more of trying to think about, okay, well, let's say with the focus rack thing, the focus rack thing was, or at least our idea of it was, um, was something that, that somebody brought us an issue that's outside of tracking. You know what I mean? Or at least for, for, you know, for what jet set is not normally, [01:00:00] you know, or currently set up for, you know, so, like, I mean, if there's any type of, uh, because I mean, that's kind of what we're looking at right now is trying to at least our, our, our, and perfect certain, you know, certain situations.
So, let's say, like, right now, when you were mentioning about the, you know, About the car or about the vehicle. I mean, we're going to do a vehicle now. I mean, because we were kind of like, because I mean, like I said, at midnight, we're just sitting there thinking like, well, what can we do? You know what I mean?
And, you know, because we're trying to foresee a lot of things, but yet we're trying to foresee issues that we might have in the future. You know what I mean? But the fact that we know we can do this, this and this, you know, that's just so simple. I mean, one of the things that we, you know, Have been focusing on, and it's more so because we come from a different, um, you know, or we come from a little bit of experience.
I'm not gonna say we're very experienced, experienced, but, um, is, it's kind of like the magic of VFX. And so a lot of the [01:01:00] times what we see is that people. Focus too much on the VFX part. Mm-Hmm. , you know, to where the, then when the, the audience is looking at it, they're like, okay, this is not real. So then everything else isn't real.
Right. You know, and so what we are looking at is the stuff that we're doing is more of creating that illusion that, okay, well obviously like when you watch it, uh, that shot the other day, you were like, okay, well what is real? You know what I mean? Like is we know the guy is real, but then is the background real?
It's, you know, it's gotta be something, you know, it's kinda like you're looking at it and knowing it's a magic trick. So we want to make it to where the audience doesn't even know it's a magic trick. They're just like, okay, they're really there, you know? Right.
**Eliot:** And
**BadBeetle:** so like, let's say like with the shot of, you know, the stage coach is like, we focus, I mean, the, the audience's attention is on, you know, the person that they don't really realize that this person is not really there, you know?
So that's like what a lot of our, our filmmaking, um, um, direction is, you know, is trying to make, I guess you [01:02:00] could say practical VFX. You know, to where, I mean, I, I think that's the best way to explain it instead of just saying, okay, we're on a spaceship or flying through space. Obviously people know you can't, you know, you're not gonna be flying through space, but if we were to do a spaceship, we would make it to where the spaceship is, you know, like people will look at it and think we actually built it.
**Eliot:** Right.
**BadBeetle:** You know, and that's the, that's the thing that I think that a lot of, um, uh, people that we've seen, and I'm not knocking anybody else's work, but I mean, a lot of the time when, as soon as you see it, you're like, okay, that's already fake. You know, because they focus on the fake before introducing the real.
Does that make sense to you?
**Eliot:** Oh yeah. Yeah. This is, this is one of the areas that, that we're. Uh, in process on, on showing some of the pieces of, which is, um, I mean, you, the, the, the thing that's, you know, most of our examples I've so far been on a, on a green screen kind of stage or something like that. And in fact, you know, it's a natural feature tracker.
So it actually tracks. Better outside, right? Where there's a lot more features. And so one of [01:03:00] the workflows that we think is going to become very common. And the other key thing is that you can basically run on batteries, right? You don't need a generator truck to run this stuff. Like, you need a couple of batteries to run the iPhone, your camera.
Um, and if you, you know, if you need a laptop or something, one of the modern Mac laptops lasts for like 16 hours or something on bat, it's something crazy on battery power, so you don't need cables, right? You don't. And so one of the things that I. I think we're going to see people doing is picking out a little bit of like an exterior, you know, like walking your origin, doing a quick, quick scan of that.
So you see, you gotta have an idea of where your geo is. Um, and then shooting just the shot where you have the actor touching the parts that they, they're running down the street. You know, again, it's easier to just borrow a little bit, little itty bitty piece of street and then CG in around them. Um, but have the parts where they're in contact, have, have that be real, you know, cause then you, again, you can do this with like two people.
You know, instead of having the whole crew and this and that, the other, and the, and the setups and the carts, the only people start asking me about, about this, that, the other, you know, like two people, man, or three people, and there's [01:04:00] no carts and no batteries and stuff like that. If you don't care, right, it's not a, it doesn't become a thing.
And then you can do stuff. I was, I was testing night tracking shots and it was still tracking, I was pretty dark. Um, I think, you know, when it gets really dark, then yeah, it's not going to track cause it needs some visual features, but, uh, tracking under a street light or something like that, that, that still works, there's enough light there to, to make that, to make the tracking now or tracking, um, so that, that'll be an interesting, you know, and we, we need to demonstrate that and build out a workflow and show it, but it's, uh, it's again, it's another one of these transformative things where, you know, like.
Configuring a whole street, that's hard, right, to get that to be show ready. You just need this little piece where the person's walking along, and you're gonna, you know, you can background replace a bunch of the things and not have to, you know, hide the non period cars and all this kind of stuff. You're just gonna, you're gonna map that out later.
That's a lot easier. Um, and then you can still have this nice, nicely tracked, [01:05:00] tracked world that you're operating in. So that's, that's one of the things that we're, we're interested in. Uh, we haven't, haven't built it out yet. I have to demonstrate all the pieces of it, but, um.
**BadBeetle:** All right, challenge, another challenge accepted.
**Eliot:** So, but very quickly, what it comes down to is, and what'll probably happen is, and I'll make a prediction is as you're going through this, as you're trying these things, it's going to change your script. He's going to realize, Oh, wait, we can do, you know, there's all these things we didn't think we could do.
And now we, now we think we can do it, you know, to where you start looking at shots from the Batman and you go, okay, here we go. Well, you
**BadBeetle:** know, what's funny is that we have a guy, um, I think I mentioned it before, um, he wants to refilm, he made a series and he wants to refilm the whole thing. It hasn't been released yet.
And he's like, as soon as we started explaining to him, you know, jet set and everything, he's like, Oh yeah. Like, cause he was always writing. See, most of the time, or at least for this writer, he would write with knowing, okay, I can get this location. I can get that [01:06:00] location. So is his imagination is limited to his, his locations,
**Eliot:** right?
**BadBeetle:** So we told him basically, Hey, look, write it as if you can be anywhere in the world. You know what I mean? Any, anywhere in any situation, let us know, and then we'll make it happen. So he was like, okay, well, I want to have, uh, yeah, he was like, I want to have a post apocalyptic, uh, SoFi stadium. You know, like I'm going to be, I'm going to be walking out of it.
He's like, how hard would that be? I was, I was like, Oh, done. You know, like, like, like give us something hard to do. You know, that's actually something very simple because all we need to do is, I mean, we can just go to SoFi stadium, 3d scan it, you know, during the, um, you know, the, the time or the, the light, you know, that he would like it to be.
And then we just film him on the green screen outdoors in that same light, you know, and put it all together, you know, and he's like,
**Eliot:** wow. I'll show you something that's that came up because this is a, I may have showed this to you before. I can't, [01:07:00] I can never remember. I showed it to a bunch of people, but this is some very recent work in 3D scanning.
So this is a, it's a Lixil scanner and this is a walking lidar scanner. So it's got both, um, a spinning lidar, like a laser detection system on it. And it also has like multiple cameras going around it. And the thing that, that I realized from this, I didn't understand this at first is some of these shots are just like the point cloud.
But it can actually also do direct Gaussian splats from that scanner. Um, and we can load Gaussian splats right into Jet Set. So there's, you know, and you can start to load them into, into 3d programs as well. So it becomes a way you can actually capture fairly large chunks of, um, exterior environments that are normally pretty hard to capture.
Um, and then, and you, and start to start to work with it. So there's, there's all these things that are happening really, really fast. Oh, yeah, to, uh, to make it, make this, make this workable.
**BadBeetle:** Oh,
**Eliot:** that's exciting. I can't wait to see it. [01:08:00] And as soon as we get the, uh, a version of the script, they can automate, they can auto import the, um, uh, Megascan stuff, uh, and Blender.
I'll, I'll integrate that and get that up. Cause that's, that's awesome. That's, I was just running into that, um, for another project. And I went, ah, it's just, there's a way to solve this. We, we did one version where we just exported everything from Unreal and used Omniverse, et cetera. But when I tried that on the more, more recent stuff that the textures, it was just too many translations going across and the textures and the UVs were getting, getting broken.
Um, and I went, okay, we'll just go from the FBX and reconstruct from the, uh, from the, the raw texture files. And then, then it was, it was behaving much more reliably when I tested it that way manually. So
**BadBeetle:** actually we did have that same problem. Um, but I mean, I think I, I didn't really want to bring it up, but it was just something, I think it's more on our end than it is on, on Lightcraft or Jetset is, um, when we imported the, the model, Like, [01:09:00] okay.
So in one of the scenes that we did, which I can't show, cause it's, you know, an NDA, you know, but, um, we actually had, had a car that we, we, uh, or was it, you know, on a street, it was a car and, you know, with the background, with the lights and everything. And then when we imported into jet set, all that, all the head textures was the car, everything, everything else was.
You know, just without textures.
**Eliot:** Where, where was the background? Where did that originate from? Where, where did the background stuff data come from? I think it came from Sketchfab. Sketchfab. Okay. Yeah. Okay. Yeah. So the way it works, so here's, here's what's going on is that, uh, in, in Blender, uh, Blender has a bunch of different ways of handling materials.
It has a full material system, like a very complete, complex, you know, you can do anything with that, that stuff. Um, and, but when we go, when we export to USD and then we go into, into Jet Set, it is, uh, wildly cut down because the only thing that. [01:10:00] The phone can understand it. So this is a very basic, simplified, um, um, uh, TBR system.
Like it can understand that like albedo and normal and roughness textures and, and that's about it. Um, and, and it only understands them if they're inside in Blender, if they're set up in a, in a, honestly, a fairly simple way and anything over that, it, it just can't does, it does, doesn't handle it very well.
So, one of the things we're doing, uh, because of that, of that, is we're implementing a Gaussian splat workflow, uh, so in Blender, you'll be able to, uh, you know, pick out an area that you want to, to generate a Gaussian splat with, hit a couple buttons, and it, it works with, um, it's a fairly automated process, you know, it's going to take about a half hour for everything to cook through.
Um, but then it works with, um, RealityCapture, which you can download from the, from the Unreal site. Um, and it works with PostShot, which is currently free. So both, both of these are currently free tools. Both of them are, if you just do [01:11:00] them by themselves, RealityCapture is gnarly. So we wrote an automated script.
It's like, and you have to get everything just right with coordinate systems and stuff. So we scripted all of that out to just set the metadata. So you don't, you know, you don't have to know, you don't have to go through the two weeks it took us. Pick ax and shovel. I'm going through that, Oh geez. So then it's automated and it'll generate a Gaussian splat that then can load into the phone.
And it looks like your original blender model lit like rendered correctly, everything. Now where limitations, like it won't handle a super long city street cause you just, it'll handle a room about really, really well. Um, but if you have a really large environment, then. Yeah, we don't have a great answer for that yet.
Um, it just, it's like, you're gonna have to use, use the geometry as reference in Jet Set and then push the data back into, into Blender. So you can actually see what's. But what you're writing it to is very real. It's just that material geometry can transfer pretty cleanly from one app to another. Yeah, pretty much a standard form.
Materials, not standard, not standard at all. [01:12:00] See for us, it
**BadBeetle:** wasn't really, uh, it wasn't like a, like detrimental, but it was more like when all of a sudden the car showed up. We're like, whoa, look at the car. And then we couldn't see the background. We're like, wait a minute. Why is the background not showing?
You know? Yeah. But, um, here's a, here's a quick question before I let you go. Let me see what time it is. I think we're almost about time, but, um, as far as the scene or not the scene locator, um, the marker. Right. You know, they, like when you print out the marker, the little, uh, the floor marker, yeah, sorry if I don't get the terms right, cause it's like, I mean, I've been, I've been up and then went to sleep for like a few hours.
Um, so with that, does it, I mean. Okay, so you know how you guys have released two different versions of it, like a small version and the big version. So does that affect, I mean, let's say if we go in, uh, if we increase the size of it and printed it out, would that increase the size of the model that comes out?
**Eliot:** No, no, the, the, um, the, [01:13:00] the, um, origin locators have to be a specific size. It's either the small one or the big one. And the small one needs to be printed on normal a four paper, you know, letter size paper, and the big one needs to be printed on 18 by 24. Um, they're, they're specific. Their size to be specific.
Um, but they, so they're, so they get recognized correctly. Okay. Cause otherwise the iPhone just doesn't. It's looking for an image at a particular, a particular image at a particular size. Yeah. It's not that size. It goes, I don't know what this is. Um, if it is that size and it says, Hey, that's the image we're looking for.
**BadBeetle:** Okay. Cause what we were going to do is we were going to 3d print, uh, kind of like a, like the vertical sensor and the, and the floor sensor at the sink or the when I keep saying sensor, but you know what, you know what I mean? The marker, um, You know, into one little object that we can actually move around, you know, just cause the paper is kind of, I mean, we put it in a, you know, in a protective thing.
I mean, at least it looks a little bit more professional to be able to put a cube on the floor and this is our, you [01:14:00] know, our location. Um, we were wondering if that would actually work or if it doesn't have to actually be, yeah, I mean, well, being that we could 3d print it in the same scale, I mean, I mean, well, what do you think?
**Eliot:** I mean, it was originally designed probably the easy way to do this. The, um, the small markers, the problem with them is that you have to get the camera really close for it to recognize. So we actually, that's why we went to the larger markers. And we had a real
**BadBeetle:** problem with that.
**Eliot:** Yeah. That's so the larger markers are designed to be printed and we just looked at Canva canva.
com and you can get a standard 18 by 24 inch thing for like, you know, 30 bucks or whatever. And upload it to that, hit a button and it shows up and I went, okay, that's probably, that's probably the right solution, um, to do that. I mean, and if you want a 3d print, a little 18 by 24 thing, that's fine too, and just mount it to it.
That'll work fine. The, the, the floor markers, the vertical markers designed to be mounted on a, on a stand or something like that, separate from the floor marker, because that's for cases where you can't tilt the camera down where you're on a, you know, [01:15:00] big fish or dolly or whatever, and like the camera's only got 10 degrees of tilt.
Yeah. Nope. That's what we were thinking
**BadBeetle:** of just having like, uh, 3d printing something to where the vertical marker, we can actually adjust the height of it, you know, and then the floor marker just stays, you know, on the floor. So it's kind of like a, like a tripod type deal, you know, that we can just kind of move around and take it to the, you know, locations.
It kind of makes it a little bit easier so we can actually do it. But if we need to adjust it, it makes it easier to do it to adjust than having the paper on the floor, you know, and I don't know, I just, that's just something that we're going to do on our own, you know, we'll probably show it to you, you know, because if it works, you know, but
**Eliot:** yeah, yeah, I'd say whatever you do, you'd probably want to have it so that the camera only sees the vertical or the horizontal at one time.
Yeah, because it's the design is, it only sees one of that. If it's not two at one time, I don't actually know what it would do. I've never, I never thought about that before. He don't cross the beams.
That would be interesting. Hmm. , ,
**BadBeetle:** [01:16:00] Jess. It just blows up. I'm like, whoops. . But, um, other than that, I mean, everything has been going good. I mean, the only thing that we've had issues with is, oh, this is what I wanted to ask you too. So in the compositing setting, we, we put it at five. Uh, 540, I believe, you know, and is that just affecting the 3d model?
**Eliot:** What that affects is, is, yeah, it's the, the, the, the size at which we render the 3d model and also the, the resolution at which we composite the pixels at, um, so then, uh, if you. You set it at 1080 HD, we're going to both render the 3D model and composite it at 1080 HD, 1080p. If you set it at 540p, then we, we render and composite at 540p.
And it comes into play where it took, there's a, there's an interplay between how heavy your scene is and, um, and, and the resolution you're shooting at and, um, how, how, you know, Uh, how beefy your camera, your phone GPU [01:17:00] is. So usually for the composing, we just recommend dialing it down to five 40 or seven 20 or something.
Just so you're, you're keeping the phones for the other parts of it. You really want to keep it for, which is recording stable frames.
**BadBeetle:** Okay. Yeah. Cause that's what we actually did. And all of a sudden it seemed like the camera was not even overheating or that the camera, the phone wasn't overheating and. I was like, Whoa, but then I started wondering, okay, well, what are we affecting?
What are we losing by doing that? You know, I mean, really nothing. I mean, now that you explain it to us, so it actually, you know, works out better for us.
**Eliot:** Yeah, that the camera, the recording your source image. If you're are you on jet set city? Okay. Yeah. So the only thing that, that camera original has been used for is for matching, you know, the type of the, the frames and stuff like that.
So you can leave that at, I don't know, seven, 20, 10, 80, you know, whatever. Um, and it's, it's very real, you know, in a few more years when the phones are, have a ton of horsepower. Um, they're already getting a lot better, but. Yeah, right now we're a little bit running on the [01:18:00] edge, so you have to be a little bit careful on on, you know, for example, shooting on the phone.
2160 P. Most phones can't take it. They get too hot and they start to drop frames. Uh, so we just recommend usually leaving it, you know, down at 10 or whatever. Um,
**BadBeetle:** oh, so even on the on the iPhone, uh, when it says camera settings, you dial that down to.
**Eliot:** Yeah, we, we usually, um, look on your recording setting.
You have a choice of what resolution to record at and what resolution to run your comp at. And usually we just run a recording at like 10, 1080p, 720p, whatever. Cause for Jetset Sydney, you're not going to use it. You know, that that's, that's just, it's reference. But you're not using that information in your final image.
You're using. Oh, okay. Information for your final image. And that way you, you sort of save phone power for the things you care about.
**BadBeetle:** You know, what's funny, Ellie is like, sometimes when we're looking at, at changes settings, I'm like, I don't want to change this. Cause This might affect something else. You know what I mean?
It might affect, uh, you know, the calibration or something, you know, it's like, I don't want to take the chance of it, you know, and, and now that you're saying that [01:19:00] it makes so much sense, you know, cause once we down, we, we, uh, yeah, um, put the, the, what do you call it? Oh, not the calibration. What do they call it?
**Eliot:** Oh, the origin markers.
**BadBeetle:** Not the origin markers. Um, the compositing, when we put the compositing down to the five 40. You know, it just seemed like the phone was running smoothly. And see, here's the interesting thing. We're not even using, we're, we're going minimalist, right? So all we have is the iPhone. We have the, uh, the Simu and the camera
**Eliot:** that's,
**BadBeetle:** I mean, and on top of that, we actually have a 3d printed, uh, camera rig, you know, actually real quick.
Cause though, let me see if you can see it. Oh, let me get it. I don't know if you can see it. It's kind of blurring the background.
**Eliot:** Oh, there we go. I can see the outline. That's awesome.
**BadBeetle:** But yeah, we do. Oh, sorry. I don't like the camera. It makes me look bad. [01:20:00] So but yeah, so we, we're going minimalist. And then what we found is that, and I mean, obviously you probably already know this, But, um, with the CIMU, you don't even need the CIMU plugged in after you do the calibration, you know, so
**Eliot:** it's useful to keep power into the camera and into the phone and into the cooler.
Um, yeah, we just mostly run it uses a battery, a battery source after the calibration. You eventually we want to actually composite and we don't want to take the live cine footage and composite in the phone. So you actually see in your viewfinder. shooting, but we're a little ways away from that.
**BadBeetle:** Well, I mean, the, the reason why we, we disconnect the Simu after we do the calibration is so we can see, um, you know, on the camera itself, you know, what we're, what we're actually filming, you know, so it makes it a lot easier because even though, you know, it does show you what you should be seeing, you know, on the phone.
We at least like to see it on the camera, you know, still it makes a lot of
**Eliot:** source.
**BadBeetle:** Yeah. You know, so, I mean, when we, we actually did it by accident, like we were filming [01:21:00] and then all of a sudden the court just came out and we're like, oh, okay, you know, uh, it's still working, you know,
**Eliot:** and then we're still
**BadBeetle:** able to process it.
So we're like, okay. Cause I mean, like, like I said, we're trying to do the minimalist to where on a handheld, it's going to be very light, you know, without all the extra stuff we even tried. Um, I think it was like, like, uh, I think it was on Friday or something. We tried using, uh, the, the iPhone's hotspot, you know, to, uh, you know, connect to the, um, uh, what do they call it?
The, uh, the digital slate, the digital slate. There you go. Right. So we use the, the iPhone, um, hotspot to connect to the digital slate. The only issue we had was there was a delay. Okay. Like, so I would hit the button and I'd be like, okay, nothing's happening. And then finally it did happen. And it was, it was more like a shock, you know, like, Oh my gosh, it worked.
**Eliot:** Yeah. Yeah. This is well for, for that, one of these things that, um, we put in our production recommendations called the travel router, um, and these are great. There it's like a hundred bucks, 110 bucks, something like that. And what it is, is it makes a little [01:22:00] local wifi and, uh, and you know, cable network. Um, that is, that sits in under an existing network, but then you have control over it.
So you can actually take it out on location. Like you set up in your, in your, your home, your office or something. You got everything working together, take it out on location and just run your, your, you know, uh, a separate phone or something like that, uh, to, to it, to give it a hotspot. And so that phone talks to the internet.
But then the router handles the traffic between all your local systems. Because otherwise, you know, you're trying to have, have your same phone, do, do router traffic. And it's, it's only one phone at a certain point.
**BadBeetle:** But I mean, but here's the interesting thing. It did work, you know, cause I was thinking like, let's say if we were in the middle of the desert trying to film, you know, or, you know, I mean, I'm not talking about a big production.
I'm talking about just, you know, a small production. You know, like, I mean, without having the router, without having everything, cause more equipment that you bring, the more, you know, more stuff, yeah. The more stuff you have to lug to where you, you know, you want to be, but I mean, obviously our travel routers, not like nothing, but the fact that we were able to do it through the phones, you know, um, [01:23:00] you know, uh, internet was really impressive, you know, and still be able to film and still be able to, you know, To produce what we were producing, you know?
And I mean, that just shows that, okay, well on a minimalist situation, you could do a lot, you know,
**Eliot:** that's, that's, I think there's a real thing to being able to operate without, you know, carts basically, like as soon as you have a cart, it's a whole different shoot. If you don't have a cart and you can, you walk around and stuff in your backpack.
You're, you're pretty free flowing, you, you can just kind of, kind of move and the, uh, and we're actually doing, we're just now implementing something where we can do automatic time code match. If you have a tentacle sync on it, where you do, uh, so you don't, we won't actually need the digital slate. Uh, it's still nice.
We really recommend that for, because that way you have, you know, which take, you know, they give a visual match to takes, uh, but we're going to have something where you don't need it. Um, and it'll, it'll match with time code.
**BadBeetle:** The only reason I like that is, um, with the digital slate is that when we're looking at footage, I mean, let's say if we [01:24:00] take 100 shots,
**Eliot:** right?
**BadBeetle:** And out of the 100 shots, um, you know, they look through the 100 shots and say, I only want 10. So instead of us having to process all those 100 shots. Like with the time code, it makes it so much easier to where, when we look at the Cine footage or Cine footage, we can actually see, okay, it's take so and so, and then we just look at it and find it.
It makes it so much easier than having to like process every single one, you know, and then go through and be like, and that's something that I think that would be helpful too is, um, batch processing.
**Eliot:** Oh, yeah. Well, okay. On the next build of, of, of auto shot, we have, we have the start of that. We have, we can name shots where you're at.
You're right now you it's the shots named by the tape, which is, you know, what happens when you have three shots out of one tape, right? So yeah, there's a problem. So we, we implemented it and I need to do the tutorials and stuff and finish the testing out of that. And that's halfway to where we need to be, which is timeline processing.
You got like 50 shots racked up in your timeline and resolve, hit a button. Yeah. [01:25:00] Open timeline and i'll export, lower than the auto shot. Brrrrrrrrrrrrrrrrr. You know, and it processes fifty shots in one go. That's what it has to be. Uh,
**BadBeetle:** what's funny is like, because last night, I think it was like two or three o'clock in the morning.
And I'm like, man, how many more do I want to do? Just sit here and just watch it go, you know? I'm like, oh my gosh. If only we could batch it. You know, because I was actually thinking about writing, um, You know, an automation that would actually do it for me. But then the thing is, is that I was like, ah, you know, I mean, I'm pretty soon.
Yeah, I'm pretty soon eventually, you know, it's something that, that would be a beneficial to the, you know, software. Oh, everybody,
**Eliot:** everybody needs this as soon as, because as soon as you start a project, what happens and you're seeing it, the shock counts explode. Anything where anywhere near the shock counts explode.
And we're, we want to build the systems so that just one person or two people can handle it when the shock count goes through the roof. Is it well. Guaranteed, you know, cause the projects, because people start understanding what's possible. And then [01:26:00] I tried to put it delicately, go nuts.
**BadBeetle:** We
**Eliot:** are, you
**BadBeetle:** know, like we, we'll do like 50 shots and now the 50 shots were like, okay, let's just take five out of it, you know, but the thing is, is like those 50 shots are like, Oh, Hey, we can do this.
Hey, we can do this. Hey, we can do this, you know? And then when it comes down to like, okay, well, what do we really need to do? You know, it's like, you know. But like I said, it's like, it's just so mind blowing because the imagination, you know, of what you can do now is so, so vast, you know, I mean, because, because even right now it's like when we look at when we were when we work on the volume, the volume is only 2d, you know, this is, this is actually giving us the parallax effect in front and behind, which is huge.
You know, and to be able to do that and know, you know, in, in, you know, where, where it actually sits, um, you know, it just makes it so much easier. I mean, even with what, um, what, uh, Rick was talking about with the smoke and stuff, I mean, to be able, normally, [01:27:00] you know, you just do a 2d smoke, you know, you know, going over it or whatever.
Now you can actually do it in a 3d, you know, environment, have it go over, you know, the blocks and stuff like that, you know, have it interact when the character's walking through, you know, um, You know, I mean, it's just, I don't know. It's just so amazing. I mean, dude, I've been on this, this whole, uh, jet set kick for a while now.
You know, it's like, I tell everybody jet set, jet set, jet set. And one of the problems we have is like, when you start dealing with the bigger productions and stuff, people are so set with the way that they've been doing it, they're like, okay, well, we want to spend the 2, 000 a day, you know, for all this crew, you know, with these, you know, 10, 000 machines and stuff, you know, we want to do it that way because we know it works, you know?
And it's like. And then when you really break it all down, okay, well, I can do the same thing with, with Lightcraft, you know, and have it done before, like even with the client that we showed the other shot that I showed you, I mean, he was blown away. He was like, oh my gosh, how did you guys do this in such a short period of time?
This is so amazing. You know? And [01:28:00] the thing is, is like, he actually thought that we were really in a basement. He was like, okay, so where did you guys actually film in it? You were in the basement. Like, how'd you get the, where'd the change come from? You know? It's like, he's like, oh yeah, it was a Halloween store.
I was like, no. We filmed this, you know, you just gave us an idea of what you wanted. We, we put it together, you know, I mean, obviously it's not the finished product, but you know, it's. It's enough to show you where we can be in less than an hour, you know? So if you go and figure, if you give us four hours, you know, of just sitting there, you know, compositing and, you know, cleaning up the green screen and the green spill or whatever, you know, we can, you know, we can get it to what you need.
Yeah. You know, I mean, but the thing is compared to doing that same shot would take us almost, let's say three, four days to get to that same point, you know, to where then we can start doing the compositing and, you know, and all that and, you know, but.
**Eliot:** Alright guys, I got, I got a 1030. This is awesome. Um, but I, I, uh, super awesome stuff.
I cannot wait to see all the, all the next things.
**BadBeetle:** [01:29:00] Yeah.
**Eliot:** Thank you, Elliot.
**BadBeetle:** Hey,
**Eliot:** have a good one. Thank you. Have a good one, Rick. Take
**BadBeetle:** care.
**Eliot:** Alright, bye. You too. Bye. Bye.