Transcript

# Office Hours 2024-10-18


[00:00:00] That's what, uh, oh, Elliot's here. All right. Sorry about that. I had to call it in a good way. I bet. Hello. I'm okay. All right. Hey, Ellie. We were just talking about the, um, Just building, uh, 3D uh, environments, you know, to where kinda like what I was talking to you about, uh, the other day, Elliot, about how, uh, we built the Batmobile or we had the Batmobile in the scene and basically we drove up and the Batmobile was in front of the, of the actor because of the, it was mostly because of the scene locator located.


Or we built it in, in a, I guess you could say in a 3D world that was based on the real world. So we knew exactly where it was going to be at, you know. But Ellie can probably explain it better than I can. I'm still Yeah, absolutely. I was just asking him about, uh, we're, we're doing something where we've got a, almost like a talk show environment, and we've got somebody [00:01:00] that wants to Uh, it's one of these comedians I'm working with who wants to sit behind the desk look like Jay Leno or something like that And i've got it working in the unreal environment on the preview But I was just curious what I needed to do to make the preview in Jet set show the desk in front of him.


Ah, yeah, there's, um, that you can do a, uh, we have a switch in the settings and that's, uh, that is, uh, enable Z depth on green screen. Okay. So then if you toggle that, then, uh, what it does is it uses the, of the onboard lighter on the system to do a more or less a per pixel depth, uh, version. So it'll, if you, if it's, you know, the desk is located in 3d space, it'll put the person accurately in 3d space.


They walk behind the desk. They they're behind the desk. So what was that called? I'm going to just write this down real quick. Hold on. Sure. It's, uh, it's in the settings and it's just, uh, enable Z depth on green screen. All right. Settings, enable Z depth. Yeah. Cause we're just starting to work with this set.


The client that [00:02:00] wants to do it shows in the set and, uh, I got it working in the Unreal world. Um, I'm using your preview script. Oh, I actually got, uh, you know, we had talked about this last time I was on, I did get the deckling card to work, uh, gen lock and say, Oh, great, great. Yeah. And it's more or less the same process you followed with the AJA card.


It's, it's really not that different. You just, instead of it being an AJA plugin, you, you enabled the black magic plugins, but everything about it. Is the same and it does work. So fantastic. Uh, yeah. If you, if anybody was curious about that, you could definitely use the deckling card for that live preview.


Um, we, we bought one and we're going to do our own like tutorial to walk, walk through it, through it. To, cause I, I, I tried doing this, uh, in one, in office hours with a, another customer, I think Joe and I ran into some quirks and I didn't quite understand what was going on. So I said, okay, we'll just do it.


Get one and fire it up. So, but yeah, so I, I found another tutorial on YouTube [00:03:00] where they had gotten, he was using a different tracking system. I think it's the vibe or something, but everything else he was doing was just about getting the gen lock going. Uh, so I kind of followed what he did and I looked at it.


I was like, well, this is more or less what Ellie was doing with, with the AJA card. Can you put that link up? I'm just, I just, so, uh, yeah, let me find this. It's a little bit, it's a couple of years old, this tutorial, but he's doing it Um, with the, uh, with the deck link card here, let me just go to my history.


Uh, should be able to find it that way. Of course I watch way too much stuff. So it's going to be way down the list here. Oh, I think this is it right here. Yeah. Um, no, no, not this guy. This guy's not as good. Sorry. I watched a few of these videos. I got to find the right one. Um,


is it before that one? [00:04:00] I guess I got to hear all I I'll keep looking forward here. Oh, I'll type in a gen log. It'll come up. So my, my history doesn't look right. It's in there.


Oh, okay. I think it's this one. Yes, this is the one. Okay, fantastic. Uh, so let me just put this in the chat.


Okay, there you go. All right. But yeah, he, uh, yeah, it's about two years old, but he does, like I said, he's not using Jetset, but this one. Yep. This is a good one. Yeah. So follow that he does. There's actually a couple of plugins he enables that I think are for the vibe that [00:05:00] you don't need, but other than that, everything corresponds.


And then when I finally got it working, I went back and watched yours on the AJA and I was like, this is more or less the same process. So, yeah, I referenced his and, uh, closely and, and modified it a little bit to, to sort of, So that you could see the, the, the causality of when you build like this blueprint where it goes into, you know, cause it took me a bunch of times through before I could associate.


What, you know, when you build a custom blueprint where this goes into the time data monitor, how all the different pieces, uh, go back together. All right. Thank you. That's that's, uh, that's, yeah, I, I was determined because my system had that card and I said, I just, I need to get this. Yeah. Well, and I have another system and we're.


We haven't purchased a card for it yet, but I prefer them to be kind of in parity, so it's easy to move projects back and forth with no real issues. Uh, so, uh, I was hoping to get that one working so we could just buy another deck link for the other system. [00:06:00] Uh, and now we did, because We got it working, so, all right.


And so then, and then Jet set, the setting to turn on and I've, I've talked back and forth with Greg on whether we just should enable this by default.


And it's, uh, green screen death.


We have your, uh, audio in. Oh, okay. Sorry. So that's the Z depth for green screen, right? That was the okay. There I see. Yeah. Green screen. Green screen depth inclusion. Enable that in the, in the settings and we've, you know, Greg and I have talked back and forth as to whether we should just have that enabled by default.


I tend to think we should have it enabled by default. It can make scenes look a little bit weird when you, if you don't expect it. Cause you, you know, all of a sudden the, your floor will start to have, um, you'll see CG starting to poke through the floor. Um, but you [00:07:00] know, anyway, we'll, we'll, uh, we'll, I'll rethink that.


Yeah. I'll try that because like I said, and the, the other big question I had is right now, the, uh, if you're doing a cine shot and it's recording the, um, the temp cop, uh, inside the phone, uh, it doesn't appear to, Replicate the, uh, like the reticule for the, the lens. Is there a way to turn that on? Okay. Let me think.


So, and here's my reasoning why, because I'm trying to develop an offline workflow, meaning. Instead of rendering out the comps and doing all that to use something like those previews as a way to choose shots, uh, without having to make any kind of temp shot through a render or anything like that. But I wanted to have the correct framing, uh, because right now it's just the full frame of the phone.


So it doesn't actually define the reticule that comes up on the interface. [00:08:00] It tells me, Oh, this is what's coming. This is what my lens is actually seeing. So even if it just had the square on there, cause then I could just do a custom resize or override to get the framing correct so that when I'm editing.


I can see a temp shot. So if that's not possible, I was going to try to use the unreal preview and do like a take record and then just render them all out as, as temp, and then we'll do the finals in post, but that should be straightforward. I'll go check. I'll go check on that. I bet that's, uh, I bet that's, yeah, that's my only question is if there's a way, two lines of code and when I'm doing the Cine code or Cine workflow, if my, if the temps could have.


At least have the, the, the rectangle that defines the, what the view of the actual sitting lens is, and I assume it would be even better if it was, you know, Oh yeah. If it was already, if it was already resized so that I had something that was a good representation of what my shot will look like in the final, that I could just bring those in and cover, even if they're [00:09:00] not the right frame, right.


With, with the tentacle sink, I think everything has. Correct time code or more or less correct time code. Uh, so if that's the case, then replacing them with the final comps shouldn't be terribly difficult because the final comps I would build inside of resolve, which is where I'd be editing anyway, so I'd have all that time good for my original, you know, camera masters in there.


Anyway, I don't mean to dictate all that, but I'm, I'm trying to develop an offline workflow, so I need to. Figure out the quickest way to come from camera to cut, you know, yeah, yeah, that, that makes perfect sense. I'll, I'll go check on that. I bet that's going to be a, like a five line fix. I just render the render the correct render the correct, uh, correct one.


So I'll, I'll check on that one. All right. Thank you. Can I just. Ask a follow up question to this. Uh, when you've been, I mean, uh, my understanding is that what are you seeing on the phone, even with the frame of, uh, uh, intended or, you know, [00:10:00] uh, appropriate, uh, cine cam wave, it is still the feed from the phone.


It's not the cine cam feed. That, that's correct. That's correct. We don't, we don't yet, uh, take in, we've taken the live Cinefeed for, for calibration, but we're not using it after that, uh, yet. So, Brett, you're good with, uh, the cropped clip? Yes, I just would like something, I mean, I'm just used to like the professional television workflow where everybody's working with low res offline files, uh, to cut.


And then, and even when you're doing visual effects, they're sending low res temps over to cut in. But this generates a low res temp for you when you're shooting right now, it just doesn't have the same framing as the city, um, camera, so. I would hate to be using those. And then someone gets the finals and realizes, Oh, that framing is, I thought I was going to be able to see that building over there.


And [00:11:00] actually it's cut out of the shot, you know, things like that. Uh, that's the kind of thing that editors would. would want is something that's an accurate representation of what the actual framing of the shot is. Okay. So your framing of CG element is what you're concerned, not the cam feed, not the cam feed so much.


I just want something that's a relatively close approximation of framing for the final shot. Uh, and you do get these temp files right now. They just, they're the full frame. So they're not, They're not accounting for the cine calibration. Um, and even if they could just print them with that reticule turned on, just so you know, oh, okay, if I crop it and the, and as long as you, most of the shots are using the same lens, you should be able to just do an override.


Uh, on your sizing. I know you can do that and resolve for sure. And then just everything will have that size. And if you have different sizing for different lenses, you would just make a couple of presets and turn them on based on what [00:12:00] lens you were using. What, what kind of projects are you, are you working on?


Uh, well, what we're trying to do right now, there's sketch comedy. And there's also, we're working on a kind of a mock talk show. I work with a lot of comedians that are doing podcasts, but they want to kind The one we're working on with the desk, he wants to look, make it look like he's on an actual set, uh, you know, an old school Johnny Carson style talk show set and then be able to move the camera around a little bit, have one person sitting behind the desk and other persons, you know, they're just trying to, you know, Take it out of the whole podcast world and see if we can take it to another world.


We're also doing some feature projects. I partnered up with a guy that's got a green screen studio here in LA and we're brainstorming. He's got all these comedians already coming in for their podcast, so that's kind of the low hanging fruit. And he's got some Big name comedians coming in and a lot of them are very interested in what we're doing.


So I just got to get it to a certain point before I really try something with one of them. And that's [00:13:00] fantastic. Actually, like, you know, keep, keep telling us if you run into anything that gets in the way of you doing that, please let me know. Because I looked at that and said, man, this is, you know, we, I, we, I just released another video.


We have a process where you can actually re render Takes in the phone. So, um, we did that with Gaussian splats, which, uh, you can sort of get it, that sort of shocking level of quality out of, out of a phone render. And then you're, you can operate at this very high speed, but the, the, what you really want to do is enable this for, as you said, the sketch comedy stuff that can never afford a normal VFX.


Traditional workflow, but they could go, I mean, then you could start doing all the crazy stuff that, you know, the location things, the Pythons did in the sixties and all these, these, these really fun things that are hard to do now, um, actually start to do it. Exactly. So that's, that's kind of what we're trying to do and what I'd love to do right now, what, just this, the, what I'm looking for now is a quick way.


After we've shot to come in and cut things up and choose [00:14:00] shots and without having to actually build comps. Um, so that's, that's really what I'm looking for with this is just a quick way to take those previews in and go, yeah, we like that one. We're like that. Let's cut these together and then just build the comps for the shots we've selected.


Um, and the other possibility I was looking at is because I did get this whole live render thing working is doing take recordings and using those as my offline files, but then actually building. The finals, and those that should theoretically be an accurate representation, the framing should be correct 100 percent at that point.


Yeah. Yeah. Um, but again, you would have to render everything out, so I'm trying to avoid having to render. Um, because, that's it. Yep. Alright, that makes, that makes, that makes good sense. That makes, that makes great sense. And I, yeah, I think what I'm looking for is something a lot of people would want is some sort of quick offline workflow.


People that are used to working offline, online, and having low quality. [00:15:00] Let me, I'll point you toward this. This is the, I mean, the one I just did on re rendering. Um, and I, I did this, uh, for the, um, this is worth, worth looking at, cause I just put this up, put this up yesterday. Um, and right now that one is just, it's for a, a Jet Set or Jet Set Pro workflow where I'm just re rendering what was, uh, the 30 frame per second, uh, information.


But I'm going to follow that up because the same process we can re render in the phone, a, uh, you know, 24P, 25P, you know, from a, from a Jet Set Cinetake and, but the, the process of being able to, you can drop, I, you know, this is a. 1, 500 frame take that I dropped into resolve and you can key it out in like, you know, boom, but it's the, uh, real time green screen, green screen systems and resolver, very good.


Um, and so you can actually render out a take, um, very, very, very quickly. That would be. It's going to look pretty remarkably good. And at some point, if you wanted to level it up, then you actually already have [00:16:00] all the tools and stuff like that, that they're in. Uh, so just I'll put that in as, as a, as a point.


And then, uh, I'm going to follow it up with, uh, one that would handle like, you know, Sydney, Sydney camera work where you're shooting 24 P or 25 P with a, like a red or a, or whatever you're going to use, what's your typical onset camera? Well, we've got a few, the rig I built out is a black magic Pocket cinema 4k, but I've also got two, six, uh, six K and a six K pro that we're going to rig out are, we're also going to try to run a couple of these cameras, both with trackers, uh, mounted on them.


Um, have you guys tried anything that was like a multi cam? Oh, it should work. You know, it should work fine. You just. Put all the, you know, jam sync, all the tentacles to the sound. So, so right now we don't have all the gear. We do have two iPhones. We've got a 6k pro, a 6k and a 4k. So we have the cameras, we have two phones.


Uh, we just don't have two [00:17:00] tentacles. We're just buying stuff slowly and getting everything set up. Right now we're just using the 4k for all the testing because that's what I build everything out on but our plan is to rig out one of these 6ks as well and try to shoot a couple of cameras so that you could do like a talk show style like the podcast he actually usually runs three cameras sometimes even four um but i love to get to where we can run two maybe three cameras all with the jet set trackers on them right right then you can intercut and do all that with yeah oh yeah so that's kind of where we're headed This is great.


This is, uh, again, if you run into something that's keeping you from doing this, you know, wave a red flag, let me know, because this is a hundred percent where, you know, this is the way forward. Right. And I'll, I'll show you something that I, I, I like to reference now, cause this just came over. Uh, this is a video that just came out.


Um, uh, you know, the, it's a, it's a commercial for, uh, kind of [00:18:00] hollow. Uh, there we go. I think we need, I'll share the optimize. So this comes over for zoom, but so the, the kind of a neat commercial, um, hit play here. There we go. Um, kind of a fun, you know, send up the matrix and, you know, talk going through these things, walking through this and, you know, we go through some, some different work of it and it's, it's tracked, you know, going through the environment, um, you can get some fun things where he walks through switches to different worlds.


You know, changes to, uh, you know, it kind of, uh, goes through and changes as worlds, um, switches to completely different environments, you know, all these, these different backgrounds, but, and this is fun. And the thing that I think is the game changer on this is that there are two people on set. You know, this is, uh, this is, uh, you know, a quick behind the scenes thing that they sent me.


Um, but it's worth this. I look at this and go, this is, [00:19:00] this is where, where we're going. Right. So you've got, you've got him, you're running on running jet set and, uh, and jet set is tucked over on the other side of the camera. We'll flip over to this view. There you go. He's running jet set, but that's it.


Right. It's got, you know, it's him, you know, the same guy is operating is, is doing, doing the, the, you know, production work. Um, but that, there, there we go. Then, then you can, now you have, instead of, 50 people on set, you have Three, right. And, and then you can actually do, and you can track and you can do all the things that you're, you're trying to do.


And that makes it work for sketch comedy. Cause you just have to be operating at the speed that comedians can operate and they're, they're fast. They're so fast and you want to be able to keep up with them. So, uh, I, yeah, I, I, I've, I've, you've almost never heard people talk about. Yeah, what I would call like, uh, uh, improv VFX.[00:20:00] 


But there was a, this British, uh, production company, and this is probably about four or five years ago that did a BBC sketch comedy show using virtual production. And that was what kind of inspired me as, cause I watched what they were doing. I said, well, that's. That's the way to do it. You don't have to build sets.


You don't have to do any of this and you can go anywhere because yeah, you're every three to five minutes. You're changing the locations again. No, uh, I, I know exactly the one you're talking about. It's small and clever productions. That's a little Moss up in Wales. He's literally the, the, uh, this, I think the second person that I sent the link to of the, of that re rendering workflow after I finished, after I put it up, because I, I looked at that and said, you know, I'm amazed he pulled that off with, you know, back in 2020.


Yeah, he was using some VR tracker, I think. Vibes and online arts and I just went, you know, wait, wait, do you see what we can do now? Because then, because it opens the gate, right? And uh, of what's, what's [00:21:00] possible to, to, to make as an environment for the comedians to go off in. Yeah. Oh, this is great. So, yeah, that, that's what I'm doing.


Uh, the only other question I had right now was, uh, I'm trying to use the, uh, the origin locators a little bit. Mm-Hmm. , I know you made now larger ones. Uh, I haven't tried using the vertical one. What's the purpose of that one? I'm, I'm curious how that works. Yeah. So the floor one just pointed at the floor.


It puts the floor origin on the floor. Yeah. The vertical one is when you're running on, on, on a set and the camera operator can't tilt down. You, you got everything's rigged up. You're on an air 35, you know, you got. The guy to tell down the floor is going to look at you like you're crazy. Yeah, most you can get him is to shift over a little bit, right?


Because, you know, so this is the, the large chart is designed. So you walk out with a, you know, a quick, you know, a lightweight stand. Stick in front of it on top of, on top of exactly where you want your origin to be. He points at it. We have remote controls in Jet Set, so you can be operating Jet Set from a, from a browser, because, you know, a lot of these things, you can't go up to the [00:22:00] camera.


Um, so it's, it's, uh, Jet Set is built so it has an internal web server that is a remote control system. And it's, it's the digital slate, it's the same thing. You just, you just go to a different tab on the digital slate. So we can actually remotely, um, you know, switch to origin mode and detect those markers.


And it, you know, and the remote, remote system, you see the highlight when you click. Set to, you know, set to origin and it'll fire a ray. It detects the, um, the distance of that marker and it detects, it does a machine vision detection of that marker and it highlights it. And when you hit go, it fires a ray on that detects where the marker is in 3d space, fires a ray down to the ground and resets your origin directly below.


Wherever that downward facing arrow is going to point, that's where it sets your origin. Yeah, now that makes sense because that would be very useful because we're constantly having to because I like using the printout as opposed to just randomly selecting it. Right, right. It orients things properly and it makes it nice, uh, but it's small.


You have to move because I have my printer. I can only print out the eight and a [00:23:00] half by 11 version. I think we're going to try printing out your larger one. But I was curious about what the vertical one, how that worked. And that makes perfect sense. I wasn't fully understanding it. They're designed to be printed on the Canva standard 18 by 24.


So just order it from Canva 18 by 24 mounted on their eighth inch thick board. And it's comes in perfect flat and tape it to a, you know, you don't need a C stand, just a lightweight aluminum lighting stand or something. Drop it in front of that. So that way you need to set, reset your origin right now, boop, dink, and, you know, pull this in.


Yeah, that would be considerably easier. So now, now that, because I wasn't understanding, I, why would I want to set my origin on the wall? But now I, now I get it, that it actually doesn't. We're doing a ton of, of UI fixes upcoming in, in Jet Set. And after we do a pass through that, I'm going to redo. Our intro Jet Set tutorials to be a one shot in the same way I redid the Unreal and Blender ones to be a kind of a one shot walkthrough.


I'm going to redo the Jet Set ones to be a one shot walkthrough of the entire production process from setting your, you know, your, your, um, [00:24:00] uh, setting where your project files are located to setting the origin, doing the remote detection. And so people can see that it's designed to work either individually, or you're just sitting there and tapping on it.


Or in a production metaphor, which is, you know, somebody is remote controlling it and you have, you know, one guy runs out, sets origin, gets out and four seconds have elapsed. Um, and that, and then you're, then you're shooting again. And so all that. Remote stuff is, is already unlocked in the, uh, in the web interface you said, cause I haven't tried that, but that's great to know.


The, the, the, there's, there's already a digital slate and even the base free version of Jet Set. And then Jet Set Pro and Cine have the part where you can hit a button and you hit a video preview. Uh, you can pipe in the video preview, uh, on the, it's, it's the panel in the, in the digital slate that says video, click on that, hit play, and you'll see a real time video preview of the, uh, the data coming through.


And that's, that's really what that's designed around. And, and, and certainly, uh, uh, Kumar has been. I've been giving that a workout over in Bangalore [00:25:00] is, uh, uh, he's, uh, we're using a jet set on some large, very large scale shoots, uh, where they're pushing the remote control aspects of it rapidly. And so we are scrambling to keep up with all the remote, uh, remote needs.


Cool. Dominating, so I'm going to be quiet for a second. That's what I'm sorry. This is, this is great. Bill, Bill just jumped on. Uh, and so, uh, Bill Brett is, uh, working toward being able to, uh, put together stuff for a sketch comedy work, uh, for the comedians and building kind of digital backgrounds, et cetera.


So, uh, I thought, I think this will be very, a very fruitful area. Because you can, you know, when you can set the comedians into lots of different environments, they can come up with all sorts of different, uh, things to, uh, you know, different ways to, to play it around. Yeah. And they love it. I mean, we've got right now, we've got a monitor that faces the set when we're doing the live preview.


This is another reason why the live preview, even if you're not going to record those takes is useful. Because you're trying to show these [00:26:00] comedians, well this is what you're, this is the environment you're in, this is what it's going to look like. And, you know, put a couple of green blocks in places to go, yeah that's a chair, or that's going to be this.


Um, but to, for them to be able to quickly see, here's where you are, this is what it looks like. Um, and maybe do a quick run through, but always be able to quickly look over and go, okay, I see where I am, right. So they're comfortable. Then we turn that off and they go for it, you know? Um, so, uh, yeah, it's very interesting.


And the people we've been working with are really interested. We haven't tried anything big with them yet. We're actually going to short film. We're going to shoot with some no name actors first, trying some of this stuff to prove to them, Hey, here's what we can do before we ask these people to come in.


And try this, but we're close to shooting a full test. And then once that works, we'll start doing it. What's your, uh, what's your pipeline on the short film on the short film? It's going to be more or less the same pipeline working with the, we'll probably [00:27:00] use the 4k black manager, 4k, uh, with jet sets, uh, editing and resolve doing everything.


All the posts will be in resolve. Uh, maybe, probably not, maybe not the audio because I'll probably end up exporting that, sending it to a Pro Tools guy, but I've done some audio work myself in, inside of Fairlight. I mean, Resolve really is a, an all in one post solution now, I think. Well, and when you get to doing, if you start doing lots of comps and you're doing, uh, you know, if you're doing rendered EXR frames, we built, uh, a, um, A, uh, uh, an exporter into auto shot that generates a resolve comp.


Um, I haven't documented that much, uh, just because I wanted to build some of the other pieces of the workflow. But, uh, if you go to others, you have to, you, if you're going to render it on real, I don't know if that's your primary, your 3d system is unreal. For now we're using unreal. We're, we're starting to dip our toes into the blender world, but the live preview and stuff, because of the nature of what we're doing.


[00:28:00] That's very helpful to be able to go, but I would love to be able to build things in Blender, then port them over to Unreal and use them there, uh, just because you get better modeling tools and things like that, obviously. So something that is already in there and you can check this out and when you want to try this, um, is our Blender add on.


Already takes live data in from Jet set. So you can actually drive the Blender camera in real time. Again we haven't had a chance to do a tutorial for it. Cause I, the Unreal, there's, there's a mechanism for doing keying inside, inside Unreal so you can do a complete tool chain on there. And Blender, we can get the tracking data into Blender and, you know, drive the camera, et cetera.


Uh, but there, there isn't a great solution for keying yet. Blender has actually a phenomenal internal GPU based compositing system, but there's no way to get live video into that. So, I'm, yeah, but just as something for it to keep in mind as you do it. Uh, but the, um, I, we're, we're starting to use that more and more and more because I was a flame guy for [00:29:00] years.


And then last year, uh, I started using fusion and I did a, I did a movie. I did actually two movies and a TV series where I did all the visual effects using fusion. Uh, So I'm a big, I love black magic stuff. I think they're doing great stuff. I mean, I think they, if everybody's not using it, they should be.


And especially the pipeline of where you're running resolve and using fusion to do comps. And I, the, what I found is that if you try to use the media in nodes, there's lots of things that can get fragile. So the, the, uh, so we built an automatic generation thing for the Lua scripts for the fusion comps inside, uh, uh, inside auto shot.


So the, it's, it generates a Lua file. After you extract your XRs, et cetera. And after you do your renders, you just drag and drop that into fusion for that particular shot. You know, you have, you know, and it'll build, it'll construct the comp in front of you and with all the things wired together with the Delta keyer with, you know, uh, the plates, et cetera, and the merge and that, that kind of thing.


[00:30:00] And including AI mats, if you use those, it'll route the AI mats into the Delta keyer as the core mat. Um, so yeah, yeah, it's cause it's, I look at it and go, that's it. That's how you, that's how you knock out 200 shots on a low budget production. You just have basically one person operating is you, you have to be able to automate that, all those, those pieces of it.


So we're, we're rapidly building toward that. Uh, and once again, if you start into that and, um, and you run into any problems, let me know. I'm, uh, yeah, we're, we're definitely targeting low budget. Obviously the comedy sketches. I've got some. uh, horror directors that I've been talking to about what we're trying to do, uh, you know, I mean, I came from the big budget world, but I'm, I really wanted to dive into the low budget world and see what could be done because, you know, Hollywood's a mess right now.


There's a lot of work not compared to what there was before the strikes. So this seemed like a perfect time to kind of see what's out there to help these low budget productions and do some of the stuff [00:31:00] For cheaper, because that's always been a dream of mine to make my own stuff, but it's never quite been there.


And I feel like you guys are right there. Doing stuff that's going to make it possible for people that don't have a fortune to spend. And the, the, the thing, the, one of the things that's astounding is the capacity of the cameras these days, the quality of what you can pull off of that, you know, the, the black magic ones and these others, that's 12 bit raw data.


And you look at, you know, when you're doing comps, you look at the edge and the fall off. It's beautiful. It looks like a cineon scan and better than a cineon scan. Um, and so you can, what was, has been low, you know, you can operate in these low budgets, but actually. Work with very, very high quality sort of imagery, then that's, that's a win.


You know, that's the, the keyers and infusion are fantastic. Oh, I agree. I agree. Um, oh, this is all right. We'll please keep what's up to date. I will for sure. And yeah, comment or a check. I mean, I'm not so familiar with the whole, uh, uh, tools. So, you know, hands on myself, [00:32:00] but I think there's some hardware from black magic for, uh, king.


Yes. Yeah. So it's something we can think of, you know, uh, pulling, uh, you know, the source to unreal. Yeah. I'm not so keen on, you know, on camera or mixing, but still to unreal, if I can get a clean alpha, uh, channel, uh, in the feed from the cine cam. And then, you know, you, I mean, you remember, uh, the horrible, uh, render from the unreal.


So I know if that could be addressed to an extent. Yeah, it's called the ultimate black magic. And we were actually exploring that workflow as well using ultimate for the key or we were actually playing With an atem system which has a key or built in not nearly as good as ultimate but just to see The big problem we ran into is the syncing, you know, is there a way to gen lock that those signals, if you're actually sending them out to an external keyer, uh, yeah, I got them to lock [00:33:00] inside of unreal, but if we were sending them out trying to send and we got everything to track and work, keeping things synchronized, trying it through an external gear was pretty much impossible.


So we gave up on that, but if there was some way, if there's a way to, uh, Gen lock an ultimate, you know, with two signals. And that might even be a question for the Blackmagic people. Um, because yes, then you could use a much better keying system than what's available inside of Composure. Okay, but if, I mean, I believe there is, I just, I mean, you can adjust the sync in Unreal with the CG and CineFeed.


So the, what he's, what he's running into is, is, and this is worth looking into is that, um, right now we send the, the, the, you know, tracking data into, into Unreal and you can, uh, we can, we can, you know, competent inside Unreal. There's also a way to just take the feed once you haven't rendered inside Unreal to take that [00:34:00] rendered imagery.


And send it out to on the HTSDI feed of the black magic. Um, and then you would take that STI feed and then route it into, um, the, the, you know, the ATM or the, the ultimate, and it can be done. It just, I tell you, man, I've, I've put that together and. It, everything is on a hair trigger, you know, like you get it working and you go and get a cup of coffee and you come back and, and, and things have gone out of sync and, and it's, there are ways to brute force it, you know, you, okay, you get, you'll get, go get a broadcast, uh, you know, clock and you, you reclock everything and all that.


And you start to throw a lot of hardware at the system and what I. My conclusion was that we wanted a pretty, pretty broadcast. If you're doing broadcast, you have to do it. In that case, you run hard clock systems for everything. And this is worth talking about. Cause this is, this is the theory of why Lightcraft is why we are is, um.


And cause you, you have to hard clock everything to do broadcast. Cause it's going to go straight out [00:35:00] the air, you know, into people's living rooms and it has to be perfect in real time that at that point, you're just, it's, it's broadcast hardware. And in all the, all these things, what we realized for most of entertainment production, um, not broadcast, but when you're doing production, you're doing a shows or things, you don't need things perfect in real time.


You want them to be pretty good at real time. So you can see what you're doing. Um, But then you want to have an extremely accelerated way to rework that in post production. So then you're editing and you can go, go very, very, very fast. You don't need it in, you know, in a half a second, which is broadcast.


You need it tomorrow, right? And that change. Means that we can fundamentally change how we, we approach things instead of trying to get a perfect key in real time, we get a pretty good one in real time, track everything, record everything, and then we can run these automated post production systems that can turn around, you know, as you, as you get into it, we're going to be able to turn around dozens and hundreds of shots in the course of a, you know, a day.


Um, And that then, [00:36:00] then, and you can fix anything in there. You're like, okay, the key is a little bit wrong. Or somebody dropped a mic boom or something like that. You just drop a, you know, thing and resolve, fix that one, move on to the next shot. And the system is doing most of the work for you. And you're just going in and fixing the things that are propped up in the edit that need to get fixed.


And that scales really well. Cause you can fix stuff, um, you know, before it, before it goes out. So that, that is, uh, uh, I mean, that's a very lengthy way of saying yes. You know, the, the ultimate is, is one way of doing it. Um, and you know, and I, it's, it's a great system. I would just say, beware of chasing the dragon.


I'm trying to get that to synchronize perfectly and hold up sync in a production metaphor, because it's going to burn days. Of, of time to try to do it. And then it's fragile. Yeah, that's my, that's my for me. I agree. I, uh, we were just exploring these options to give the best looking preview possible, but that's [00:37:00] good to know because I, I said, I don't want to purchase anything like that.


So we are sure that we can do this. Uh, and I'm worried getting decent. We did some. Adjustments to our lighting on our green screens and things that actually helped us to get a much better looking, uh, shot inside the unreal preview. So I'm pretty happy with that because we are going to do a post process.


These are not going to be the final shots, but I do have a couple of these clients that are like, well, could we just finish it? You know, so I can kind of show them, well, this is the best it's going to be if we're doing it that way. But if we do it, if you spend another day. Here's what we can get, you know, right overnight is the magic because then you can fix everything.


Yeah, exactly. So I prefer to do that. I'm a post guy. So that's always going to be my preference because I know it's always going to look better. Uh, I'm just looking at the best way to give somebody something to look at on set. That's really what we're looking for is when I bring some big name comedian go, here's what your set's going to look like.


I [00:38:00] don't want them to go. Well, why is the green still, you know? All right. Why do I have a big, is it, is it going to look like that? Am I going to see that? You know, right, right. You're no, no, that's all right.


All right. Well, fantastic. Uh, what's what's next is a bad deal or Kumar. Do you have other questions or nothing particular for night? I mean, for the day for you. Yeah. I'm just here to learn, uh, Elliot. I'm sorry. Hey, how you doing? Yeah, I'm just here just learning, listening, you know? Oh, sounds great. Sounds great.


I was, I was, uh, I'm still, uh, still, still smiling at the, uh, at the quality of, of that mask, uh, that they, they, they had on that. That's, uh, uh, Oh, we actually are now, we actually printed it in full screen. Do you want to, you want me to show you? Oh, yeah. Yeah. This is, this is the thing. Cause this is. You know, the digital, digital backgrounds are, we can, we got that, but [00:39:00] I still think high quality costumes are, is a real thing.


This is what we started with. We actually 3D scanned the, the actor. Oh, let me just get into this in the frame. And then what we did is we scaled it down. So that way we can actually, um, you know, do the modifications and stuff like that. So then we, this is the second version of it, you know, him with the mask, then, We 3d printed a smaller mass.


I don't know if you guys can see. Yeah. Then we didn't, we didn't like how this was because we had to actually, oh man, we had to take it, you know, we had to cut the back of it. So then we actually made it into a two part. Let me see if it comes in and I have to stand in front of it. I think we had to make it into a two part.


And then now we, we scaled it back up to, to the, the actors, uh, For the actor's head. Now we actually have the full scale version. Oh wow. We made it out of TPU. So it's really, you know, rubber and everything. We're going to be, we're [00:40:00] going to end up, uh, let me see if it's sure. For some reason, I guess it's.


The king is not right, but, um, that looks great. I mean, that's, that's what that holds up on camera, you know, that's, and that's, that's exactly where you want to go is spend some effort on the costume and the stuff that's like a foot away from camera that you just, yeah, I mean, somebody can do that digitally, but that's, it's crazy money to do it.


And then you have this really great looking foreground, uh, you know, actor work. And see what we're going to be doing. I mean, right now we're finishing up 3d printing, uh, the rest of the costume, uh, for Batman. And we're, uh, I think on Monday, we're going to be scanning, uh, the guy who's going to be playing, uh, Mr.


Freeze, and we're going to build out his whole costume. Now, I mean, like, I mean, like I told you the other day, what's so amazing about this is that we were able to do this in less than a week, you know, compared to like the conventional way we've seen is like, you have to have the guy, you know, all up in the, you know, the plaster and, you know, it's a whole process.[00:41:00] 


You know what I mean? We were able to do it in less than a week and it was more so less than, uh, the most of the time took in 3d printing, you know what I mean? So let's say like for the full scale mass, you know, for, for like this. You know, it actually took, damn, there it is, so, I mean, this actually took, um, I think it was like, uh, 18 hours to print, you know, and, uh, I mean, other than that, I mean, we could probably could have had it done faster if we could figure out a faster method to doing it, you know, but, um, I mean, like I said, like, conventional ways, it just took so long.


I mean, it would take like three months sometimes to actually just get one mask. You know, and then be able to test fit it and do all the, you know, the modifications this way, you know, with us scaling it down, we're able to do it within out, you know, we're able to get, you know, from from this, you know, to this in, like, I mean, we run multiple printers, obviously, but I mean, we were able to do it in, like, less than a day to get to this version, you know, and then modify it, you know, [00:42:00] how we want to.


You know, from there, you know, but what, what strikes me at this is that you're going to be able to adapt that process of like, almost, it's almost like digital costume construction kind of thing, but to multiple, you know, you could do, you could do armor, right. Just looking at that and like with a little bit of repaint and it's, since it's an elastomer, you know, it'll, it'll move, it'll move a bit better with the actor.


So, I mean, if you, You actually put metal. I don't know if you ever put on a suit of metal armor, but it's like, it's like, you know, just the shirt. It's like 60 pounds. You just go. This is, you know, how did anyone ever do anything with this? Yeah. Imagine if I wear it for like eight hours a day, you know, for filming and stuff.


It's, it's ridiculous. But one of the things that we actually, um, I think we figured it out last week was, um, and it's actually kind of interesting because, you know, With 3D scanning, we were able to, or what I was doing before was more, more so 3D scanning and, you know, for props and stuff like that. [00:43:00] And in that process, um, there was a car that I had to do.


I think it was a Honda civic and what they wanted to do was do a custom car wrap on it. And throughout that method, I actually figured out that, you know, from a 3d model, I can actually make patterns that can actually be cut out. Like, I mean, the same thing that was cutting out for vinyl, I can actually do it for cloth.


So now as we build in the costume, we can actually flatten out the 3d model areas and then actually get a pattern that we can cut out, you know, for the material. So not, I mean, the whole thing doesn't need to be, you know, uh, 3d printed. We can actually design the costume, give, you know, the art department, the pattern and, you know, I mean, have, have most of the work, you know, already done, you know, that, I mean, it's, it's, it's, it's really, it's really interesting to kind of see this, and I said this before, but it, it you, you're basically building a mini Weta workshop.


Um, yeah. You know, to, to be able to do this and, and except being able to operate on, on a, on a indie scale, um, [00:44:00] yeah, , it is gonna be fun to see this. Oh yeah. And I mean, it's really, it's amazing. 'cause I mean, we're, like I said, we're doing it with a five to 10 man team. We really believe wholeheartedly that we can get this done, you know, in, you know, with just a minimal, um, minimal crew, you know, and that's our biggest thing is, is the challenge of that.


But at the same time, all of us have backgrounds, you know, in movie production and VFX and stuff. So, you know, we're all working together to try to get, you know, all this done. Like I think, uh, one of our guys is working on, uh, what we were talking about the other day about the, um, the car chase. You know, of figuring out the concept of how we want to do it, you know, and without having to build a full scale batmobile, you know, which, you know, I mean, it's a daunting process in itself, you know, especially since we're planning on filming in like less than, you know, I think a month and a half, you know, um, but really what we're going to do is we're going to build half of the cockpit.


And then just put the 3d model on top of the cockpit and then actually [00:45:00] have, you know, the environment, you know, um, you know, moving, you know, as the car is moving and use the, I think, I think we were using a, um, what did they call that? The, the motion simulator, you know, that people use for video games, we're going to use that to simulate the motion of the car and stuff.


And I think it, I mean, in theory, it seems like it'll work, you know, I mean, we've, we put a lot of effort and planning into this, you know, and I mean, we really want to make it to where. It looks cinema quality, you know, and you know, I mean, just to be able to show that, you know, with a minimal budget, with a, a minimal crew, you know, it's gonna be astounding.


I think when you, when you get into that one, one thing you might want to do is, um, 'cause the, the, the, you know, the, the access motion rigs. I, I've seen them for doing, doing race cars. You might, you might run into, into difficulty with, with loads and stuff like that. But, uh, yeah, if you. You're going to want to be able to understand where the, where that thing is in 3d space.


Um, you could also hook on a, [00:46:00] another jet set device. Um, yeah, that's what we tried. Remember when I was telling you about the gun? Uh, so that was kind of more of a test to see what we can use it for. And that's what we were thinking of putting one of the. An iPhone on the motion simulator, you know what I mean?


And basically see if that actually will work a lot better. I mean, the gun was more, um, it was a lot harder to track because there's so many micro movements. You know, and with the, with the actual motion simulator, it won't, I mean, the motion will be a lot more subtle, you know, compared to like, you know, like, I mean, an actor can move his hand like this and we have to track all that, you know, with the gun, you know, but.


And the simulator, most of the scene is hidden. You're going to be putting a giant CG scene around them. And so the smaller variations that will be, will be hidden. And plus, I think we can actually record the simulation movements. You know, so if we do need to refine it, we at least have that data [00:47:00] to refer back to, you know, which is good.


I think, I think that should work. And then, then you have, you know, the, both Jetset devices are running on the same time clock. Um, and so that the data will, you know, you sort of line up the, the clips in time and do an extraction for the correct frames. And then you drop it in, into Blender and it'll, That that should that should line up in time wise.


Yeah, pretty well because we were running into exactly the same question on Uh on another project of figuring out where animated characters were going to be And uh, yeah, they both both adjusted systems are automatically operating on the same time uh timestamps, um if they're if they're fed, um, you know, the correct the same time code and And then then things line up in time because that's going to be I'm sure these, these actuation systems will have, you know, some way of doing data, but the likelihood of it being output in the correct coordinate space is hover somewhere in the neighborhood of good luck, whereas our data is actually designed to be, it'll be space and time coherent, it'll all go into Blender and it'll line up and it'll, it'll do the things [00:48:00] you want it to do.


Uh, you just, just make sure you just reference both of them to the same coordinate system and then like, you know, tape it onto the, uh, the motion platform and it should track. Yeah, that's what that's kind of what we're thinking. I mean, obviously in theory, until we actually get there and do it, you know, that's why we've been doing these micro tests, you know, of just saying, okay, well, can we at least handle it, you know, before we go to that extreme of saying, yes, we can, you know, and, you know, dealing with it on set, you know, I mean.


Yeah, overall, I mean, I think that we have a good understanding of how this whole process is going to work, you know, and I mean, a lot of the, a lot of the benefits that we have is that, um, I think the movie is called everything all at once. I forgot what it was, but it just won an Academy Award or something.


Oh, it's out here and see me. And one of my friends actually, you know, knows the director. And he said that one of the key things is, is that being a director and knowing what VFX, you know, how it works, you know, having the whole understanding of it made his film so much more [00:49:00] successful because I mean, one of the things I hate is hearing, Oh, we'll just fix it in post, you know what I mean?


And that's like the hardest thing for us is that, you know, when they say that, and then, you know, we have to spend months working on something that we did. You know, on that, on the day of shooting, you know, and I mean, I think that's what is going to be different, at least because we can preload a lot of our, our sets are, you know, get everything already, you know, ready to where all we have to do is just, you know, minimal post work.


I mean, that's kind of the reason why we actually started looking at the lighting system that we were talking to you about, you know, to try to get the, the. Um, real time lighting, you know, for a minimal budget of like 2500, you know, I mean, we can actually get the whole lighting from different angles. And, you know, I mean, from the, from the feed from unreal, you know, which.


I think that's going to be groundbreaking, you know, just to be able to do that instead of doing it digitally, which I mean, digitally, it looks good, but I still feel that, you know, like digitally, you know, light wrapping and all that stuff. I mean, it still looks good, but I think the [00:50:00] problem is, is still, you know, it takes a lot more work than actually having it on set.


Or to get the lighting right, you know, like, you know, and then just, you know, give it to, you know, post production, you know, but sounds, that sounds great. And I, I'm, I'm, I'm seeing there's, there's a bunch of very good digital models of, of various Batmobiles. I don't know if you guys are doing the, which we actually have the one six scale of, um, the, uh, 99 Batmobile, which is like the most accurate replica that we've seen so far.


And then plus we have access to the, you know, to scanning the one at Warner brothers. So that's right. A big one. Yeah, the big one, you know, that's if we need it. I mean, but I mean, they're pretty much, I mean, right now, like I said, we're still in the stages of trying to get, um, get it authorized because we were trying to get, you know, Josh Brolin in it.


And I mean, even if we don't get it authorized, I mean, we're still going to move forward with it, you know, because I mean, it's a fan film, you know, and, and, uh, but we just feel like authorized is a better way to go, you know, as far as, you know, business [00:51:00] wise, you know, just to try to say, you know, look, we did get authorization.


They do approve of this and we feel that the, um, I mean, the only problem we have with working with DC is obviously they have a lot of restrictions. You know, you can't change this. You can't change that. And, you know, it's still, it's, it's okay. I mean, it's not our IP. So, you know, we kind of work with them on it.


So that makes sense. Okay, well, great. Um, well, Ellie, I do have a question. Oh, go ahead. Come on. Go ahead. Oh, you remember how you said that you worked on the iRobot, um, or the Roomba, um, you know, robot. Now, it was kind of crazy is after I got off the call with you the other day, I was actually thinking is that because I've been looking at what that guy was doing with the 360 cameras and gauging and sliding.


And one of the problems that I've always had is, you know, being able to get all of the angles, right? So what I was thinking, I mean, cause I have my room running around, you know, you know, the house and stuff. I was thinking like, if I put a camera on there, With a, with a [00:52:00] vertical, you know, extender, you know, I could probably have this thing go around my whole house and scan it from every single angle.


I mean, isn't that pretty much, I mean, it seems pretty simple, you know, but I mean, is that something that's feasible? I, I, uh, I think that would be a lot of fun to see, uh, if you end up doing it, send a picture of it, because I've seen Roombas rigged in a variety of different ways over the course of this.


So I, you know, horses for courses, uh, Olli, and it's a worthwhile question, because Olli, the fellow over in, I think, Finland, um, He had something where he had like, uh, three GoPros on a stick where he was, he was kind of walking around to, because he is trying to basically capture as much information as he can in the shortest possible period of time, um, in a, in a, in a, in a coherence, in a coherent space.


And, and I, I don't yet know what the right answer is on that. Yeah. Um, you know, he's using, um, he was using GoPros and I think he might've been using one [00:53:00] of these. It's, uh, Insta 360, you know, the, and yeah, he was using a 360. And it can record it. It's 8K. Uh, it's, but of course it's a 360. So, um, I, I don't know.


I, I, um, in terms of if you're trying to get really a high quality imagery, It's still hard to beat a DSLR with a, you know, a moderately wide lens, you know, put a 24 or something on it. And, and then, and walk around the part that's always hard is knowing what you've already captured and not missing stuff.


Exactly. Yeah. Like the overlap, you know, that's why I was thinking if we were able to have it already pre programmed that the, you know, my little Roomba would go to the certain locations, you know, and. You know, kind of already know where it hasn't scanned or clean, you know, at the same time. Oh, this is kind of cool.


So, uh, Kumar just sent a nifty link of, of a real time, a real time scanner. Uh, let's see what it is. This [00:54:00] is the first I've come by, you know, so far, uh, which is, you know, I think they have the cheapest version should be around 10 grand. Um, I mean, see for a Gaussian, uh, more than the FOV, what you need is more and more perceptions.


Okay. Yeah, very large numbers of angles, multi cameras. The best, uh, station. That's wild. Look at that. Oh yeah. That's, that's interesting. So they've got a letter scanner and they've got four, you know, like a, a lens on every, every direction so you can kind of walk through it hemisphere. Interesting, interesting.


So, and, and the have a great, uh, splat out, uh, you know, kind of good ones. But yeah, I mean, if you're not dealing with LIDAR, yeah, you, you typically need, you know, you put dozens of GoPros as a, you know, uh, inside out or outside in, if you're, you know, doing a space or a product [00:55:00] and then, you know, try go around, uh, you need more perception than the FOV.


is good, but. More perception is what you really need to get it, uh, you know, get it better. That's quite remarkable. Uh, and that, and it looks like they're, they would probably be doing, giving you a, I like how they've got a harness to hold the thing up, but they're, they're getting enough data capture there to start to, uh, have some real time feedback on that.


Yeah. That's quite interesting. Yeah. You clearly know what you're missing. Yeah. Well, that's, that's, that's remarkable. That's quite a, and I'll be curious to see like what, what bit depth they can, they can generate. They have a rotating LIDAR, they have their own lenses on there, which they might be able to do.


Um, wow. They're, they're really going for it. Aren't they? Um, Oh, Ellie, I also have another question. What is the purpose of the, um, what do you call it? The, for the life of me, I can't remember what's called it or the [00:56:00] HGRI. So, I mean, we do have a folder in light craft. That's for the HGRI. But, I mean, is that coming from, um, the AR kit?


HGRI. Yeah. That was one of those things that I had high hopes for, uh, but was less effective. That's why it's not really documented is, um, ARKit when you're walking around, uh, synthesizes an HDRI, a rough HDRI image. Of your surroundings. And it uses that to generate illumination, to light the, to light the, the, um, you know, the, the object that we're rendering your 3d 3d 3 USDZ model.


Um, and at first I was excited about that. So that's exactly what we want. We want to have that automatically captured. Um, after we did some experiments with it, it, the problem is that it's, it's not well controllable when it does its captures. Um, and. The output isn't, isn't really the level you'd want it to be.


Um, I thought you'd be able to hit a button and like walk around and like, okay, I just captured an HTRI, [00:57:00] but there's. Uh, it's deciding internally when it wants to add a frame to it, uh, when it wants to add more information to the environment. And it's not controllable. You, it's just a black box of like, you hit a button and then it, it decides when it, you can see it, uh, you know, iterating and, and capturing additional pieces and filling in parts of the HDRI.


Uh, but it's also not controllable. And, um, on set you, It's just, there's, I think the, the, the right way to do it is going to be some variation of what's evolving with these guys. You can go up and bang, and in, you know, four seconds you have, have an HDRI and it's perfect and it's high res, you control it.


And, um, this one, unfortunately they removed the ability to do brackets. So hopefully they add that back in. Um, that's, that world is still evolving, but the cameras are still inexpensive. Well, this was, I bought, um, it's an Insta360 X4. It's their, their newest one. Um, yeah, they have, um, it's an HDR capture, but it [00:58:00] is not a, it's not bracketed.


Um, and so, you know, this was, we, this came up when, um, uh, people were starting to use these and put these on top of, of cameras to assist in post tracking and for, for good reason, which is that the 360. Uh, camera means that you can solve a track in synth eyes extremely quickly. Um, and it works just remarkably well.


And that the difficulty comes when trying to translate that track over to fit your, your cine camera. That's something we're working on internally, uh, to, to find a good workflow that, that can be, you know, made into a process. Um, but yeah, I mean, it's, it solves. You know, the core problem with trying to track a monocular cine camera is you only have a certain amount of data, you know, your field of view, and especially as you go into like a portrait and then a telly, you start running out of data.


I mean, you just do, um, whereas, uh, and it becomes very, very difficult to track that with, with, uh, to retract that. Whereas, you know, the 360 is a 360 and [00:59:00] sees everything. Um, and it, it, the tracking is, you know, magic. So, um, so that's, that's. We're looking at already having something like that as an option. Um, But that's still very much in the experimental phase of, of working with the 360s.


I need to find a reliable procedural process. I can get the data through synth eyes so that the pieces build on top of each other and we get a, a reliable output that can be automated instead of, you know, the initial tests are cool, but they're all hacks. So I need to, you know, so just to request you to work with, uh, Kondo, uh, the new, uh, Qoocam Ultra.


Much better off camera. Okay. Let me just, uh,


Oh, with the K. Yeah. K K. Okay. That is, so [01:00:00] this has auto bracketing for sure. I know. Um, I mean, of course, Theta, Ricoh Theta has, but it's nowhere good for videos and stuff. Um, but yeah, this one is real deal, uh, in terms of the quality and, uh, wow. Wow. So, you know, I had this in my, uh, list to get one. So don't force me to get the Insta, if you get the book.


Interesting. How much, how much are they charging for it? Okay. Yeah. It's 500 or 400. 600. Sorry. Yeah. That's, that's, that's, we're still in a reasonable and we'll, I'll be interesting. Do they have a. SDK? Uh, let's see if we can, that'll be an interesting question. I'll, what I'll do is I'll, I'll, I'll look down into this so they can do AK360.


Uh, and they, and it has bracketing, you said? Yeah, yeah. No, maybe it did not explicitly put on [01:01:00] the stuff, but I know it is. Okay, fantastic. Okay, that's, that's very interesting to look at. Okay, I'll, I'll, I'll, I'll, I'll check that out. Yeah, it's, it's, it always ends up being a combination of what the device can do and can we talk to it.


Uh, so for example, the, the tentacle device is having a very good, you know, SDK that we communicate with over Bluetooth. That's why we, that's why we work with them. We can program it. I know. I know. Sure. And okay. One quick question I had with the tracking data. Yes. So, uh, you know, we haven't really touched that part of it yet.


Uh, is this something you could export it as an FBX or a file which could be visualized in a, uh, game engines and, you know, any of those tools? Sure. Great. Thanks. Yeah, so you'll want to go through AutoShot, uh, and what we did is, is Jetset records are the tracking data in a standardized, uh, called a JSON form, you know, it's just a kind of standard, you know, for, for computer stuff.


And then we, um, AutoShot then takes, it [01:02:00] takes that data and, um, you can synchronize it to the CineCamera, uh, frames with a couple of different techniques. And then it writes out a shot package and the shot package has a set of EXR images, along with the tracking data that it's associated with that. And then we, we, FBX files and most of those other files tend to break very frequently.


So what we actually do is we write a direct exporter. The, the native language for whatever tool we're talking to. Mm-Hmm. . So if we're talking to Maya, where we write the exporter in Mel Python in Unreal, we, it's, it's, we actually have a direct blender plugin, uh, in Unreal. We write it to a script that, that directly, a Python script that directly drives, uh, Unreal's internal, uh, system.


So we, we generate a level sequence. So we actually don't use intermediate files for almost anything because I've, I've tried that and it, it, it's not reliable under, under high pressure. Yeah. Especially FBX. FBX is crazy. It's a mess. Yeah. It's a question I just came across the VFX team, uh, in the [01:03:00] last week, uh, we were discussing and the question was, you know, yeah, I did not really think of this initially because yeah, we had kind of seen the whole workflow and how it goes.


And of course there is a connection between the Unreal or, uh, the Blender. You, and you also say Maya is also something. Yeah, you can see a list of, it's actually already there. It's just that I haven't done, I haven't done videos on it. So if you, if you in auto shot, uh, I see, um, under, uh, you can set the export setting to either a blender, unreal or others.


And if you search the others, there's like eight, uh, including, uh, fusion. Uh, you know, Blackmagic Fusion Resolve, we have an exporter for Nuke, we have an exporter for SynthEyes, we have an exporter for Cinema 4D, um, so, you know, like, and, um, and so a large number of these things, we already have the exporters for, I just haven't gone through and done a video for each one of them, but they, they, they basically all work, After Effects, we just have, [01:04:00] that one is, is now working correctly, um, and so I, sometimes we're, we're ahead of our development team, Uh, our development sometimes goes ahead of our tutorials.


So, yeah, I mean, my ignorance, I haven't really explored because yeah, I mean, as you know, that's not something I do want to look at it. Well, it's, it'll be worth looking at it. Cause this is, um, this is really the key when you, um, for visual effects, I mean, you can do everything by hand for one shot, whatever that that's fine.


And what Jet Set and AutoShot is really designed for is when the shot count starts to, starts to rocket, um, go, go much upwards. And so we've built an automated. You know, pipeline processing chain so that when you have a hundred shots coming through, it's not all that much different than when you have like two.


Um, you know, you can, you can, things are set up in a very standardized way of processing it into all these different tools. And we write, you know, when you render out of Blender, you render out of Unreal, we've already set it up so that it renders into a standard set of directories with a standard set of file names so that you.


You audit hit the compositing [01:05:00] script once again, it takes, it knows where the files were written and it creates the node group for Nuke or fusion, et cetera. And so the things, if you stay in the pipeline, everything stays automated through rendering, through compositing and, uh, and into finals. And that's, that's the real core of the system design is to automate the data, data flow of, of these pieces.


But totally. I mean, of course that. Clearly make the sense. Otherwise, I mean, you have so many things to deal with in isolation. Otherwise, otherwise it's a mess. It's very, becomes very, very difficult and very overwhelming to deal with that. Yeah. I mean, this, this question came from the VFX because they're also not clear.


Uh, they hadn't seen the whole thing, uh, you know, as end to end stuff. So there is this, that is the question that I hadn't to check with you. Yeah. For the VFX people, the first thing they're going to want to know is the, is the tracking, uh, the day, the post tracking data. So, uh, I did [01:06:00] one synthesis tutorial.


I'm about to do another one for what happens when, uh, you, you shoot Jet Set Cine, but the, the, the. The phone gets bumped. And so the, the scan is, is, is out of what is, is, you know, misaligned. It turns out to be very simple to fix, you know, and, and that, so I'm going to show, show the fix for, for adjusting that.


Um, yeah, it's, it's, it's easy. Um, and, uh, uh, so I'll do that and then, oh, go ahead. Yeah, I'm sorry, quickly, you know, just on the same line, for example, you know, scenarios wherein we need to take the phone off and, you know, put it back and I mean, of course, you know, the chances that it is not in the same position now and you don't have room for recalibration and stuff like that.


So, I mean, we're still. In good hands to still carry on with the track and, uh, rely on the data. You want to be a little bit careful. Um, but what this, what this tutorial I'm going to upload shows you is, is how to correct it because the, the, what usually happens if the phone gets bumped is it hasn't moved that much positionally.


Yeah, maybe [01:07:00] a centimeter or something like that. That's, that's fine. That does. Usually, the problem is the angular motion has, has changed enormously. And with this, it turns out that in synth, synth eyes, you just load up the, the graphs of, of the, you know, the incoming data has. Position and orientation, and you just, you just select the, uh, the, the, all the, uh, I'll, I show it in the, in the tilt, right?


It, the tilt was, the camera got, the phone got knocked, so it was pointing downwards. So I just grab all the, uh, I, I'll just show it to you. You know, let me, uh, this is actually synthesize, I mean, easier to just show it than to, uh, then to talk about it because this is, this is going to be what's in the, in the tutorial.


Mm-Hmm. . Lemme pull up the script. Run script. Uh, this, I think this is it. Let me look at it. Okay. Let me go to the camera. Yeah, this is it. Okay. So let me, let me share a screen so you guys can see, uh, cause the fix is actually crazy easy. So [01:08:00] share, uh, synth eyes. Okay. And okay. Are you guys see the synth eyes screen?


Okay. So this, uh, this shot, uh, came in. This was, uh, well, it says to the starter shedding. Okay. Now it is. Yeah. Okay. So, and, uh, so what you can see here is that we have, it has a scan. Um, but if we go over here, you can see that the scan is, is misaligned, right? So come back over here. So over here, the, uh, the edge of the floor should be here.


And actually you can see that it's clearly, uh, clearly in the wrong, wrong spot. Let me just save the synthesize file quickly. There we go. Um, and so. Yep, replace, that's fine. Um, and so, okay, the, the, the, basically the thing that happens is the camera got knocked. Um, and positional errors aren't that big of a deal.


It's the angular errors that get you. So fortunately, what we just do is we're just going to go to, [01:09:00] um, uh, go to camera and graphs. Uh, so we can see the camera and, and the graphs. I'm going to zoom in so you can see, you guys can see what, what I'm doing. And I'm just going to go down here, find the camera, uh, and I'm going to hide the solved velocity, velocity.


We're going to go to the solved path, uh, cause that's where, that's where the initial tracking data goes into from Jet Set. And so I'm going to turn that off and I'm just going to look for the tilt angle, right? So this, these are all the key frames. You guys can see this, right? The yellow, yellow line on here.


Okay. So what I'm going to do is I'm going to grab with my cursor, grab, and grab all the tilt key frames, and I'm going to push them up wrong direction. There we go. Gonna eyeball that in a little bit. There we go. So now you can see as we, uh, move, move through the scene, things actually line up. Um, you know, there's the ladder, there's the, all the different pieces of it are, are, are now basically correct because [01:10:00] all the other pieces are correct, right?


The, um, I'll move back to here, uh, as we move through the scene. There we go. So you can see, you know, the, the, the mesh now aligns pretty well with the, with the pieces, right? Uh, sorry, the synthesizer is still caching, caching information. So now that we've got that, um, we can actually, and we can, we can adjust, I suppose we can adjust the pan a little bit, because it looks like the pan is a little bit off there as well.


So I will, uh, adjust pan angle. Once again, just grab all the data. Uh, let me grab all the key frames. There we go. And we can just adjust all the key frames at once. There we go. Align that up. And I think that is the correct alignment. Um, I mean, we'll, it'll depend on what your, what your scan looks like. But I think that's, uh, let me look at that a little bit closer.


Actually, the ladder's over there. So let me adjust it to the ladder lines up. There we go. And the neat thing about the scan is you can, you can see the, see that, see it when things line up because you have, you have a pretty [01:11:00] good, pretty good, uh, scan reference. I gotta, I gotta figure out a way to, to move by smaller increments.


But anyway, you guys get the idea. Um, I'll, I'll figure that out for the tutorial. There we go. So things that, that way things line up the way you kind of think they line up. Okay. And then that, that's it. Then we can just do the same process of, um, uh, in this case, I'll go back to the camera view. Uh, we have move over here.


I think we have AI mats. Let me turn on the Roto mask. Yep. There's our AI mats. Um, and then we can just, uh, run our, um, uh, run our script and track it. Um, actually here, let's do it. Uh, script, uh, we'll just run the multi peel script on this guy. There we go. So it's gonna, I'm going to put this in the tutorial because there's a neat script you can run that automatically cycles through, uh, that goes through synthesize and tells it to detect features that are small, medium, and large, uh, [01:12:00] size.


Um, and, uh, and, you know, automatically goes through it. Cause normally you have to go through this and it's kind of tedious to go scan for your tracking points up the size, scan for your tracking points up the size, and then it just sweeps through it and does it automatically. And this was a script, script made by Matt.


That's, uh, that's, that's great. You just hit a button and it, you know, crunches through finds the tracking points, I'll take a second to do this while we're doing it, but then, you know, this is nice because then, you know, even if they. And unless they did something crazy on set, then you can still recover it in post tracking, uh, and, and fix stuff.


Uh, cause you have, you know, you can see the, the scan is nicely aligned now with, uh, you know, this pads here. Someone moved this pad clearly between the scan and the jump. So we won't, we won't use that for tracking reference, but the background's clearly aligned and, and, uh, doing what it, what it should be doing.


And the AI mats are keeping us from detecting points on most of the people in the scene. Um, so it's just, you can see it's, it's [01:13:00] picking up lots of tracking points. And then I'll, then I'll do a selection of the tracking points that I know are going to be probably pretty good. And, uh, yeah, there we go.


Okay, so, uh, it just tracked it. So there, there's, uh, so now we have, uh, you know, plenty of, plenty of 2D points. It's not yet a 3D solve. Let me zoom in still a little bit so you can see it. There we go. So you guys see all the, the green diamonds? All the green diamonds are valid, are valid tracking points, but we, it's not yet a, uh, it's not yet a 3d solve.


So what we're going to do next is I'm going to just going to grab, I'm going to pick a frame that has a bunch of them in a, in a spot that I know are going to be pretty solid, like, uh, you know, a ground, there we go. There's this couple of markers on the ground. Um, where's a good spot. What if we go over here earlier in the, earlier in the shot?


That looks, that looks pretty good. Um, so I'm going to grab, uh, I don't want to select the mesh. You're going to grab this. I'm going to do a rotary, grab around [01:14:00] this, and I'm going to click, uh, add this, this, uh, tracker as well. I'm going to do track, drop onto mesh. There we go. I think that'll give me a solve.


Let's go try it. That's always fun to do stuff in real time. All right, it's going to go crunch solve. There we go. It's crunching the trackers. Six iterations. It'll take a second to solve it. And you can see over here, it's iterating down through finding an error. Okay. So we're at a, at a two and a half pixel error.


I'm going to get rid of the, the worst of it. Frames, fix that change to, or fine solve. There we go. And I actually, that mostly worked. It did something weird where it actually threw the, uh, uh, through the. I have to go back and redo it because it threw it through the, I didn't pick enough markers and it, um, uh, so it, it threw the, um, alignment of that off a little [01:15:00] bit.


So I'll go back and then do it right through the, through the tutorial, but you guys get the idea. Um, basically you, you, uh, you pick out the points, um, you can realign the mesh and, uh, and solve it. And you know, it usually works pretty well. Right. Right. So scanning is equally, I mean, it's very essential for this.


Uh, for each shot, I mean, except rather. Yeah, you really want to, uh, to, after they, you drop your, always after you drop your origin, um, uh, quickly, quickly scan. And I realized we're going to need to have that in the remote as well. So I already put in the request, uh, to, to put it in the remote thing. So, cause I see where you're coming from, you know, they're, they're, they're moving the camera over the place and it has to be, everything has to be remote where you sit at reset, reset the tracking.


Somebody runs out with the, uh, whoops, uh, my, my AI camera just ran away from me. There I am. Go back camera. Um, Somebody runs out with a, with a vertical marker, punches the origin [01:16:00] scan, you know, and all this stuff is happening independent, you're not slowing down camera team at all, the director is talking to his actors and you're just getting all the data behind the scenes and the hit roll and you go, that's, and then you've got all the data and all the tracking data.


Everything lines up and, and yeah, the VFX team will, will, will, will be so happy. Yeah. And, and I mean, yeah, I mean, luckily, I mean, I just checked, I'm not sure to compare with the older version, the, at least the 16 pro has, I think like about five meter range with the LIDAR. Oh, it's a big, Uh, I mean, I mean, as I know, it hasn't updated, uh, hasn't been updated, uh, ever since, but yeah.


Uh, but yeah, it does like, you know, five meter range. So it's good. I mean, I mean, I still have a constraint because you know, I don't want to take the camera off the mount and then do the scan and then put it back because again, it disturbs. So it has to be on the camera, but you know, it's a hefty camera kit.


You cannot really [01:17:00] afford to. Show it around on the way. So, yeah, I mean, with the given range, uh, I think, yeah, we should get decent data for this. Yeah, I think that can, that can work. And the other thing you can do is, um, is you can be scanning with other devices. You just, you just make sure you see the same, you know, coordinate marker or something in that.


So you can go back and reference and export a mesh with the same origin as, as the one. This we can simply do, you know, we can, yeah, with the poly cam and stuff, I can get great deal of mesh with my other phone, uh, on the set and, you know, keep this unrest in that case. Yeah, this is, this is great. I, I, I look forward to, uh, to, uh, um, I can tell you an anecdote from, uh, uh, people are not used to getting good data from set, you know, that they, they, they don't, they don't believe it.


Um, and you know, for mostly good reason, uh, but, uh, back on Alice in Wonderland, we, you know, we [01:18:00] tracked all the, the steady cams, right? So they're chasing Johnny Depp over trees, et cetera. And so they, they, uh, you know, we sent the data off to, um, to the You know, the tracking tracking teams. And at first the reaction was, yeah, whatever.


And then they opened up the data files and they realized that everything actually tracked, like, you know, pretty much dead on and they turned around and said, okay, give every data file you have, we need that right now and yesterday. And that was, that's been the reaction to it. It affects teams after they, they have to see it because they're so used to getting burned on, on stuff, but when they see it and the data comes through, then, then they usually have a, have a, you know,


they have a realization, but yes, remote scanning, I'm on it. I, I see you. I see, I see where you're going. It's, it's all, it has to be remote. You need to say remote jet set now. Yes. Oh, it's clearly becoming a, uh, you know, it's, it's a necessary, necessary thing. And there's [01:19:00] more and more things that are spooling up that are, that have the same, everybody has the same needs, you know, camera team is over there, you can't talk to them, but you need to get all this to vacuum all the data off the set and get into a form that, that post a post is happy to see, uh, we, we actually had, uh, Uh, a fun thing where the, uh, a group is, is, uh, Harold's Walt, uh, who is, uh, is directed at several movies, including the more recent Karate Kid, uh, movie.


And, um, so, you know, very successful director has, is, uh, is working toward doing a, uh, an animated feature and has basically used, and, uh, we just found this out last week, uh, has used Jet Set to build a new process for capturing, uh, Uh, data for an animation feature, the, they're, they're calling, um, uh, reference capture and so they, he went, he has a whole, and he, he's sent us a, some notes on it and a video and they, he's happy that to talk about it.


And, uh, um, but they, they ran jet set. They had, [01:20:00] they both did. A traditional in animation, you capture the voices in the voice studio. And then they did something very interesting is that they hired a team of like mimes, actually, of, of theatrical, uh, mime players. And they had them lip sync while he's shooting.


They built the 3d sets of the, the an animated environment, loaded them in a jet set, have live action actors, you know, going through and lip syncing The voices of it. Well, he's, he's blocking them and he can block them in like a traditional manner, but they did, they, they, they originally went to do this for it, to test for a few, a few shots and it went so fast, they went.


We're just going to shoot the whole production. So they shot the entire production with this, um, edited it, scored it and showed it to an audience, you know, like, and it's, it's a real, it's real time content with, with Jet Set and some of the edges are chattering, et cetera, but you see the virtual backgrounds, you see the actors, you hear the emotion, you hear the voices that the actual voices that they would be, be using.


And, [01:21:00] you know, and he explained to the audience, like this is for an animated feature, this is stand ins, et cetera. Um, but the audience could watch it and they completely, you know, it clicked. And I just, uh, what's fascinating is their, their, their visual effects guy. This is, you know, months back. And he said, well, I'm not going to bother using AutoShot.


I'm just, I see the JSON files is perfectly formatted. So they just wrote their own processing pipeline with the JSON files to pull it into, you know, all the tools they're using. Um, I think we're going to go back and write some more code into AutoShot so that they can connect to it directly. Uh, but you can do it if you want to.


Um, but it was, uh, it was pretty remarkable to see, um, what, when people realize what the data quality, what they could do with it, that they went and almost, I think they're going to reinvent how you do animation production because you can do it so quickly, um, Fluidly, you know, and have, they're using the actual framing, like, you know, like the actual framing of, of [01:22:00] the characters.


That's what they're gonna take into animation, um, to do that. So anyway, just an exciting, exciting, uh, little development there. Sure. Okay folks. Uh, so that's probably good for today. Uh, but yeah, looking forward to it. Let me know, uh, what, what comes up and what we can fix. Oh, thank you. Thanks guys. Alright guys, thank you guys.


Next time. Thanks.