Transcript
Office Hours 2024-09-13
===
Eliot: [00:00:00] That's all right. Morning, all. Good. Morning. Morning, Elliot. Hello, Brett. Hello, Joe. Hi. All right. Let's see. Uh, all right. What's the jump in first? What, I guess first of all, what, uh, Joe is, I remember yours are with more refreshment or yours? Are yours after effects stuff or are we on the, uh, deck? Link stuff?
Joe: Uh, well, I guess in terms of, I mean, I don't know if there's any progress on the X Link. Not yet.
Eliot: We've got it. We got it. We got to install it in our, in our machines at the office, then go to town on that for a couple of days.
Brett: Okay, cool. There's an interesting, I have a similar question about the deck link, and I actually found a really interesting video on YouTube this morning about getting time, getting time code from a deck linking input into UE five.
Eliot: Ooh,
Brett: let's,
Eliot: let's look at
Brett: that.
Eliot: So here, let me see if I can
Brett: all ears share this [00:01:00] link. Uh, I don't know if I can do this, but I, I've started watching it this morning. I haven't gotten all the way through it, but, uh. He does go through the process. Uh, he's using a Declink 4K, but uh, let's see, how do I share this?
I haven't shared a screen on this in a while. On the bottom,
Joe: you should just see a share button.
Brett: Start share, here we go. Uh, desktop one, Safari, there we go, go there. Share, open system settings.
If we
Joe: been able to get time code it, it's just not,
Eliot: yeah, we got a, Joe, you got a good echo going, . Yeah. I'm hearing a lot of feedback there.
Joe: Is that, that's,[00:02:00]
Brett: let's try this. I don't know if this will work. Oh, it says I have to quit in order to share. Maybe. Well, let's try it again.
Joe: Recording in progress.
Brett: Okay, so, there it is. Okay. Share.
Eliot: Okay. Take a look at this one. And do you have the link, uh, for it? Yeah, it's
Brett: up there. Um, here I go. I don't know if I can, I can find it. Let's see.
Joe: Create that. Please, uh,
Brett: So I don't know, I, because I had actually, I've been the one that put the question on the forum, and then I saw your response that, uh, you guys had tried it and were having trouble, and then I started doing some research this morning. [00:03:00] But if Joe's already gotten the timecode in, that may not be the issue.
Eliot: Yeah, so where we, where we were so far is that we, I mean, timecode coming in, we got that, so that's no problem. And the, um, and we have timecode coming in from the, um, uh, the embedded timecode coming in from Jetset. Uh huh. And And, you know, normally what we just did with the odges, I just put both of the feeds with time code into, uh, into time data monitor, hit sync and it synced.
I'm like, Ooh, that was easy. So then I went into, we went into do that with the black magic and I, you know, that was, that was, it was not, not straightforward. So I'm, I'm trying to go through it and, you know, with all these things, there's, there's going to be some little goofy switch that's heading down in some menu that's like, Oh, that was the switch.
It's going to take. And, uh, and so we've got one and, uh, our, one of our guys who is like the, the, I'm, I'm working mostly out of my house in Irvine. Um, and, uh, but our, our lab or like our insert stage up in LA has, you know, we've got a deck link. We've got all the, the [00:04:00] cameras and the heavy gear and stuff like that.
So the guy who's, who's up there, it's going to be back on the 17th. And so then we can just kind of go to town and start to, it's going to take hours. I'm gonna tell you that right now. I was just gonna Oh yeah. Of going, oh, I believe
Joe: that wholeheartedly
Eliot: and at the end. Well, I appreciate you guys doing it.
Oh yeah, no worries. Joe. I'm super curious. Like what? There's a massive echo and I don't quite, how do you,
Joe: I'm trying to to see if this works here. Hold on. It's, oh, you still got your, I dunno why I'm
Brett: getting that echo. You've got two sign-ins. It looks like. Oh, there it is. Yeah, one of
Joe: them because I've been doing the two sign ins because I, I have some stuff to share on my computer.
Eliot: Yeah, but let me try using a
Joe: different audio source on.
Eliot: The mic is active on both of them. Yeah, I think that's, I think that's why we're. Joe's iPhone, yeah, that's. Because I guess the
Joe: speaker,
Eliot: oh
Joe: yeah, it's the speaker on [00:05:00] my phone when I'm trying to do it for some reason. I don't know why it never did that before, but
Eliot: sorry, I apologize.
Zoom, Zoom audio science. Yeah. Yeah, so I don't have anything new on the DeckLink front because we're, we're just going to have to go to town on it. Like it took. It took like many round trips with the Aja stuff before it got, we got it clean, you know, where it's like, okay, push this and push this and where I got it in my head of, of what piece was hooking up where inside Unreal.
Uh, and, uh, I expect it's going to be exactly that way. So I've, I mean, there are, I, there are some people out there who clearly got all this, all the pieces working. So I suspect it'll be a matter of. We'll probably have to up the frame buffers and the deck link, um, to buy, buy some more time, but then we're going to have to tell, you know, tell it correctly where to, where to go.
Um, yeah, there's, there's, there's, there's some time data monitor stuff that, uh, we're all getting a little bit of a crash course in.
Brett: Yeah, and I'm curious about this because [00:06:00] I have a system that's got an 8k, but I'm about to get another system that we don't have a card in yet. Which we may, we may end up going with the AJA card because of these difficulties.
Cause basically, uh, the other system will be on a, on a stage and that's where we really need the live preview. The system I have in my house will be more for post work, so it's not as essential there, but the new system would be coming to me soon and they're expecting me to set it up and give them recommendations about.
And so I'm trying to determine right now we're going to play around with the Elgato over there just to get it up and running and I told him once I got some better answers about which card is going to be the The best solution. And right now it sounds like the AJ is, uh, or one of those, we
Eliot: hit a button
Brett: and it worked
Eliot: and I, yeah, I, I thought that it certainly makes it
Brett: easier,
Eliot: So can, can you tell us a little bit more about, uh, what's your production [00:07:00] process, your, your pipeline, you know, kind of what, what you are doing? Yeah, so
Brett: what we're trying to do, I mean, uh, I come out of posts. I live here in LA too. And, uh, I've been in television posts for a long time doing visual effects and, uh, and finishing.
But, uh. Because of the strikes and everything, I found myself unemployed earlier this week, year and decided to get into this. So I really kind of. I partnered with a guy that's got a, uh, a green screen stage in, um, West LA. And he, uh, he was, he's just shooting podcasts and stuff there right now, but he really wants to get into virtual production and I didn't have a stage.
So it kind of worked out. I said, well, I'm learning how to do this. And if you let me use your stage to kind of perfect what I'm doing, I'll set you up for free. And then, uh, Maybe you set other people up or he wants to start making content. So that's basically what we're doing is trying to set up a stage to virtual production, a green screen stage.
Eliot: And, uh, the interesting thing is you guys are coming out of, if you're coming out of post, are you [00:08:00] planning on using the, uh, the live unreal connection more as kind of a lighting preview on set?
Brett: Yeah, that's exactly it. So he's got a lot of the podcasts he's doing are with comedians, some, some comedian, famous comedians and stuff, but he wants to do a sketch show, but he doesn't want to present it to these upper echelon people until we've really got something he wants to be able to say here.
We, even though we'll do it in post, I like the idea of, you know, and I've done a lot of tests just using the cine and shooting it, uh, without any preview and I've had. Great success with it. It's a great product. I'm very impressed. Uh, I really am. Um, and, uh, in terms of the cost and what you're able to do.
And honestly, the way you can work on tethered with just a phone in your camera is really exciting to me. You know, so I'm pitching him on this whole idea of using your product to do that over there. And he's He's got some money. He's invested and he's spending some more He's buying this computer because [00:09:00] he I told him I didn't really want to take my computer out to the stage I prefer having it at home.
Um, right and I was working with a laptop that wasn't so great an older laptop But he ended up purchasing this or he's just is getting it put together this week, but we don't have a card yet So it's perfect You For us to kind of, for me to figure out what the best solution is before we buy something that makes, that makes sense.
That makes, that makes a lot of sense. Uh, one of the things,
Eliot: oh, go ahead.
Brett: Go ahead, go ahead. I, that, that's it. That, that's really it.
Eliot: So, so one of the things that, that I'm curious about, and this is, and we, we are just now introducing this, so I, I'll, I'll put the tutorial up. The tutorial video, so you guys kind of see what, what it can do is, um, the trick with having to haul around a cart and get it basically getting an engine to run on, on set is a thing, you know, as, as we are all discovering.
And one of the things that I'm quite curious about. Uh, is we have a new capability that we just put into Jetset, which is rendering Gaussian [00:10:00] splats. And just to get an idea of what, what that can look like, um, I'll put this over here and there we go. So this is, this is being rendered in real time on the phone.
Um, this is, uh, and, and like, you'll notice the lighting and changing, the reflections are changing, all this kind of stuff. Um, and this is just from a, um, a coffee shop over, uh, is, is, uh, yeah, I think over in Beijing somewhere, one of our colleagues over there took like, you know, a few hundred photographs and, you know, ran it, scanned it and, you know, did, did the, did the solve, et cetera, but then you can actually train a, this, it's a, it's called a Gaussian splat model and it, you know, the quality that we can render in the phone is nuts.
It's just nuts. And it's in the phone. On set presence becomes like. You know, . Yeah. And, uh, and so, oh, we're still getting a, are you guys able to hear me? Yeah, I can hear it. Actually, we've still got two [00:11:00] mic. You know what I'm gonna, I'm gonna mic, I'm gonna mute one of your guys. Uh, should I mute your phone or, or there's a Joe Zohar or a, or Joe's phone.
We'll, probably,
Joe: yeah. Are you able to hear me at all now?
Brett: Oh, that sounds good.
Eliot: Now
Brett: it's not here again now.
Eliot: Yeah. Okay. So
Joe: I
Eliot: have no idea why that was, I just muted one of them. It was no, never clear. Uh, yeah. And so, you know, this is one of these things where I'm always trying to figure out what is the lightest possible thing that we can be doing on set.
Right. Because, you know, you add the, you know, as we find the, the more connections, Like the more failure modes there are, right. And, uh, and there's a category of thing where you just need it. You need to have the engine on set. Cause you're going to be, you know, moving around a ton of stuff and on the fly and the director of change and stuff left and right, and okay, great.
Got it. You got to do it. Um, but I'm not sure that that's going to be necessary for lots of things, especially if you're doing, you know, something where I'd almost say like an improv comedy or a comedy sort of thing. You don't want to be sitting there and modifying the 3d scene. You just want to let the, [00:12:00] the, the actors like go.
Yeah, absolutely.
Brett: And I think what we're, our plan is I'm, I'm going to be using it. If we get this working the way we want is for, uh, people to, to previz, get a quick look at, and for lighting, but we're not going to record the signal. We're not going to be, it's really just to have people look at it and go, Oh yeah, that looks good.
And then the, cause I prefer working. untethered with the camera. Uh, you know, I've got decent battery life, uh, and a couple of changes so we can, I can really work without any cables attached to me all day long, which is what I'd prefer to do when I'm shooting. Um, and the way this guy that I've been working with has said, I've, he's got everything tethered.
He's got an ATEM and he's doing live switching. I don't know why exactly, but he's, Invested all his money for that, for the podcast stuff. Um, but for this, I've, I've been trying to sell him on this idea of not having the camera [00:13:00] attached, but he's really insisted on being able to put it up on a monitor, uh, for people to see before we start shooting or maybe while we're shooting, but I don't, I don't see it as necessary.
For anything beyond maybe the lighting that that's where I see
Joe: you could use the ax soon. If you want to be untethered and get like a wireless. And I,
Brett: I did tell him that the, the, the axon will give you not, will it give you the, uh, the camera signal without, without the key though, right?
Eliot: Yeah. Now there's,
Brett: right, yeah.
A couple things. So I think the, okay, go ahead. That makes sense. But that's my thing is we wanna be able to see a preview of the comp. Right. The final composite. No, that makes sense. I guess that would've
Joe: to come outta the machine out of whatever capture card you have. Yeah.
Brett: Yeah. I mean that's kind of what we're looking to do is, I mean, and it looks like you were able to achieve that with, and we, and I have been able to get it going with the El Gado, uh, [00:14:00] and I was trying to figure out if I had an elga, ddo and a deck link.
If I could get it back out to a monitor from the deck link, but I prefer if I'm going to have the deck link to have the video coming in on the deck as well. Right. Yeah.
Joe: Were you able to get it in sync on the Elgato?
Brett: Uh, no, I got it. I got it pretty close. I was just kind of, you know, sliding it around a little bit.
I got it real close and
Joe: yeah.
Brett: And I can certainly go full screen on a computer monitor if I don't, if I don't need to send it out to a television. Um, I mean, but this guy who owns the studio, his ideal is To be able to get something that's in sync that he can show to clients as we're setting up, uh, but not, I'm not going to record that.
I'm going to do, I'm going to record it the way it's really designed to work, which is record the video, like it's black magic raw, and then bring it in and do post processing to get the comp done. Cause I think you're always going to get higher. Quality results if you do it that way. And he agrees with me.
He comes out of post as well. And he [00:15:00] understands he just wants to be able to show somebody on set. Hey, this is what it's going to look like more or less. And do you like this lighting or should we adjust it? What do you want to do? Uh, so yeah, I
Joe: got it for the same reason. So I'm there with you. Yeah.
That's exactly one of the reasons because clients, they, they need to see it. Like, you know, they just need to.
Brett: Yeah. So that's what I'm looking for. I don't, I don't want to record That signal at all. Uh, but I just want people to be able to see it. And, uh, if it can be in sync, all the better, you know?
Eliot: Right, right.
That way the clients aren't saying, they're like, why is it doing the weird thing, you know, in the monitor? And, and I totally get it. It's, it's an interesting thing. It's, it's just the number of pieces that I'll have to click to get that to behave is, is not a small number of pieces. It's, I think it's, One of the things that we're, we're figuring out is, is, um, you know, the Gaussian splats are like one piece of it.
Now the problem of course, is that the lighting is baked into them, right? So you, you, you, even if you derive the Gaussian splat from a [00:16:00] 3d model, then you have to like take it and model us the crunch for, for, you know, some chunk of time, and then you get this nice compressed thing out of it that can render, but you can't be like, Oh, I'm going to like, Oh, move a hard light over here and do all the interactive things.
And sometimes people are really going to want to do that. It's just the, the sort of the complexity kind of. Blossoms. Sure.
Joe: I mean, with that, sorry, because I wanted to talk about some of the Gaussian splat, um, aspect, but I didn't know if Fred stole. Oh,
Brett: that's, that's it. I mean, I want to just give you guys.
Okay. So, yeah, we're in a similar place on the deadline thing.
Joe: So, so I had a couple of things that I wanted to share. So, um. So essentially now I've, I've, I'm really trying to like implement this. There's like a particular, like, uh, kind of functionality trying to like, that I think is very possible to achieve with this.
So I've just been having a couple of things. So I'm going to, let me share a couple, let me just share something because I just, I used Polycam [00:17:00] to capture my bedroom and then I brought it back into unreal, which, which works, which everything was working. Um, um, and. So here, let me share this. So can you see this?
Are you able to see that project? Yeah. So this was a, um, so this is the comp plate. I just put in like a person, like a model for mix, just like sleeping on, on bed. Um, now here it looks pretty good. It looks like the track is pretty like solid. It's hard to play a full res. Um, and that, and this is cine footage.
So it's like this. I took the, I took the Polycam of it. Then I, um, let me see, I should be able to see. Let me see if I can't just show you, um, sorry, I'm, I'm, I'm a little new to, [00:18:00] uh, I mean, this is cool, Elliot. You're, you're, you're helping me like between you and Bill, you guys have been pushing me into blender and I'm really understanding why.
Um, sorry, let me just move this. Yeah. So here is the scan behind it. Here's like, uh, and the scan collection. And then I did the scan of, is it not going to show the skull? Yeah. The scans back there. Um, the wire from, from the app. Yeah. So, okay. So I was able to do the, and I was like, okay, I like the idea I'm seeing.
I was like, oh shit. Okay. That was working. I was like, this is matching pretty well with my live action. And I could see this being really helpful for, for example, I have the shoot that I've been telling you about Elliot with Intel, where we're doing a live performance, virtual production, music video. And what I would love to do is I want to scan the whole set.
Like we're going to be on there with the monitors screen so that I can bring that back into Blender so that in post. I can [00:19:00] add other elements like especially in the foreground and just you know, let's say I want to Roto, you know, we've got some trees, you know, I don't know. I'm just saying let's say our set because I don't know what the set's going to be.
It's going to be a jungle or or uh, let's just say An Asian garden. So, you know, you have one of those bridges that goes on there, right? And let's say the artist is walking on that and it's like, okay, I want to run some graphics, almost like post production projection mapping. Like if I wanted to like put things on objects and have them animate on there, instead of doing that in camera, which I would love to do, but we couldn't.
To use this tracking data to be able to bring my scene in and then have a virtual version of that scene that I can paint whatever I want on. Does that make
Eliot: sense? Yeah, I got it. Got it.
Joe: So, so you know what I mean? Because then I would have the recreation because essentially then just the person's moving, but everything else.
Does it. So for this application of being like, cool, I could scan my whole set and then be able to bring it in seems possible. Um, you know, and then I would just [00:20:00] work around the moving talent or anything that was moving, which is fine. So. I did this test to try to see if that's working. So then I, I rendered this out of blender and I was able to get it to come into a, um, now, but my, you know, it's, it's just like, and I made sure it was 24 frames, but it's like, man, when I get the, the track is just not there.
And I noticed, and I'm like, okay, that's not what it's doing. That's not what it's supposed to be doing. Um, And I, I'm like, I'm wondering if I'm doing something on the render or the importing process, because I'm like, everything is 24 frames. Um, I even have it here making sure that it is right there on my render.
Um, with the, with those, uh, with frames that should match up. So that's why I'm just like wondering
Eliot: if you've encountered
Joe: this at all.
Eliot: So the, the, um, uh, one of the, the, the first thing we're going to want to do is do a viewport render [00:21:00] in Blender to see, see if, if the tracking is looking good in Blender as we go through it.
And I think you, you were running into crashes. We fixed it this morning. Uh, so we can actually install the, uh, The updated one, uh, the, off the website and then it should work. Oh, so then for
Joe: Autoshot.
Eliot: Yeah. Oh
yeah.
Eliot: The Autoshot blender.
Joe: The Autoshot blender. Okay. Yeah. Because I was getting this, this is what was happening.
Eliot: Yep. That was Blender 4. 2, Curveball. So that that's fixed. So let's, let's get that installed because I want to see Blender first and then, and then we can figure out how to make the jump. Okay, cool. And, um, and
Joe: then what was the other, uh, Okay. Blender at, I'm just downloading that now.
Eliot: And I, I, I looked at there's, there's a, there's a deeper question in tracking and after effects and stuff.
And after we, after we get this, let's check on this one and I, I've been thinking through it and I, I know, I know what's, what's going on and I'm just thinking of the best way [00:22:00] to approach it. Um, yes, because that was going
Joe: to be my next thing was, um, sending the information from blender to after effects for life.
Because right now I did the render out of blender, but I'm like, okay, well, what if I, I want to, uh, send it and all that. Yeah. I want to send in that camera tracking data. So it's like, okay, do I want to set the bed on fire? And like, you know, it's like, I've got like a cool little fire element here. And it's like, yeah, I just want to stick it right on the bed.
And it's like, I should be, you know what I mean? It's like, theoretically that should work. Um, so let me, yeah. So those are the two main things that I was hoping we, uh, try to figure out today. Um, so I would have to put. This goes in my
Eliot: program files, right? Uh, actually, uh, you don't, you don't need to just, uh, just download it.
And then, uh, then, oh, yeah, I'm sorry. The, uh, okay. The auto shop blender, just put it in your downloads. It's fine. That's [00:23:00] fine. And we're going to install it inside. Blender. Yeah. Oh, I saw, right. It's
Joe: a plugin thing,
Eliot: right? Yeah. Go ahead and download it to a natural download place and we'll go into blender.
Should I uninstall
Joe: the other one or will it just automatically overwrite?
Eliot: It'll automatically over install it. So just go to edit preferences. Yeah. You know, the drill
Joe: Autoshot blend. Oh yeah. 0.
Eliot: 111.
Joe: That's the one. Beautiful. Okay. Uh, I think it did it. Yeah. 0. 111. Okay, great. So now I probably need to restart. Yeah. It's good to
Eliot: exit. Come back that way. Make sure the plugin is. Yeah. Let me just
Joe: do
Eliot: that.
Joe: Dude, Ellie, man, this, you guys are. And gangsters, man. This is so cool that I can't tell you how much we appreciate this kind of engagement, man. It's huge. Well, it is fantastic.
Eliot: It's the real, you know, [00:24:00] that you find out what's what's going on.
Joe: Yeah. Yes, sir.
Brett: I don't think I realized you had the office hours to fill yesterday when I was poking around on the site.
I said, Oh, that's where I need to go. I wish I had more questions. I, so far, the only big one I've had is the, uh, the issue with the deckling card, um, and obviously you're working on that, but this is fascinating to me. Anyway, I'll just hang around and watch for sure.
Joe: Ah, it works or I think it is right. Yeah.
It didn't give me that error.
Eliot: Yeah. Well,
Joe: the, um, is it, uh, okay, cool. So it's doing its render. It's rendering now, I guess. Right. Okay. I think so.
Eliot: No, it should be. Is it, is it kind of going through its thing somewhere? I don't, I mean, it's going through
Joe: the shot. Okay.
Eliot: Okay. It sounds great. Yeah. So let's, let's check that out. Okay. So, but, but jumping into, I, I, I parsed in kind of what you're, what you're talking about. And then saw the tutorial. There's a couple of benches.
Oh, Hey, there we go. That's that's all. That's this is, this is good. And this is kind of fun. Cause it's, it's [00:25:00] actually a, uh, surprisingly harsh test. Take. Or for, uh, like direct CGI live action integration. Yeah,
Joe: no, exactly.
Eliot: The way that most post production camera trackers work is what they do is they look as you give it a piece of footage and it's going to go through that piece of footage and it's going to look for natural features in the scene.
And it's going to say, okay, I can, this feature is in this, this frame, you know, over here on this frame, over here on this frame, over here on this frame, it's going to, you know, Automically track like tens, dozens, hundreds of those through the portion of that. And then it backs out the camera motion by analyzing that, um, although the individual kind of 2D tracks.
If you get enough of them, you can actually back out the 3D camera motion. Usually, right? Usually. Sometimes it's, it's a little bit fragile because you And
Joe: depending how much parallax and stuff you have to get, be able to get away with it, right?
Eliot: Yeah, it can blow up pretty easily. So. [00:26:00] Yeah, that's great. All right.
So the thing that's interesting is that when we're taking the data from the phone, and so when, when you, so when you solve it in a, you know, synth eyes or you solve an after effects or nuke or whatever, they all kind of work the same way, um, if you get enough parallax, you have those 2d points you, then you figure out, okay, so I've got my, my, you.
You turn those 2d points into 3d points. Once you have a solve, and then you can stick stuff on those, those, those now, you know, 2d, those now 3d points. And it's going to stick or stick really well. So the interesting thing is coming in from the phone. We don't have the, we don't have the 2d points. We just have the, we just have the 3d camera and we have the geo, um, like, which is the, you know, the bed.
And that's, that's why you are, you can, you know, knew where to put, put her on the, on the, uh, on the bed and, you know, have that, have that all line up. And after effects. I'm still figuring my way through this. Um, because the, we don't, we don't yet have, like, if you were in blender, what we would just do is you, you drop [00:27:00] exactly what you did.
You just, you know, like, here's the scan, put it on top and then
Joe: the camera movement goes over it perfectly. Yeah.
Eliot: Yeah. And in after effects. It did not traditionally have like a full 3d scene that you could just drop stuff into.
Joe: Yeah. You need like no objects to attach it to.
Eliot: Yeah. And, but we, they, the new version of after effects now does they, as of like, you know, three months ago or whatever it is, they does actually have a 3d scene.
And so when I'm, this is clearly going to come up a bunch of times and there's, there's like a. There's a hard way to do it, which is like, we run it through our synth eyes, retracking pipeline, and we get a bunch of 2d markers. We exported it to after effects and for a category thing, we're going to need to do that.
But this, the shots tracking great, right? It's easier. Basically already, already mostly there with, with what we got. Got with, yeah, I see like a little bit
Joe: of like, I see a little, like if you look at her knee on the bed, like, like there, you can see it's kind of slipping a little bit, but it's pretty, it's like, it's solid.
You know what I mean? It's, it's, it's. It's pretty good. And like, it's [00:28:00] actually really good for, uh, for a free motion tracks, you know, before you even shot in the footage, analyze it. So that's, um, that's pretty good. And that's why I'm wondering, I'm like, okay, so it's synth eyes, like the only way to then get it to be like a perfect rock solid track.
Eliot: Yeah. For, for, okay. So we've got a couple of things going on. One is that we, we have, we have our synth eyes pipeline. I don't know if you've seen the, uh, the tutorial on it. But it's actually pretty cool. Like I, we, because what it's doing is normally if you do try to track, track stuff from scratch, it's brutal because the problem is like, even if it solves, um, which you need a fair amount of parallax for it to solve.
You don't have inherently the math of doing monocular, like single camera tracking, you lose position orientation. There is no such thing inside those pixels, right? So it's just right. And you're sitting there and like adjusting and squinching it around and trying to get it to line up. And it's a giant pain in the neck.
And, um, What we're doing is we're taking the, all the Jetset onset data. Cause you've got the camera track, you got the [00:29:00] scan, um, and it's worth, I'm going to play this just so you see how fast this stuff can go is I it's a, and I, and it's kind of fun cause I, we have a, uh, one of the other users is a stunt guy, uh, and he taught himself synthesize by watching the tutorial video.
And that was, that was just like bombing three shots. So it just, which is hilarious, but I'll show you, um, uh, the, let's see. So let
me go find that real quick. Cause it's. It's worth it to understand there's, there's a category where we do need to do that when you need like a sub, when you have a visible scene between live action and CGI, um, the, the iPhone basically almost any real time track is not a sub pixel track. It's just not right.
You know, like not us, you know, I have what I pick your pick your poison. They're all good. Um, but sub pixel requires a crazy level of precision right now. I think only you get from a post refinement. So we actually built. A pipeline to do that super crazy [00:30:00] quick. Uh, and I'm going to pull up the video where I, I, I, uh, kind of show how that, uh, works.
Um,
Joe: Are you talking about the synthesized tracking video?
Eliot: Yeah, the synthesized
Joe: tracking. Oh, yeah. No, I watched it. Okay. So, I was, I had watched it. That was where, but then I had run into just this issue with the not being able to render out a blender. So I kind of, I thought it would stop me dead in my tracks with like the tracking refinement.
Eliot: Oh, no, no. So we, uh, this, the tracking refinement is, is parallel, you know, the stuff we're going to pull into blender, uh, you like, we can just, we can pull the data into synthesize, no, no problem. So we're doing that. And at the same point, you know, there's. And there's a category of shot where if it's complex enough, yes, synthesizes is crazy powerful and it can handle anything.
Honestly, you know, there's, it's sort of how deep you want to go into it, like, you know, and my answer is I want to go not that deep, but like, do all the automated things that can go fast. And so, and then if the shot [00:31:00] complexity goes way high, then, yeah. Okay. Send it to somebody who's in this, but for the most part, I wanted to get to the point where users could follow the tutorial because it's that 3d scan data coupled with the, the onset track.
That's the magic because we use, we can use a trick. Ooh, excuse me. Pleasure. In SynthEyes where we project the, we let SynthEyes look for the points in the scene that can track well, and then we project through the camera, through those points onto the mesh, and that gives us survey data and survey data is gold.
Like. Normally, you know, when you work on a big feature, there's a bunch of people walking around with a field of light or, you know, surveying tool and a tripod going, dink, dink, you know, and measuring the 3d points of a bunch of points on the screen, and then you pull that and then, then, you know, then you have exact data and the data we're generating with this pipeline.
Isn't that exact, right? synthize solver to use it as, as like a smart [00:32:00] reference and you hit solve and it locks it in there. The camera is. The salt camera moves maybe a couple centimeters from the original camera, maintains the focal, the scale maintains everything. And that was the piece that, um, and, and you can do it in like two minutes, right?
It's fast 'cause everything's automated and we, you know, we're using the, um, uh, what, what's what it's called the ai, uh, romas to, to make, make sure it synthesize isn't putting tracking marks on the people going. Yeah. So
Joe: does that only work if you have a green screenshot? Because I, I think I was having trouble 'cause I was trying to process this scene.
But then in Autoshot, it, it, it froze when I, when there was, when I had the AI model, the inspire net, it would just lock up.
Eliot: Oh, let's, let's, uh, let's, let's look at that, you know, let's look at that next. Cause that's really useful. Uh, because you don't want, in general, you don't want the tracker to track people cause it'll throw off your solve.
Um, and so it's, it's extremely useful to just be able to have the, um, [00:33:00] the batting tool, uh, you know, just, just take care of that. So I
Joe: thought that was genius when I saw it in there to minimize, I was like, oh my God, that's incredible. Oh,
Eliot: it's,
Joe: I was like, whoa, that's insane to have that big. But then I was like, wait a minute.
Maybe if it didn't find that person, it would, it would lock up. I don't know, but I just know it was giving me that kind of a, it was just, it just wouldn't let me do anything. I would just click it and it just, I just get the pinwheel.
Eliot: Oh, okay. All right. Well, let's, let's, uh, let's, okay, let's do a couple things.
So it's, let's let Blender cook through its thing. Yeah, it should be almost done here. It should be done
Joe: Aaron,
Eliot: but it looks like it's
Joe: holding
Eliot: pretty well,
Joe: you know, it's
Eliot: looking.
Joe: Yeah. No, it's I think so too. Like this is very promising. That's why I was just like, okay. Like this would be great if I could get this camera tracking data spade into after effects.
Like that now I'd be getting closer to like what it is, the functionality really looking for. And eventually I'm going to go to Nuke, but I [00:34:00] just know after effects now. I mean, dude, this Jetset process has had me learning all types of software. Well, you guys are awesome.
Eliot: It drops you in at, at a level where normally you spend the first, uh, you know, three weeks on a piece of software learning how to do ingest and the bookkeeping pieces of it.
And, and, you know, no, like that's, that's, yeah, it just turns you right off.
Joe: But like here, like this is my, this FYI, I'd never touched Blender. Oh, wow. I never touched you. So it's like, you know, um, like you guys, it, it's awesome. I love, um, oh, okay. I think it's done. Yeah, I think the render's done.
Eliot: Yeah. Let's, let's play it, let's just kind of see how, how, how it looks when we're looking at it in real time.
And that, and that way we'll get a good sense of what we should expect. Can I just play it
Joe: right from here? Um, um, it'll look generated.
Eliot: Um, if you go into Autoshot and you just click, uh, you can like exit this window, the blender render window. 'cause it's, it's done rendering. Oh, okay, cool. And, uh, there you go.
If you go into AutoShot and you click, uh, open next to the [00:35:00] take. Uh, open file. Um, oh, well way down. Uh, under your take selection. Um, oh, under Take selection. Yeah. So there's an open button right next to the take. Um, that's, there we go. Uh, oh, here, sorry. That's the guy. And that'll, that's a shortcut to open up, uh, your Windows Explorer.
Right at that take, uh, top directory, just click preview and then the preview. And there should be one of them. It's a video. So just like pull that over a little bit. Oh, yeah. Yeah. DP render. There we go. And so then there we go. There's there's the VP render. Yeah, that's cool.
Joe: Yeah, that's pretty, that's not bad at all.
I mean, that's so much better than what I was. It sounds like you could see the drifting a little bit, but like pretty good.
Eliot: Yeah. Yeah. That's that's I'd say. Okay. We're in the, we're in the range of solid and a couple other, other pieces. We, uh, we are, we're going to do a new release of Autoshot the next.
Index day or two where we fixed some of the points of our color space conversion. We had a bug in our, in our, where [00:36:00] we're clipping about clipping values and now we fixed it. Good Lord. That took some, anyway, I'm not going to, I'm not going to rant about color gamut conversion. I
Joe: was trying to figure some of that out too, but that actually makes sense.
Cause I, cause I, like for this stuff, I was like shooting on, I shoot on Z cam. Um, And it's like a shoot log, but I was like, okay, what color damage space does that like, and it is pro res, it is pro res, uh, HQ. So I don't know if that,
Eliot: you know,
Joe: so, uh, yeah,
Eliot: in the Z cam does it, what, what log format, uh, does it use Z log?
It's like their, their log format. Okay. I'll look that up. What we're going to have to do is, um, we'll make, uh, if it's in resolve, then what would, then I can, we can go make a, uh, a 3d LUT and add it to our list of LUTs. Uh, because with, with, with LUTs, you got to be precise. Like if you shoot that thing in Sony S Log3S gamut, you know, Cine, it's, you have to use the exact LUT otherwise, uh, to get the transcripts correct.
That was
Joe: exactly why. So, [00:37:00] yeah. And like Z cams are definitely, and I know they have, um, you know, they've got their LUTs, they've got their LUTs, their Rec. 709 LUTs available for download and all that stuff.
Eliot: Um. All right. I'll go, I'll go look that one up and, and, and get it right. Yeah. I'll cross that bridge.
I got time before I get there. All right. So tell me a little bit about, cause the, the way we get tracking data. So here, this is actually a really great example because the tracking data is good. It's not a sub pixel lock. Right. And so depending on the usage of what you want and after effects, um, then I think we can get to this level of tracking and after effects.
When you need a sub pixel lock, we're going to need to do a refinement pass. Um, that's, that's the kind of the line of demarcation, which
Joe: I'm into. Yeah. That, that was, you just answered my, I was like, what is the line that we can get up to without the refinement pass? So it seems like, so I just wanted to make sure I was hitting that.
I was like, okay, he's getting as good as I can with Jetset, like solo. So it seems like we're there now and that's working. So I'm definitely interested in the [00:38:00] tracking refinement aspect of it. Um, because for me, it's like, right, an example here, it's like, if I just want to, because pretty much the reason why I did this, I wanted, I'm trying to get a solid test that has a lot, that has rockstar track and like, this was a shot so that I could just send it to them and be like, look, I got fire on the bed, it's raining in the room, you know, I'm just, I, I literally just want to throw every fucking effect on this shot or this test.
To just be like, look at what you can do, how you can track almost everything anywhere.
Eliot: Okay. In that case, then I know your next step, your next step is this entire stuff. Okay. And there's a, there's a couple of key things. One is that, you know, we retain the, there's There's like a bunch of things that this helps with.
Um, one of them is that, and I need to test this. And if you get a chance to test this, I want to do this normally with, with synth eyes, you need to have enough parallax in the, in the shop for it to solve. And the reason for that is it's correlating the, the, you know, the pixel 2d motion motion and needs to get enough of it that you can back out the map.[00:39:00]
And if you don't have that, then you have to switch on a tripod, solve. And it's, it's like a bunch of like approximations, et cetera. If I got this right, and we need to be testing this. Um, because we're doing survey, a survey database chat. So you have the mesh of the bed. I saw the scan, right? And so you do the track and you project those points on it.
When we have survey data, we don't need parallax. If I got the math correct on this. And this is a big deal because all of a sudden like, yeah, tripod, then what you're, because normally if you do a tripod shot, you have to set up synthase one way. If you do this, you have to do it another way. There's, there's a lot of if thens.
Uh, and a lot of fussing in the UI because you have to handle these in these incoming mathematical conditions that can be bad basically to solve and you still have to solve it with that scan. And then this is why I'm like, anytime you're shooting a Jetset cine shot, Oh, you know, set the origin scan because this, yeah, because no, it's funny.
I was skipping that
Joe: and I didn't realize how valuable that was. I was like, Oh, I don't need
Eliot: gold. It's [00:40:00] gold. Right. Because then. Once then we can project those points. We are no longer, we have, because what I'm going to go on a deep, what the underlying core of what synthesize and any other solver solver does, it's, it's called a nonlinear least squared solver.
Um, there's like a bunch of them out there. And what is, what is, if it has a bunch of variables and it's trying to, you know, make up a camera solver on it. And what we're doing by giving it survey data, by, you know, getting the scan by giving, telling synthesize to find good tracking points and projecting them to get now 3d points is that we're.
Of like the 20 variables that that solver is, has to work out. Now we're giving it 15 of them because we know the position. We know the initial conditions. We know the focal length. We know the 3d points. All of a sudden you're down to like, you know, four or five variables and the solver can do that, no problem.
You know? So I want to, I want to do some tests with this, but if I, if I got the math, right, um, then, then it, I mean, it just means that. You can just hit go and it's going to work 90 percent of the time. You know, everyone's going to have [00:41:00] some crazy, I'm, I'm dealing with a shot right now with it. One of the other users is a stunt person and the camera bombs down, whips around the guy.
Like the whole background turns into this giant blur, like, Oh, you know? Okay.
Joe: That's the beauty of what this is, is like people trying to track those crazy shots. It's like, well, it's impossible if you're trying to do it with the footage, cause you're never going to be able to lock on any of those points.
Right. But it's like, you know, so that to me, that is where I see like how this could be really revolutionary is like, Hey, do you want to shoot your shots and have your tracking data before you even bring the footage into any sort of effects program? It's like, right. See bread, even right there. It's like, Oh, because I,
Brett: I've, I've spent years of my career fixing shots like that.
So
Joe: you know what I mean? So that I'm an old school.
Brett: Flame guy. So I've done it in real
Joe: deal
Brett: right
Joe: there. You're on the highest level there is when you're on flame setup. So it's like You [00:42:00] know, so that was, that's why to me like just playing with this Jetset stuff, just like that was where I wanted to just dive in because I just saw like such immediate application, um, just for so many people, like infinite, you know, and then I, I, Ellie, I don't know if you saw the, I sent you that video, um, where somebody was using an action cam.
Oh yeah, yeah, I watched that. That's great. That was pretty cool, right? I thought that was pretty badass. So I wasn't sure if there's any way that that is. Does, is there any way that that could work in conjunction with this? Can that provide any additional information to like be
Eliot: helpful? I'm taking it in and I, I watched the video and I, and I saw, I kind of, I saw what they were doing.
You know, they, they, um, they, you know, they basically solved the shot with a 360 and then they had a manual offset between the, that and the right and the camera and stuff like that. And it
Joe: sounds very similar to what you guys were trying to automate of like Right. The Cindy offset and doing that was like, okay,
Eliot: yeah, they, they.
Where they would get bit is if they had to, um, [00:43:00] they're, they're doing a bunch of things that are manual and they're assuming the alignment is, is one to one is exactly square and which, you know, it's a reasonable assignment, you know, you know, cause they have it bolted down. Um, I, I'm, there are, there are ways in SynthEyes to, to, you know, to bolt, to use multiple feeds to get it in.
I'm not sure yet. Like I, I. I want to see, um, because I, I think this scan based stuff that we're doing is actually going to be a better fit for a lot of cases. Um, especially when you have like a, you know, a cropped in, a cropped in camera and stuff like that. I, I, I'm not a hundred percent sure. I liked it.
I liked that what they were doing. Yeah. Well,
Joe: I, I, I guess I saw the two, um, cause I thought like for this stuff, like, okay, so this was. That was a reason why I shared it with you because I saw kind of like a fork in the road of like use cases, um, where I can see everything we're doing here in Jetset being [00:44:00] great for studio.
Right. Cause you need the laptop there. You have to sync it to the Aruko marker. You know, you have to do that. And it's like, when you're out in the fields, you're not, you know, that might not really be an option if you're like, Hey, I'm getting this complicated shot and, uh, out in location. I may not be able to bring that stuff with me to like, sink it like that, which I know you can do manual syncs.
You know what I mean? Um, you showed how to do that, but like, I saw that the 360 thing being like, ah, we're out in my field. So I was, I was like, So I don't know, maybe the, maybe the manual offset is the solution to that. If you have to like do it out on the field, but I was just like, okay. I don't know. It just seemed like having 360 information of your set to be able to track for that kind of accurate, hardcore sub pixel tracking.
I'm just like, okay, is there any way we could feed extra data to help?
Eliot: Well, it's, it's definitely cool. And as I, as I sit around thinking through it, I mean, one of the things that, that comes up is, um, is, is that people are wanting us [00:45:00] to do more and more is data management, right? So you have all these things going off on set and you want to have them all fire at the same time and you want to have them, you know, the recording, have it match up and all these, these sorts of things.
And there are not presently very many good ways of getting data like onset into post. And yeah, I saw what he was doing. He kind of snapped his fingers and using his, his manual sync and et cetera with the take. And that, that makes sense. Yeah. I liked it. I I'm going to have to stew on it a little bit more, uh, in terms of the, I think you're exactly right on the, the, the slight pain in the butt of the digital slate.
Um, it's great for studio stuff. We're, we're heading toward the. Um, and at some point, we're going to add this into Autoshot, which is the, um, uh, syncing based on the time code, because we already have the, uh, what is it called the, um, the tentacle sync on there. So our, our cine footage has the same time code as our tracking data.
And so, you know, we, we can, you know, we're going to be able to sync that in post. And that means you don't, you don't need the digital slate. Right? It's [00:46:00] still nice. It's super nice to have a burned in thing. It's like, no, I'm on, you know, especially I'm on this ID, right? It's like, there it is, like irrefutable,
Joe: which
Brett: again,
Joe: for studio stuff, I think is brilliant.
Eliot: Yeah. And yeah, but I
Brett: would agree. I'm sorry. I didn't mean to interrupt. Oh, I, I, I agree. The idea of not having to have a computer nearby would be ideal. Uh, and I had a question that just came up in my mind about the, uh, lens calibration. Uh, Is there a way to do that without having Autoshot on set? I know you get the red.
Yeah.
Eliot: So right now we need it. There's just no two ways about it because the, and the reason we have it in Autoshot is because we tried to do that number crunching on the phone. We'd, we'd smoke the phone. That makes sense. He's like, bon voyage! Au revoir!
Brett: My question is, if I'm out in the field and I'm running in a place where I don't necessarily have some sort of network to jump on, uh, what's, what's the [00:47:00] solution?
Or do we have to run a hotspot or something like that? Because if I'm in the field when I, and I don't have. Anything outside of maybe a 5G signal, uh,
Eliot: Hotspot is your, is your friend. And actually we, we found this nifty little travel router. It's pretty sweet. Okay. Yeah, that's cool. And this I put this on a, on a production recommendations page because it's, uh, This just comes up.
We're just going to make this into a standard recommendation. Um, it's here. Um, and I'll share a screen real quick. Actually, I mean, you guys can pull it up. But it's a little travel router, and it's tiny, and it runs on not much. And, um, but it, and, and it's designed actually to run under another wifi system.
So this is, even if you're going to a stage that has wifi, um, I think you're going to want to move to something like this because then you can set up your own wifi network at home. It's, it's, it's designed to go in a hotel room, right? You go and then you [00:48:00] have your own wifi network. That's like shielded from the main, you know, wifi system.
And it, and it builds its own like DHCP, you know, like, you know, DNS network, et cetera. And that way you are in control. It's a hardware
Brett: VPN.
Eliot: Yeah, you know, and then you like, okay, you bring, bring along your laptop, you plug it into that and it's, it gets the internet from, you know, the other wifi signal, but inside your devices are just talking amongst themselves on their own.
And that way you set it up in your house, you know, everything works and you walk, you walk out on set and you're still on the same IP address. You're not dealing with weirdness. Cause when you're dealing, you walk into somebody else's, somebody else's network, weirdness is going to befall you, right?
Brett: Oh, yeah, it's already happened.
I had no problem with this stuff at home, but when I got on that green screen stage and their network, we had trouble getting Jetset and, uh, and AutoShot to talk to each other. I eventually got it working, restarting things over and over, you know, restarting a laptop, restarting the network. Restarting a router, whatever, uh, and it finally turned [00:49:00] out because I do have a VPN running on my phone, which wasn't a problem at my house, but it was definitely a problem on the stage.
I had to disable the VPN. Um, so yeah, it's definitely, that's why I say like, it sounds like a hardware VPN, which is kind of an ideal situation. And
Joe: they're
Eliot: cheap and you know, then all of a sudden What we're just trying to do over and over is remove variables, right? You know, remove all the unknowns that it's computer, it's networks and stuff like that.
And now you just want it to be your own networks that you own. And that, that, you know. That everybody else can go do their thing and they can't get your, you, right. Cause it's, yeah, no, this is awesome.
Joe: Especially for 90 bucks. Like that's, yeah,
Eliot: this is, I mean, we're, we're experimenting with this. I, I, I first tried this big fancy one and, uh, and I realized that it expected its own network feed.
It wanted to go directly to the fiber modem, et cetera. I'm like, okay, no, no, no. We, we, we want, we don't want to. We don't want to have everybody else have to shut down their network. You just want to like [00:50:00] piggyback off the existing network. And the cool thing is this can, um, a hotspot off of phones. You take your iPhone, you know, tell it to hotspot, run a USB signal, a USB cable, and boom.
And then it uses your phone for the, the, you know, the network for the internet stuff. But most of the autoshot stuff doesn't go to the internet. Right? You're just, you're just bouncing stuff back and forth on the local area network. And then, you know, then you're fine, you know? So you would
Joe: still need to bring like a laptop or just some kind of Machine did onset to plug into this router too.
So your phone and the machine and PC would need to be plugged into this router.
Eliot: I think what people are going to do is, is it's going to be like a Mac laptop cause they run forever on battery and they're silent, you know, and you need that for the calibration part just for calibration. Um,
Brett: yeah, I have, I have a Mac laptop.
I haven't used it much with Autoshot. You, you think it's, uh, I have not tried it at all. I have an older PC laptop that I've used, but it's really. Rather slow. My Mac, I didn't even realize
Joe: you guys had auto shop for Mac, so [00:51:00] that's good to know. Yeah, yeah,
Brett: yeah. It only works on m hardware on the, on the max.
Yeah. I've got a uh, right, M1, M1 plus it's first generation, but it should, yeah. I got the M1
Joe: plus too. Yeah.
Brett: Yeah. So, uh, but you think it'll, 'cause I'd prefer to use that over this PC laptop 'cause it's definitely a faster, but I've stayed away from it because it's Max and Unreal and all this kind of stuff that don't seem to play very well together.
Eliot: Yeah.
Brett: Uh, unreal is a thing.
Eliot: Unreal is a thing. If you're running Unreal on the set, honestly, you know, the, I've almost never tested on, on, I don't know if I've ever tested on real on Mac because everybody who's using it is already on a windows box of big NVIDIA card. And you know, that's, that's just, yeah, I've worked
Brett: with it on Mac.
There's a lot of limitations. Uh, I would not recommend it, but, but if I could use it, yeah, yes, exactly. Blender works great on Mac and PC. It's. I've used Blender a lot for character stuff, because I've worked with an animator that does that. And we've actually been working with [00:52:00] another technology, uh, Wonder Dynamics, I don't know if you're familiar with that.
Yeah, they do, it's basically live motion, it's motion capture, but without It's fascinating what they're doing too. So you can shoot video and replace an actor with a CG character. And there's certainly limitations to it, but it's pretty cool what they're doing as well.
Eliot: Yeah, that was neat stuff. I saw, I saw the videos coming out and, and, uh, and, uh, just, you know, cause it was impressive that they're doing both the capture and they're, they just, they're putting together a bunch of tools.
They're doing the AI based background, like in film and things like that. So I thought, I thought it was great.
Brett: Well, and one of the things they will give you is you shoot your video. You basically, the way it works is you upload the video and then it process it. And then you can download the finished shot, which isn't really finished, you know, but it's a good kind of estimate of what it's going to look like.
Cause then it will also give you the blender project back with all the animation built into it. So then you can actually finish the shot and post. [00:53:00] Using that stuff. So, uh, it's great. But back to this. So with the synth eyes, I have, that's kind of my next step. I've gotten pretty good results just doing the, uh, but you think that's really almost.
Necessary to get a really nice high quality shot in the end
Eliot: at the point. So it, it all depends. So the, um, if the, is, is there visible like ground contact in the shot, like here, like the camera is like two feet away from the contact point and, and yep, get it, you're going to need to, going to need to track and refine.
If you're doing the, the, you know, walk and talk with the cameras, you know, like. Shows waist up and they're, and they're going along. This, the, uh, the Jetset date is fine. It's
Joe: I've seen that
Brett: too, but as soon as I start to shoot somebody standing somewhere, I've noticed it for sure. And that's, so I agree with you.
Yes. That if I'm shooting waist up and, uh, not trying to do handheld or something like that, cause handheld gets tricky. I've [00:54:00] noticed. Um,
Eliot: it's interesting because the, um, We've actually had one of the, like the Truesdale Brothers shot this boxing, uh, boxing short and see if I can find the, uh, Truesdale.
There we go. Let me pull this thing up. Uh, they just did this, um, as a spec commercial. Was
Joe: it that venom, that venom? Yeah. Yeah. And it's really fast. Fast handhelds,
Eliot: you know, um, let me pull this over this. And turn on the video clip.
Joe: Yeah, that was a cool clip. I was like, Dan, that's what I'm talking about.
Eliot: And I, I want to do some, you know, I got, we got to do some interviews with some of these folks to, to get them up.
Cause so, so that everybody can see what everybody else is doing. But all the shots in the, in the rings and stuff, that's, that's just handheld Jetset and the camera's moving so fast, you know, and all this kind of stuff that post tracking, you know, good luck, right? You know, what's nice with this stuff is that The sweet spot of the Jetset stuff is, is sort of fast moving kind [00:55:00] of, you know, camera stuff that would be hard to post track and the sweet spot of the, but, and the places where you need an assist, like the ground context stuff where it's real visible, then it makes a good, like, you know, reference.
It's a good jumping off point, uh, to do that. So that's, um, and that's. I mean, just mathematically, that's just how it is. If you have ground contact, you need a sub pixel resolution. We can't do sub pixel in real time. And I don't think any, honestly, if you're, if you're curious, I worked out the math. Let me just, I'll, I'll just put this on the, cause I, I, you know, I wanted to actually just, I knew it intuitively and then I wanted to actually just kind of, um, show people, but why, why you do this.
Um, Let me look at this production. I think I put this under Autoshot. Uh, difference in tracking quality. Let's see if I put that there. Okay. Yeah. So this was, uh, cause I, what I wanted to do the math because it's, it's [00:56:00] just, this is just math, right? So, um, and what I did is I wanted to show people, you know, roughly the, the quality they, you know, that if it's done right, this is roughly a good Jetset track, which is, you know, it's solid.
It's, it's working, but it's not a set pixel lock. And if you, if you zoomed in, you'd see the, the small micro jitter sorts of things. And, but even then, right. With, with things in a good case. It all depends on how close your sensor is aligned to your main camera. Right? And so, and I worked out that the, you know, the trig of, of like, if you're aligned by a 10th, misaligned by a 10th of a degree, and you're two meters away from whatever your, your point is, then that's already a third of a centimeter, right?
You're misaligned by two 10th of the degree and you're, you know, and it goes up very, very quickly until you're, you have virtual centimeters of error with. And it's, it's inherent because the, the tracking sensor is mounted to a camera. And if it's like it this far off, you got three centimeters of air, right?
You know, that's, and that's, that's not like [00:57:00] our technology or someone else's technology that is math. And, you know, like, okay, so I was, I was a mechanical engineer. So you cannot fool me about, about tolerancing. I'm like, I'm looking at a camera rig as it goes together and the AC is hoofing it to put it together.
You are not within a tenth of a degree of accuracy, I mean, you know, you put it, you put it on to calibrate and, you know, they go by and it gets, you know, banged against the door. They move stuff over for the, you know, for the gimbal operator to balance stuff. Get happens right. And it is a mistake to expect laboratory precision on a stage.
So what we are trying to do is design for the, the realistic use case of. Of like, okay, we got real close and now we need to lock it up. And, uh, and let's do that in the most efficient manner possible. Uh, anyway, no, yeah. I think that is the best tool for that. You believe right now? It's the tool that we built up the initial pipeline and it [00:58:00] has, uh, I actually, I tried to do this in Nuke.
And, um, in the inside their tracker and I couldn't make it work. There's a, I'll show you the magic trick and synth eyes that makes this work. Cause they're all, I don't know if the trackers are good and they can all do it. Should
Joe: I get synth eyes like right now? Like it, cause I mean, that was one of the things, like, I don't know if we had to, to, to perhaps like send this shot.
To synthesize, to try to refine. I don't know if it would take longer than we have right now, or if it's worth me
Eliot: doing metrics. Oh, actually, yeah, we can, we can totally do it if you want, if you just wanna get, you know, like a Yeah, I'd love to see it. I, I
Brett: haven't
Joe: watched that video yet. Right. I figured you would like to Brett also as well.
Right. So I was like, okay,
Eliot: yeah, let's do it. I'm done. Let's do it. Because this is, because you've got, you already have the data. You already have the yay mats. All the pieces. That's what I figured.
Joe: I figured most of the time consuming stuff was done, so I was like, wait, this getting, let's, let's track it. I'm just gonna do a month subscription here real quick.
I'll just bite the bullet, do this month subscription, and if this works then I'll subscribe for more. But I figured, you know, 50 bucks is a fair enough risk assessment there. So let me, I'm, I'm just filling out,
Eliot: this is a hundred percent why [00:59:00] we, we, we do all this s 'cause each individual step is like a low, it's a low bar and it, and it gets you there and it means that you're still, you're in control, right?
Yeah. I've, I've, I've told , we're gonna walk through this and we'll, we'll get it, we'll get it to work and, and then you're still in control and you're not having to go and hire a facility. Right. Just to get your, your couple of shots done, right? This is, you know, you hire a facility. You have to
Joe: be 400. I'll tell you, I've been telling a bunch of my friends about it.
And one of the things that I think is, um, especially from like, even like a larger scale Hollywood, more commercial viable production front, where I think a lot of it is reshoots. Dammit, I need a pickup shot. Shit, I need to get this thing. Do I have enough money to get every location, all of this? It's like, but when you were on set, did you get a whole bunch of good reference pictures?
Did you do everything? So if you need to quickly make a, you know, Oh, I just need a closeup of this person saying this line, or I need an insert of somebody grabbing [01:00:00] something. It's like, okay, well that kind of stuff. I feel like this is money in the bank. Yeah. You know what I mean? It's like, all right.
Eliot: Yeah.
Um, it's like,
Joe: So I've already, I've seen a lot of, I've seen ears like perk up at like that thought of like, Oh, okay. I didn't even realize that was a way to use this. So yeah, my, my, what's up, sorry.
Eliot: Oh, I think the, the gaussians glass are going to be a revelation.
Brett: I think so too. Yeah.
Eliot: You can capture those.
And you know, cause people are already,
Joe: they
Eliot: can do it. The, I think. It's all different levels. Like what's, what's the level and the, and the, the app and we're, we're experimenting with Polycam and it looks really cool. Um, the best results we've seen so far is when someone has LIDAR and they, they, and, uh, and, you know, takes a bunch of photographs and runs it through, um, uh, what's the program, uh, capturing reality that that's where the, um, that's where that, that coffee shop scan came from.
And then it's like. Bam. Like it's a location at that point, right? If you have [01:01:00] captured a location and you go back and reshoot in that location, if you need to need to do it. Right. And you know, people are going to do this for reshoots a couple of times. Then they're going to go this second. We could just capture the location and shoot it.
You know, that's I'll sit here and predict it right now. It's the first time I've seen something that can capture like photogrammetry. You end up with these kind of dead looking models because you have to munch down the textures and. It looks like shit. I don't like, uh, yeah, you're not releasing the light, you know, whereas the Galaxians that you captured in the light right now, now we're in the reflections and now we're now we're talking.
So, you know, if you have to capture like, you know, go take a thousand photographs. Great. Okay,
Joe: so let me see if I because the 1 thing about Polycam that I liked was the speed. Life. So fast. You know what I mean? Like that was amazing. I was like, holy shit. I took the scan in my bed in like 30 seconds. It was, I was blown away and it looked good.
I was like, damn, that's a pretty [01:02:00] high res, especially off the iPhone 15 pro. Like it's nice. Yeah. So I can definitely see like people who are, if they want to shoot it, it's like, okay, take the DSLR and get those super high res, you know, clean. It's, but for people like, especially stuff like this, where I'm like, no, I'm going to reshoot this anyway.
I just want to get a nice.
Yeah, essentially to use it the way you did it for your unreal blender work through where you're like, right, you know, cause that was a Polycam scan that you use for that too, right?
Eliot: Oh, the blender one. Yeah. Yeah. We just, uh, it was a scan of a set that they build on the, on the, on the stage. They scan it, Polycam and pull it out, you know, drop it in scene locators, all the high old nine yards.
This, this, again, is going to go off. Yeah. We're, we're working rapidly toward, so right now most people are, are getting this because they went onset pre visualization and, or like onset visualization and then post tracking, like exactly, exactly what you guys are doing. I think what's going to start happening, and we're, we're putting in some of the pieces to do this [01:03:00] well, is to, um, once you have a location scan, poly scan, drop it, you're going to plan your whole shoot out, and like just going through and, and, We're going to be putting in a couple of features to let you kind of put like a digital actor in there that you can make you move around easily.
I mean, it's just a, you know, a blender figure, but being able to actually move them around a little bit so you can direct, you know, where they're going to be, um, because you drop it in blender. You're like, Oh, I want him to be. Over here, and you wouldn't be able to do that without going back and forth to Blender a war.
Yeah. Yeah. No, you hear right and just do that in the app. So we're in the middle of doing that. Um, but when you can do that, and the, you can already embed audio. So if you had, if you actors, you know, just run lines and you just record that and you drop that into the, um, into the file. Um, Jetset will already, or like, Autoshot will already bake in audio into your USDZ.
So you can hear, you can hear the actors talking, right? You can do the takes where they're talking and they have the conversation. And you can block it. You can test it and watch it, right? You know, like with, with just, [01:04:00] you know, and I've done a few tests with this where you're like editing and resolving.
Oh, I need a shot. Go ahead. Get the shot. Dump it over. And the instant people get a idea of how fast you can do this, where you're blocking shots and you're watching your work manifest in front of you and you're like, Oh, I need a shot to walk over the living room, get it is they're going to plan, you're going to plan your whole show this way because you can go so fast and you can watch it before you're, you know, in context.
Um, you know, in an edit, you know, like watch, watch what you're creating and get a sense for it where the camera needs to be like, what the kinds of stuff. Some people are doing this already in their head. Love it. Spielberg, you do you, I, but some of us would like an assist on an understanding how this is all going to work and working our way through the, you know, a scene.
This it's a 3d world. It has limitations. Um, you know, the camera has to be here. The crane is this big, this kind of stuff. And to be able to work through that in your living room. In your bunny slippers before you're on [01:05:00] set and like the crane guys are like, you know, yeah
Brett: Yeah, it's amazing
Eliot: yeah, I I just I'm very excited about that part part of it, um, all right, so All right. Let's say you want to share screen and we'll let's, let's walk through it.
Joe: Yes, indeed. I sure do. Okay. Ba bam. Okay.
Eliot: Uh, you guys able to see that? Yeah, there we go. Okay. Okay. So first of all, let us in an Autoshot.
Uh, let's go over to that guy and, uh, let's just make sure we've run, we've run the, um, uh, did you run the others? There you go. And, uh, yep. So don't, so turn
Joe: inspire net. Do you wanna see what happens when its inspire net's on? 'cause the end blocks Oh, that's
Eliot: right. It's, it causes, I'd actually like to see that.
Um, let's, uh, okay, let's save and run, add camera, image, plane. Okay. Yep. So day directory. And, and then [01:06:00] it,
Joe: and that's, and that's it. No, no. Oh, let's, no, no, no. Let's hang out. '
Eliot: cause what's going on in the background? We may have to wait here for a little bit, which, you know, and we got, okay, it's 500 frames. It might.
Okay. What's what's happening is in the background. Oh, there we go.
Joe: It looks like it's doing something
Eliot: now. It's loading up two gigs of, uh, it's an AI, it's a AI model. It has to load the model weights into the GPU. The model is not small. Right. And so, you know, let's, let's just make sure it's, yep, there we go.
It's, it's, it's, it's cooking frames. So, so maybe
Joe: it was because I didn't have synth eyes. Cause it. Was just not, it was just like, literally yesterday, I did the exact same thing I just did right now, and it, like, I did it like seven times, and every time it would just, I wouldn't get that bar. I waited for ten minutes and I didn't get the, that, that status bar.
Eliot: It's probably because I'm watching it.
Joe: Yeah, that's what I thought. I was like, of course, now that I bring it to the mechanic and there's nothing wrong with the car.
Eliot: Right, right, exactly. Because I'm watching it and it's working. Yeah, [01:07:00]
Joe: no, I mean, in this one, I don't, there's nothing to, there's no green or nothing to roto.
So, you know, but it's just, I'm glad to see that it's working for one of you. Oh,
Eliot: so there's no actor in it. I mean, this is the bedroom shot. This is the one. Okay. Actually in that case, let's, let's go ahead and cancel. Uh, and cause inspiring that we'll do kind of weird things if there's no actor. Um, there we go.
And so let's, let's, let's on the air room wrote a model. Let's go to none. Um, there we go. Cause, cause what it'll, and then let's, okay. Yeah. Let's go ahead and then click save and run. And then, uh, then it should actually be really fast. There we go. Yeah. When I did this, this was super
Joe: fast
Eliot: done. Yeah. And so what just just so you know what's going on inspiring it is a it's an image segmentation thing, but it doesn't necessarily care about humans versus other things that cares about foreground cares about what's in the foreground, you know, area, and it segments and usually if you have actors and stuff there in the foreground, and it segments them really well right it's, but if you just have like a lamp in the foreground, it's going to segment the [01:08:00] lamp, it's kind of funny to see it do that.
Um, so if you don't have anything in the foreground, what can happen is it can be generating a roto mask that covers the whole scene. And then you're not going to get any tracked points. I was scratching, tearing my hair on this that I like, why isn't it tracking points? I clicked on the roto mask. It was like, I'm like, Oh yeah, it's because there's not a person in the scene.
It just wrote a mask, the whole thing. I mean, I
Joe: got to say though, that's genius. I'm going to test that next. I'm going to just shoot something with a person because the fact that I can get an AI roto model to like. Oh, so that it doesn't solve for that movement is like, um,
Eliot: I show this to someone who is a little
Joe: extra.
You guys didn't even need to do, but you did.
Eliot: Yeah, yeah,
Joe: that's
Eliot: it. It's what matters when you the what's what's going to happen with all this stuff. And we see it see it rumbling. And it's exactly what happens with all we used to build a we've built the systems for once upon a time and Pan Am and, you know, the big, you know, these big episodic shows.
And when people get a taste of how fast you can go, the shot count will go through the roof. Absolutely 100 percent [01:09:00] through the roof and this whole thing is designed to handle that. So, you know, right now we're processing individual shots, but the reason like each shot has a hex ID is so that later we can add in our timeline capability, which I'm already working, working through how to how to process those because you're going to be processing 100 shots at a go.
And you need to be able to do that with one person, you know, no, I'm glad
Joe: to see you guys factored all of that in for like, right. When you actually gear up for a real production and you're like, holy shit, cause I could, you could easily get lost in the weeds without those kinds of identifying markers.
Eliot: The wheels will fall off the core cart. Yeah, exactly. Yeah. Cause you're like, okay,
Joe: yeah, that's, that's really the thing. And you guys are right to focus on that because the idea of something being cool is one thing. But then the idea of it being practical, that you can repeat and that is simple. And that is a whole other, you know what I mean?
And I already know Brett understands that pain of like,
Brett: Oh yeah, I worked on once upon a time.
Joe: I can't pitch this. I can't pitch this to a client. You know what I mean? It's like, I just can't. [01:10:00] There's too many variables. I don't know if I could send the information out to the artists that needed it. You know what I mean?
And then you can't see it.
Eliot: You know, like, yeah, I watched the tail end of one of those seasons and it was less than 14 days from shooting onset to in your living room, rendered 300 shots. Post comp to insane. It was going so fast that the network would sign off on, on the, the show on the, on the real time preview stuff, but there was coming out of that, or there was our prevision system back then they would sign off on the edit with the prevision edits in there because it was going so fast, the visual effect, the final visual effects shots were dropping into the mix and they went out the door.
It was. Nuts. And once you see it, oh yeah. That
Brett: show frequently finished, uh, day before day of Air . I, I, I worked on the finishing on that show. It's, uh, it was very tight schedule. and the post workflow, which show is
Joe: it?
Brett: Uh, once Upon a Time, .
Joe: Oh, once upon a time. Oh God. Yeah. Got, okay.
Brett: I also worked on Pan Am, uh, but only the pilot.
Um, but yeah, I've worked on, I worked on The [01:11:00] Flash on cw, which had ridiculous turnarounds like that. Uh, yeah. You know, with a lot of visual effects and just insanity dropping visual effects shots in hours before it had to be at the network, right? Right. And in that environment,
Joe: you exactly you need that efficiency because yeah, because everything that's, that's, that's been my focus is like, how do I get a workflow that is.
solid, that is simple, that I can like show others. That's the thing. It's like, I'm not going to be doing every position. So when I'm on set, I need to be able to clearly do that. So that, that's why it's like, I just, your guys emphasis on, and like your team and, and like workshopping with us, people who are like out there doing it, you know, your actual, Is what's gonna make this land in like actual cinema production and not just be like oh This is cool thing you can do if you have a passion project that you don't mind Fucking around with and looking like an asshole for your first 20 takes, you know what I mean?
And like it not work things like that. So So, okay, great. Yeah. So we're, we're definitely, I just [01:12:00] wanted to give kudos to you guys. Okay, so let's, let's,
Eliot: let's, let's get this, uh, so first of all, let's, uh, under, under the, under the, um, in AutoShot, let's, uh, click, uh, right next to the SynthEyes script exported to, there's a directory.
So go ahead and click on that and that's going to automatically open up the windows. There we go. And so the, go ahead and, uh, go up and copy the path, because what you want is the path, because we're going to have to open this up in SynthEyes. Actually, you can just click to the right of that, uh, and you'll tap the whole.
Yeah, I'll copy the whole address as text. There we go. And we go into synth eyes and that's, uh, let me go under script and that very top row and we're going to go run script. And the paste in the, uh, the path. There we go. There we go. I'm just hit enter. There we go. There's a, that's the script we want. So that's the, there we go.
So it's going to pull that in and import it. It's going to take a second. And all right, there we go. Importance click. Okay. Now, uh, go switch your [01:13:00] input. Uh, actually let me find my little annotator. Well, they, they've hidden the annotator for me in zoom and I don't understand where they, where they put it, but up on your right, under layout on the very top area, you can click to camera.
That's it. There we go. And let's, uh, let's scroll back out a little bit. Okay. So that is, all right, I think that is your footage. Um, and we, so it looks like it's lined up. So let's, uh, we can zoom in again to scroll, scroll wheel in a little bit. And let's, um, so that looks like good. I think that's, that's your bed.
Uh, it looks like it's lined up. Oh, yay. All right. Fantastic. So what we can do immediately is, um, is we can just go over to, and Oh, this is a cool thing. I'm going to, let me give you a link. Uh, oh yeah, good. And. Okay. Yeah. Just go, go up a directory. We are, this is a quirk of synth eyes. Um, and then go, uh, actually [01:14:00] go back to the one you were in.
I forgot what's in, uh, let's go actually. Yeah, go down, go down to the synth eyes. There you go. And then just click on that and click save. We already generate the, the SNI file. Um, and then, but for some reason, Synthize wants us to just go back and re save it now, it won't bother you. It usually does end up doing that, doing that every time.
At some point, we're going to fix that because it's a little bit irritating. So, okay. The first thing we're going to want to do. Is so normally what you just hit do is you just run the auto tracker and it's gonna but let's not do that yet. Not not that good. Okay, this is key. Normally synthesize you the big green auto button.
We don't actually want to do that. Not yet, right? Because we want to. Yeah, we're gonna break the process just a little bit. And we're actually going to, I'm going to, let me give you the documentation. I, and I'm going to be doing this in the next, um, in the next tutorial, I'm doing with Unreal and fusion, uh, and synth eyes.
So we're going to want, uh,
Brett: Oh, that's perfect. I'm doing a lot of my work in fusion now because, Because Flame's way [01:15:00] too expensive, and so was Nuke, honestly. Well, Infusion's
Eliot: great, man. It's
Brett: fast. I did, uh, the last show I did, I did a show that ended in like May. I was the visual effects supervisor. It was a cheap, low budget sitcom, but I did all of my visual effects in Fusion and fell in love with it.
I really did. Oh, this is so Would you
Joe: recommend that over Nuke? Well I'm not a visual, I'm a director, but I like knowing about visual effects. You know what I mean? It's like I like being able to go into programs and learn and play, so.
Brett: Yeah, I think, I think it's very good. It actually has a lot of the same capabilities that I was using in Flame, but the thing that I really like about it is it's fully integrated with both edit and color.
Joe: Yeah.
Brett: Um, and if you've got to work, if you're working in a facility that's, Resolve base for color and editing. It's almost a no brainer to start doing it that way. It makes your, if you've got a visual effects team, people that know how to use it, uh, yeah, I think it's [01:16:00] fantastic, but you know, it's not the standard, but I know a lot of people are moving that way because of the cost.
It's incredibly cost effective.
Eliot: There's a category of things where we're seeing it happen over and over, which is, it's a lower budget, you know, lower budget show, so they need to pipeline hard. And so they, and there's a lot of green screen shots. And so we're doing this on, um, uh, one of the, uh, the, it's a show called Moonland that it's a kid's show in Norway that they're, that was the very first use of this pipeline.
Um, like the pipeline didn't exist and they were already getting, shooting the show. Like we got had like barely, like I got it like running 40 hours before they're, they started tracking shots. So it was, it was close.
Brett: That's fantastic. Cause that's the pipeline I want to use. And
Eliot: this is the key is that I spent a year.
You know, thrashing with, with fusion and, and resolve and programming it a lot. And if you are trying to use the media in nodes, they're very fragile and fusion like, so the way, the way fusion is normally designed to work in the external one is it uses the XR loader to load in the XRs, the [01:17:00] XR savers, and then you can pipeline stuff and it's great.
And if you use EXR loaders and savers inside fusion, inside resolve, it works great if you try to use the media in node, which is what they added into the resolve version of fusion to take footage direct from the timeline, super fragile. It'll break and you won't understand why your comps are breaking.
At least that's what I ran into. So we actually built our automated fusion loader and, well, we can go through this. It's, I haven't had time to do the tutorial on it. It already works. But you know, you just drag, you just drag and drop the Lua file and it goes and synthesizes the, uh, A basic comp, right? It hooks, hooks up the, the, you know, the camera and the camera out and stuff.
But I would love to actually work with you to get it. It's, it's not been run through production yet. Like I got it running to a base level, but I want to dial in all the pieces to make it.
Brett: And are you transcoding then, or are you working with camera originals?
Eliot: So AutoShot, the way we do it is we actually, we, we take, we go from the camera original and then we pull our EXR files and we do the correct color, you know, as of last week, the correct color gamut transformations to get into an [01:18:00] ACEScg formatted EXR.
And that's, we basically standardize whatever gamut it's coming to, we go to ACEScg and everything reads it and it works. And you're not,
Joe: that's good to have some kind of standard like that is good.
Brett: And I've worked with, uh, I mean, the show that I was working didn't shoot raw. They actually shot V log, uh, like what you're dealing with, Joe.
Uh, and we, I did it directly in timeline cause I was the finishing editor as well. So I did a simple show and I, and I do shots right in the timeline in Fusion. Just because it saved me a tremendous amount of time. I never had to drop a shot in. I'd finish the shot. I'd tell the colorist, Hey, take a look at that.
Make sure you're happy with it. He adjusted. And most of the time he wouldn't even need to adjust it. And the shot was finished. I could, and I could render out a final color corrected shot for the client. Cause everything was done remotely. I never had never even met these clients. Wow. It was the craziest show I've ever worked on.
I worked on that show for two seasons and I've never met anybody on the first time. Because it started during the pandemic and we just can't, they loved [01:19:00] working remotely so much they never went back. They're like, oh, it was the most efficient show I've ever worked on. But we did, we did everything from the timeline directly into Fusion.
But I'm, I'm curious, you know, about the EXR workflow. I've done some stuff. I mean, certainly when I was working in flame, I did a lot of things with the XR visual effects, either rendering or taking them in from other vendors to adjust them. And, and I, before we got into resolve, I was doing all the editing and the finishing in, in flame as well, but his resolve really developed their product.
It just became ridiculous to do any other way. And that's, they're very, they're very involved. I'm actually know a lot of the people that work over there. Uh, Yeah, they're, they're excited about making that the tool in Hollywood. They want everybody to be using it for everything, for editing, visual effects, and color, of course.
Oh, while
Joe: I'm looking at this grid, sorry, I, I, I'd been, I kept wanting to ask you this question and I forgot, but while I have it, when doing the scan, does the color of the mesh mean anything? [01:20:00] Like if it's red to yellow to green. Oh, no, it's scanning. Is that just like depth or no, I wasn't sure if it meant like, Oh, green, man, it's a good scan or, or if red man, it's a bad one.
Like I wasn't sure. So there's no difference in color.
Eliot: No, no, it's just sort of Apple sort of, they, they do segmentation on the scan where they're trying to identify like this, this polygon is part of a wall. This polygon is part of a table. This polygon is part of the ceiling. And so they do basic segmentation on it.
It's not. It's a little wokey, and um, and we don't really make use of that in any other part of it. So we'll, I'm going to show you a couple things. I'm also going to, we're probably going to need to get, um, because right now the, the, it's not been un delogged, and so what we're probably going to run into is that we're going to have some difficulty with, with feature extraction.
We can fix a couple of those. Oh, because it's flagged,
Joe: because it's logged,
Eliot: yeah. Yeah, so let's do a couple things. Let's, let's um, Let's first, uh, let's hit P for your pre processor [01:21:00] and under that right now, it's going from ACEScg to R709. Okay. That's right. Cause we need to, we need to be processing. What we've got right now is we have EXR files that do not have the correct log conversion in them.
Um, so, all right, so we'll, we'll get that, but let's, um, uh, Let's see if there's a better way to do this. Um, I see. So we've got this big spike in, in the middle and very little contrast on either side. So, um, you know what, let's go through and see what it gives us. Uh, let's just cancel out of this and I'll show you the process and we may have to go back through and rework the color a little bit and then, then, then we do it, but so, I mean, this is all good
Joe: to know.
I'm, I'm, I'm happy to run through this process. It'll learn all these things. So,
Eliot: so we're going to need to download something. So, uh, on the lightcraftpro. downloads. Um, let's go to that, that site and we have a nifty script. That's going to save a lot of time because we are, we already used one script into that loaded that the autoshot generated script loaded in all the data.
So let's scroll down and let's [01:22:00] get our, uh, Synthize multi peel at the, near the bottom. There we go. And let's download that guy. Okay, great. And in Synthize, uh, yeah, we don't, don't, we don't need to open it yet. So in Synthize, we're going to go under script. And we're going to go to a user script folder, and then we're going to go find that download of that script, the multiple script.
There you go. Let's copy that guy. Oh, yeah. It looks like you already downloaded it. All right. Yeah. Okay. And so then let's copy and paste it into the user script folder. That is where, uh, go ahead and back up over to, um, uh, Ian, uh, synthesis. We'll go back to script and, uh, user script folder, uh, one up. There we go.
And it's gonna open it up. And what this is that, this is Syns user script folder. There we go. And let's, uh, let's remove the weird one parentheses on that. I don't, I I always worry about parentheses and file names. There we go. Okay. And then over, back in Synthes,[01:23:00]
we're gonna go to file and we're gonna go to, oh, find new scripts. There we go. All right. So that that just looked through the user folder found the new scripts. Now we can go to script, and we should see a new script. Let's go down and it should see multi peel Matt Burke style. There we go. All right, so we go ahead and fire that thing.
And it's going to start working through you can move the Cynthia window over somewhere else. And we're not, we're not going to be using it this, so, okay. Just what's going on here is the, the, there's a part of, of, uh, the part of, of, um, synthesize that we're using is called the auto tracker. It's not the auto solver.
The auto solver is the big green button. Don't want to hit that. The auto tracker is going to go through the footage and it's going to look for features that are identifiable in a series of, uh, at least 15 frames. Pattern. Yeah. Right. And so the trick, of course, is normally you have to set The, um, the pattern [01:24:00] to like whatever size and it looks for the pad, that, that size of pattern.
And, um, and then if you wanted to look for different pattern sizes, you have to go in and type those things and re and re, you know, there's the process you do it and, and which is laborious and a pain in the neck. And so Matt, fortunately wrote this, uh, Cynthia script. Sizzle has two scripts, uh, actually multiple scripting languages.
Sizzle is its default scripting language. That's what the autoshot importer was, was, that's the first script that we ran. That loaded the Jetset tag, dropped in the scan, dropped in the footage, dropped in, did all the setup stuff. Um, and it all configured the solver, all the things that we're doing. This is actually a different kind of script.
It's called Cynthia. Um, and, Cynthia is advertised as having a voice interface. Makes no sense, whatever, don't do that. Um. What it is, it's a natural language interface. So if we were, what we, what you'll see when we open up the script, it's just telling it to go through and set the solver at three different, of [01:25:00] the auto tracker at three different feature sizes, go through and look for the good ones and peel the trackers, find the, find the ones that are, that are good ones.
Um, and it's just going to sweep through it and you could, it does this kind of
Joe: removes anything that's like horseshit essentially any like dead tracks, anything that will get in the way. Oh, that's smart. That's cool.
Eliot: Yeah. And it's sweeping up through three different stages of kind of small, medium and large features.
And we're going to see whether we see we see features at the end of this. If we don't see features, then, then, you know, Then we're then we need more contrast, we're going to have to fix our color stuff first. Um, if we see features, then we can, uh, if we see trackers, which there's little green diamonds showing up all over the place, uh, then, then we, then we have it.
So let's, let's go back and forth. I'm not seeing green diamonds. I'm thinking we may have to, uh, , get our color fixed before we, before we do it.
Brett: So this, this footage is actually a, a transcode. The, that's the E XR transcode that he's working with. Is that, that's e
Eliot: xr, transcode. And, and, and that's,
Brett: that's done, that's created by Autoshot, correct?
Eliot: Yeah. Yeah. Okay. And in this, I mean, [01:26:00]
Brett: I, go ahead. I'm sorry.
Eliot: And in this particular case, we did not have the correct log transform in there. Gotcha. So what we're looking at is, is like super, is basically log footage. Yeah. Yeah, it
Brett: looks def, definitely looks like log, for
Eliot: sure. Yeah. And, uh, oh
Joe: yeah, it's logged.
Yeah, definitely. Yeah, and, and I used the V, I've heard that Z camm is closest to V Gamt, but I didn't, I don't know. I don't think that's really accurate.
Eliot: I'll look up the correct one and get it. Get it exactly right. Oh, that's why you
Brett: had the V. Yeah. I've used the V gammut on actual Panasonic footage. Yeah, it works relatively well.
Uh, Resolve may have a transform for that already. They, they will. Yeah, yeah, yeah. They're very good at that. They're on top of these cameras like nobody is.
Joe: Yeah, I'm sure they definitely have.
Eliot: And so You can spin around
Joe: long enough now.
Eliot: What we'd normally run into is we'd, we'd have enough, I mean there's, there's enough stuff going on here that we would be able to detect, um, we'd be able to detect features.
I, on this one, we need to get the right log format in so that it comes in as a color at the color that it expects, because it's, what it's using [01:27:00] is, is, is contrast, you know, like, yeah, so that makes sense
Joe: to me. Like, yeah, I totally, that totally makes sense.
Eliot: But then the next, the next, the process is you basically pick out a bunch of the, the, the trackers and that, you know, you want to project onto the mesh and you go under, under track, we would just go project on mesh.
Uh, or drop on mesh. If they go up to the track menu, um, uh, up in the right next to view. Yeah. And it's just, you know, you pick them out and go drop onto mesh and then we'd solve bam. And that, that would give us our initial 3d solve. We do some refinements things to get rid of the, there's always some really ugly trackers.
You want to get rid of those. And honestly, if you, if you can find a decent number of track points on your footage that overlay the mesh, it's like three minutes away and you have a solve. Um, and then
Joe: all right, well, before we it sounds like we're let's let's roll back three minutes. So what do I have to do to get because you get so all right, let's do a let's do a we're going to do a hack.
Eliot: And this is okay. So we're [01:28:00] gonna let's go ahead and do file close and synthesize. There we go. And, uh, let's go back to AutoShot and we're gonna put in a different log format that is not the correct log format, but we'll, they, they're, they're all within a short period of time. I would say it should get you pretty close.
Yeah. So we can do, oh, okay. We already had Panasonic V log. Did, did we already have that set? I did. Yeah.
Brett: I was gonna say, I think he had that on. Yeah.
Eliot: Yeah. Well, that's weird. Um, okay. In that case, I want, I want to look at this take. So in that case, what we're going to do next is we're going to, we're going to zip up the take and send it to me so I can look at this and, and understand what's going on.
So let's go file. Is it
Brett: a ProRes file? You said it's a just a quick sign. It's a, yeah,
Joe: I mean, I have it right here actually. Um, like this Is it with, oh yeah. That's great.
Brett: Okay.
Joe: Yeah, this is it with the lud on, on it.
Eliot: Um, can you show it to me without the lut? Just so I, I look at it with a, like a normal 7 0 9, uh, 'cause that's the log.
So this is
Joe: it log.
Eliot: Okay. And then can you just put a normal seven, nine. [01:29:00] Okay. Yeah, that's fine. There's, there's enough, there's enough points in there to, to get stuff off of. Yeah, that's, that's fine. It's a rumpled bed, a lot worse than that. Um, so, okay. Yeah. I want to look at the take and I want to see if I can get it.
And that's with a, what, what, what, like, this is the Z
Joe: log. This is, they, they, they have like a, they can, they have their own, like, you know, rec seven Oh nine lots specifically for Z log. All right. So that's Z load to dad and, and just, and I just throw it in there and it doesn't.
Eliot: All right. So what I'm going to do is I'm going to go look up Z log two and what, what gamut is it running, um, when you were shooting the Z log, does it have its own Z gamut?
Or is it a, uh, what?
Joe: Oh, yeah, let me hold up. I could go turn the camera on and tell you.
Eliot: Take a quick look. Then what I'm going to go do is go get that, get the look for it. And then I may have to pick this up on Monday. Um, to, to get the rest of the details on it, but this is, this is the correct path. [01:30:00]
Brett: But ultimately it would automatically work coming out of AutoShot.
Eliot: Yeah.
Brett: I mean, that would be the goal.
Eliot: What it's doing is when it does the EXR pulls and you set it to a given, you know, the matching log and gamut, what it's doing is it's doing the color space transforms to extract that out correctly. Right? Like there's only one correct way to de log footage, right? Sure.
Sure. You just use the inverse of whatever the camera put. And then we put that into, we do a gamut transform over, over to ACEScg. Cause it's usually, you know, the, uh, the color space is usually, um, you know, slightly different, but ACEScg is a nice standardized version of it. But that way we're not, we're not losing data, you know, for the most part.
Like there's some weird stuff you can get into. If we, if we start dealing with the arcane forms of gamut clipping, then I'll look at it more deeply. But. For a standard use case. This is, this is the way to do it. So this is the rig. All right. Let me, uh, let
Joe: me, uh, pin pin. Let's see here. So I'm going to go to color.
Let's [01:31:00] see here. What is it doing for color gamuts? So Z log, I don't know if it shows a particular gamut colors face on here.
Eliot: I'll look up the specs on it. They probably have a standardized gamut that they're working in. And, uh, yeah, sorry. Yeah, it's not. I don't see it in here.
Joe: Yeah,
Eliot: that's fine. That's fine.
Everyone's while you have something where it's like a 14 different gamuts and I just want to make sure which I know which one it is. Um, and, uh, no, this will be great. This should be great. This will be great. All right. So we'll. Um, okay. Uh, guys, I actually need to, I've got some other stuff flying at me. So, but, uh, what I'm going to go do is I'm going to go look up, uh, get the Z log.
Um, uh, actually, so, Oh, before, before we go, uh, let's zip up, uh, let's zip up your take and send it to me. And that way I have, I have the test footage to verify it with, and I can, uh, you know, test the whole synth eyes pipeline on it, et cetera. Um, let's go back to your. [01:32:00] Uh, let me unpin this shared context.
There we go. So there we go. Uh, export take zip. And this is great, Brett. I don't know if you've, you've used this, but this is our primary debugging tool, because once you have both the take identified, um, and you know, all this kind of stuff, you hit take zip. It's writes out a zip file, uh, and you can see where it's located.
It, um, downloads. Yeah, there we go.
Joe: Oh, there we go.
Eliot: And so that has all the information, including the, it has the tracking data, it has the scan data, it has the, the Cine, uh, file and it loads it up to a format where he sends me a link and I open it and point AutoShot to it and boom, it's going to literally repopulate AutoShot with every setting he had on there and I hit go and that way there's no like, you know, running in circles, trying to recreate settings.
It's, it's a hundred percent recreated. And this, this, and where do you
Brett: upload that? Where do we
Joe: send that?
Eliot: Uh, just send me a Dropbox link. You know something. Okay. Yeah.
Joe: I'm uploading it to Dropbox right now. I gotcha. Okay. So I I I'll
Eliot: Simple stuff. Yeah. Ah,
Joe: so close. [01:33:00] Okay. Um, yeah. 'cause I was just like, I'm foaming at the mouth here almost getting it.
I, I, I
Eliot: got, I, I'm, I'm right there with you. So I do need, but I. I, I don't want to wing it on, on, on log transforms. I care deeply about getting that right. And we actually just patched a, um, a bug where we were clipping highlights. So I think that's one of the reasons why we weren't getting the, the, uh, why we weren't getting features on that is because we accidentally blew away some of the data.
Uh, and so now we fixed it. And now I just want to like close loop, correct transform, verify everything's coming through. And then we should be, we should be in, in the land of clover and honey. Uh, so
Joe: if I have for once before, because this was something that I just wanted to show you, because one of the other question that I had was, okay, I have that stuff in blender and it seems great and I want to get that camera information to after effects.
So I tried doing that here. And it's like, it puts all of the, the camera for me, it puts all the key frames on the camera layer and [01:34:00] not on any null object for me to attach, like anything I want to, and that's been troublesome. Um,
Eliot: is where the, um, so there's two phases, the stuff we're getting, we get from.
From, uh, from Jetset, we don't actually have the, those feature tracked points. We will have them from the synth I solve. So that's one way of doing it. Okay. And, and I also may look at, cause in Blender, what you did is you had the 3d mesh, you just dropped something on top of the mesh and it worked pretty well.
It's not, you know, so pixel track, but it was, it was, it was good for lots of things. I think what we want to do is I think we want to use the new After Effects 3d scene, and I think we want to bring in our mesh. And I think we want to bring in a camera because that way you can drop things onto the mesh and then they're going to stick.
I think that's, that's how we want to do it. That
Joe: is what I want right there. Yes.
Eliot: It's not a sub pixel stick, but it's easily going to be good enough for your, for a lot of use cases, you know, sub pixel. So I'm seeing
Joe: all of that. Yeah. All of that [01:35:00] stuff. Like you said, anything where they're not connected to the, to a physical thing in space, like waist up or knee up, you know, stuff I'm really seeing.
I, I, that could be great. And then just, but for, because for like, you know, a lot of the stuff I'm going to do, we need that subpixel, like it needs, it needs to be able to fool that illusion, take it up to like hiring commercial stuff, which is where I really want to take this. I like, I see so much stuff application here for music videos and yeah, whether or not I have it in time for this one.
Cause it looks like we're shooting this music video on the 23rd or fourth. So maybe, you know, hopefully, maybe, maybe we will be able to figure something out, but I would love to be able to shoot a bunch of things and then bring in stuff and be like, Hey, you see this prop. We like put some other art on it and like stuff things on there.
It'll just be, you know, or like tracking text, like, Hey, the, the guy spinning lyrics. It's a live performance thing. I want to see the text populate. In [01:36:00] all kinds of ways as the camera's moving around. Pretty simple, you know, stuff people use, tracking for every single, you know, all the time. And if they can do that without having to get that tracking data while shooting.
Right. Right. Because, and the other thing too with this, that the main reason, because some people are like, well why don't you just track that and post like normal, is like, we're probably gonna be having a lot of lights, strobes, and fog, and haze, and laser, and like all this shit. And I'm like, right, that's just gonna fuck up any, like, resolve we get if we're trying to do it in post.
Eliot: Yeah, you want to get as close and oh, I'm not going to toss in one more one more tidbit. Um, uh, is that so, so there's, there's a bunch of different ways of approaching this one is, you know, what the initial pipeline we did is what I call a hardcore pipeline. Like, this is just the bring it right, which is the synth eyes.
And like, it's a full on 3d really resolve resolve. And that's, that's like, That's like, you know, you can go, that's, that's feature. That's, you know, whatever, [01:37:00] whatever you want to do. You can, you can, you can do that. There are other ways of cheating it. Right. And so one of the things that I want to try out is that we have the, the, the, you know, you can see the real time tracks pretty good.
It's not quite a sub pixel check, but it's pretty good. The field of view is lined up. All these things are lined up. So what you can do, I, I think you're gonna be able to do is pick a point on the, the rent, you know, you just rent, render it out in blender. The, you know, just with the, uh, the normal Jetset track shot, um, you know, pick a point, uh, on the, on the 3D that's gonna be hooked onto the 2D.
Um, and the, and the render stabilize the, the, the 3D footage around that point and stick it to the, the track 2D point, ah, and then re-render one point track. Interesting. Yeah.
Joe: Yeah. Interesting. Cause I mean, yeah, cause I have, cause I mean, pretty much what I, it's very simple. That's an old
Brett: post trick doing.
Yeah. That's a very old, yeah. You
Joe: know what I mean? Cause for me, I'm like, but it works. It
Brett: works great.
Joe: I'm like, okay. You know, cause for me in terms of right here, like I'm in blender, like ideally if I'm like, Hey, I want to stick a point there. I want to [01:38:00] stick a point there. Like if I make my scene lock, which I don't mind doing, I don't mind, you know, cause like after effects has a thing where it shows you all of the track points.
You know, when you do a 3d camera and you can just like pick it and be like, make a null object there. Okay. Whether or not this workflow allows for like all of those points to be accessible at any moment that, I mean, that'd be cool, but not necessary. But at least what is necessary is, Hey, in my 3d scene, where I did the reference and Jetset of setting my scene lock point, I want to be able to have that be a null object in after effects that I know that will perfectly lock on there.
So. From what I'm gathering, it seems like we don't even necessarily need to do that. It seems like SynthEye's process will give me that once it's run through that.
Eliot: It will, it'll give you the point cloud. It'll give you the tracked point cloud and they have an export. I have not gone through their exporter after effects, but their exporters have generally been very, very good.
So the neat thing is that I think we can get you I think we can get you what you want. There's [01:39:00] different levels, like the synthesize process should already get you what you want. And I just need to solve the Z cam log stuff, and I'll get that to you. And then, then, you know, we should be off to the races.
Um, we, I will close the loop. I want to verify it, but it should just, should just work. And, and that will work great. That's like the, you know, uh, that's the hardcore way of doing it. There's also going to be an easier way of doing it with, where we just use the Jetset data. And with, and getting it, it's going to take a little bit of more work.
Cause I'm going to have to, we're gonna have to program the after effects 3d space, and I need to understand that and how we're going to, whether it's a, you, whether we just pull in the cameras, USD, I need to understand it. It's going to take some time to do it. Um, but I think that's the right way to do it.
Cause then we're in an after effects. You just put stuff in, you've got your mesh, like, you know, where everything is in it. And you want to, you want to just stick things on and have it work. And then I also want to try this one point stabilization tracking, because I think that's. Cause what's going to happen is some of these music videos is stuff's going to be, the camera's just going to be like, you're going to be mostly there and it's going to be flying off of the place.
It's going to break most of the post tracking [01:40:00] algorithms, but you're going to look at it and you're like, this real time tracks almost there. We just need to like go and, and stick those two pieces together. You can hand track one point through a shot, right? Right. True. No matter what happened, you'd be like, beep, beep, beep, beep, beep, beep.
And then if it's just one track, you just do it, lock them
Joe: together. And.
Eliot: You know,
Joe: so essentially you would just overlay those tracks. So it's always just, you know, the image will move around it, but if you just lock that track, okay, that's a pretty smart way to stick it on. Yeah. And that's an interesting, yeah.
I'm dying to say, you know, I'm down to try all of them. You know what I mean? I'm down. We can workshop whatever you want. Like, just let me know. Like, cause. Yeah, uh, because and that's the thing too why it's like there's a million different like case usage So it's like here. Okay. I want to put a cg object in a real environment Okay, I want to take a real person and put them in a cg and you know, so it's like okay Which one is best for which and which, you know, this one, we're going to see contact points.
This one, it's just an, so, [01:41:00] and honestly, like all of their, especially with like the stuff that I'm working on and with the places, I mean, a, the, one of the main companies I work with up rocks, um, they do their whole, most of their stuff is like, they're big on like music culture. And so much of their stuff is like talking interview to camera things with like big artists.
So for me, I'm like, I already see this. You know, I already see this as like a no brainer way for them to like kick up their production because they have like a, an LED room already. They call it their podcast room. It's not like a super, it's a, the pixel pitch is pretty high. So it's not like a, you know, you can't, it's not like a XR OLED type, you know, super, uh, Where you can shoot at and it looks real.
There's still a lot of potential where I'm like, you know, if you could track in that, you already have the screen. So it's like, this would be like a no brainer implementation for you to be able to have camera tracking you already. So, and [01:42:00] I think they would be using it on every show. Honestly, I don't see any reason why not, especially with these guys, Gaussian splats into this and.
We find a quick way because most of their stuff, you know, you're not going, you're not breaking that one 80, right. You're pretty much there. You're not going, you're not doing any crazy camera stuff, you know, maybe cameras on a gym, just getting that like subtle movement of like seeing that two shot, things like that.
So yeah, just a lot of excited. So I'm here to really try to figure out whatever way to like make this work. Cause I just see. I honestly just limitless
Eliot: potential in application. The, the gian splat stuff is gonna be fun. And I'll tell you what we're the first, the, well, I, I just did author the tutorial for it.
It's not in, in the, in the, um, it's not in the app store versions yet. 'cause we're still getting that through the, the, the, uh, the approval process. But probably next, next week or two. Um, but so we, we have both the real time implementation that we render live in the phone. So you can see the comp, et cetera.
And then we're going to have something where, um, we can rerender [01:43:00] inside the phone. So you have, you have your, your, your track shot inside the phone and you're like, okay, hit rerender and to get the, the Gaussian splatter rendered real time. We have to turn the dial down the quality of dial down and it still looks great, but it's, you know, we're.
But we wanted to re render that, that, that take, because we have, you know, the matched timecode, we have the tracking data inside the phone, and just prank the dial up so it's not 30 frames a second, it's like, you know, 4 frames a second or whatever, and re render a clean background plate that's matched to that original take, that has, that has timecode, and then you can just like, you know, you know, Push both of those out on, on, uh, you know, like Autoshot or whatever.
Um, and then drop them in an editorial. You have a clean track background plate. You can put your camera original footage on in premiere or in, you know, resolve or whatever, because the VFX workflow is perfect for what I'd say, like commercial high end episodic VFX, like things where you want to get it in blender.
You want to have it, you want to have it, you know, tracked and all this kind of, and the usual tools that people are used to using for what I'd say, the, the interview. Um, you know, [01:44:00] like the thing, I think a lot of people are just going to want to compliment in the timeline because you got, you're going to have like, you know, 15 minutes.
Yeah, I agree
Joe: with you.
Eliot: Yeah. And
Joe: especially with their, that stuff's going to be long. It's not like short shots. Yeah, that's exactly. Okay. It's going to be an hour long plan, you know, that, that gets, so these are all the ways where I'm trying, you know, it's like, I don't want to pitch to them something where it's like, uh, you just, this just added, you know, 20 hours of render time to our workflow.
We don't have time for this. So yeah, no, but this is, this is, this is cool. Like I, I, Yeah, this is, this is awesome. Um, I mean, cause I'm, I'm into starting in the, like the hardcore pipeline stuff, like getting that subpixel, I get to start there and then, you know, uh, work our way back to cool, like, Hey, this will save you time.
If you don't need that contact points or anything like that. Great. But for the most part, I need to live in. Hey, we can shoot this and it's like, we'll get you a rock solid track that affords us open limitless potential and post for text effects for Roto work [01:45:00] for pulling mats for sticking things on there for, you know, especially with a lot of just like painting because there's some artists who just want to like paint on things and just have it stick there and kind of animate.
So like, that's the world I live in. I live in like a very like creative people want to do all types of shit to the image type stuff. So, um, good. Yeah. So we'll have fun, but, uh, okay. So I guess I, I think that there, is there anything else that I should be like, yeah, you're testing until I get, until I hear from you on yeah, send me the take
Eliot: and I'm going to be going through it and, uh, and getting the Z log stuff, correct.
And I want to match it and test, you know, make, just make sure that the, that the, uh, it goes through the, the, uh, the correct log stuff works. Get in the synthesize, make sure, I mean, cause I'll be able to track your shot. And I just want to make sure it works. Um, and, uh, and, and that I can just close the loop.
I don't need to be, uh, doing that on an office hour. So just once I have to take it, and then
Joe: we'll jump into the live action composites, you know what I mean? Like taking a person in front of a green screen and doing it, because that [01:46:00] it's just the tough with the composition of like, what am I framing for?
You know what I mean? Like I I'm trying to look at my talent, but then I'm also looking at the comp. Yeah, that's sort of talked about that. It's tough. Like the range fighter thing is a pain in the
Eliot: butt.
Joe: Yeah. With this, when I didn't have a person and I'm just shooting to track data and bring it in as like a, you know, like, like for this example, to put a CG element there, I could just look at my camera and just get the shot that I want.
Like what's on the Jetset comp, like doesn't really matter. I need to make sure because it's tracking that and it'll do the offset later. But when I'm like, Oh, okay. I have somebody where their head might be getting cut off or things like stuff like that is where, so we'll get there. Um, but I guess, you know, first things first, I
Brett: had one question about that.
Actually, it looks like the previews that are created in Jetset. Don't, uh, Replicate the framing of the cine? Is that, is there a way to do that? It's an
Joe: approximation. Yeah, that was like, that was one of the things that, I think that was the first thing Elliot and I talked about was like, [01:47:00] how can I see what I'm getting?
Or how do I get what I see? I
Brett: could see it on the phone with the reticule, but my question is when it makes the preview file, if I wanted to use those for offline editing, for instance. Nah.
Eliot: Okay. Okay. So, um. The there's the live composite that's generated inside Jetset, you know, right now it's a 30 frames per second taken and you'll it's, um, it's cropped in right, but it's still that to match the match to match the, um, uh, the, you know, the, you know, the calibration field of view, but it's, it's still, you're dealing with the fundamental thing.
It's a range finder thing. And we were, we're very much looking at, you know, Other ways of doing, doing that. I mean, most obviously getting the live life, you know, feed into the phone and topping that that's, yeah, there's, there's some, there's some work there, um, as soon as you go through Autoshot. And what he's doing here, that's, that's the cine, that's the cine footage, right?
We, you know, you, you, um, you can turn off the, the cine footage processing, but [01:48:00] we, we have, we could, in general, people, if you're shooting a cine, cine footage, you want to, you want to see the cine, cine framing. Oh, sure.
Brett: Yeah, that's what I, because my thought is, before you go through and process something, it'd be great to be able to.
Cut something together and go, okay, let's use that one. And that one, and that one, these others we don't need. Uh, and, and have something awesome. That's almost like
Joe: proxy comps, right? Yeah, exactly.
Brett: Something that's going to, going to give me a good idea of what the framing is going to look like. Uh, and I know before you
Joe: spend time comping, before you spend time processing all the individual shots, right?
You can, yeah, that's a good call. Yeah. That would be awesome. Yeah. Just
Brett: a real offline workflow. from the files that the phone generates, uh, even if they're not the right frame rate, that's fine. Resolve is very good at converting that stuff and you can work. And it just will show me, okay, these are the shots I want to process.
And these are the shots I want to go through and even know which frames I want to process. Uh, because I've cut it all together and I go, okay. And if I've got matching time code, that's more or less.
Joe: Yeah. [01:49:00] Yeah, exactly. Cause right now, if I want to know my frame, I have to bring my raw CineClip into Premiere.
You know, make the mark. I'm like, okay, this is where I want the shot to start. And I'm like, yeah, this should be simpler. Um, it's okay for now, but when you've got like a shit ton of shots, you're like, man, I should be able to just do this all and be like, that's what I want. The right way to do
Eliot: this is to do a lot of comp in the phone with the Cine footage.
So we generate a 24 frame per second live, live comp. That's time code batch. That's just the right way to do it. Yeah. No, two ways about it. I know we have Sanser butts. That's the right, that is correct. Sorry. This
Joe: was one thing I wanted to, and then I'll let you go. Um, cause we were talking about, since we're using the X soon to send the video feed from the camera to the phone for calibration, the idea of like, Oh, well, is there a way to just pull it for the comp?
Right? Like that was that, but one of the thing that I did notice that I wanted to share remember when I used the ethernet cable. And I [01:50:00] wasn't getting power. So that would be the only other thing is that if you use the ethernet cable, you wouldn't be able to plug the X soon in.
Eliot: Yeah. Yeah.
Joe: This is soon. I tried putting it in the USB C adapter on the Belkin thing.
And it's like, Nope, it needs to be plugged directly into the phone. So I just wanted to point that out because I didn't think about it. When we were talking about it, I was like, Oh shit, if you want gigabit ethernet, as well as a live feed from the camera, you wouldn't be able to get it. You at the current way,
Eliot: we have requested axiom to put an ethernet port into it.
And so we'll see if at some point they they can they can do that. That would be gangster. Yeah Yeah, then then then you're like, okay, you know, these they're there's all these things that there are that are iterating toward toward it You know, and um, and You know what, man, there's just so many, so many different ways to do it.
I want to see how we're, you know, like the, um, yeah, that's, that's, I hope, I hope they [01:51:00] do that. Um, and, uh, but I, I hear you loud and clear on, uh, on, on the live feed into the camera, like no two ways about it. That's just, it's, uh, you know, I mean, we're, we're, we're on it. It's just, it's, uh, it's non trivial. And so I did, we got it.
I'm trying to make it so that we can get people through their projects. While we're figuring that part part out, you know, so it's, uh, I know the Truesdale brothers actually multi cammed the 24 frames per second, you know, footage with the 30. They just time code matched and did a multi cam and resolve.
And so they, they edited and They did something cool with that and I don't fully understand it. And at some point we're gonna do an interview with them and I it's like, alright, show me how you did that. So then we can show people how, how that, how that works. Which
Joe: respect stuff is cool. But when you start getting into teams and they're like, wait, now, like we don't do workarounds like this, we need it.
Right. You know what I mean? Yeah. No, it's, I, I, I mean it's all going, but, so I, I'm down to, you know, [01:52:00] like I said, if any help or testing, you know, with that until, until we get to our. Utopia of like features and assets of everything, just being that, you know, it's like, yeah, let's figure it out.
Eliot: Yeah, that's, that's, that's definitely, uh, loud and clear, but, but,
Joe: but, you know, how would you figure it out in the meantime, you know what I mean?
Like how we get it sorted in the meantime with the current iteration of gets at Sydney.
Eliot: Yeah, actually, you know what, so what I, what I may do is I'm going to, uh, Truesdale. If what I'll do is, is, uh, actually, uh, I can, I can connect you up with the Truesdale brothers because they actually worked through this for that, for that spec shoot, they, they, they did a multicam of this and they figured it out.
And again, I, I half understood it, but they, they know resolve far better than I do. Um, and yeah, I've used
Brett: multicam and resolve with. Varying frame rates. It works perfectly. Uh, it's especially the easiest not in
Joe: premiere.
Brett: I, I'm not much of a premier guy, so I don't know as much [01:53:00] about that, but I can tell you results with resolves, multicam with multiple frame rates, even if you don't have matching time code, if you've got an audio scratch track on everything, it'll sync it up that way.
Yeah. Yeah, there's a really great feature that you basically just use the audio waveform to sync up the cameras and it doesn't really care what frame rate they are. It'll convert it to whatever your timeline frame rate is. So if some of your footage is 24, some of it's 30, some of it's even 23, it'll sync it up.
It'll time match it. It'll, it'll,
Eliot: what would be great is to do is to do a tutorial on that process. Oh, sure. Yeah. Cause like, and if, if, if you guys are generating. Um, you know, like shots and footages and stuff to try this. So, I'm just thinking what a good order of operations would, would be. I mean, it sounds like you're, what I can do is check to see from these guys, they can write up a description of how they, how they did it.
Um, and, uh, I'll, I'll post that. And then maybe we try, we try implementing it [01:54:00] like that multicam process. Cause I know that's exactly what they used it for is, is for what we're talking about as kind of a stop gap between what we've got now, where you've got the, both the Cine, you know, the correct Cine footage and the, and the, and the, and the comp kind of, Put together so you can toggle back and forth and see what's going on.
Um, you can
Brett: even do it using a stereoscopic workflow 'cause that don't carry two streams for you as well. I've done that on some remastering work. We weren't really doing a stereo stereoscopic workflow, but we use that feature and resolve to tie two streams together. Uh, I don't know if that, I don't know if that works, if the frame rates aren't the same.
Uh, the, it definitely works in the multicam and the multicam. You can put as many streams of video as you want in there, but it's a little trickier. Switching back and forth the the actual stereoscopic is a little bit easier to switch back and forth between the streams, but it only will carry to, uh, but yeah, resolves very, they've really, that's a really the red editing side of resolve has gotten very robust over the past few years, because they've really been pushing it.
They've been trying to, you know, [01:55:00] You know, get rid of Avid .
Joe: Yeah. I'm always about getting rid of avid. So yeah, they've been trying
Brett: to kill Avid for a while, and I think they're very close. There's just too many people on the offline side that still think AVID's. The only thing that that's worth using ,
Eliot: I'll say Bill, by my co-founder founded avid, so I, yeah, I,
Brett: I remember that.
Yeah. No, I used, I use that for years. I've, I actually finally let my avid license lapse and didn't review it, renew it this year because, uh. I worked on a feature last year that the guy cut the whole offline in Resolve. That was the first time I'd ever seen that. And I was doing all the finishing and the visual effects in Resolve.
And I thought he was using Premiere. So I said, Oh yeah, give me your Premiere project. Or, and I'll take a, I'll make an XML from that. He goes, Oh no, no, I have a Resolve project. I was like, Whoa, just send me your project. That's all I need. And you're
Joe: giving me a good pitch for, cause I generally just Premiere, you know, For a lot of this stuff, people use After Effects, so it's just easier for me to, you know, [01:56:00] especially for just, uh, you know, more social content, things like that.
But, yeah, no, this is, you're, you're, it's just another program to start playing with now, you know what I mean? The
Brett: free version's very robust, so you can play around with it without having to spend a dime. That's great. Yeah, you'll find that you'll probably eventually want to spend the money, but it's only, it's like 300 and it's a, and it's a lifetime right now.
It's a lifetime license with free upgrades. Yeah. Uh, and you get two installs, you get two installs out of it. I mean, that's half a
Joe: year of, uh, of an Adobe subscriptions. Yeah, it's certainly
Brett: cheaper. And I think it's more powerful and. If you really want flexibility, buy a dongle, then you can move it to any system you want.
Uh, but you get two installs with the, it's a little harder to find those. But the trick is go buy a, uh, actually my wife sells these too. If you ever, if you wanna reach guys, I got an 11 o'clock, so I got, I gotta jump. Sorry. Thanks. All I get into, I, I won't be there Monday, but I'll be back next week sometime with some more questions.
For sure. This [01:57:00] has Great. Send me the Dropbox. It was great meeting you, Brett. And, and I'll,
Joe: I'll hit you up about that. That. Potential piece that drives us. I will continue to be to be. Yeah.
Brett: Yeah. If you get into, if you get it resolved, get a dongle because then you can move it to any machine. You want,
Eliot: if you guys want to send email addresses on the chat, you'll go for it.
Oh, yeah.
Brett: Let's
Eliot: do that real quick.
Brett: Yeah. Let's see. Yeah.
Joe: I appreciate that. That's that's great.
Eliot: This is, this
Joe: is good
Eliot: to hook people, hook people.
Brett: Absolutely. Are you in the L. A. area too? I didn't even ask where you are. I am, I'm in Hollywood, yeah. Oh, okay, so we're very close. Sunset La Brea. Uh, yeah, I'm in Burbank.
Uh, but I work at, I've been, my, the studio I'm working at is in West Hollywood, like, Culver City area, so, uh, so I'm down there.
Joe: Perfect. Yeah, I'm actually, I know you were talking about, because my buddy, he's got a big warehouse, like, he does all big, like, lighting rentals. Yeah. For photo video, and he's got a huge warehouse space that he's in like Frogtown.
And I was like, I was looking, I was like, wait, you got a lot of, oh, so Elliot, we'll see you. I got to roll guys.
Eliot: I'll [01:58:00] talk to you soon. You got any other emails? Yes, I got it. I got it. All right. Yeah, I got to see you guys.
Brett: Yeah. All right. Thanks, Mike.