Transcript

Office Hours 2024-09-16

===


Eliot: [00:00:00] All right.


Hey, Elliot. Hey! All right.


Hey, good to see you. Good to see you too. Alright, so, interesting, interesting stuff. So I spent some time tracking down transforms on the Z Cam. Okay. And so, what I'm probably going to end up doing I have a hacked version, because it turns out the Z Cam 2 footage is responding apparently fairly similar to Red.


Joe: Um, and 


Eliot: which is interesting, but I'm also going to need to call contacts. You can directly because they have transforms that go to a lot of different ways, but they don't actually, uh, and they're, they have a lot of, uh, a let you know, download page where they have a ton of different lets, um, they have almost a lot that we want.


Which is, uh, the ZKey, and which, which format are you [00:01:00] working at? Are you in ZLog2? Yeah, ZLog2. Okay, ZLog2. Okay, so that makes sense. Um, and, uh, there we go. Let me pull up my notes.


And we need to transform to go from ZLog2, what's called, to ACE's AP1. And they have AP0. That's, that's, that's going to be a whole completely different set of color space primers. So what I'm going to do is I'm going to contact Z Cam. Do you know anyone there? Before I, before I just kind of go blind. 


Joe: I don't.


Um, I mean, you know, what's usually great is their Facebook page. The Z Cam Me Too page. Oh. You will, you'll be able to just link directly with the owners. Just be like, hey, you know, like that page is amazing. Some, one of their top contributors or somebody from the company will get right back to you. 


Eliot: All right.


Let me see if I can find it and make sure before I, all right. Z cam. 


Joe: Yeah. The right 


Eliot: one. Z camera [00:02:00] continue. Okay. So you camera marketplace, the Z cam users group. Yeah, yeah, 


Joe: yeah. The user group. 


Eliot: All right, users group. All right.


Okay. So let me see if I can join that. 


Joe: Yeah. Let me make sure I'm going to look on. 


Eliot: Yeah. Can you send me a link to make sure I'm going to the right one? Cause there's like six different things. All look very, very similar. Yeah. It's 


Joe: the Z cam E two and P two is where, lemme 


Eliot: find Z cam. 


Joe: Lemme see if I can,


I just, uh, emailed it to you. Oh, okay. Lemme look for that. Oh, oops. The autocorrect changed the [00:03:00] name of the subject, . 


Eliot: Of course it's scam. . That's hilarious. . All right, there's the cam. All right, lemme pull this over here. 


Joe: Yeah, it's a great forum. They know somebody will get back real quick. 


Eliot: Alright, I'm going to join the group.


Alright, so that's uh, approving, okay. Do I own any kind of approvals? Not yet. Uh, I want to get the correct Z log 2 to axis AP1. Um, let's transform to[00:04:00] 


Yeah, bro. All right. Let's see. Profession is other. All right. Submit. There we go. Let's see if that does it. Okay. So I'm going to chase that. And that's so, all right. But there's some, some things that we can get going on. And there's a couple of questions I wanted to see. Um, cause I, I, again, I don't have the exact correct transform.


I started looking around at the footage and I got to the point where I could put in the synth eyes. I didn't get, I didn't get bupkis for tracking points. My God. I bounced off that from, uh, from the track, from the take zip. I sent you, uh, from the, the, okay. So two things, can you show me, uh, can you pull up the file?


Cause I'm on your file is tracking beautifully. Um, and I want to look, cause I think, I think the correct answer is going to is we run a, uh, We do a single point tracker on that thing. Okay. And then we, then we, then we stabilize it out. Which, which would be a good plan to try this, but I wanna see your [00:05:00] file again.


'cause on yours it looked better, the tracking looked better than, than what I was getting. Um, mine looked like you, 


Joe: you're saying in Blender you wanna see 


Eliot: the Yeah. In Blender. Yeah. When I ran, ran the, take the, the, the scale of the bed was a little goofy, I thought. Huh. I remember seeing Joe's, Joe's file and it looked great.


So. Yeah, 


Joe: here. Let me share.


Oh, wait. No problem.


Eliot: Okay. Sorry. No worries. Oh,


Joe: yeah.


Eliot: So yeah, here is, are you able to see that? I can see that. And, uh, did you, do you have the footage? [00:06:00] Hooked up. So that was the thing that, and maybe turn on the scan too. So I can see all the things that are hooked together. 


Joe: Um, 


Eliot: I think that's everything. So let's see. There's no, where'd the footage go? Oh yeah.


Go, go to, um, show preview or rendered. That's fine too.


Oh, there we go. It's processing shaders and then let's go to, uh, yeah. Material previews. Fine. 


Joe: Try just material previews. Is that faster? Yeah. 


Eliot: Material previews. Fine. Cause what I'm trying to figure out. Okay. So there's, okay. So there's the footage. Yeah, that looks pretty dialed. Um, let's look through the shot a little bit.


Um, and I don't see the far side of it. That all looks


I flagging. Oh, you know what, uh, hit, uh, zero to, um, [00:07:00] uh, uh, to hook to the, the ca the, the, the camera. Yeah. That's there. It's, yeah. There you go. There's the guy. Oh, you have a polycam scan in there, right? Of the room? Yeah, the polycam scan, yep, correct. Can you turn that off for the polycam scan just for a second?


While I'm trying to, I want to isolate down and see, uh, see. So that is this one. Probably, there we go. Um, so there's the scene collection, there's scan, and then there's the footage. And then, well it's hard to tell, um,


it's pretty close though. So, all right, so let's, which take are you on? You are on, uh, 131B288, 131B288. So let's see, 131B288. Okay, that's fine. And then what, um, Can you do an auto [00:08:00] shot? What is your in frame and out frame? 540 1169. 540


1169, alright. Alright, I'm just gonna reprocess mine and just have it, try it out. 


Joe: Yeah, and then, you know, cause something else is weird, when I rendered, I rendered this, like, cause it was like, oh man, this looks good in Blender, let me render it out and then watch it in Premiere, like both sandwiched on top of each other.


And they, you like, weren't in sync like this. And that's weird. Cause everything's 24 frames. Like, let me see if I can't pull up the, just the premiere file. So you could see what the renders version looks like, because that was also scratching my head. I was like, man, this should just sit right on top. Like what's the deal.


Eliot: And yeah, 


Joe: so that was. 


Eliot: Yeah. I was kind of scratching my head cause they're a little bit on that one. And cause if we render the blender, the fast viewport video, I remember [00:09:00] we looked at that. Let's look at, and let's look at the fast viewport render again. It looked like it was really locked. 


Joe: Right. That was what I would have been in here.


You could still see this, right? 


Eliot: Yeah. I can see your, um, I can see your blender scene. Okay. You just pop up a, the fast viewport render. Um, 


Joe: and then that's it. Right. You're seeing that 


Eliot: I actually don't see anything. I only see, Oh, there it is. I see something. Uh, it's taking a second to show up right now. It's just a blue square on my screen.


Um, 


Joe: Oh, okay. Oh, I guess maybe I overwrote it by accident. I might have to re render. Sorry. That's fine It's only like one frame of it now for some reason. Um, okay, so let's 


Eliot: uh, and just uh, What is it? You can just click the fast viewport video over on the right. Yeah, perfect And that should do the trick.


So we'll cook through that. All right, while you're doing that mine is Processing. [00:10:00] Okay. Uh, optimized track. I want to use save and run generate frames. 


Joe: All 


Eliot: right. 


Joe: I was over at, uh, Will, Will's studio yesterday. They started the build on the screens. All right. Insane, man. Like I, it's bigger than, or it's going to be crazy.


Like they've got the ceiling and the floor and like the full three wall. It's like a full diorama. Like it's set up like it's going to be diesel. So, and they're going to teach us like how to run it all. Like that was what I'm like, really excited to like, be able to understand how to engineer those sessions.


And, um, cause I'm familiar with unreal and like just doing stuff, but like that kind of like how to get it on the volume display on men's play and. To like learn that stuff. Um, yeah, so it's great. Yeah. And yeah, that's why it's like, I really, uh, cause I'm going to go there once they, I, [00:11:00] I'm hoping it should, most of the structure should be done within a week.


There, I want to take a scan and like do the whole thing and kind of do what I did with my bedroom there and like test out some things that's just, that's, what's been like, It's so weird on multiple fronts. I keep getting like so close. I'm like, okay, I'm getting close with the shooting live action and copying it in CG.


I'm getting close with shooting a real element and getting a CG placed in it. You know, it's like, I keep getting close, but I'm like, ah, I'm going to keep going. Keep going though. I'm like, yeah, yeah, yeah. We 


Eliot: want to close all the little loops in this to that are that that crop up.


Okay, so that's cooking. Let's see what mine's doing. Cause it is always a game of meticulous details. All right, let's see what's going on with mine.[00:12:00] 


Okay,


where is that? That's weird.


Joe: Also, um, while this is just rendering, um, I, I wanted to follow up with Tau group soon about the formula one. 


Eliot: Oh yeah. 


Joe: Like pitch them something. Cause I, you know, just while we have the time, you know, cause it's like November, it will be just like that soon. So, um, yeah, I was curious what your thoughts, if it was still a good idea to try to do it the same way.


Did the Star Wars one or, you know, cause I know last time when we had talked with Bill, he was like, Oh no, we like, I think they're, you know, he was perhaps the splats or a different method. Um, I was just curious, you know, thoughts [00:13:00] on that. 


Eliot: You know, we are, we're in the middle of testing out a bunch of things all at once on the splats.


The thing that I, I'm a little curious on the splats comes down to occlusion. Um, when you're, when you build a 3d thing, the, you know, cause we have a person inside this car, right. They're driving and it's, and the occlusions are very, everything's tight 


Joe: and everything, right. Yeah. 


Eliot: It's more of the, when a, when a splat works, it's not like precisely defined 3d geometry.


And so what I'm seeing is our Z depth occlusion can be a little, a little goofy. Um, cause this, the splats are like this huge number of intersecting and overlapping. Like Gaussian distributions in 3D space. And so sometimes they, the, the Z depth, the occlusion doesn't really, it doesn't work the way you'd want it to work.


And I think that might end up being the defining aspect of it. I didn't know that the last time we were talking about it until I'd, I'd started to see more things and I went, Oh, that's it. There is your problem. There is your problem where I think when [00:14:00] you have something that tight around a person, like a foreign F1 car, I think we're going to be in three because I, I saw your 3d model and you have, and it looks shiny and, and, oh, it looks 


Joe: awesome.


Yeah. The model I'm happy with. So that's why I was like, okay. Is there, um, you know, and I'm also open to, I mean, cause it feels like it would be an, uh, it feels like it's already possible to just do it with the phone. Like if there was no cinema camera component to it, which I think I'm, which I think could work for this.


Eliot: I think that's what you want to do. 100%. I think 


Joe: that's what I want to do. Just not even waste time with the Cine stuff. Because right, because didn't you guys, the FX3 was just kind of for show when you were doing it, right? Right. 


Eliot: And we're actually a way, we're decent ways off from being a live composite in the phone.


Whereas like, you know, of getting Cine footage into the phone. Whereas you're just working with the iPhone, it already works. Right. And so now then you're down to You know, just making, you know, doing, doing your production work instead of like R& D work. Because the problem with R& D work is it's very hard to predict the time scale.


So that, that was something Bill, [00:15:00] Bill didn't know about, and we're just finding that out. Um, And so I would be, I think the splat stuff is going to be background for, it looks so good for background and it works, but there, this, this inclusion stuff is, is real, um, and we may get better at cleaning it up, but like we are early in our stages of learning of, of cleanup and inclusion on this.


And so one of the things you can do if you have a 3d model, and this, this is something you don't want to try out is that you, you know, um,


it'll be interesting to see. If we can actually use, use, you know, I don't, I don't know if this, if this is possible, we'd have to look at this to, um, I actually have to think about this before I even, to, if, if we can use a reflection map with video, um, because then, you know, the car has like the surface of it, um, and it would be going, it would be going by, I don't know if that's possible or not.


I actually have to, we'd have to think about that a little bit before we, before we, we dive [00:16:00] into anything like that. Um, but yeah, so I, I think the car is probably. Probably where you're, you know, the, the CG car is probably where you're looking for. And, um, uh, and it builds like what, you know, the tell, tell them that I found something new on occlusion and that's, that's, that was just new information that we're, that we're getting, uh, as, as we're doing this.


Um, but I think it's, I think it's a good plan. I think it's a good, a good thing to aim towards. It's, it's going to knock people out and we already have what I'm 


Joe: saying. Yeah. 


Eliot: It's just going to knock people out. 


Joe: So, and one thing was like the cure, um, cause I noticed that like, even in that, like it's a little rough, is there, you know, and that green screen looked pretty key.


So is it just because the, the cure is like rough and it's not really meant to be anything that you can, for final use. 


Eliot: The cure is, is designed to be like a one touch cure. Right. And it honestly has not had a ton of reps. I mean, like we, we just, we, we did an initial implementation, like six months [00:17:00] A year ago, something like that.


And I haven't touched it. Um, so that would be an area where I think we could get some refinement, uh, in before November. Right. We can, we can do something, we can do something there. Um, and I 


Joe: think I can push the phone, you know, I'll put the phone in a nice case and be able to just, I think that would streamline it, but the one thing that, so it's like, I think like.


Oh, it doesn't have cinema picture. It quality. I think people won't care about that, but they will care about like, do I look good and do I have like a nice clean mat for like my Instagram clip that I want to share that for sure. 


Eliot: Yeah, yeah, that's a reasonable aim to, to get to, um, where we can, with a little bit of tuning, because we already have a keyer, you know, I know there's a couple of bugs we run into with the, with the displacement and stuff, with the, uh, depth occlusion combined with keying, but that's exactly what we're diving into next.


Is, is doing some of the, some UI fixes that include the keying work. You know, that's, that's, that's, that's going to be right in the, in the realm of what we're, what we're fixing. This looks like it's tight and [00:18:00] it, but it's just hard to tell from single frame rendering. This looks like super lined up. Um, all right.


So I'm going to, I'm going to actually, so I got that loaded on my end and I'm going to actually drop a, you know, drop a, a, um, a mesh, I just want to make sure that I'm replicating, you know, replicating what, what, what you're seeing. Um, on my end when I'm doing these tests and I think I am, it's close. It's close.


All right. Every once in a while, I get these bad frames too on. Okay. Might just be a bad frame there. And then going up here,


Joe: it's like another knock. I should be almost done here.


Eliot: Yeah. Mine actually looks further off than yours[00:19:00] 


starts. 


Joe: Okay. My finished let's see mine. 


Eliot: Yeah. Let me, let me watch yours.


Joe: Getting there. 


Eliot: Render mine. 


Joe: You, you saw what I just played, right? 


Eliot: Uh, let's see, let me come back and, uh, come on your screen. Okay, go ahead and go ahead and play it. I was, I was over on mine. 


Joe: Oh, sure. Sorry.[00:20:00] 


Eliot: Okay, so I don't, I don't see anything on yours yet. 


Joe: Oh, you don't? Oh, sorry. Yeah. Maybe just my, please move this window away from the shared application. Maybe I just need to share this. I'll try.


You're seeing that right? 


Eliot: Yeah.


Joe: It's like there's a little slide there.


Eliot: Like the, the scan looks like it's decently aligned. Yeah, it's, it's aligned to the bed. 


Joe: Yeah, I pretty much see it. The slide that I see is like happens right like there. Like you can kind of see the bit like her slide [00:21:00] around a little bit there. 


Eliot: Yeah. Okay. I bet this one. Okay. I think what we should try is there's, I'm going to, I'm rendering it on my side too, to take a look at it.


And then I think, I think this is going to be one of these where we're going to try to, um, cause I want to think of that origin point is, is in the frame for almost the whole shot. Let me see if it is. I'm just looking at it as it's going through. Cause if it's in the frame for the whole shot, what we can do is.


Just stabilize the render around that point. It's almost in the frame, though. Almost in the frame, okay. Yeah, yeah. Is there anything that's in the frame? And


Joe: this, this hasn't gone to synthesize. 


Eliot: Right, right, right. So, because the, okay, so, okay, we're going to do a couple things. One is I'm going to get you the updated version of, of, uh, of AutoShot. Let's see if Greg put it on the, on the, [00:22:00] uh, It has, it has some of the, um, it has the, uh, color space fixes on it. So that's, that's kind of a big deal to get that, uh, I crapped up pro.


All right. Let me see if my, because I have 


Joe: 0. 0142. 


Eliot: Okay. Yeah. So let me get you the updated build of that. And this is. All right. Okay. So there it is.


Joe: Yeah. Cause there there's, uh, you know, I'm, I'm actually trying to, I haven't told up rocks yet about cause for this Intel shoot, they, so the whole thing is that, you know, Intel, because they're aware that the computers that they gave us won't, excuse me, aren't good enough to run on real, but they were like, okay, well, what can you do?


And I was like, well, [00:23:00] definitely pre visualization. And then I was like, wait a minute. I guess I could use this to run auto shot, like I didn't say that to them yet, but I'm gonna like, that's the stuff that I want to test, which would probably be the most feature rich aspect that we use the laptop for from a brick from their branding part.


So yeah, that's why I'm like, I'm curious if this would just like run on a laptop. Tablet, you know what I mean? Um with blender so that I could kind of do some of this On that that's worth trying. 


Eliot: That's what it's worth trying. I'm definitely 


Joe: going to yeah 


Eliot: Yeah, we may run into some things of of of where at once in nvidia For the, the really powerful stuff.


We ended up using some Nvidia stuff, but, uh, we might be able to get through the blend, the basic blender things with, with whatever, whatever the laptop has. Okay. 


Joe: And you're, and I'm saying for not unreal, just, just, just 


Eliot: like, yeah, just 


Joe: auto shot, just jet set and auto shot. 


Eliot: Yeah, that'll be worth trying, trying that guy out.


All right. So, [00:24:00] uh, there's going to be two things you're going to need to install. There's, we have a new auto shot and a new auto shot tool. So there's one link and let me go find the other, other link. And that will at least get us to. Um, give us to the fixed, fixed color space of there's auto shot tools, copy link.


All right. And there's chat. Okay. So yeah, let's get both of those installed. 


Joe: And 


Eliot: then, cause then we, we are no longer blowing away. We shouldn't be clipping our footage anymore. Now it's still giving me fits trying to track it in, in, uh, um, so in that I used, I used instead of the correct Z cam transform, cause I tried downloading it.


They have a thing for resolve, but it's, um, the format that it's set up is sort of not, not kind of what we would, we would want it sort of done in a couple of different layers of things. And we just want the straight up transform. Um, so I'm going to, I'll contact them, but I went through it and it [00:25:00] wasn't, it wasn't giving me that many feature points.


So we'll, you know, again, this will be a bit of a process to, it's good to understand this and figure out a good way around through it. However, we end up doing it. 


Joe: They have a FX threes at the studio. So it's like, I could do some more tests with their Sony FX three in conjunction with this, like while, you know, getting the result, uh, while getting the conversion information set, you 


Eliot: know, I want to figure out with what people have.


You know, like you've got a Z cam, I want to figure out the Z cam, right? You know, because every additional step that someone has to go and get something is a big, you know, it, it makes it harder for people to do stuff. And the Z cam is a cool camera. We're going to see it over and over. And so, all right, let's, let's, you know, let's figure out.


I know what the math I'm looking for is, and I'll, if they, we could, we can get the correct math of one day if they respond to me, um, on that. Yeah, I, I 


Joe: just meant is, uh, if it was like holding up until they thing, I was like, okay, [00:26:00] I could use that to, you know, to try to take it to the next phase if that's helpful.



Eliot: think we're, 


Joe: I think they'll get back to you soon. 


Eliot: Yeah. Yeah. And then, then we're solving it correctly and then we're building everything, every individual step of it. Cause the, this initial color space transform, it's not like an artistic question, it's like, no, no, there's an actual, just transform to go from here to here and we just need it.


And then, then we can be, then we can be, you know, working in the correct, correct world. All right, let's see what mine did on the steps 


Joe: are the same, right? Because I'll probably, the reason I brought it up is I'll probably have to use their Sony cameras for this shoe. Um, like I might not use my Z cam for the Intel one.


So that was why I was just like, the steps would be the same though. Right. It's the same. It's just 


Eliot: same thing. Same stuff, same stuff. It's and, uh, that's why it's worth getting, getting all these reps dialed. Um, so we, it's, we are, we just understand how it's behaving. 


Joe: Yes, sir. Okay, so, uh, new [00:27:00] autoshots installed.


The, uh, tools is almost done downloading. 


Eliot: All right, great. Let me go look at that render video.[00:28:00] 


Oh, it is floating.


Mine's, mine's floating a bit. 


Joe: It's not stuck, right? 


Eliot: No, it's not stuck. 


Joe: That's weird. 


Eliot: How's Well, is it sticking on yours? Okay, let me check something, make sure I've got the correct, um, all right, let me go through and verify some things, timing, all that kind of stuff. So, 


Joe: installing auto shot tools now. 


Eliot: Yeah, go for it.


I'm just going to be working through. I want to just. Start from scratch and verify and check every little thing. All right. So I've got this,


that is[00:29:00] 


one, three, one B two, two, two.


Joe: Cool. Installed.


And so one question too. So the. Like, [00:30:00] because I was able to get that Formula One car loaded up, like, in the model with the USD into Jet Set City. Would the background be too much though? Like the, the track seeing it move, like that would probably be too much for the phone to handle, right? So is there like a different, would I process the background differently?


Eliot: I think what you would probably do is, yes, I think you probably process the background differently. There's, there's a couple of things we could do. When if we try to do a 3D background, that's one way to do it. Um, or we can actually do a 2D background map projected on on a 360, um, 360 sphere, and that might be the first way to do that.


It's not as it's not as 3D accurate. Right. You know, but on the other hand, you know, depending on on on your point of view, um, it could work. So. I think, um, let me think about this for just a second.


Well, you might, what might be worth doing,[00:31:00] 


and I'm just kind of thinking, thinking out loud, because the cool thing is, is it doesn't need to be a specific F1 track, right? It's not, it's just needs to be a cool track. What I wonder, I 


Joe: have that one in on, I have one in unreal now. That's pretty cool. Oh, 


Eliot: really? 


Joe: Yeah. I have a great one already in unreal where I had the car on.


I even had a spline like moving the car along the track. Okay. Yeah, I, I, I think I have to redo the spline cause splines seem to be more of a pain in the ass in Unreal than they should be. 


Eliot: Right, 


Joe: right, right. You know what I mean? I was like, I just wanted to go along this path. I felt like I should have just been able to draw the path and be like, assign, parent it to this and go from A to B.


But it's a little bit more, uh, cumbersome than that. Um, which is fine. I'll get that sorted. But I was like, okay. 


Eliot: So I've got an idea, which is that, Um, let's [00:32:00] see. One of the guys we work with is really good at baking stuff, baking scenes in Blender. Um, so I wonder if Um, just kind of thinking through this for a second.


I wonder if he'd be able to do, um, project where we do, he does, he does like a little chunk of a city that's going to repeat sort of in the same way that you see how we do the, uh, uh, the trench run kind of thing. It's just, it's one chunk of trench that, that kind of, that, that repeats, you know, over and over, but it's set up so that you don't notice the, the repeat.


And I wonder if that could be sort of a good, an interesting way to do it, where it's a set of buildings is blowing by. You know, and like, uh, yeah, no, that was exactly what 


Joe: I had. I, I just had it. I just, you know, it was a short little section. I mean, the, the, the track thing is huge model, but then I just picked like a little street where the, where like, I was like, okay, that could be a good finish line.


And I just pretty much, it'll just like, when we hit roll or start, it'll start the car driving, they'll get their five, [00:33:00] 10 second clip. Yay. I crossed the finish line, right? Stop. And then it just resets for the next person and then play, they do it. Like that is I think all that's needed, you know what I mean?


So it's like making it loopable I mean be cool, but I don't think that's 


Eliot: you don't actually need it, right? Yeah, I don't think we need 


Joe: it for this um You know I just want to get like a nice animation across the finish line seeing some like fireworks and things in the thing that we could Put logos of whatever companies want to sponsor it on the billboards And then once we get all that approval and then and then we could like if that could be baked in where we still get You know You know, the parallax, but it doesn't need to be all that is, you know, doesn't have to be all the business.


Eliot: I like that approach. Okay. So I think we should try to do that. Um, and then if we have to figure out some things of Okay, so you, you already have the Unreal scene? 


Joe: Yeah, I have the scene. 


Eliot: Okay, okay, okay. So I've got 


Joe: I have the scene and the car in Unreal. They're both like, ready to go. 


Eliot: [00:34:00] Okay, and normally, if we were going to be doing an Unreal project, I wouldn't be suggesting this because we, you know, you do want to, you want, you're going to be rendering in Unreal.


In this case, we're going to be rendering in the phone and we'll have a lot more control if we pull the scene over to Blender and then a couple of our guys, you know, like, just baking. You just said we have a lot more direct control, um, if we pull stuff in Blender. And so for that, okay, so the, the thing you're going to want, and I'll give this to you to go, go, uh, uh, you can start into this, uh, and then we can go over whatever, whatever, uh, we run into that doesn't work, but it should actually mostly should behave pretty well is we actually have, uh, inside our, It's on our Blender plugin.


We have a tool to convert Unreal scenes, um, and I'm going to do this. And so we can actually convert your Unreal scene to Blender and it'll come through correctly instanced and everything. And I think that is where we want to start. Um, and then we can, then we can like start to be modifying things around.


If we need to bake stuff, then we have a couple. You know, a [00:35:00] couple of people who know, know baking really want blender. Um, and then we have control, like, you know, we can say, okay, we need to simplify things down. We're going to go to a big thing. And as just super straightforward to do it, I think that's the right approach.


And then we can build, we can start off with the finish line and see if we like that. And the, the things, the events in like November, right? Something like that. 


Joe: November, like 20 something. 


Eliot: Okay. Okay. So we got, we got some time. Okay. This is the right way to do this. Cause this, then, then it's under your control.


You get it. And do you know, do you know Blender? I mean, a little bit. I remember you. 


Joe: I'm learning through this. Like I said, this render that I did in like what I've been doing now is the first time, like, because of what you guys, you guys have been getting me into Blender. Oh, got it. Got it. Cool. Yeah. But I'm into learning.


Like I'm still learning. I'm seeing, I'm seeing how I need to know this stuff. Like I understand my bill kept pushing me towards Blender now. Yeah, I'm getting it. 


Eliot: Yeah. Yeah. There's, there's a category of things like this where it just gives you, so, okay, can you, I'm, I'm one of the tracks down on something.


Can you send me a Dropbox of that visual render? Cause when I do it, I see sliding. [00:36:00] Okay. The 


Joe: preview that I just made right now. 


Eliot: Yeah. The preview and just send me a Dropbox cause I, I'm seeing sliding of the mesh on the. On this. So what I'm going to do is I'm going to I'm going to test the timing over here.


So I'm going to show you my share my screen so you can share. Yeah, I want to I want to just verify each individual piece of this. So there we go. So I'm just testing the timing of this. So this is the first frame we have there. It's first frame that we're going to recognize. M for Mark. All right, I'm gonna line up my pieces.


So we're six seconds and 19 frames. Okay,[00:37:00] 


that, let's calculate that. So that is, what's our frame rate on this? It is 24, 6 times 24 plus 19. All right. 6.79. Let's check that six point.[00:38:00] 


Lemme generate my frames. I don't think that's it.


Let's see


and


Joe: just sent you the clip. 


Eliot: Okay, great email.


Joe: Yeah, because pretty much all I did with this scene was take the jets was take the mesh in the scan. I brought it all into blender. And then I just imported this [00:39:00] model and just placed it on top of like the mesh, the mesh render. Right. And then the camera just went over it and it was already tracked. So I didn't actually have to do anything to the character.


Just the camera data. I just want to 


Eliot: watch this. This looks actually pretty good. Why isn't it? Is this, I guess it's sliding a little. Hang on. What am I doing here? 


Joe: Yeah, I mean it's sliding a little. That's why I was like, oh man, I feel like if this was in, you know, with a synthesized pass that maybe we could be, you know, 


Eliot: right.


We'd be 


Joe: money in the bank maybe. 


Eliot: Okay, that looks a lot better. It looks, mine is sliding a lot. Um, I don't really understand that. All right, let me retry it with this.


There's blender. All right, I'm going to stick a, okay, I'm just going to render this.[00:40:00] 


That's going to be cooking there for a second. All right, so while we're doing this, okay, so are you, uh, did you get autoshot and autoshot tools installed? 


Joe: All set up. All 


Eliot: right, so let's, all right, let's run that. Um, and 


Joe: Point one four seven. 


Eliot: Okay. One four seven. Okay. Great.


Zoom


And it's chugging through


now this it's really dialed [00:41:00] then the


motion is good. It's like the lens calibration. So


See, that's good.


You know what? That's what i'm going to look at. Let's look at that [00:42:00] buttons


One b two two eight


Solve this


I'm just going to rerun this and check stuff because there's the time [00:43:00] synchronization is good. There's something a little off with, with how the mesh is, is stuck on top of it. Whoa. Oh, wait a second. What? Hang on. 


Joe: Okay.


Eliot: When I, the initial thing you sent over had a field of view of 46. 83. I just resolved it and it gave me a 60. 83 degree field of view. Whoa. Whoa, wait a second. What's going on? So that shouldn't have changed by 15 degrees. Um, well that, that, this is interesting. Now, now we have, now we have a potential, now we have an idea.


Uh, cause that's exactly what looks kind of weird is the, is the, okay, hang on. I'm going to stop my, my, my, uh, scan over here and I'm going to rerun this. The plot thickens. So why would you get one 


Joe: [00:44:00] blender that you're getting that? 


Eliot: No, this is in, in auto shot. So I, all I did, me look at this. I share, share my screen.


So you see, see what I'm doing here. There you are. There's share screen. Okay. So over an auto shot, all I did is I clicked resolve.


Joe: You cut out on me.


Oh, you got muted. I think.


Oh, yeah. I think you're muted for some reason. 


Eliot: Oh, there you go. All right. Uh, yeah. Uh, so do you see my auto shot screen? Yeah, I see it now. Okay. So I wrote down the focal length or the, I'm not the focal length, but the field of view, which is the key thing, um, before, which was 46. 83. And then I hit resolve and it's 60.


83. I'm like interested, so I gotta, I gotta understand why we have [00:45:00] two different wildly, very, very different values. Um, 


Joe: yeah, I'm 


Eliot: not, I'm not worried about the city focal length because, because that we're, we're, you might, because that's actually not what the solve is going to use. This is the key value of the field of view is, is the big thing.


Um, so now that I see 


Joe: mine says 60. 59. 


Eliot: Oh, okay. So that's a good, that's good. So then, okay. Okay. Okay. 


Joe: Like I just opened up auto shot here and it, yeah. Like the, my resolve, uh, says 60. 59. 


Eliot: Okay. That's good. So that's what explained why yours looks locked and mine does is looks jacked. Okay. Okay. But if you did and you did your normal push calibration, Um, with the like, yeah, calibrate.


Okay, in that case, we got it. I want to understand where that came from. So, okay. All right. So this, this is good. This is very good. This is a, this is a distinct plus. Um, [00:46:00] okay. And then let me, and I'm going to verify this behavior after the call of where we got one field of view before another


after. Okay. Yeah. This is exactly the kind of stuff we want to catch. Okay. So then once we've done that, now I'm going to rerun, uh, I'm going to save and run and let's see if what will happen in blender now that I have a. Hopefully correct field of view, and I bet it, I bet it locks up about 10 times better because now we have a correct field of view.


Joe: Yeah, I mean, yours is like point, you know, three more than mine or different, but. 


Eliot: Oh, there we go. There we go. Okay. 


Joe: Okay. Yeah, that looks much more accurate. 


Eliot: Yeah, this is, this is more like what I'm looking for. Okay. Now the world, the [00:47:00] world is making sense because then if I go over here. Okay. There's the edge of our bed.


Okay, and this is coming in. I don't really see much. Turn off my, turn off my scan. Oh, I need to see my true preview. Oh, did I forget to add a image plane? Yeah, there's the image plane. Um, I see it up. Okay, so I see it up there. Back here. Okay. Well, it's making more sense now. I'm going to try rendering that and I think that's going to be now, at least where we're matching, but I want to, I'm going to go back and understand why we weren't matching in the first place.


So that's important. Um, but at least now we're, now we're, now we're matching. Okay. So that's going to render for a second. 


Joe: No problem. 


Eliot: Okay. So the next thing we want to do is, um, okay. So now that we have [00:48:00] that, now that's correct. Um, we can try pulling this into, to synth eyes, uh, and I ran into problems, but okay, so that's good that we're, that we're figuring this out.


Um, I ran into problems getting synth eyes to pick up trackers, but let's, let's try that again. Now that we have. And that's already looking better. I'm not going to worry about rendering it out. I'll do that later on. So now we know we have, I have a culprit nowhere where the before and after issue was. So then the synth eyes start looking at this.


Cause I'm not quite sure the answer yet, but let's, um, let's go through this. I'm going to go blender. I'm going to go others and I got to save and run. And in this, what I have. Is I have a little bit of a hack, which I'm, I have a red, uh, RWG log three, 10 


Joe: to HCCG, so that's the closest to this. I 


Eliot: saw this on like some YouTube thing and it's not, this isn't correct, but I, I don't, I don't know what else to do quite yet.


Um, And, uh, he said [00:49:00] they actually came in very, very close. So I need a log format to get us, get us over to something semi normal. Uh, and then, but I'll show you what I'm seeing. Um, okay. So let me save and run and pull this in and actually what'll be great. I'll show you what I, uh, when I was in resolve. I'm going to flip over to Resolve, and I'm going to hide this, and so here, what I told Resolve to do was interpret the footage, um, turn off this ACES transform, go here, alright, so here I used a, oh, that's right, I was experimenting, and I couldn't get the transform chain to work.


Um, so let me go back to


get these pieces correctly wired. So many nodes, so many nodes. All right, so here, do I [00:50:00] have anything on this guy? I don't think I have anything on this one. Um, so I'm going to interpret the footage. As C log, go back here by, okay. And there is our, okay. That's the correct one. And I should be able to interpret that.


Do it a lot and Z cam. There we go. No, I don't want ACEs AP zero. That's right. Oh, this is where it got, this is where it gets tricky. Cause there is no.


Cause if I convert this to ACEs AP zero, then it's, it's just not correct. Um, Yeah, this is where it was. This is where it got difficult. So, okay. So we'll go like seven or nine. Okay. And what I wanted to see is when you're, what is your color? [00:51:00] Are you color when you're looking at your footage, are you in premier?


Um, I don't want to 


Joe: premiere looking at the footage here now. Um, I can show you, 


Eliot: yeah, let me, let me get a screen cap of what yours looks like color with, you know, with the correct color timing for yours. And then that way I have a reference point when I'm looking at this, see if we're close 


Joe: Sure.


You can see that. 


Eliot: Yeah. Okay. Yeah. So this is, this is fairly close to what? All right. So there's, there's what you've got. And then I looked a little 


Joe: cooler from what I could tell or what I remember on that, but it looked similar enough to do to keep going. 


Eliot: Right, right. To be able to kind of start to do feature detection.


Oh, can you move up so I can see a bit more of the window? Uh, there we go. And there we go. Okay. Perfect. Perfect. So I didn't see the posters. I'm going to do a screen cap. Yeah. [00:52:00] Uh, rectangular. There we go. That gives me a good reference. And let's save that. Save as.


Okay. There


we go. Okay. So that's what it looks like on your end. So let me jump back to SynthEyes, and we're going to see if I can, uh, screen two, share. Let's go take a look at this. And we're in SynthEyes, and I'm going to open up a script, run a script. That's the correct one, 9. 50 AM. Yep. All right. Let's see if we're [00:53:00] looking about right.


Okay. There's our frame. Okay. So we're not crazy far off.


Okay, good. Now, now the bed's more or less lining up.


Okay. Yeah. So yeah, yours is, it's, it's different. It's, it's definitely okay. And yours looks more correct. Okay. 


Joe: Yeah, that one definitely looks a little 


Eliot: weird. All right. Yeah. Cause it's not doing the right one. Okay. And the trick


go back here. Okay. And what I was running into and I don't have a great answer yet is, so when I do, let's see, I do features, um, [00:54:00] and I'll go to advanced. The tricky course is if I run my. I'll try to use the corner trackers. Let's see, I'm going to switch this to detecting corners. Okay, so there's, there's corners.


And then if I move through this, the scene, then some of those should be detecting through enough frames. Let's take a look down here. And what I have, what I have, you can see this on the screen share, right? 


Joe: Yep. I'm watching. Yeah. 


Eliot: Okay. So what I have it doing right now is looking for, um, usually in an, in an, in a user generated environment, there's a lot of corner features.


And so I have, uh, since that's going to either track spots or corners. So right now I have corner features enabled. Okay. Um, let's sometimes catch up for a second and clearly in the beginning, there's just no corner. There's not too many. And then up here, we start to see the, you know, it can detect the corners of the, um, of the, of the [00:55:00] window and it starts to see some pieces there.


Um, and then we can kind of see, see through that. So that's potential there. And then I'm just kind of going through this to see what it's going to be able to track and not, I mean, this part of the shot should actually work. Um, I think that's enough. So let's go see, I'm going to click auto re blip auto re blip is where it shows you the, the feature points that detects.


On a given frame, right? So let's, uh, let me switch from corners to 


Joe: right. This is like blips. That's how, if you want to do like manual tracking, it's just dies, right? 


Eliot: Well, this is, this is actually, this is the auto tracker. So what we, what we wanted to do is, is find these P features and automatically track them.


So we're not sitting there doing man, a bunch of manual tracks and. I'm working through this with a couple couple groups, actually, where the same thing kind of happens. There's part of the shot that actually is the, you know, the automatic feature detection is fine. And then there's like a dry spot in the shop.


It's just like, no. Um, and [00:56:00] so what I what I want to do is is. We're figuring out a workflow. So the auto tracker does what it does. And then the human part patches, the piece of it that where the automatic tracker is, is dying. Um, and I'm still figuring it out. Um, but this is, this is why I want to go through this and understand it.


So small blip size, I think one of the problems we have is right now it's set to really big. So let's do four and eight. Okay. And that's already better, right? So we're seeing, yeah, what happens if we do two and four. Uh, I guess, okay, four, eight, and I'm going to look at the script. Okay. So that's interesting.


So what happens if I move over to another part of the shot? Okay. I move over here. What happens if we do,


you have to change something to get it to. Okay. So we have a few pieces there. [00:57:00] Um, all right. So what I'm going to do is I'm going to show you the script that we're, that I'm experimenting with, uh, the Matthew Merck script. So I'm going to go to script, user script folder. And this is the multi peel script that Matt.


Brewed up and it runs through the auto tracker and multiple, multiple levels. And it's going to sample, um, from small, medium and large. I'm going to open it up so you can kind of see it. Uh, and this is the Cynthia script, uh, the kind of, uh, synth, uh, synthesized natural language feature script. So we're going to start off with, okay, so 12, 24, eight.


So I'm actually going to start off with four and eight, get small ones. And then eight, 16,


and then, okay, so then we'll have small ones and large ones. So let's, let's see if that works and it save and read, uh, find a scripts. There we go. So then what we're going to [00:58:00] do is we're going to try running this script. And so it's going to, okay, here we go. This went over to the side. So what it's going to do, go through is it's going, going through and it's going to search It's going to set the feature, the small, large features, you know, like we just said automatically, then it's going to sweep through all the frames.


Actually, this might take a little while because it's going through like 600 frames, um, especially with the small features. But I want to see if it, what it, what it picks up. Um, and this may be something I have to do more experimentation of on the side and may have to, there's two things that we want to do.


One is I want to get the exact correct. Um, let transform so that when it's going into synth eyes, we're getting correct footage, uh, because when the footage is correct, then it's more correct than it was, you know, right? Yeah, no longer a 


Joe: feature point identification, right? Yeah. 


Eliot: Yeah. And then, um, and then I'll, I'll want to, you know, I want to get that correct and then we can go through and, and, you know, I wanna understand what it's gonna take to get synthesized [00:59:00] to, to detect some of these points.


Um, and certainly some people, you know, expert trackers like ask her that. They just do a manual, a survey to, uh, like it's called, um, a survey, uh, a supervised track where you just kind of go through and, and tell, synthesized to track, you know, track a, uh, a point. 'cause you don't need that many points to track a track a shot.


Um, I think you need like eight. And, um, so this is, this is something I'm, I'm figuring out as well is, is what is the ratio of things where we have. The automatic system do it versus the manual and you know, the first shots we did were just for those, those green trees. Gotcha. Just hit manual. Everything worked, you know, so now we're getting into things where it's, it's a little bit more, more tricky and interesting, but this is real, right?


This is, this is just, this is realistic. The bright, bright panel coming in, you know, like a, it's going to be like. I mean, for 


Joe: cinematic stuff, it's like, right. Most people want backlit, you know, moody or avid, like that gives it that look. Um, 


Eliot: And I think [01:00:00] what, what I'm going to end up having to do is also learn how to adjust.


So since that has an image pre processor where you can crank up the contrast, uh, and do things like that. So I want to go through and just, it's going to take me a little while to kind of learn that all the pieces on this shot to understand. What the pre processor should be to get the, you know, to kind of, the nice thing is that in synth eyes, we only care about tracking, we don't care about what the final image looks like.


So we can do all sorts of weird things and contrast and stuff to, to get the track, right. And then, then, you know, then, then it's, then it's fine because we're not affecting the, the, you know, the actual con so there it found points, but it didn't find features, or I'm sorry. So the way synth eyes works is it's going to find points that it detects Um, it needs to see them in 15 consecutive frames to make a feature and what it looks like it to make a tracker and then it looks like it didn't, it didn't find the, it found some points, but it didn't, didn't find them on 15 consecutive frames, uh, to do this.


So this is, um, [01:01:00] so what am I doing? Tag 


Joe: out and hit the bathroom real quick. Oh yeah, go, 


Eliot: go for it. Go for it. Like, this is, you know, yeah, 


Joe: two 


Eliot: seconds. I'm going to crunching through some stuff.[01:02:00] 


I did not detect any future points. Right. Let's see if dial it down.


Joe: Sorry about that. 


Eliot: Hey, no worries. I'm going to dial [01:03:00] down. I dialed down the, the required number of, of, uh, uh, frames. And, uh, hopefully I'm going to see if this will actually do it any better.


Now there's a way to do something that we may end up trying on this that I haven't tried before, which is you do a planar track. And since the top of the bed may be actually a reasonable plane, then that, then we can try that. Yeah, 


Joe: I was actually going to ask you, like, whether you prefer like Mocha, which I know is a planer tracker versus like Synthizer, you know, if there was a preference that you had on that.


Eliot: I've simply never tried Mocha. Uh, I'm the, the thing that I'm trying to do is do things that maintain 3d spatial accuracy, you know, and I don't think Mocha does. I don't, I don't know it that well. I know [01:04:00] synth tracker and if I can use that. To and while still constraining it in the correct 3D coordinate space, then I'm all for it, you know, and part of this is just going to be a little bit of a voyage of discovery while I find, you know, part of this is me finding the odd corners of synth eyes do that, you know.


Just the same way of finding the, the drop on a mesh, you know, makes, makes that part of it work. Um, I'm, I'm sure I'm gonna have to find a couple other far corners of synth eyes to see, um, how to, how to put all the pieces together. Um, okay. And then of course, like, yeah, you know, and what we may end up doing, okay.


So look at this. And then we'll also look at, um,


and we could change the shot so we can see that the origin in it. I don't want to do that, but we'll see what we have to have to do to just get a 


Joe: so I noticed, [01:05:00] like, I mean, this is all part of why I just I noticed this comment. Can you see the screen? 


Eliot: Uh, let's see. I know I'm sharing so I'm going to stop my share so I can see your screen.


Joe: I just wanted this was just, you know, and this is like what. Okay. I just wanted to share this just so, uh, because this was from a comment in that, uh, in just on one of those videos. Um, and, oh, shoot, where is it? Yeah. So it's like, just so that you see just a comment from like perception wise, are you, are you able to see my screen?


Oh, no, no. Go 


Eliot: ahead. Go ahead and share. 


Joe: So it's just, it's just a stupid comment. You know, nothing that, oh, that's fine. 


Eliot: Yeah. Let's, let's take a look. 


Joe: Um, but you know, like, like this is where I'm like, okay, we're, this is what I know the whole purpose of what we're doing is to change this perception. Um, And where is it?


Sorry,[01:06:00] 


but show me that one. Oh, yeah, there's so it's just like, you know,


Eliot: Jack can. Okay, so let's which, which one are they doing? Oh, right. 


Joe: This is this was that like, so there was a couple there where it's like, right, they're trying and I'm like, okay. I don't find it particularly, you know, I mean, I'm just trying to see, but I just wanted to share like the talk of it is going in the rooms.


Um, so it's like, you're getting out there with people to see, but it's like, right. I think they're like, some of them are initially, I don't know why, but yeah, I just wanted to show, you know, just to show that, Hey, it is like people are, 


Eliot: Oh, can you scroll up? I actually want to see what the original video is.


I don't, I don't know which one. Oh, that was 


Joe: the one I sent you with the member of the 360 camera. 


Eliot: Oh, that's right. That's right. The three. Yeah, that's right. 


Joe: You know what I mean? So it was just like, I just was scrolling down and I was like, Oh, okay. People connected the dots of like, Oh, this could be [01:07:00] a similar way.


I think, you know, I think what they don't get is like that jet set could be under like terrible lighting conditions as well, like that's where I see. Like, you know, something where this couldn't, you know, it doesn't matter the lighting conditions with jet set. So it's like, it could be, I, at least I think so, 


Eliot: right.


I think it's a good idea. I think that spherical camera is a good idea. And I, I don't yet with all these things, it's always figuring out how to wire it into the general flow of things. Um, and I mean, I like what they did. I get what they did. Um, it depends on a bunch on some manual manual measurements, which always kind of kind of worry me.


But on the other hand, having a spirit, I mean, having spherical data is just good for all the reasons that, you know, it's just a great idea. Um, yeah, that's yeah, I agree. So what would be interesting, right? Actually, let me [01:08:00] think through this for a second. 


Joe: I mean, here I could show you. I actually, so I made them get a, uh, I made them get one of those spherical cameras.


Um, just so just, you know, I'm, I'm into fucking around. Um, and this, so I just did like a, an initial test shot, uh, screen to share, cause the window was closed. Okay. Let me try it again. So 


Eliot: I should think through. Like if we could get something where we wire in the 360. Here's like 


Joe: something that I had. This was from the 360 shot.


Are you able to see this in synthesized? Oh yeah, yeah, yeah. So this was me just sort of doing a Damn it, it has to buffer every time I open it. That's why. Yeah, it takes its own sweet time. It does take its own sweet time. Um, but like I was able to get a pretty, uh, I was able to get, you know, yeah, at that point.


That's [01:09:00] pretty good resolve actually. Um, 


Eliot: yeah, absolutely. Absolutely. That's and that's, that's taking in. It's the 360 camera, 


Joe: 360 camera. Yeah. I just kind of just did a test 360 camera, um, just to kind of be like, okay, what I'm having trouble with, because I was able to do that and then I was also able to, um, set the offset.


Um, here, let me just open this up 


Eliot: because I, I actually like the idea of figuring out how to use 360 footage and all this, because it, it is, I mean, it just, it's exactly what you want. Everything's sharp and focus and you have a 360 range. So it is exactly what you, what you want on that. And it allows 


Joe: you to use any lens, you know, that's like, that was like the big one was like, right.


You're not tracking based off of a. creamy bokeh or like, you know, super shallow depth of field, which just makes it impossible to get any sort of accurate real information. So it's like the idea that it's [01:10:00] like, Hey, do you still want accurate tracks while using your favorite vintage and, you know, shoot wide open lenses, knock yourself out, you know?


Um, so this was here. So here I took the I took the tracking data from, uh, synth eyes. Oh, sorry. You probably can't see that. Let me share that. So I took that data


keeps hiding and I was able to bring it into synth eyes


so you could see that. Or I bring it under. So here it is in blender guys with the three 60 degree shot. So I went in and you know, like this was, this was the camera. That it imported with, and then this is my cine offset [01:11:00] camera here based off the like rough calculations I did where I measured how far off the 360 camera was from the sensor, you know, pretty much what you guys doing internally with your cine offset.


Um, but my pro, you know, and then in the video he was like, yeah, just overlay the footage. And there you go. And I was like, wait a minute, how the fuck are you doing that? Actually? Um, like overlaying the footage to make sure, cause I was like, okay. And then I tried, like, I was like, okay, are you using it as a background for the camera?


I was like, maybe that's what you're doing. Um, and I did set that, um, 


Eliot: No, what you could, Oh, I'm, I'm, I'm, let me think through this for a little bit. Yeah. It'd be very interesting to, cause we. If we could do our calibration, you put, you put the, uh, your 360 camera on there and we run our lens caliber, you know, we almost like have to do two cine [01:12:00] calibrations, um, one of, one of which is does, does that, does that 360 camera have a live output?


Probably not. Um, it would be astounding if it did. 


Joe: I would, I don't think so. Yeah. I've never, I haven't seen that. Okay. 


Eliot: Yeah, probably not, probably not. Um, but nonetheless, if you could, what we could do, I'm just thinking through this 


Joe: because it's like, there's the camera motion, you know, it's like, it's that's from synth eyes of like tracking, you know, the camera there.


So I have. You know, I'm just like, right, but how the fuck do I get? Because this looking at a three 60 


Eliot: and 


Joe: I wouldn't have thought it was possible, but I was like, Oh, I saw them do it. So I'm like, okay, so it is possible. Um, and I was like, right. But how did they overlay the cine shot with the three 60 shot?


Eliot: So they would have had to compute the, the field of view separately. Um, 


Joe: and I was like, did he make this, I was like, I don't know if maybe they [01:13:00] made the top camera that comes in from synth eyes. If they made, if they, if they made this like the same resolution as the, you know, the 7, 000, the 7, 600 by 40 thought that the AK, 


Eliot: um, well, I, I just like the idea of, of using it.


And I, I mean, I get what they did in their, their tutorial and I'm less, I'm less worried about exactly replicating that. Then figuring out how we would work that into our workflow. Cause I think it's actually fantastic. You know, I was a 360 camera running at 24 frames a second. That's exactly what you'd like.


Um, and we could, we could have it be flashing frames so we could synchronize our data pools. Um, and the other trick is alignment. How do we align it? Um, 


Joe: Yeah, that was the reason why, uh, cause I agree. I'm like, Oh man, I'm glad you see that benefit too. With, with, with like [01:14:00] having sharp images, three 60 around you.


Oh yeah. 


Eliot: Yeah, no, that's why I was trying 


Joe: to figure out how they did it so that maybe we could reverse engineer, you know what I mean? Or, or like, See, um, so it's like, I've gotten it this far and I asked, I was like, how the fuck did you align the images? I was like, that's the part that I can't. 


Eliot: Right. Right.


It's like, 


Joe: that was a step, of course, that he skipped that he was just like, and then we did this and that's the, that's the word. I'm like, how'd you do that? 


Eliot: That's, that's the part where it gets, it gets tricky. Well, what they, they did is they had both cameras pointing, pointing straight ahead. So they had a default.


They had the default orientation of, of the, of the, um, of the, the 360 cameras is the, the, the straight ahead is the same straight ahead as the, as you're seeing a camera. Right. Um, so they had that, and so they set their offset. And then the last thing you have to do is you have to set your field of view, which you can back out your field of view from [01:15:00] your, Uh, from your normal camera, um, you know, your, your focal length, et cetera.


Um, I mean, you can do the math. It's just some tedious stuff. They probably didn't want to show, but the, I, I don't quite know yet. I, what I want to do is think about this a little bit and see if I come up with a clever idea. Cause I like, I really like the camera. I mean, I love, I mean, You know, getting, getting accurate 360 footage is just a, such a win for all this, exactly all this stuff, right?


The more data, the better, and then figure out, Synthetize can take multiple different things in and actually up. So, okay, so this is key. Okay. I'm starting, I'm starting to get it. Um, the key thing Synthetize that we haven't done before is you can, you can do, um, I don't know whether we would do this, how we would do this in it, but you want to have the different pieces coming in at the same frame rate.


And the problem we haven't done that with the iPhone yet is because it's not the same frame rate. It's a 30 or whatever. And the, the, the cine camera, [01:16:00] this is the same frame rate. And that's the thing that makes it could make it work. Uh, synthize can use something called survey tracking where you. You have, I mean, we use one variation of it, but there are other variations of it where you bring in different pieces of footage to kind of, to help the, the main solver, um, I think that's probably how we would do it.


Um, did you, you, do you have, you have a 360, did you buy that 360 camera? 


Joe: Yeah, I have, uh, yeah, the, the, the, for the job that we're doing for this Intel project, I just made the studio. I was like, Hey, buy this. So I could do some, 


Eliot: right. Right. So it's the same one in the video right 


Joe: now. 


Eliot: Okay. Okay. Okay. So it's like, here's 


Joe: the, here's the synth eyes video with the three 60.


Is it showing you this, or is it just showing, can I, I'm going to see if I can't just have it share my screen. 


Eliot: Man, that would be so nice to have that. Um, I tell you what. 


Joe: Here, here's the screen. So this is it in synth eyes [01:17:00] here. And you can see that track is like pretty fucking 


Eliot: pretty. Oh yeah. It's three 60 data.


You know, it's, it's exactly what you want. It's the framework you want. I mean, I think this is a win and I think we can actually incorporate this into our flow without a crazy amount of crazy amount of effort. Yep. And it's picking up points and picking up points all over the place. Right. You know, it's great.


It's really, really good for that. Um, it goes through. Yep. By finding spots and then featured points. And then, then, and then, yeah, I see you drew the, uh, you drew the, uh, the line around yourself to keep your, keep it from tracking you, which is great. 


Joe: Correct. Exactly. So I was just like, I just wrote a bad out and I was like, damn.


I was like getting, you know, it was like to get a 0. 4, uh, resolve. I was like, yeah, I 


Eliot: think we can do something with this. Um, 


Joe: would you happy to help you test whatever you want? Oh, 


Eliot: no, no, this is, this is great. I, uh, so, okay. Request for you. Cause this is, uh, could you do a jet set Cine shot? Um, Oh yeah. And it's, [01:18:00] it's got, you gotta, you gotta, you know, QAO, what's it on top of the pyramid and do one of the, 


Joe: that's my idea, you know, that's the main thing where I'm like, okay.


You know, I'm still having trouble getting, you know, being like, all right, how to get it like exactly aligned so that like when I bring it in to there, it's like, that's the point. Hey, put this on the tripod. You don't want to be sitting on that chair next to it, whatever. Um, you know, obviously it's like if I wanted him sitting, like I put the human figure there, if I wanted him sitting, I'd have to roto the, the table and post, 


Eliot: which 


Joe: I'm, that stuff I get, it's just being like, all right, this is the, the, the track marker to kind of come in as some kind of null object or something in whatever.


Um, but yeah, so, sorry, you were saying, use the, 


Eliot: So, could you, could you, okay. Could you do a Jet Set Cine take a test shot and also have that 24 that 360 camera mounted on the camera and running so like you'd, uh, I think you'd probably hit run on the [01:19:00] 360 run on the, um, run on the, the, uh, the cine camera and then, uh, And then, you know, run jet set and cause that way, and it's going to flash those frames.


And I want to see if we can actually detect those frames in 


Joe: the 360 camera. Oh, that's genius. 


Eliot: And then we do a frame pull and we pull, pull that into, pull that into synth eyes. Because I think this is, I think it's great because it exactly solves the problem that you always want to solve and it's cheap and it's 24 frames a second.


I'm like, this is a no brainer. Um, and it's just going to take us, it'll take us a little while, like me thrashing. And I think this will go faster. Um, what a great, yeah, I mean there, there's, there's always stuff to figure out, but I mean, this frame pull is not gonna be hard. And then we can,


I, I'll have to look a little bit more about how we pull, how we pull it into synthesize, but that if we have the frame pull, the, once we have the frame pull and it all matches, then getting the additional data into synthesize is [01:20:00] just not gonna be that hard. Um, we just add, add in the sequence. So, um, yeah, if you could do that, that's 


Joe: what to do from here.


Yeah. No, I'll mean, I'll do that for you No problem. Like, uh, happily do that for you. 


Eliot: Yeah, this is this is great. I think I should 


Joe: have done it for this one. I could have used it. But yeah, but like, no problem. I have the setup. I haven't like touched it when I got it built. And I just like have left it built.


Um, so I could easily do that. And then it's like, you know, so like, let's say. We got this, right? Let's say this was from Jet Set and we got it synced. Like, what would we then do with it? You know what I mean? So it's like, okay, so then, and now, so we got it. We, so let's say, let's just say for the sake of argument, we shot this with Jet Set, it figured out the offset.


We've got everything tracked now. I set a couple of these markers here and this person where I want people, and then I brought it into Blender. So now I have it in Blender here [01:21:00] and it's sort, and then I'm like, I'm just trying to think of like, okay, how do I rewrap this? 


Eliot: The nice thing is that I think we can, um, I'm going to have to look at this a little bit more, but there are, there are workflows in SynthEyes to do exactly this kind of stuff, uh, where you have multiple, multiple inputs.


So what I would be aiming for is to bring in that additional data tracking to SynthEyes and use it to help boost the main solver. Um, so that we actually, that you get the correct, we still want to solve the, and the approach that I think we would do differently. Right now, what they're trying to do is they solve the track completely in the, um, with the three 60 camera.


And then they kind of like hand offset that. And I, I think we can actually do better than that. I think what we would do is, um, match the features of, of the, of the, you know, do a feature lock between the, the three 60 and the main camera, and then what should comes into blenders. [01:22:00] 


Joe: I like, and I know that you, that was what I'm really glad.


I, at first I was a little hesitant to share this. Cause I was like, I don't want 


Eliot: to like, no, no, no, no. This is data. This is data. 


Joe: But that's why I love talking with you. I knew you would look at it as like, wait a minute. This is just another tool to acquire data. As you said, just another data 


Eliot: tools are a bear.


And you just, like, as you're seeing, this is. Getting these solves is the biggest bottleneck in visual effects, I think, you know, that's what's 


Joe: crazy about this. I got a 0. 3 solve in five minutes, 


Eliot: right? 


Joe: Five minutes. Like I just went through was refining things. I had to remove anything that was in reflections or like in the, you know what I mean?


Just and every time I remember just kept going down, down, down. And so I, and I know you guys would be like, and I know you're the right team to figure that out of like, okay, how can we automate the, like, now you sync you now, you like put your shot in. Yeah. Who is that offset? [01:23:00] Like, 


Eliot: I love, I love this. I think, I think it's a great, I think it's a fantastic addition because it exactly solves, you know, the iPhone is.


Has a wider field of view, the wider the field of view and the crisper the better for tracking and 360 and crisp. And, and the, the problem with the iPhone, it's not 24 frames a second and that's always been the bear with, with doing all this stuff. So I, I think this is great and then we can automate all the pieces of it and Yeah.


Uh, so if you, if you can do that shot, so, okay. So yeah, I'll 


Joe: do that later today for you 


Eliot: and then go ahead and, and try out the unreal, the blender. Um, 'cause I have to get off in just a couple minutes here. So go ahead and try that. The unreal the blender. And then what I can start doing is, and I, I'm going to contact the Z cam guys.


And you know, we're all these pieces, you just pound on them. And then like each one of them is solvable. Like you get it, you figure it out, you solve it. And then it's part, part of the flow and doing next an additional extraction of, of these, this is like a no brainer. I think we can, we can do that pretty easily in auto shot.


Um, and then, then it comes, shows up as a shot package, you know, and then [01:24:00] frame extraction and match the frames. And, and away we go. And I think that's, that's the, There's a win. I love it. I like this. I'm 


Joe: so game. Like, I'm gonna knock this test out for you today. It's cheap, 


Eliot: too. Like, how much was that? 500 


Joe: bucks.


Oh. Dude, like we're there, we're still there. 500 bucks. Yeah. new one. You can get the like, and I know Qoocam, there's like another company, Qoocam is coming out with their version that's like sharper images and blah, blah, blah. But I think for the most part, most, like anything at this level up seems like perfect for this.


And that 500 range is like great. Yeah. We can get this. All those Matterport, LiDAR ones, those are like 6, 000, 7, 000 and you're like, okay, nevermind. 


Eliot: Yeah, yeah. And this is light enough people are actually going to use it. Like I always look at 


Joe: this. Yeah, exactly. And I don't notice it at all in the build. I was like, okay, this didn't affect my shooting style at all.


Super lightweight. 


Eliot: This, this is such a win. This is, and we can, we can pull it and, and this is great. And I'm going to actually, what I'm going to do is [01:25:00] I'm going to say, oh, Matt. 


Joe: Hello. I need you to make me another, uh, another, another script. Yeah. Yeah. I'll go 


Eliot: through it and I'll spend the, you know, the first like two days going diving deep in synthesize.


I don't want to waste his time. Like I will go through hard. The synthesize 360 solves. And, and get, you know, 80 percent of the way there that I'm going to get stuck. And I'm gonna be like, help. And then I'll, I'll, you'll, you'll say, Oh, you do this. You know, you want me 


Joe: to, when I do that, do you want me to like process it in any way for you?


Just give you like the raw data of it, like go ahead and, 


Eliot: um, no, just, just take the, take the recording of it. I want, I want the recording right. As, as out of the, out of the camera. Um, and, uh, you know, just say, you know, roll, roll on all three of those things. So then, and I'd say, you know, stack the deck in your favor, you know, have, have it reasonably close to the, the, the blinking flashing, you know, frame.


So we can, we're going to have to detect a bunch of tiny frames in a 360. So we want to be reasonably close to the, uh, to the, uh, You mean like get 


Joe: to make sure the camera's close [01:26:00] to the 


Eliot: yeah, make sure all three cameras can clearly see the flashing markers Yeah, because that's going to be how we're going to sync because this thing I don't think it takes time code.


Um, 


Joe: Yeah, I don't think so. I don't think it 


Eliot: does. Um, and we can try time of day But let's just start off with the flashing markers because then then this is pretty this is clean This is super awesome. I think it's going to be super clean to do this. Um, Amazing. I I 


Joe: no problem. I got my marching orders and uh, i'm on it for you 


Eliot: All right, dude, this is great.


I'm gonna go talk to Z Cam. 


Joe: All right, 


Eliot: see you soon. Bye.