Transcript
Morning.
Good morning. Hey, Elliot. So I'm just trying to share the link with a couple of colleagues.
Oh, fantastic.
Happy New Year.
Happy New Year. All right. Turn on my little meeting thing. All right. Lemme go back and look at what we're dealing with.
All right. Share it here. It'll be fun to show you a little of the stuff we're we're cooking on.
Yeah, that'd be great. That be
awesome. Alright, and let's see
that.
So I've, it's funny, we've got two shows now that we, LED shows that her. Vx. Mm-hmm. But they're all being done through Unreal.
Alright.
As obviously they'll go to comp, they'll go to Nuke ultimately, but a lot of the renders are coming from Unreal.
Make sure I understand that. Are they, they're, they're LED shows going to Nuke?
I, I don't fully understand that.
So they're l they've been shot in camera and then we're doing, um, additional shots where we are using the same environments from Unreal and rendering. And then compositing. Um,
oh, okay. Our access
on, on the top. So basically replacing what was in camera with a, with a render.
Okay.
Got,
got
it. Um, and my background isn't in comp or VFX, it's, it's in animation and motion capture. So I'm trying to get my head around how we can rebuild this in Unreal. Uh, as a, as a way of basically validating what ultimately you'll see in your composite pipeline.
Oh, yeah. Okay. This is the U lens stuff.
Sorry, it just took, takes me a second to remember content. No, no, no.
Sorry, I should have said that up front.
U lens. Okay. Okay. Okay. Okay. So this, that's the question. So we've got a, a jet set calibration. We're looking at how would and, and go into, okay, so let me look in the forum quick, real quick. So now that I've, I've got my.
Now I've got my context working. Okay. Okay. That,
that was more high level. Just just so you're aware that, um, that's kind of one of the things we've been using Unreal for Makes and, and Jet as well, because I, I came to a while ago about saying like, can we use via track, the camera, that kind of stuff.
Well, so this is, I mean, to, not to get wildly off the subject, but we're, we're working on, um.
Uh, a, a sort of a much expanded project, which is, uh, 'cause basically what we've found is, uh, jet set. We figure out how to pack an entire virtual production system into a phone. Mm-hmm. And it works great when you have like a small team, like just one or two people, and you're like, you know, like the auto shot works and you send the files, you put the files on thumb drive, give it to the person across the room.
Yeah. And all this stuff blows up when you start to. Have a much larger team and they're geographically all over the place and you're distributed and we, and I can't remember, have I showed you any, any of Spark before? Um, yeah, I'll show you a little 'cause it's I
time we connected it.
There we go. So this is, we are building a, what was obvious to me is that we need to build a collaboration system that's wired to jet set, that, that goes from the front end, the planning process, um, for the production process and into the post-production production process in one seamless software architecture.
That's all real time, all browser based. Everything just kind of works and, and it's designed to work with Jetset. So this is all in the browser. There's nothing downloaded. Um mm-hmm. We're, we're loading, you know, Gian Splats and USD files, same as jetset, you know, exactly the same file structure. So we've got, um.
You know, again, browser based runs on the Mac, bc, et cetera. So this part of it is our shot design area where we can drop in, uh, you know, like, you know, this is a li a, um, an X grid scan, a part of Rome, and this is a USD, you know, part of a USD scene and everything's correct scale. So you can figure out whether the car is gonna actually be able to drive through the tunnel as an antenna, sticking out, that kind of stuff.
But we're, we're, we also have a hook to a. Database system that's a designed around screenplays so that you can actually load in your screenplay. It parses it, you know, blocks in, into individual actions and character actions and dialogue, et cetera. You drop an audio for like an audio, you know, table read.
It'll lock it to the script. So then you can say, okay, I want test, you know, I want build out my coverage shots for these five lines. It'll instantiate a camera in the scene with the correct, you know, sensor and lens package and this kind of stuff. And you can test, shoot, you know, test shoot your scene.
Just, you can hook up jet set directly to this, right? So you can just directive the, the camera motion in your browser. Nice. And then you can basically build out your coverage. And yes, you could do this in an unreal or a blender, but what I'm aiming for is so that you and, and can can be sitting in one place and your.
Non 3D speaking director of photography can be in a completely different place and you can be communicating with each other one-to-one the perfect match. And this is all, it's all real, real time database, right? So if I move this, it moves, it moves it in his, in his screen. Right. So 'cause it's, and vice versa.
Yeah. 3D Google Docs for filmmaking. Um, and we're building the, out the whole pieces so that, you know, you have a jet set shoot and it uploads the, you know, the takes and all this kind of stuff and does the metadata matching and all the stuff we do in auto shot. We're actually gonna, you know, put into the web to 'cause that way you can handle it.
Distributed team. Um, same based
concept. I think that's what we like about it's. It's the way it kind of patches up your shots and then you can deliver them to various artists and all that stuff works really well and not you, you know, it's, and the way we builds it in Unreal. I think it's great. You've got the scan there for line up.
Yeah, yeah. This is, I mean, 'cause this, this is, this is what happens. The shows go to 600 shots in a blink.
Yeah, yeah, yeah, yeah.
And you, so that's, that's what we're, that's what we're cooking on. We'll have the, the story planning version of this. We're gonna. Um, probably have beta, uh, at NAB in about three months.
Um, so coming up fast. Wow.
Okay.
Um,
yeah,
I mean, it works. And are you
coding This? Is this
Oh yeah. I,
oh yeah.
So we're, no, we, I mean, we've got, we have a small team and they're, they're very good and they're very, very fast. And we know exactly what we're aiming for. 'cause we already built Jetset and, you know, we're, we know that's, that's basically USD and Gian splats.
The whole, whole thing is, is built on top of that. Yeah. So that way you can have six people collaborating and they're all in a different USD layer. And everybody else seems to be looking at USD for post-production, which is okay. It's a giant pay in the buck. Mm-hmm. Because you have to convert all your assets a hundred percent to USD and then you have to do a giant like Soliris pipeline.
Okay, great. But most people just have their stuff in my a Maya or Blender or Unreal. That's your 2D app. App and that's where it needs to go. So we just use the USD on the front end to where you can have 16 people hammering and they don't blow each other up because the layering system. So anyway. Okay.
That's, this is a, we, I, I have a contrarian view of this because I think it's a fantastic technology that everybody's flogging to try to get it to work in post-production. And I'm like, you. Yeah.
Well it's trying to get, it's trying to something to carry it through the pipe. Right. Something from pre-production all the way through to pipe saying,
yeah.
And there's, there's some real legit things on that and we're gonna carry it most of the way through. And then kind of at the end, like be able to kind of switch over all the USD decisions and just drop 'em into the Blender stack or the Unreal Stack or you know,
yeah,
pick your, whatever your 3D weapon. I
mean, that would be my thing with Sparks.
Obviously we've got a whole team of artists that work in Unreal.
Yep.
Where do they come on board within this and how seamless is it going from Spark into. An unreal scene.
You still, um, we still end up wanting to, you start an unreal frequently and you might export a kind of a proxy SD scene from Unreal and put this in.
And this is the conversational thing, so you can be again, you know. It's a lightweight. This will work on a, on a Mac air, like I'm testing it at all times on a Mac Air. 'cause this is the rider communication tool. And they don't have Nvidia, they don't do windows, they don't do any of that yet. The director is, is in some random AS airport on his Mac air.
And this is how we're gonna going to communicate. And this is, the whole system is designed around that. To catch all those, the important camera blocking decisions and then ripple them back into the big unreal stack where you've got the, you know, quad 50 nineties on your home box. Great. That's, that's where that should be.
Okay. You don't need 50 nineties to make, you know, com composition decisions. No, no, no.
Yeah. Lineups and
so, alright. Um, so back to you lens. Um, 'cause yeah, at some point I'd love to have you guys kick the tires on this, uh, 'cause I'm sure you'll just kick the tires hard.
That would be great. Yeah, very much so.
All right. Lemme look at you Lens and the solve. Js ON They should have the same. Um, uh, they should have the same, um, let me put this in the right spot.
Well, it looked like the same court, like the same numbers. You, I think one of them, let me pull it up as well. Sorry. The, the K values, whether it was, whether it was the jet set one or the
All right.
Almost like K one, K two, K three, and, and then the other one
was just, yeah, that's, that's standard open CV stuff. Um, yeah. And we might have to, you know, hand hack it to um, uh, you know, to get it, to get all the work, but as long as they haven't done something crazy, you know, it should still be the same open cv, K one, K two model.
Every once in a while, somebody. Does something weird. Lens calibration is man, especially if you're trying to get into the old school foundry, new lens. Literally, I tried to invert 'em once and I spent three days trying to invert those equations to get one from the other. I'm like, Nope, can't do it. Oh, right.
Back to St. Apps.
Yeah, I was trying to rebuild it in Unreal. It's just like typing it in and it, that wasn't working, so.
All right. Let me look at the two, two pieces of it. So. All right, so our goal is, I, I'm just gonna pull this up and I can, let me pull up my text editor so you can see what I'm doing. Um, there we go.
I'm gonna open up both of these. Uh, open. Nope, not. I want a different program. I want notepad.
There we go. Okay, let's see if I can't get this thing to format correctly. All right, I'm gonna share my screen so you can see what I'm doing. Uh, notepad. And so this is, this is the solve. Do js o. Lemme see if there's some way of, of formatting this easily. Um, view. Do we have, you don't need word wrap. Do we have A-J-S-O-N display?
Maybe? Um, nope, I didn't do it. Lemme just look briefly cur, I I won't time on this.
The Jet Set one displays correctly, doesn't it? The, the,
yeah, it's probably, um, so this is the U lens one and it shows, you know, nicely. And then this one of course is.
If you take their tx, so I had to put TXT on top, upload it to the forum, but
oh, you know what, if you
take that off, it works.
That's what I need to want. That's what I need to do. Thank you. There we go. Get rid of that. That way it knows it's a JSON file.
Yeah.
What a concept. Alright. View. And it is still,
Hey, I, I used subline and that worked for me.
Oh, okay. I'm on, I'm on Windows, so I have to look at this a little bit different. Uh, plugins, JSON tools.
I pretty print a. Okay, there we go. All right, so we are in our calibration. Lemme just look through what, what we have here. So this is the solve. So there's cx, there's CYFX, FY, K one, K two. Okay, great. So this is, this is fine. And then this, this is the transformation matrix. Um. Okay, great. All right, so let's then, let's then look at the U lens.
Um, and you, you're seeing this right as I'm switching? Yeah,
I can see it,
yeah. Okay. So what I think we're going to want to do is let's do a little bit of copy and paste between these two. Um, so the camera parameter table, so this is. FX and FY 2058. So where's the, where's this, uh, Len file from in the first place?
Just some random, I,
I'll be honest, I just copied it from, it was from, uh, vi.
Okay. That's fine. That's fine.
Just blank basic. Ignore the values in there.
What this
is, what I tried to do is match the ones I thought lined up.
Oh, okay. This is fine. And this is, I'll, I'll just kind of enumerate this for reference for, you know, yeah.
Someone who, someone else is looking at this stuff. So FX and FY are the, the, this is the focal length in pixel values, which makes zero sense to anybody else unless you're doing machine vision. But the equivalent ones over here, we want our fx. So I'm gonna grab this, um, and that way we will have an accurate pixel.
So I've already done that, so
Oh, fantastic. All right. So lemme just wonder
that that was the one I recognized.
Yeah, fy. All right, here's our focal length.
Oops.
So then we're gonna need to do, and we don't, by default, we don't actually write out CX and CY and it looks like they didn't either because it, CX and CY are the center offset where if you're really trying to solve super accurately the, the sensor machine vision sensor is not perfectly centered at the center of it.
Like the optical projection. Yeah. Always a little bit off, honestly. Don't worry about it.
Yeah.
This, this, there's this whole category of rabbit hole stuff in this stuff because the truth is we're dealing with realtime tracking, realtime tracking at its best day is centimeter ish accurate? Something like that.
Yeah. Um, and then of course, as you have misalignment to the projection even, and that's at, at the, the mounting of the camera to the, the sensor. And if they're misaligned by, you know, a quarter of a degree by the time you're 20 meters out. You're off by like five centimeters. Yeah. And that's just, that's just geometry.
So I, I, I ran the math early on. I'm like, we're not gonna get subfile off of it. Nobody can get subfile off of a mounted, a mounted tracker. So let's, let's,
but your lens calibration process, the, the way you calibrate it is presumably accurate, right? The
Yeah, it's
static frame at that point.
It's, uh, it's reasonably accurate.
Um, for, for the way we calibrate lenses is, you know, we've, you've seen our process, you know, we capture a bunch of different things.
Yeah. I've done it live now.
Yeah. And it's reasonably accurate for that particular, um, focus distance and, and, and that that setup. 'cause of course, as you're racking focus, the focal length is changing, it's breathing, it's, it's changing.
So it's pretty accurate for that incident in time. But, you know, it's not animated and I'd rather
do it in Jet set than do it in Unreal. I'll be honest. 'cause on Unreal's a bit more of a faf.
Oh
yeah. It's a lot. Putting
the video in and,
oh, yeah, yeah. Uh, so, okay, so we've got this. And then honestly, what I'd say is.
So the K one and K two, K three, those are the distortion levels. Um, we, we generally only calibrate with K one, which is, um. The, the R squared distortion, it's kinda the lowest term of distortion, honestly. K one by itself is the most stable and it'll handle almost anything. Okay. K one and K two are when you start getting into fish eyes and you start getting into weird stuff.
Right. Okay. And the problem is that they're, they're higher order. They're art. Art of the fourth and art of the sixth parameters. By the time you're still dealing with something that's a fourth order parameter, it is twitchy. Right. So we tried that and I, you know, early on of course, like, no, let's, let's try calibrating super accurately and, and it, yeah.
Blew up stuff left and right. So after that I'm like, alright, so we only just do K one. Okay, that's all good because that is numerically stable and life goes on. So I'm gonna put in K one here and I'm gonna get rid of their, their K three sensor, their, you know, K three distortion value. I don't know what they're.
Why they're putting it there.
Oh, sorry. That might be me. Maybe I put it in the wrong
place. Oh, no worries. Yeah, that would still be with deep, deep terror. Like, no, no, that would be me. K one. There's a K three like stuff's captain. She's gonna blow. All right, so, all right, so we've got K one and then encoder tables.
There's no data there. Yeah, this should all, and let's make sure our image dimensions make sense. Um, okay. Ah, this, we have to be careful of. Um, camera parameter table. So gimme a second to think through this. We're calibrating with 1920 by 10 80, so I think this is correct. Um, 1920 by 10 80. I think that's, I think that's, that's correct.
I have to be a little bit careful in this, this, this should work. And if it does something weird, then, then I'm gonna, I'll have to double check your image. Image you mentions, but I, I think everybody's
just FX value because I think I, I looked at someone else's lens calibration file and it, the value seemed completely different.
Almost like a magnitude of say. Yeah, a hundred or something.
The, the part you have to watch out for is the way these Vietnam, these machine vision techniques work is they, they don't know, understand outside dimensions. They don't know focal length and millimeters at all. You have to back that out based upon the sensor sensor size and a bunch of other accounts.
Yeah. So what they're selling for is pixel distances and um, and again, this, this, you know, so for example. What's your sensor width? 18, 18 millimeters. And so, um, you're, and with 1920 by 10 80, so this pixel distance is gonna be on the order of like 20 millimeters. Right. If I'm just, I'm just doing this in my head at a calculated mm-hmm.
Focal length. Right, because you see the correlation like 18 millimeters equals 19 1,920 pixels. So then 2058 pixels is a little bit higher, like, you know, 2019 millimeters or something. So this is probably a calibration from a wide Right. Am I, am I roughly correct?
I think so.
Okay. Like a 19 millimeter?
Yeah.
Yeah. It's a zoom lens. And that's the wide, yeah.
Oh great. All right. So good. So you, the world makes sense. Um, I'm gonna hit save and I'm gonna send you this file. Lemme look it through the rest of this stuff to make sure there's nothing else. Nothing else. Oh yeah, there's a 20 millimeter aha. Yay. That sounds good.
Uh, it's always good when intuition matches reality.
Yep.
Uh, open cv, there's the sensor dimensions. That's great. You know, it's millimeters, dimensions, camera parameters. Okay. I think that's. And we don't have any encoder data. Um, okay. That all makes sense. Boy, I wish the thing that, I mean, this is a perfectly nice lens file and looking at it because Unreal, they don't have a Python script to their lens system.
Right. And I think we tried this two years ago and, and just bounced off it like,
right,
like kryptonite, you know? 'cause we, I even think we looked at binary editing the um, like yeah, I remember seeing that's where the, I
found the post on the forum. I was like, ah, okay.
But this is okay. And so if this, if they have a text format of their, of their lens file, then we can just generate this.
This is, yeah. So I'm super glad you brought that. Okay. So this is good. I'm gonna save this and send it to you. And let's, let's, uh, lemme look at this distortion. Yeah. Okay. Okay. Okay. I
should have been copying
it. Oh, oh, I made a mistake. I made a mistake. Focus encoder and zoom encoder. Yeah. So, uh, that you were correct.
Oh, I was right. Okay. I put it in the right, I put it in the wrong spot. So we need to have, haha, that would've been, uh, that would've been weird. Okay. 0.0 zoom encoder. And we're gonna put this as the K one. There we go. Okay. Focus Zoom. K one. Okay. Well, it, it's nice to read the headers before you. Change, change things.
All right. That sounds good. Um, yeah, let's, let's try it out. So I'm gonna send you this file. I'll stop the share and just shift this file over to you, um, 20 minute millimeter U lens. And lemme go find your email, Tim. Okay.
Okay.
All right. Sending,
let's see if you should get that in a second to your email.
Okay.
And then do you, I mean, do you have the system? Can you kind of load it and see if things, things make sense over?
Yes, I've got it in.