Transcript
# Office Hours 2024-11-01
**Eliot:** [00:00:00] Hey, Mark. Hi. All right. Good to see you.
**Mark:** Yeah, I was a little nervous. No one was showing up. I thought I had gone into limbo here.
**Eliot:** Oh, no worries. No worries. All right. All right. What's, uh, what's, what are you working on?
**Mark:** Uh, well, I'm, uh, I'm gonna start attacking the, um, um, the, your system again. I've got my, I've got the Jetset Cine and the thing.
I think I finally figured out the, um, Cooler. Boy, if you don't put that, uh, spacer on there, that magnet just sticks like glue. It took me forever to get it off of the phone.
Interesting. But, uh, , well, that's what you get. I mean, the Chinese, there's just no instructions for the shark at all, so
**Eliot:** it's Right, right. No, it's, it's the magnetic, the magnetic, uh, fixture is pretty strong. . Yeah.
**Mark:** I was always worried that those spacers might, uh, hurt the con, [00:01:00] uh, uh. The heat transfer, but apparently not it's, uh, I
**Eliot:** guess by spacer.
Can you show me what you're talking about? I'm not because usually I just put it on without much of a much of a connection. But is there is there like a heat transfer sticker? Well, I'll show you what
**Mark:** it's got. It's a, you see, you're supposed to. I mean, I was had to consult. Um, you see, they give you that metal plate.
Okay. And you're supposed to stick that to the phone. Oh, okay. And that's got its own little adhesive. And if you do that, then this thing really sticks like glue on it. Whereas if you don't have that, it, it tends to fall off the phone.
**Eliot:** Oh, okay.
**Mark:** So, so this is a very important, but if you don't put this plastic spacer, which it has on top of it now, uh, then it's like, it sticks to this thing so, so, uh, hard that when you try to pull this off, it pulls this metal [00:02:00] plate off the phone.
Oh, goodness. All together.
**Paul:** Interesting. So
**Mark:** it's, it's, it's really weird. Hey, Paul.
**Paul:** Hey, Elliot. Nice to meet you properly.
**Eliot:** Excellent. Good to meet you as well. Hi, Paul. Good to meet
**Mark:** you, Elliot. How you
**Eliot:** doing?
**Mark:** Hello. Yeah. Hey. But I, I, yeah, so, uh, I know that, that these gentlemen are probably really anxious to get to you.
I just have two, uh, uh, quick questions and I'm gonna let you go if it's all right. Oh,
**Eliot:** sure. Go for it.
**Mark:** Yeah, the, um, I noticed that with my first experiment with that I was on the, uh, school stage and which is primarily black with a very bright green screen. Right. And I think we were using the, uh, the ai, uh, compositing.
system, which which is working great and everything. But the phone saw all this black area and said, Oh, well, I guess I've got to brighten this up. And it totally blew out the green screen, right? It's overexposed. So is there a way of locking the [00:03:00] exposure manually on these phones? Yeah,
**Eliot:** there's, um, uh, in the, uh, in the app, uh, there's a right in the, where the focal length, uh, you see, you know, 24 millimeter, 32 millimeter, et cetera.
Uh, to the right there's a little exco exposure button, uh, an ae uh, button that says, that's the auto exposure button. If you tap it, a little lock sign will go onto that and that'll lock the exposure, uh, to the current setting. Uh, so if you're shooting, if you're just shooting Jetset on a green screen.
Then yes, you definitely want to lock off the exposure setting. Now, when we run Cine, we actually unlock that. Um, because in Cine, the, the, um, it's going to be the Cine camera that is controlling for exposure. And the Jetset camera is just doing, doing tracking. And so we actually want that to be running auto exposure so it handles various lighting levels as you move through the shot.
And it's just kind of equalizing to whatever it sees. So we actually turn it off when you do hit roll on Jetset Cine. Uh, so don't be surprised by that. Uh, but that's to preserve, um, uh, tracking data through that. Uh, but for, you're just shooting green screen on [00:04:00] Jetset, you know, or Jetset or Pro where you're recording to the iPhone.
Yep. Just, you know, get exposed for your green screen the way you want and click that lock button and it'll, it'll lock.
**Mark:** Ah, okay. That's good. And then the, uh, only other question is the, uh, The fantasy castle. I did take note that I'm amazed that as you're scanning, the phone is actually taking Note of where the light fixtures are in the room and it's lighting using that but the fantasy castle is is Very flatly lit so that lighting is baked in to that background
**Eliot:** More accurately the fantasy castle when we import it from unreal we it's importing Using, uh, a kind of a cutdown version of their PBR textures, uh, physically based rendering textures, almost everything in Unreal will, has, has a, an, uh, a physically based rendering material set up, including, you know, albedo and normal, and roughness, I think are the basic ones that everything has.
And the, in the phone, uh, we simply don't have the lighting, [00:05:00] the same lighting model that Unreal has. So what you have is the phone is, is capturing a very simplistic HDRI of the environment and using that to illuminate it. Roughly. Uh, but it frankly is not, you know, it's a phone. It has, you know, two orders of magnitude, less compute than, than Unreal does.
So that's why. Inside the phone, we use the, the model, the rendered model largely as, as a proxy. So you can kind of see, you know, see where the, the framing is and we'll see where, where things are. And if you really need to see the, um, the lighting match, that's when we would use the real time tracking data out from Jetset into Unreal and that, uh, for the, uh, the live rendered preview.
Uh, that's, uh, under this tutorial. I'll show you this. And that's, that can be a good combination where, uh, getting Unreal to frame lock. Here, I'll, I'll send this, uh, real quick. Getting Unreal to frame lock is, you know, you can do it. It's just hard. And we have a tutorial on doing that. But getting the, the data in, uh, so you have, uh, both the live action, uh, the cine camera [00:06:00] view and the rendered CG view into Unreal isn't too hard.
So I'm going to, uh, you know, I'm going to put this, put the link in here. Uh, this is the tutorial to go through to bring live tracking data into Unreal and composite it, uh, with a, with an inexpensive, we used, uh, an Elgato, uh, HDMI capture, capture dongle. And that's, that's a good, that's a good start for, uh, bringing in live data.
And that's, that would, you would use for your lighting reference. Uh, cause inside the phone, we have, you know, much more limited, uh, rendering. So we have that for blocking geometry. Does that make sense?
**Mark:** Uh, from what I gathered from, from that cornucopia of science was,
**Paul:** uh,
**Mark:** basically, uh, when you're, uh, this is for Jetset Cine.
So you can, uh, uh, quickly see your previz. But if you want to fine tune your lighting, there's a way of seeing the actual lighting in your set and then match, let's say the key light position for the shadows and everything. So you have that facility.
**Eliot:** Yeah, there's, you can send [00:07:00] that data to unreal or, and what I'm working on literally right, right now, uh, is.
Um, and then, um, you know, you can use to generate from blender or unreal soon to be others, a Gaussian splat of your rendered three D scene because we're at the point now where unreal can render, you know, 1000 frames and, you know, seconds or blender can do that as well. You can instruct constructed Gaussian splat that can then load into the phone, and that has.
That's rendered and made looks he has reflections highlights all the whole nine yards. So you would say,
**Mark:** yeah, that was my next question. The Gaussian splat essentially bakes in the lighting. So that would.
**Eliot:** Yeah, you'd have
**Mark:** the best of all worlds there.
**Eliot:** It's a remarkable advance and we're, it's, it's, uh, the science on this is happening very, very quickly.
The research is going, but I'll show you, uh, just, and this is in the, in the Blender tutorials. Of authoring with Gaussian splats. I'll, uh, share. Did you just pay
**Mark:** something for me? Cause I, I didn't see, I didn't see anything.
**Eliot:** Let me share the screen real quick. So this is, uh, the tail end of the tutorial.
Whoops. I just lost my [00:08:00] screen. There we go to my back. Okay. There we go. Um, So this is a rendered Gaussian splat running inside Jetset in real time. Uh, this is captured from a live action environment. This is a coffee shop over in Beijing where they took a few hundred photographs and did some LiDAR. And, but the thing you'll notice is the reflections are not baked.
The reflections are actively moving correctly. They're the correct reflections. So you can shoot this. You can have mirrors. You can have all sorts of stuff. And the Gaussian splats will capture that. And it'll also work in a 3D scene. So what we're doing right now is building a, uh, a partially automated workflow to, uh, render out those frames from Blender or Unreal, uh, to, of course, the key thing you need to do is we are trying to maintain the coordinate system between the Gaussian splat and the original.
So that if you shoot the Gaussian splat in the phone, record, push the data back into Blender or Unreal, the data, you know, everything matches. Um, the coordinate systems all like line up correctly. So that's what we're, I'm debugging right now. [00:09:00] It's extraordinary, because you can have a fully rendered 3D scene in your phone.
And that's it, man. You can do your lighting match. You know, you can, you can add moving objects as an overlay with USD, like the Gaussian splats don't move yet. Uh, but for a background lighting reference, it's incredible. I mean, it's just incredible. What, what's possible in your phone?
**Mark:** You're, you're basically now reproducing what we used to do for centuries on a real set with real actors, not centuries, but at least a hundred years.
Right. So it's, it's, uh, it's. I mean, the Gaussian splat idea is great. So the Unreal now has a Gaussian splat renderer.
**Eliot:** They have a few different ways of doing it. There's multiple plugins and we're going through and debugging that process. I haven't done the tutorial yet because I want to make sure our coordinate systems match.
Um, when you write out your renders, you have to write out a matching dot XMP metadata file to load into reality capture. And that way when reality capture [00:10:00] does its initial solve, it maintains the original camera poses instead of. Doing what these programs normally do, which is come up with a random coordinate system and scale, which does not help us.
We want to match precisely back to the original coordinate system and scale, whether it be from a scan, a real world scan, as this one was, or from a CG environment. We need to That needs to match, um, and so forcing those solvers to, to match the coordinate systems is, is, uh, well, I used to have hair and I haven't had hair for a long time and now I have even less.
Um,
**Mark:** thank you for sending that picture. This, that picture of you is great because it has your original, uh, camera. Oh, yeah,
**Eliot:** yeah. Oh, that's
**Mark:** that's nice.
**Eliot:** Uh, so that's, that's what we're, but the benefit and the possibilities of this are incredible because we are coming to a world very, very quickly where to run virtual production on, you know, on a set or on location, you don't need to be hauling around a rack of equipment, a set of 40 nineties, you [00:11:00] know, uh, You know, the tracking sit like all this kind of stuff.
You it's a phone on on a camera and that's pretty and maybe your remote laptop so you can remote control it, things like that. But that's that's about it. Um, and that's, uh, it's very exciting to see this, uh, starting to come together.
**Mark:** Oh, great. Great. So I think if you just have that link, I could copy. You said you had a
**Eliot:** yes, let me say this is the the link for the gaussian splat setup.
Um, and then there is a matching workflow where we, I must be not seeing the chat or something. Uh, yeah. Pull up the chat because I'm putting this, uh, uh, let's see, you guys see it. I'm sending it to office hours. Everyone, um, got notes, whiteboards, apps. Paul, are you seeing the, um, the, uh, the links that I'm sending, uh, to the chat?
Okay. You're seeing those. Okay.
**Paul:** Um, yeah, in the, yeah, along the bottom there, I don't know if you've got a small, if [00:12:00] you've made the window minimized, Mark. No,
**Mark:** I've got it big, let's see, I've got more, I've got three show chat previews.
**Eliot:** There you go, chat is probably what you want, what you want. Yeah. That's where I'm pasting the links into the Zoom chat.
I can, I mean, I can also just email 'em offline to you. That's fine.
**Mark:** Yeah. I'll eventually find it by the time this, uh, zoom meeting ends . Sure.
**Eliot:** All right. I can, I can copy a couple of these over.
**Mark:** Okay. Well, great. Well, I, I think I'd like to hand it over to these two very patient gentlemen who, uh, are anxious to talk to you. Hey. No, it's
**Paul:** all good.
**Mark:** It's all good, mark.
**Eliot:** No problem. Carry on. These are, I mean, these, these are great because usually, um, what you know is the case is that almost everybody's trying to solve very similar.
Sorts of things involving, you know, hooking cameras and live action together. All right, Mark, I just sent you [00:13:00] an email with the three of those links. So you should have those.
**Mark:** Right. Right. And I finally just now found them. So that's great. I got in both ways. Thank you very much.
**Eliot:** Absolutely. Absolutely. And yeah, just keep me updated as you're, as you're hooking all the Cine stuff together.
I mean, you have the, the, the initial set of tutorials. Do you have those? Uh,
**Mark:** yes, I, the, the, the, the tutorials are great. I don't think I'm going to have a chance to do a full pipeline of the Cine, uh, but we're deriving that information off of your tutorials directly. Um, uh, the, the, the class is, uh, uh, ending soon, uh, at the school and it's just, um, I've got to deliver by the end of the month.
So it's, there's just no time to actually do it, but I think that the general Jetset is, is fantastic. And I'm going to try to do another one where you, to take up your suggestion of doing a more dramatic move than the, than the tilt down that I did. And with that, with that tilt down of the girl on the tower, I noticed [00:14:00] that if, if you try to monkey around with those, um, origin points, it's very easy to throw off the, uh, tracking you've got to like lock it and not change your mind on the spot because it's you might
**Eliot:** i'd say take a look this this episode just came out of uh and show this to your class um of a it's a youtuber um named olden peters that we've done a couple of projects with and he's uh Uh, he's really, uh, I, I'm starting to show this to a lot of, this just came out yesterday, but he's using Jetset to, uh, not only pre shoot and pre edit the entire production, but also as the, you know, a camera cap, camera tracking and capture system.
And he's able to get the vast majority of his shots, uh, tracked right out of Jetset Cine and just, you know, pull them in. And in fact, partway through, he notes that he had one shot that didn't track. I think he was like right up next to the green screen. So there's just like no, no feature points that you could, you could get.
Uh, and so he had to manually track that one and it took [00:15:00] him to do that one shot, took him the same amount of time as finishing the 23 other shots that were in the sequence. Oh, I'll bet. Yeah. It's maddening.
**Mark:** I hate manual tracking. I was never good at it at all. Oh, it's
**Eliot:** brutal. I mean, that's why I started Lightcraft because I did it a couple of times and said, this is madness.
How does anyone get a project done with this, this, uh, this method? So, uh, but what's really interesting is He's really, I mean, he's doing some very complex things. He's doing complex interactions between a live action character and a CG character and lots of moving cameras. And so especially see that because it's, the interactions between the characters are, I mean, colorful.
You know, it's a relationship between a person and a robot. So, you know, there's that. But the, the fluidity of of how the characters are interacting because he's got the animated character loaded in to Jetset when he's shooting including this the dialogue right so the actors can hear the animated character talking and he puts up a c stand so we know the guy knows where you know where he is but they [00:16:00] hear him so the timing he can hear their character talking through Jetset and the the motion and the blocking of this is exactly what i wanted to see someone do Uh, because it's very fluid.
It's very, it's exactly the kind of block you'd see, uh, on a good live action production where you're not worried about, you know, where the actors, you know, are, you just know, um, and the camera knows, because now the camera does know where everybody is, and so you can have these, these dramatic, Uh, motions that are in line with what the characters are under.
**Paul:** So,
**Eliot:** uh, I'd say this is, you know, he's really, uh, I'm basically breaking new ground, uh, because he's doing, it's a full animated, you know, live action integrated, you know, virtual background production with animated characters interacting in a complex manner. Um, and he's doing it with, there's like five people on set, you know, that's it.
Oh, including the actor, right? So it's, it's, um, it's worth, worth it to understand just how. How far, uh, the envelope is going very, very, very quickly. I'll share the screen. So you kind of get a chance [00:17:00] to see, um, you know, he, what he's, what the operator is seeing on set, you know, he's, he can see him, you know, in the background, he can look over, you know, swing back and forth between, you know, one of the characters and the other character, and he can look, look around and see, you know, what, oh my God, that's
**Mark:** a lot of great.
**Eliot:** Yeah, yeah, yeah. So they're, they're shooting on our mini, on our small, we have a small insert stage. And so, yeah, I was there. Oh, so that they're, they're, you're playing around on that wonderful little stage of yours. Yeah. Yeah. It was tiny, but you know, with he's able to, and this is another key thing and Paula, hopefully this is useful to you as well.
And then we can dive right into the intricacies of, of calibrating on a full frame in a second. But the, um, the, The, the other key piece he's doing is he's, he's pre editing, um, you know, or he's doing, he built, he built the CG background of the set. And again, Alden is a skilled Blender, uh, artist, so he could just, you know, build, build this.
And I think [00:18:00] many more people will be in Blender, and many more people will be able to do this. With a Gaussian splat of an area because those are much less technically complex to construct. But he basically shot the entire environment, you know, standing in his living room. I like the cat running around in the background, you know, edited it, you know, multiple times, and what went through the sequences, et cetera.
And so when he actually went on the stage, he had it, the whole flow of things locked in his mind. He knew exactly what he needed for each shot, what he was going to be using. And, you know, John Ford shot like this. John Ford shot in the, edited the movie in the camera so that when he was done there's really no other way to put it together, right?
But now people can do this and train their minds to understand what the shot flow is going to look like in the edit. Well, they can still go back and tweak back and forth so that they went when they go into production. The whole thing, they shot it in half a day. That was like start to finish the, I mean, it's not, it's not a 30 minute show.
It's like a, you know, four minute, uh, kind of sequence of things, but that's how fast he could go. Uh, and [00:19:00] he had the entire edit turned around with the, with the, the, you know, the initial comps in the day after. So the speed at which you can operate is very, very high. So it's, it's exciting for us to show what's possible with that.
Okay. Fabulous. Uh, Mark, I better jump in and dive into Paul's, uh, calibration stuff. So absolutely. Thank you. All right. All right, let's uh, all right, Paul, where are we at? What's the, what's the current state?
**Paul:** So, um, yeah, having some real challenges with setting it up and, and just getting everything to feel like it's seeing the same thing.
So we, We haven't gone through the, the, the full workflow of, um, matching up, um, the shots out of Jetset, whether it's the comp short, just a clean footage and the shot out of my cine camera, but I have material. I just did a test before we jumped online. So we'll bring it into [00:20:00] Premiere and kind of line them up and just see how they compare.
But everything I've seen so far, when I'm like, for example, I'll, I'll look at the little yellow reticule in Jetset. that supposedly at 50 mil and or 35 or 20 or 75, any anything I've done, and I'll compare what I'm looking at the reticule with what I'm looking at on axon. So my cine cameras are seeing and they're, they're not the same thing.
So they're either different focal lengths, we're seeing more information around it or less. So that's kind of I guess the least Worst case scenario is that they look like a little bit different. And then the worst case at the moment is what I was sending you on the email earlier in that I'm calibrating and the, even the red reticle before Autoshot, um, pushes it, um, and does it's calibration.
Um, that seems off. It's slightly. Slightly funny angle. It's supposedly a 35 [00:21:00] millimeter lens it's calibrating to, but my red reticule is larger than my 24 mil or 26 mil iPhone camera. So already it's off even before it turns yellow. And then when it turns yellow, I don't know if you saw, but it, it become, became, you know, tiny as if it was like 150 mil.
Whoa,
**Eliot:** that's kind of strange. Well, tell you what, what may, uh, what we can do, actually, we'll tell you what, do you, um, do you have your, your, your camera rigged up? I can just walk through a calibration and with you. So let's, let's just make sure we're getting the data, you know, from the get go, let's, let's see if we were getting the, the upfront data, correct.
So, um, we have a nifty thing we can do, which is a remote assist. Um, so do you have a, you have Jetset running on your, on your device?
**Paul:** I'll just get it. Yeah, I'll just mount it onto the camera and then, um, it should be good to go.
**Eliot:** Because then what we can do is, um,
all [00:22:00] right, like the green, green screen background. There we go. This is perfect. Quality. No, this is, this, this is the default status quo around the world is using, you know, making sure Jetset Cine is working, you know, like in your office with a green screen, you know, debug everything. Yeah. That's it. Before you go near a set.
**Paul:** Cause our, our thing at the minute is that it doesn't matter about the creative, the quality of green screen, whether it's well lit. If we can't get Jetset Cine to work,
**Eliot:** you just
**Paul:** can't get it to work. Do you know what I mean? So, but we just need to get out the weeds almost and into actually being able to do a full pipeline.
And also we're not doing it for live stuff. We're literally interested in, in takes just for pretty much just the camera tracking data, just for Unreal. That's kind of our thing, but I'll go [00:23:00] back to Paul, I guess.
**Eliot:** All right. So there's, all right. So let's see, let me, uh, let me look at what the, uh, cameras we got.
Let's, let's take a quick look at how we're, how we're hooked up here. So there is a city camera. There's all right. There's, there's device. All right. So we've got an ax soon and let's see, so. There's the Axiom going into Jetset Cine. There we go. That looks promising. Uh, and then let's see, where's the cable?
We have an HDMI going from, what kind of camera is that? It's an FX3. FX3. Okay, great. So you have a micro HDMI going up into the Axiom? Uh, yeah, HDMI going into
**Paul:** HDMI.
**Eliot:** Okay. Sounds great. All right. So then, uh, let us, and so when you, um, let's do initial checks. So when you look, let's, uh, When you turn on the FX3 and you have the CMO activated, it should pop up something that says, uh, on the phone that says, uh, Accident CMO is trying to connect, um, and then it'll automatically push you into the, [00:24:00] um, what's it called the, uh, Accident C application.
Yeah, I don't know if
**Paul:** you can see this, but it, it's, uh, it's all hooked up.
**Eliot:** Hey, there we go. Okay, so then, are we seeing a visual feed, uh, from the FX3 that is correct on the Axiom 3 application? Because that way we can make sure we're getting, uh, getting a visual feed.
**Paul:** So that, that's, I guess, one other thing we're slightly exper, experiencing is that I have to often switch between the Axiom and the Jetset, and when I do that, one or both crash, and that's what it's just done.
At the moment, the Axiom is displaying just all black, so I'm gonna reboot it.
**Eliot:** Yeah, you only are gonna wanna, want, want to run one or the other at a given time because then there, there will, they'll both attempt to access the same, same hardware. Uh, so let's, we'll just use the Axio c uh, first to verify that we're getting a, a correct signal from the FX three that it looks like what you'd expect it to look like.
And then we can exit out of the Axion C app and then we can move to Jetset.
**Paul:** I actually wonder if I might have [00:25:00] just lost battery, just this very second in my FX3. So just one second while I No worries. Yeah, we're up. So yeah, we have footage.
**Eliot:** Okay, so it's, it's seeing, uh It's seeing the, uh, seeing the green screen.
Okay. Fantastic, and That, let's just double check that the HDMI output is, uh, showing the same field of view that the FX3, you know, uh, like you pull up the, the, um, uh, the monitor view on the FX3's LCD that we're, you know, it's not doing something kind of strange.
**Paul:** So I'll have to unplug the HDMI, I guess, to do that, right?
Oh, I
**Eliot:** see. Okay. I think that should be working. Like, we've worked with FX3s before, so we see footage there. Um, so, okay, so you can go ahead and exit, um, uh, exit the, uh, the Accident C app. Uh, then you can, and before you start up Jetset, Um, I'm going to put up a QR code I'm going to share on this and you can point your iOS camera app at the QR code.[00:26:00]
Um, and then it'll pop up a little, uh, a yellow button and you can click that button. And what that will do is, uh, then that will open up, uh, it'll basically, we have a remote assist kind of screen share system built into Jetset. That way I can see what you, what you guys are seeing very clearly. So let me, uh, pull that up.
There we go. And I'm going to share my screen so you guys can see what I'm seeing. All right. Share them. There we go. Okay. Okay, great. So now I can see, I can see your screen. So, you know, for calibration, we can just use the default grid. So you can just go ahead and click on, on that. That's, that should be fine.
All right. There's just a top, uh, top link that just says use the default grid. So that's fine. There we go. Initialize the error session. Just tap on somewhere on the ground. That's fine. There we go. Click OK. Okay, so there's our, there's our grid. And so next we're going to want to, let's go into a calibration session.
So we're going [00:27:00] to go in the lower, uh, left hand corner, the main menu. And we'll go into Recording and Cine Calibration. There we go. And we're going to click New.
There we go. All right. So we've got ready from cine one. So we see, we see data from the max soon so we can click start. There we go. And it should show a view. Great. And so, uh, so what this is showing is, is it's showing all the natural features we're detecting in the scene. Uh, so the, the 2 things I'd say is 1, we probably want to brighten up the, uh, the cine camera just a little bit.
And 2, we want to point it at, uh, an object, um, Uh, that has a lot of features on it. So the green screen doesn't really have any features on it. So what we usually do is just swing the camera around. Uh, and point it toward, you know, what, that's perfect. Actually, you have just found the perfect set of objects.
As you can see, it just light up with a series of natural features. And what we mostly want to do is find stuff that's going to be at a similar distance to what you're going to be shooting at. [00:28:00] Uh, and then, then the reticule will be a bit more accurate. Doesn't need to be perfect, but, you know, that's about what we're looking for.
And something that we can get We can walk around in a, you know, at least 90 degree pattern around where we're basically doing a tiny bit of photogrammetry. Uh, and so you're going to want to be able to go, uh, move the camera from side to side, you know, translate it. By about, about a foot by each step over a span of like nine or ten steps.
Uh, so this, I mean, this seems to be fine. I think that, that should work. Um, we've got some foreground, some close foreground objects. And so also plenty of background objects. Um, okay, so we can actually just click a test frame. There we go. We found about 87 matches. All right, and so we can keep that frame.
And we can translate to the camera. Just move to your right, just slightly about a foot. And keeping roughly the same subjects in place. There we go. And, you know, and keep the frame. And translate, move over again. And you just, you know, kind of work your way around the circle. There we go. Keeping, that's plenty of matches.
And move again. [00:29:00] Test frame. There we go. Keep the frame. And again, test the frame. Keep the frame. There we go.
There we go. Keep the frame. And it's testing against both the, uh, the current cine shot, which is the one below it. And also the, the original cine shot, which is cine one. So we can see we have matches both between, um, our current shot and our original shot. So that's good. And I think that's. Cranking along, do maybe 10 or 12, 12, uh, captures.
There we go. All
**Paul:** right.
**Eliot:** All right. So now we can, uh, go ahead and click save.
All right. And then, uh, what, what millimeter focal length are you, are you shooting at? About 35. 35. Okay. That sounds good. And then just 35 millimeter and maybe just put the date on it as well. So we have the, uh, um, good idea of when it was. [00:30:00] All right. Let me refresh that so I see what you're doing.
All right. Uh, and once it's saved, um, I think it doesn't let me see the save screen, so you can go ahead and click exit,
and just let me know when you've, uh, when you've exited. It's going to take a little while, uh, for it to Yeah,
**Paul:** so I've exited out. I'm just picking, picking the origin point again.
**Eliot:** Okay.
**Paul:** And so I guess already there is like a, uh, first part of the, like, I feel like it should be not looking like that because if the iPhone, uh, native lens, uh, of the camera is 26 mil, I should be seeing the whole of that 35 mil red reticule, right?
**Eliot:** Well, let's see, uh, can you tap the screen share the, when you, when you added the QR code, it added a new button up above the scene locator button to the screen share button.
Could [00:31:00] you tap that again? Cause right now I'm not getting a feed. Uh, there we go. Let me, uh, refresh my, my site. There we go. Okay. Okay. There we go. There's. Okay. So I bet what we're seeing is, so I see one edge of the red. Um, what you may have is, let me look on your, on your zoom again. Sometimes this has caused your, your, your Cine camera was pretty tightly aligned with your iPhone camera.
Um, all right, well, let's, let's, um, can you go to Autoshot and there were, we'll, uh, let's go see if, if it calibrates correctly and if we get a reasonable focal length that we'd expect to, if you can, uh, share your, uh, share your Autoshot screen. I'm going to turn off this screen share for now. Um,
and let me look up the. A sensor width, sensor back [00:32:00] with FX3. Okay. So it's talking to that and we're going to put, let's see, since you're back with, uh, FX3, or is there a sensor with, just double check what Sony has, has our specification. Uh,
**Paul:** I believe it's 35. 6. But when I was looking at the, one of the tutorials and it was talking about the Blackmagic Pocket Cinema 6K, it seemed to be putting the sensor width different to what I've found online to be the sensor width.
It was putting it at the S35, the Super 35.
**Eliot:** Yeah, the, the, the Blackmagic that I have is, it's a S35, uh, version sensor. Uh, there are others that are full frame. Uh, and in fact, we're going to be adding very shortly. A dropdown menu of all the different [00:33:00] camera manufacturers along with their models and then the sensor formats.
So you can just click instead of having to look up this stuff. Cause it's, it's for exactly this reason, it's, it's very easy to, to get it incorrect. Uh, and, uh, so we were adding that with rapidity. So let me look for our, okay, 35. 6. All right, so that looks about right. So there you go, 35. 6. And so if you click calibrate, let's see how, where we're at.
And if it, and it should give us, it's going to crunch away. Okay. You guys are on a Mac. Sounds good.
And every once in a while, we run into things where things calibrate okay on windows and they don't on Mac to do some subtle thing inside the cold map binaries that it's different. So I hope it's not that, but we'll find out. One way or another. Uh, we can figure it out.
**Paul:** We'd want to be [00:34:00] good if it was that because it might sort of make sense of why we've had so many issues with stuff.
I can't remember if it was the email I just sent or yeah, no, it's the one overnight actually, but where I, I think I was trying to line up a 50 mil lens and the red reticule was, you know, something like it was taking up, This sort of size frame, but then as soon as Autoshot, uh, pushed it to the Jetset, it went about this big and it went at a funky angle.
It sort of put a Dutch to about 30 degrees and it, and it was like a little tiny reticule within the frame. Interesting. Interesting.
**Eliot:** Um, that's curious. Well, the good news is that we can, we can trace it down because the, the way it's designed is we capture these images and if you roll a take, then it, then it stores those as those images.
Um, you know, you know, it's something we can open up and look at and see if we have, uh, and, oh, and the, the, um, the output of that was clean, [00:35:00] right? You didn't have any over information overlays or anything on the HDMI. Uh, I should've, I should've looked at that closely. I forgot to look at that.
**Paul:** Um, I'm not sure what you mean there.
I did it in various ways. I did it with, uh, with and without a comp. Um, I tried different things, different kinds of types of model.
**Eliot:** I actually, what I'm speaking of is when you have the HDMI output from the FX3, uh, going into, you know, at first, um, uh, first the Accent C app and later Jetset. Um, that you've turned off the image overlays that sometimes, uh, DSLRs can put onto that, like the current, you know, have all sorts of stuff on the screen, uh, and you, you generally want to have it, you want to have a clean feed because those, the information bars that are on the screen from the HDMI, uh, feed can actually break the, break the salt, but it looks like it's salt.
Uh, it looks like, okay, 47. 4. Oh, that's interesting. And you have a 35 millimeter sensor. That's weird. Okay, so what's going on here? [00:36:00] Um, that's officially weird. It's an fx3. And what, um, all right, the other things are, and you're in full frame mode and recording, because I think the fx3 has both a full frame and a, and that windowed.
S 35 mode.
**Paul:** I'm also just going to point out that I don't know if you're going to see this, but I just opened up the webpage directly from Autoshot the calibration window. And it's got on that, uh, clapper board, um, with the time of day time code, it's got the lens it's it's written in the top right corner where the lens is.
It says 47. 4 millimeters.
**Eliot:** Um, so I see, um, I, right now what I see is your Autoshot screen share. With the calibration, and I see that under the sensor with the under 35. 6 and it calculated for a focal length of 47. 42. So something's clearly off, uh, in a fairly substantial way. Um, and I want to understand exactly what's going [00:37:00] on.
Um, on the F 35. Let's see. Um. And you're shooting in full frame mode. That is correct. That's why I understand that the F35, say, it looks like.
Calculate equivalent focal lengths.
**Paul:** Let's take a look.
**Eliot:** On the, on the F, I'm trying to think of how to, how to verify behaviors on the F35. So two things. I'm going to want to get actually a, um. The take from this, uh, [00:38:00] cause I want to look at the lens calibration. So it would be very useful to, uh, roll a Cine take, or maybe you've already done that and send, send one to us so I can, I can look at and understand what's going on, um, but that something, something is funny.
Um, and pause it and we could have a numerical error. Let me look at your bundle adjustment. But our, our initial cost and the final cost are within, it's like a four, four tenths of a pixel. So it's, hello, it's, it looks like it's solving. Um, okay.
It looks like it's solving. So then what's left is the sensor back with, so, all right, let's take a look at the F 35, uh, I'm sorry, not at the F 35, but Sony FX 3, FX, FX 3, worrying. Okay. Oh, actually, I'm going to look at our handy dandy [00:39:00] online sensor database that we're going to be making full use of in a short period of time for exactly this.
Uh, let's take a look here. 3D.
**Kumar:** Hey, Paul, I'm sorry. Can you quickly just show me the GUI check of, you know, the calibration? I'm just curious to see that.
**Paul:** Yeah, I don't know if it will share.
**Eliot:** Oh, let me see. Uh, you may need to share a different window or maybe you need to share the overall, overall image. That's a good suggestion, Kumar. Uh, let's see.
Alright, so you can zoom in. That looks
**Paul:** pretty good.
**Eliot:** Uh, and you can scroll, a mouse scroll wheel in so we can actually come in to see. That looks like, for the most part, it's solved. It looks like one of them is kind of out of whack, but Uh, and does the point cloud, uh, look like as you zoom back out, you'll see the point [00:40:00] cloud of it, of the, uh, solid features in the room.
Um,
**Kumar:** and that looks, just get close to one of the point of views, but I don't see two lenses. So they're so close to
**Eliot:** each other. Oh, that's, Oh, wait, that's an excellent point. There's almost no offset. There
**Kumar:** must be two of them. Uh, There are, there are, I mean, they're so close. You can see on the one which is backed out, they can see two of them.
Yeah. I'm
**Paul:** on a track pass. I just had a little trouble scrolling to, uh, hold on.
**Eliot:** Because each one of them should have, um, like a frustum that is, um, like two separate frustums, one of which is the iPhone, uh, which will be a higher than the, than the, um, Yeah, I mean,
**Kumar:** at
**Eliot:** most, at no
**Kumar:** point, they can align for sure.
Okay. That's weird. Um, maybe you could just put your phone [00:41:00] little off from the, you know, access further, uh, there are, okay. That's fine. There's the same camera. Maybe just push the, uh, iPhone away from the lens and then, you know, uh, do one.
**Eliot:** Oh, you mean to see it, to see if it's. I mean, we should see, that's a multi centimeter offset that he has in there.
Um, and so what's strange is that to have, it's very unusual to have the field of view be basically identical between the iPhone and the Cine camera. So there's a couple things that are puzzling about this. And I actually almost want, I want to see that calibration that we just went through. I actually want to see the data set to see if we're running into something unexpected numerically or whether it's, it's one of these weird Mac things.
Um,
**Paul:** so let's see. Um, I will say compared to what I've seen on, on a lot of setups, the, I'm putting my camera, my, [00:42:00] my iPhone camera as close to the lens as possible. Whereas I've seen. Often the camera, the iPhone be put on top of up here and it's kind of the whole phone is centered to the lens, which means the iPhone's camera is quite far to the right.
So well,
**Eliot:** yeah, yeah, this this should I mean, that's actually a good thing. So what I what I'm curious about normally Um, we'd see, we, and we see it in one case, can you zoom in closer on, and maybe what we need to do is just send, send us the, uh, the, the data set so we can, we can run our analysis. So over here to understand what's, what's going on.
Um, it's very good to know this. Can, is it possible to zoom in closely on, on one of those, on a couple of those cameras, uh, the, the, the ring of cameras. I know you're on a laptop, so. At a certain point, what we'll just do is send the file and we'll look at
**Paul:** it. It's only letting me zoom in on the um On that sort of uh, okay.
It's [00:43:00] axis. It's not letting me zoom anywhere else in the in the image at the moment. Uh
**Kumar:** Maybe reorient yourself put the uh, yeah the The better ones in the front and then yeah take this around. Yeah. Yeah that way
Yeah, I think it's, there's something missing totally.
**Eliot:** Yeah, something's strange. So this is, it's actually very good to find these things to see if we're running into a numerical. You
**Kumar:** don't have to be very clearly. Or, I mean, they're so tightly overlapped, it's very unlikely. Yeah. There is a physical offset.
So let's
**Paul:** I should say again i'm on an iphone 12 pro by the way I don't know if that's a if it's because it's the oldest one with lidar is causing an issue or not But I don't think
**Eliot:** so. Uh, that's that's uh, I mean I I test on a uh, iphone 12 pro, you know pro max So it's it's very very similar
**Kumar:** I mean you already [00:44:00] have it, you know, you have different positions covered and the 3d data is in so that should be a thing
**Eliot:** Okay And it's, the FX3 is very similar to the A7, I think.
So yeah, that's 35. 6 millimeters. So that's, that's all correct. Um, let's see if there's a
**Kumar:** Oh, sorry. Can you just zoom in further on the, the one which is back? I think there's seen so many frames within that.
**Eliot:** Oh, that's interesting.
**Kumar:** Are they all in there? Looks like no. Yeah.
**Eliot:** Okay. This is starting to look like a numerical thing.
Um, okay. Well, that would explain why, where those guys are. Okay, we're going to want to, okay. Yeah, that's, that's probably what's going on. So, um, I want the data, the data set, cause that way we can, we can rerun it numerically over here. So, uh, can you roll a quick cine take? I mean, think about what exactly we need to build a, to track this down.[00:45:00]
Um, you've actually, Oh, cine take, right? Earlier. Uh, I did one earlier with a
**Paul:** different lens calibration. Yeah. I can do it. Yeah, I can do a fresh one with, with the lens calibration. We just did. So you can see the whole workflow. Yeah,
**Eliot:** this would be very, very valuable because it looked like, I mean, it looked like the, it looked correct.
So I want to, I want to see what's, what's breaking and where, um, and then we can, then we can debug it and understand what's going on numerically.
**Paul:** And just so you know, um, So obviously it did a, a successful push, um, of that calibration to Jetset, but when I look at it here, I can't see any yellow reticule, which makes me think it's, it's repositioned in a way that I could see.
Yeah.
**Eliot:** So if it's, if it's stacking all those cameras in the same spot. Um, then it's something in the solve is breaking. And so the offset is probably quite far off. Um, [00:46:00] so if, if, if the, uh, you know, it's a numerical solve and sometimes they break, but, um, that looked fine. You are having plenty of features.
There are correlating. Um, I very much want to understand, uh, I want to get the original data set and then we run it through our own, own systems and understand what broke. Oh, go, go for it. Kumar.
**Kumar:** One, one quick check. I mean, what I could just see, uh, just, you know, the specific case, I think you had the lights handheld for the television.
That's right.
**Paul:** Yes. Just for that one. The ones I did earlier during the daylight was, was a very bright sunny day here and we did it, um, with lots of, lots of light, but yeah, that was handheld.
**Kumar:** Okay. I would say just, just don't do that so that, you know, even the, uh, the light variation might disturb or, you know, try cheat the alignment also because, you know, as light variation in the shadow zones, you know, it, it affects.
**Eliot:** Yeah,
**Kumar:** that's a good point,
**Eliot:** but on the given, when I saw they were getting lots of matches between, [00:47:00] I'm very curious because, uh, something is clearly awry and I want to understand what it is, um, to, to better, but I think what's going on is that the solve broke. Um, and so we're getting an offset that has, is, has like, you know, strange data, which is why the reticule is somewhere off.
Like if the reticule is behind the camera, it's not going to show up. Yeah, yeah, yeah. So if it's in 3D space, positioned in a place that doesn't make sense, then it's going to be way off.
**Paul:** And this is just another, you know, it's what I sent you, but so here's, here's the red reticule, um, on a 35 mil lens, um, calibration.
So it's barely visible. And then when I pushed it from Autoshot, I get And it turns yellow, it's there. So that's the
**Eliot:** five more lengths. Yeah, okay, I can tell you what's going on. It's the solve broke numerically, and so instead [00:48:00] of, because it has to compute both the, uh, the field of view, uh, and also the, you know, the, you know, the entry people, and also the offset.
And so what happened is the offset is like, You know, three meters ahead of you. So the implied field of view is, is like way down there. So that's, uh, although actually know that, hang on, that doesn't make sense because when we're on that, the reticule is a pure 2d pixel map. So there's, there's layers of things that are not making sense here.
**Paul:** And the fact that it goes at this funny angle, that all of my yellow reticules are at least some kind of funny angle, even though I'm quite meticulous on the positioning of my lens and any kind of Dutch tilts or anything on my iPhone. They're all, I would say they're all dead straight.
**Eliot:** Do you have image stabilization enabled on your Sony?
Maybe. Yeah. That'll kill us. Okay. Let's okay. All right. That's that's very helpful. That just that's uh, let's turn that one guy off and let's do the calibration again And I bet that will at [00:49:00] least it won't hurt Yeah, it may not solve everything but that's that's one step of things that we want to work
**Paul:** on This may take me a minute Find
**Eliot:** because the two things I'd like to verify is image stabilization is off And also that, um, I think the FX3 can shoot in either full frame or a crop S35 mode.
And I just want to verify that we're not accidentally in the, in the crop S35 mode. It's, I've never used one of these things. This, this is just paging back through some memory. Yeah.
Cause I know it will work. Cause that's what I looked shot within the barn and it worked fantastically. That was great.
**Paul:** That
**Eliot:** barn one. Oh, yeah. Is that the one
**Paul:** with that dude, yeah, with the little walkway?
**Eliot:** Yeah, yeah. That's an FX3. Shooting raw. I mean, it's [00:50:00] beautiful. Just beautiful. Very compelling. Oh, yeah.
Yeah. That was, we, I, we wanted to do that project with Alex because I'd, I'd seen his, his work. Uh, and, uh, we, you know, when we cooked it up, I said, okay, we want to do something that people think you can't do on green screen, like atmospherics and, you know, something right out of the, right out of the seventies.
So he, he did that in spades, but it's, uh, yeah, it's an FX3, uh, shooting raw, ProRes raw, um, Jetset Cine, you know, worked, worked great. So I know we can do it. I just want to make sure we get the settings correct.
**Paul:** Yeah. Cause for us as well, you know, we've got a lot of stuff that we need to shoot and we'd love to even just get to that stage, just do like a full, but the minute we're, we still feel like we're at stage one.
Do you know what I mean? Oh, yeah, yeah. Let's get you guys through it so you're
**Eliot:** getting calibrations and shots and, you know, let's get that part nailed.
**Paul:** But I'm, I've got in, I found SteadyShot and it's saying it sets a standard. Yeah, the active [00:51:00] standard or off. Yeah, I turn it off. It says it's invalid with the lens So I don't think it's being used as it is because it okay. I've gone. Okay
Yeah, it's just Just what does that say?
**Eliot:** Actually, if you do you want to point point your camera back Your menu at the, at the camera. We can both kind of take a quick look at, at what's, what's going on. And I will, um, let me, uh, pin you, pin you guys to spotlight. Oh, let me, uh, all right. Let me stop my share.
Um, actually no, you're sharing. Okay. Hang on. Let me pin you go. There we go. All right. Okay. Fantastic. So if stimulus, okay. Okay, so we can turn hopefully off SteadyShot. Um, [00:52:00] okay. Now, is there, what does it say for SteadyShot adjust? Uh, this is slightly tricky to do because I'm Right, right.
**Paul:** You're, you're sort of, uh So, so one of you
**Kumar:** need to come down the other way, I
**Paul:** think.
Yeah, uh, hold on. Let me see if I can operate this with the wheel, but I was looking.
**Eliot:** Oh, I see. You're adjusting on the menu, but displaying on the, uh, okay. Now I got you. Yeah. So image stabilization. All right. So steady shot. Uh, let's click on steady shot and see if we can. Can we click on that and turn it off? Uh, or how does that work?
**Paul:** Oh, I've gone the wrong way.
**Eliot:** Okay. Yeah. It's basically either.
**Paul:** If I try that, then it comes up with the [00:53:00] invalid with this lens.
**Eliot:** Oh, I see. So with this lens, you may have a steady shot. Uh, I see. There may be a manual switch on the lens that says, uh, uh, or on the, or on the camera body, which is a physical switch on the camera body or on the lens.
And it's demanding that you turn that off on the camera body. Yeah, I'm just
**Paul:** going to try and.
No, I don't believe so.
**Eliot:** Let's take a look at the, let me look at the ace, uh, fxg.
**Paul:** Yeah, um, There's a little
switch there, I don't know whether that's an ibis. Yeah, that's part of the, um,[00:54:00]
There was an ibis on the metabones. IBIS, I think is, I can't remember what IBIS is, but it's got Image stabilization, that says. So that's on the metabones. Maybe that's it. But that's not the lens. Oh, I see. So I'd need to Yeah, I don't know if that would be it or not. Let me just pull up
**Eliot:** the web manual on this.
Each one of these cameras has their own little
Little, uh, set of things. All right, what are we looking for? Okay, this looks like SteadyShot is off. Um, okay. And then let us, um, Can we go back on the shooting format? Is there something on [00:55:00] the, uh, in the menu that would let us look at the different formats it's shooting in? Whether it's, uh, Meanwhile, I'm
**Paul:** Um, I can see it now. I can see my hand. I can see, I've got the origin point. Okay. So I'll just go ahead and do the, um, the calibration.
**Eliot:** Yeah, that sounds good. Just lock, lock the phone in position before the calibration so it doesn't move.
Yeah, I'm just
**Paul:** gonna get it a bit closer to the lens front.
**Eliot:** Sure. Okay.[00:56:00] [00:57:00]
All right, and then, uh, you can save it, that's another, another name, and, uh, screen share your, um, Autoshot when you run the calibration so we can see it.[00:58:00]
**Paul:** Do
**Eliot:** you want me to, do you want me to screen share the Autoshot? Yeah, that'd be great. That way we can see, see what Autoshot is seeing, see what you're seeing.
**Paul:** In the right project,
**Eliot:** uh, and is that the uh, yeah, let's refresh and get the update and calibration. Yeah There we go, and let's uh re enter your sensor width Whatever it was before Go let's calibrate and see where we're at [00:59:00] And
**Paul:** again, I guess I would say as a as a starting point The, the red reticule is only just visible, which I think, again, if we're saying the, the native Jetset lens is about 26 mil, you'd think that 35 would, would be wholly visible
**Eliot:** within that. On a phone, it's like a 19. It depends on the phone model, but like on, uh, depending on the phone, it's usually like a 19 or a 20, something in that, in that neighborhood.
So it's
**Paul:** usually wireless. The, the, the, [01:00:00] when it, when I go into the preset lenses, the 18 mil, um, in Jetset, I flick between that and my iPhone's, uh, camera. The iPhone's camera is way wider than the 18 mil preset in Jetset.
**Eliot:** That's interesting. So, um, sure I do. Okay. Let's take a look at this. Um,
**Paul:** okay. So that's the lens that it's just pushed. So that's the yellow reticule. That's let me, let me
**Eliot:** pin your, uh, let me pin your view. Uh, spell it for everybody. And let me pin that. Okay. Actually, can you hold that up again? So I can, uh, see if it was on a very small.
Okay, this is strange. I'm going to see. Okay, so let's do a couple things on. Let's go back to your [01:01:00] screen share of actually. I mean, I've already got it. Let me pin that screen share.
**Kumar:** Sorry.
**Eliot:** There we go.
**Kumar:** Can we just have a quick check again for the sensor? Uh, the calibration once they have, right? Uh, what was that again?
Kumar?
**Eliot:** Yeah,
**Kumar:** we check off the calibration.
**Eliot:** Oh, yes. Yes, that's it. Now it's a. Remove the pen and let's look at the screen share of Cool map or of of Autoshot. Um, yeah, there's something, something going on. The numerical solve is, is, is something's breaking?
**Paul:** Yeah. Yeah,
**Eliot:** yeah. It's messing up. So, um, can you, um, uh, back to Autoshot, can you, um, there's a little information button next to the, the, uh, the calibration.
Uh, if you can share the, uh, Autoshot screen.
The good news is, is it's, something's breaking numerically. And so, uh, alright. And then there's a little I button next to the [01:02:00] calibration. Can you click on that? Let's take a look. It should give us, uh, Did it pop up a window?
**Paul:** Uh, find a window? Yeah. Hold on, I'll just, uh, share that one.
**Eliot:** Let's just take a look at some of the images.
Cause what that does is it shows the, uh, shows the captured images so we can look at it and see if, uh, Um, you know, sometimes things, you know, okay, there we go. So we can look at the captured cine images. Oh, there it is. There's your problem. That's it. Um, okay. So what's going on is, oh, I should have done this in the first place.
Um, so do you see that, um, sorry, that was my mistake. I should have done this the very, very first thing. Um, all right. That's good. Good lesson for me. So, uh, let me grab the annotation. Can you show that again? Um, the, uh, back to. So it looks like the captured image has, so it has, so the, the calibration assumes that the full width of the image is covering the sensor width.
And right now [01:03:00] it's, it's, uh, the, the, uh, the view of the cine camera is actually cut in. Um, right. So, uh, because you have a bunch of, uh, there's a bunch of, um, uh, there's, there's all the annotations, uh, that are, um,
**Kumar:** uh, all, all this stuff. So it's going to keep them as common and, you know, aligning to those all the time.
Well, what's,
**Eliot:** yeah, so that's, what's breaking the solve. So there are two things going on. One is to have all that information on the screen. Um, what the Sony is doing is it's windowing down the actual sensor, sensor information to be a small subset of the actual HDMI width. Whereas the Jetset Calibration, um, right now assumes that the, the, the data the HDMI sensor, uh, matches the, is the, the width of the, the visible data on the HDMI will match the, the visible data, uh, on the, the actual frame, right?
So that basically you have a clean feed from the HDMI. So there, there's, there's our problem. Um, at least that's one of the problems. [01:04:00] Uh, so that's, this is great. Now, uh, uh, that this will remind me to check that the very first thing in my future, uh, and then, so let's take a look at the iPhone images. Um, and let's just make sure that those things, those make sense.
So let's, uh, let's go, go to iPhone. Let's click on that. Okay. And so that looks like a normal, normal ish iPhone calibration. Um, okay, let's, all right. So now, now we know what we're, now we know what we're, what we're, what we're, uh, aiming for. Let's go back to the FX3 and there should be something like an HDMI overlay, uh, switch.
All right. This is, now we're, now we're cooking. And it was the data overlay.[01:05:00]
This'll be great. I'll, uh, hopefully this solves it, but I bet that's it. I bet that's the smoking gun
**Kumar:** by the nature of the whole, uh, camera alignment in the GUI. This is where this boiling down.
**Paul:** Yeah. Yeah. Yeah. Um, just looking at where that would be in the menu. I just not anywhere obvious just yet. Yeah, they probably have buried three layers
**Eliot:** down under it.
Yeah. There's some hidden.
**Kumar:** I mean, this behavior is, I mean, luckily I hadn't experienced with, uh, the black magic series and nor with Ari. Yeah. It was, I mean, yeah, this is something new to be aware of. Um, yeah. What, what
**Eliot:** I'll, what I'll do is when I, uh, I'll post this office hours and I'll, I'll like edit out all the parts where we're, we're running in circles, right?
'cause like it doesn't help anyone to, to, [01:06:00] to, to figure that out. Um, and this, we will, we'll jump to this and, and then it, then it's, then it's this kind of very clear cause and effects sort of thing. Um, but this is, I mean, this is really good to, to see this, um, because it's so easy to, to, uh, to do that.
And that should have been the very first thing I, I had you click. That would have saved us an hour of me wandering, wandering in the woods, but that, that, that's all right. That's why that's there. Um, it's good to be reminded of the information button.
Something like overlay. I'm looking in the online HDMI manual, uh, or the FXG manual. And
**Paul:** yeah, I'm on display stuff, but it's got, it's just a bad[01:07:00]
**Kumar:** options like HDMI out. So that it should be defined.
**Eliot:** Yeah, I'm looking for it tomorrow and they, they have, uh, they may have hidden it. Well, um, they may have, uh,
**Kumar:** Oh, the, the, the simple stuff, like on the camera you'll have an option wherein you can, you know, a, a button will hide the stuff, wouldn't do the job.
Sometimes, sometimes on the switch.
**Eliot:** It's wildly camera dependent. . Yeah. And some of the old DSLRs, you can't turn it off, which is deadly . But on the FX three, I know we've done it. So it's a matter of menu surfing until we find where it, where it is lurking. Um.
**Paul:** Yeah, I'm oddly, I, I, I've got that button on here, but it, it adds more stuff, not take stuff away.
**Eliot:** That's the,
**Paul:** there's something like clean, a
**Eliot:** clean feed option. [01:08:00] And if we run out of time here, what we can do is, is we can just set up a call where, you know, it'll take you guys a while to find the clean feed option that we can go through this again. And, uh, and then, and nail it, but I'm, there's gotta be a clean feed option lurking somewhere in there.
It just may require a. Some spelunking. They're the menus. Yeah, yeah.
**Paul:** Flashlighting. We're finding a few different things on, on Google, but none of it's quite applying.
**Kumar:** I suggest you go to the YouTube, most friendly. Ah, yeah, that's a good idea.
I mean, it's easy to not get lost with the text. Yeah. Helplines, but videos, yeah, makes sense.
**Eliot:** External output, HDMI input, and input display, so[01:09:00]
**Paul:** Okay.
**Eliot:** Okay, so I think it is, um, under Menu, Setup, External Output, HDMI Output Settings.
**Kumar:** Is there no search by words within the menu? No, no, they, they, there's,
**Eliot:** they, they're, they don't do search functions on the DSLR.
**Paul:** Oh yeah, I think, I think you're, you're right. Miles had it as well, but I couldn't find it before. So it's in my output settings. In my info display. Now, I guess.
**Kumar:** The only thing about this info display, I think that is something[01:10:00]
**Paul:** So, what does that mean in terms of ah, okay, so I will still have
The ability to Oh, that's good. Yeah, so I can still see the picture on the hdmi monitor Yes Operate the camera from the back of it with the, with the settings I'll need in terms of like F stop and, and
**Eliot:** yeah, there we go. That's clean feed on the
**Paul:** monitor.
**Eliot:** Yeah,
**Paul:** it is. Yeah. Clean feed. I don't know if you can see that cause it's dark, but yeah, clean feed.
**Eliot:** All right. If that, if that's looking like it, like it should, then, then we're, we're, uh, we can, well, all right. So that's, that sounds good. So that's, um, We want, we can take another, another run at this and, uh, see if, uh, yeah, do you guys want to
**Paul:** stay on the line or do you want us just to report back afterwards?
It's up to you. I know we've taken up a lot of time.
**Eliot:** Oh, no, this is, this is why, I mean, this, this is why, why I do office [01:11:00] hours, because there's this whole category of problems that is just utterly inscrutable in, if you try to like email stuff back and forth and it like. And then if you just like fire it up and everybody looks at it, then you get these kinds of breakthroughs are like, Oh, it's, you know, click that button and it's a super valuable, uh, because it's everybody runs into this stuff.
Uh, uh, and again, I'll, I'll, I'll cut the edit this down when you have, uh, when I have our office hours, but it's just such a valuable thing to define because it's so easy to. To get got by this. Um, trust me, you're not the first and I, I forgot, you know, I forgot to check the, uh, check the, um, uh, the, the info on the take.
That should have been my very first pit stop on this.
**Paul:** Okay. I'm gonna do 35 mil again, do
collaboration new
**Kumar:** something on a similar line, like, you know, I mean, I'm, I still have, uh, uh, [01:12:00] struggle with, uh, technical sync on Sony camps. You know, I mean, I, I just kept it in the to do list, but yes, that was something I tried with two of them and they kind of find, you know, we'll do it at a later point.
Yeah. Yeah. Timecode is a
**Eliot:** thing. It's, it's a real thing to get, and that's why they have, Tentacle has like 40 different cables to go to all these different cameras to get it to go into exactly right, the exactly the right place. You know, port at exactly the right voltage. And, uh, I mean, Bill asked me once, like, should we just make our own time code box?
But no, no, no, no, no. Oh, it's, it's like this thing of analog voltage problems, you know, and then technical solving it, you know, 200 bucks. Geez. Perfect. There's just these things that you think are just going to work and then you actually get into it. And it's, it's like, it's a company size problem. Like, yeah, [01:13:00] because you know,
**Kumar:** our very first attempt, we started to the Sony cam thinking that, you know, okay, it's going to be, you know, we're going to get started and then day one, and okay, this is the sink happening, like, you know, okay, I, yeah, I mean, initially we had just the, uh, uh, 3.
5, 3. 5, and then, okay, then I got a local, uh, 3. 5 micro USB. Uh, but no, last week I got one chip from us, uh, technical, uh, so, you know, buy it from a technical
**Eliot:** store.
**Kumar:** But yeah, I haven't got a time to, uh, get a Sony and try it out.
**Eliot:** Yeah. It's great to, you know, and it'll be great to, with every production, you sort of have to get all the parts to, you know, To see each other.
And then, you know, the first time they see each other, nothing works. And then, you know, after a couple hours, all the things behave and talk to each other. Um, but it's [01:14:00] all right. So you want to scare it, share a screen. There you go. There's plan punching the, all right, let's take a look at our info, but our, uh, actually, I'll refresh.
There we go. Um,
Uh, let's see, cause we should see the mark three calibration. There we go. And let's take a, let's click the info button on that one. Uh, just so our magical info button that tells us, tells us the things we want to know, oh, you need to calibrate. Oh, I see. You need to calibrate once. Okay. So let's go ahead and punch in the sensor width and we'll calibrate.
There we go.
**Paul:** Punching away. Guys, just, just a real quickie, obviously this is going to be edited out. The, is that something we're, ideally we're looking to swerve needing the tentacle. [01:15:00] Uh, cause we don't really want to spend that too much more on hardware. Do you know what I mean? Is it something that we do necessarily need for what we're doing?
We're not going to be doing live stuff either. We just use it.
**Eliot:** I'm going to, I'm going to bet you're going to want it because it's so profoundly useful to have all the tracking data, have the same time code as the actual cine camera data, and one of the things you're going to run into, and we just added it this week is, um, especially when you're on a production, we have, we have a digital slate in which, you know, that, that can work.
It flashes QR codes. All right, there we go. Hey, convergence. Okay. So there's, there's something else here that just happened. That, okay, right over here, um, before we were seeing, you know, some pixel errors, but it said no conversions. Um, and I was a little concerned by that, but I wasn't quite sure what was going on, but now this one is saying convergence.
Uh, so that's great. So then, and we solved for the sensor, there's the sensor focal length of 33. 56. It's very close to the [01:16:00] 35 millimeter. So that, well, it's, what's the ridicule look like? GUI, please. The
**Paul:** ridicule is, yeah, it's, it's way closer than Hey, there we
**Eliot:** go. All right, now the, the world is making sense.
And the last thing, let's click on, as, uh, Kumar said, let's check, click on the GUI check. And, uh, of the, of colmap. Let's take a look at that. I bet we're going to get some nice solves coming out of there. Uh, I'm just going to close, uh,
**Paul:** the others. So I've still got the old ones open. Hmm.
The gooey is, I think, looking like you would hope.
**Eliot:** There we go. That's, there we go. So now each one of 'em has a rum, uh, and you can see that the, i, the camera is above that. It's located correctly. They have very, very similar, uh, let's see. And we can go from the top. We can probably see the relative field of views of both of 'em.
There we go. [01:17:00] That's, that's now we're, now we're cooking.
**Paul:** That thing there, which is what I'm sort of having to scroll and orbit around and, uh, Zoom. That seems right in the middle of the pack. Is that right? Is that where it should be?
**Kumar:** Um I think that's only taking the center of CZ, kind of.
**Eliot:** Yeah. That's a good question.
I don't, I don't think that, uh, what happens when you rotate around? Okay, so it's not the floor origin. Yeah, I think comap, when it does a solve, it sort of picks a place, an origin point. I'm not quite sure if there's anything special about that, but the key is that now we can see each solve, each frustum correctly calculated, each one's in the correct spot.
Um, and then actually let's go back to, to, um, to, uh, Jetset, I'm sorry, to AutoShot and click the information button and we should see a full width, uh, camera frame for each one of those things. Um, and yeah, [01:18:00] the little I button there and just pop up a view.
We can look at the Cine. Hey, there we go. All right now, so now the, the full width of the sensor is viewed on the HDMI Go through it. There we go. That's what we want to see. All right. Cool. Okay. This is okay. This, this is why this is the, like, thank you. This is, this is a fantastic office hours. I love it too.
Oh, I love finding something that has a clean problem and a clean solution. And it reminds me, cause I had my head down in Gaussian spotland. I completely forgot to check, uh, the, the calibration info button. So this is a really good reminder to me to check that. So
**Paul:** this is nice. Just. Operating, having, having that back here and always live, even when I'm looking at the Jetset on the phone, I'm not [01:19:00] having to switch between the two because that was a, that was a real pig.
**Eliot:** Yeah. Yeah. And over time, what we're going to be looking at doing is, is running the live cine feed into Jetset and composite it there. So you can actually see exactly what you're, what you're looking at. It's just going to take us a little while to engineer all the sub pieces to, to stack it up, to get.
You got to get the translations correct and all these things inside the inside the renderer, but we'll get there. Yeah. Okay guys Fantastic.
**Paul:** Thank you so very much. Cheers guys. Yeah, absolutely
**Eliot:** Tell us how when you actually shoot some clips and tell us if they line Lines up if things are correct and and all that and uh, what what's the project?
**Paul:** We're we've got a few little basically it's show real stuff at the moment that we just want to show proof of concept and Uh, I do a lot of lifestyle things So people always want to show a breadth of a country and lots of different homes So we want to show how quickly we can cycle through various backgrounds different kitchens different living rooms lounges [01:20:00] Um, all with the same cast or, or, you know, limited cast on a green screen.
Uh, I work in sort of midfield kind of budgets, you know, huge, great big budget. So they can never, or they, They don't like to fork out for VP stages. So frame up that you can do what you thought you could do on VP, but, but do it on green screen instead. Um, and we've got some other stuff. I think we saw the plane.
Uh, we sent you our very first sort of messaging was around the plane wing. Oh yes.
**Eliot:** Yes.
**Paul:** Um, so we're, we're messing around with a sort of something a little bit more fantastical, um, which. Mars was made in Unreal. And yeah, seeing if we can start, you know, how, how much we can, uh, push the, the, the technology and shoot that kind of shot handheld rather than doing it locked off or static.
I think, um, it's a camera tracking and the fact that you can then just drop it easily and it creates that sequence, you know, yeah. Like that's the, [01:21:00] but I appreciate a lot of other people will be using it for live stuff. But I think for us at the minute, it's really just getting that, that quick pipeline of camera tracking and just being able to drop it straight in to Unreal.
Exclusively, well, so far on client branded stuff, like commercials, pretty much not even content. So there's always a client or an agency over your shoulder wanting to make sure you're absolutely guaranteeing to deliver on, on what you promised. So we just want to, you know, get it as robust as possible and, um, have, you know, be confident when we walk on it and not have these kinds of conversations.
Yeah,
**Eliot:** yeah, exactly. You want to solve this, solve this early and, and, uh, upfront and get everything dialed with the. This is great. You should, the, as we're going into, if you're doing, like looking at doing lots of different locations, um, the, uh, the, the Gaussian splatting location capture is becoming extraordinary and we're, we are ratcheting toward rapidly to having that be a key portion of what we're doing.
Um, I think we're going to, [01:22:00]
**Paul:** yeah. We used it for something for another shoot, not nothing to do with camera tracking where, I mean, that was actually enough that Paul did out on a shoot in Bangkok and I was out in England and Um, it is, yeah, it's that whole world is very interesting. Uh, but I still think there's loads of mileage in self creation, kitbashing stuff as well.
Yeah. Do you know?
**Eliot:** They both work great. Like it's, you know.
**Paul:** Yeah. But I know that will open up ready made, nice and easy, Environment pre existing environments. I know you can edit them to an extent and and like layer them up Maybe and if you've got any preferred if you're if you're this excited about a Gaussian splatting then please do let us know any preferred Either workflows or pieces of kit that we should be looking at so that we can you know, maximize it
**Eliot:** I'll toss and this is this is gonna be you know stuff.
That's all You know, we're we're kind of [01:23:00] working through figuring out how to do this But there's some interesting things that have already shown up One of the things I'm looking at closely is, is this scanner is cut by X grids. And, uh, you guys, I can just share the screen on this. Did you guys see this?
Uh, please come and share X grids. There we go. Share. Um, and what it is, is a scanner you walk around with, right? You just, and it's scanning in real time. And so instead of like the traditional ladder scanner, you have to park down a tripod, you know, scan, pick it up, move it over, scan, you know, all this kind of stuff.
No, no, it's just meshing as you walk, as you, you know, scanning as you walk along through a substantially large environment.
**Paul:** Now, the
**Eliot:** initial scan is not, um, I mean, it's not good enough for a final pixel thing. It's going to have, it's a point cloud, but what I'm looking at is, uh, what we're kind of starting to call interactive, uh, location capture, uh, which is you go out with this point cloud, somebody walks around for an hour and they can scan like some pretty substantial chunk of the environment.
You drop that piece of that with a scene locator into [01:24:00] JetSeb. Shoot knowing what you've got, you know, what, which aspects of the environment you're going to need and then pull the shots back into the original 3d scene. So you get a heat map of. Like what, what are the, what's the coverage are you absolutely need?
Then go back with the camera and there's walk around just for the part you need and just hose it down with a high res camera so you can get a really bang on Gaussian splat and you're not trying to capture the city cause nobody can do that. You're just trying to say, okay, we're, we're going to focus on this, you know, this nice little corner cafe.
And just, you know, mop it with the, with a high res camera. So the Gaussian splat is photo real and it works from all the angles and you know, the angles, cause you shot them and you've got a heat map of all your, all your camera frustrums all located in that original scan. Um, and you can do that. Like one person can do, can both walk through it with a scanner.
And one person can go through, you know, get the lighting, you have the lighting. Right. You're going to shoot a Don go fast, you know, over a span of 10 minutes. So the lighting's not changing to get your, your Gaussian splat. And that's going to [01:25:00] look, you know, really bang on. And now you're doing location work.
One person on that location site and one camera and a scanner, all of which can fit in a backpack that you could like fly, fly, shoot, shoot, shoot, fly done, you know, and that everything else is under control in your stage environment. You know, and that's, I look at that and go, this is just how we're going to do locations.
I mean, sure. If you've got the budget, fly them out, 3DMs, knock yourself out, whole crew, you're like, woohoo. You're like, it's the world's most expensive camping trip, you know, great. You know, but when you don't have it, then this, then what you're doing is sampling the world exactly the piece that you need for your final production and not a, not an inch more of what, than what you need.
And it's totally doable. So that's what we're, that's what we're cooking through. And it's why I'm focused on, on getting, making sure our, our registration is correct when we do solves that we can match it back and forth to, to 3d, because the same thing works between a [01:26:00] large 3d set. Like say, if you have a monster Houdini set, that's proceeds like a procedural, like cityscape, right?
You can't load that into the phone, but we can, you know, pick a corner, put a scene locator in it. Do a Gaussian splat, put it in the phone, shoot, push it data back to the big, you know, the big original model. And it's the exact same concept with this large exterior, you know, same, same method, you know, same, same techniques, just slightly different source sourcing of, of the original geometry.
Yeah. Very cool. I'll leave you guys with that. Uh, please let, tell me how this, this works. This was awesome. Uh, I am super, super chuffed to, uh, uh, I, I like clear, clear problems with clear solutions. Yeah. Hey, no worries. This is great. This is why I'm doing it. All right, guys. I'll talk to you soon. Okay. Have a good one.
Take care. Bye. Bye. Bye. Bye.