Transcript
# Office Hours 2024-12-16
**Ron:** [00:00:00] Hello.
**Eliot:** All right. Morning, all. Hello.
**Ron:** Morning.
**Eliot:** Good morning. All right. Uh, who, uh, who wants to go jump in first?
**Ron:** Well, I mean, we probably going to have to save a question because we're from the same company. Yeah. We from interior nights. Uh, we had some, uh, contact on the forum with you last week. Um, we were doing a test shoot and running through some issues, et cetera.
And we just wanted to like learn a bit more about certain aspects, uh, especially the whole frame rate question. Um, you ran, for instance, into the issue that. Um, a lot of cameras that say that they're 30 actually aren't 29. 97. Um, and we were wondering like, is, what would you guys view on that? I suppose, ideally, I mean, also for, for ourselves, we'd like to use 30.
So we probably plan to [00:01:00] use that on, the actual shoot later on. Um, but have you guys been using 29. 97 ever? If you know, you know, if people using that, is it always a problem, et cetera?
**Eliot:** So let me just make sure I understand. So this is for a Jetset Cine project. Yes, I have a Cine camera and it's operating at 30.
0 frames per second or 29. 97 frames per second, uh, which you're at, you're at 30. 0. Is that correct? Or
**Ron:** So the, uh, as far as I know, the phone, the Jet Set app on the phone is always running at 30. Um, but the camera, I mean, the camera we had with the last shoot last week could only do 29. 97, which of course they call 30, but it's not quite.
**Eliot:** Ah, so that's an excellent question. And so, uh, the way it works is that the Jet Set is running on the iPhone and that is locked to 30. 0 frames per second, you know, more or less. Um, and the way we work with Jet Set Cine. Is that we have a couple of timing [00:02:00] mechanisms. Um, the, the initial one that most people have run into is our, we have flashed a set of optical markers on a digital slate so that both the, you know, they, they both see it at the same time and that lets us, you know, set a, you know, A moment in time that matches between the jet set can't the jet set data and the city camera.
And then we interpolate re interpolate that based upon the frame rate of the footage that when you load the city footage into, um, into auto shot, it's going to look at the dot mp4 file or the dot mov file. Or, you know, the raw file, whichever one you're using, and it's going to, they, each one of them is encoded with a frame rate that's, that's in the metadata.
It's going to use that frame rate, and it's going to reinterpolate the 30 frames per second data into the 23. 976, or 30. 0 frames per second. That's where the interpolation takes place based upon the actual, uh, piece of footage. Cause that's, that's the, the record of truth, like whatever came out of the camera, that's what, that's the, [00:03:00] the metadata that we use, uh, to, to correct, to reinterpolate our, our, our three, our, uh, our tracking data.
**Ron:** Yeah, because of course the tracking dating is just a number, so you can interpolate that, so that makes sense. It's good to know that it does that. Of course, the problem that we then had is, so, we had network issues, so the slate wasn't working, right? Um, and we will be trying to use, uh, one of those, uh, those travel routers next time.
Um, but we're trying to fall back on a, uh, a tentacle device. But that wasn't working either, uh, because we have to set the tentacle to either 30 or 2997. And, um, in one mode, the camera wasn't syncing to it, and the other mode, the Jet Set app wasn't syncing properly to it. It was really weird. It showed it.
But it was like half a minute off or so. It was really weird the way it's showing.
**Eliot:** If we had the tentacle at 29. 97, the jet setup would be off [00:04:00] after a few seconds. Do you have the rig? Um, I'd actually really would like to see that. Um, and understand, cause that, if that's a bug, I want to know about it. So we can, we can patch it.
Uh, do you, do you have the rig? We don't have the full rig, but we could, we could probably repro the, maybe we can repro the 29. 9 sync in the 30 phone. We just don't have the camera again because we, we hired it. Sure.
**Ron:** Yeah, but we don't need the camera for that. Uh, at least I would expect it to happen as well, if we just connect the, uh, or turn on the, the, the
**Eliot:** Yes, yes, because at that point the, the Jet Set device only sees the tentacle sync, it doesn't see the camera.
Uh, in another few months we're going to see the camera, but not yet. Oh, wow. Fair enough. So, uh, yes, do you have a Jet Set and a Tentacle Sync? Because then we can repro it, uh, and I'll show you, uh, a tool that we, we use, uh, that, that is extremely useful for us that we're [00:05:00] gonna, right now it's an internal tool, but at some point.
Uh, do you have your Jet Set device available? Do we want to do this, uh, now? I think I can, if we want
**Ron:** to, probably grab it. So yeah, the phone is in the top drawer of my desk, uh, and the tentacle is, uh, it's one of the cupboards, uh, basically where all the shooters, et cetera.
**Eliot:** Yeah, that's, yeah,
**Ron:** I'm, uh, I, I live in the Netherlands, but the studio is in London.
Um, I was over there last week, but at the moment I'm back home. So I'm not in the office. Uh, I need to, Rosanna to get stuff for me.
**Eliot:** I understand. I understood. Uh, so actually, so this is a really good question. So, uh, you do live in, in, uh, I'm sorry, Amsterdam or in the surrounding areas.
**Ron:** Uh, much further south, close, well not too far from the Belgian border actually.
**Eliot:** I see, I see. And is the production you're working with mostly based in the UK? In London?
**Ron:** Yeah, yeah, yeah. Um, uh, so our studio [00:06:00] is based in London. Uh, game studio. Uh, I used to live there for the last 15 years or so. I just moved back here recently. I'm originally from the Netherlands. Um, but I go into the office like a week, a month or so just to work, to work.
So
**Eliot:** we have, we're building a tool that may end up being of use to, and you'll, we'll, you'll see a quick version of it now, uh, what we're, what we're doing. Cause it's, uh, it's a remote assist tool that we have built into Jet Set. Um, and it's exactly for this kind of thing, where, you know, she comes back, I'm going to put up a QR code, she points the camera to it, and it makes a, it makes a link, a screen share link, where it basically creates a new button on the Jet Set UI, that's a screen share button, looks like a screen share button, hit a button on it, And it bounces the UI over our server.
So I can actually see it. Um, and this turns out to be profoundly useful for exactly this sort of thing. But what I'm also, we're, and it's come up and that we've, we've been living on this thing for three years. So it's just as an internal tool, but we're going to, um, Release this as a, as a [00:07:00] production feature.
Uh, it's, you know, probably early 2025. Uh, so instead of our kind of, our sort of hacked in house, you know, zoom, uh, zoom wannabe thing that we built just for ourselves, it'll be actually with real zoom. So you'll be able to send, uh, send someone a link, uh, and, and, um, And it'll, it'll hook up to your, to a given zoom, you know, it'll load up jet set and it'll hook up to a, uh, particular zoom session.
So this, uh, I think, oh yeah, yeah, because what, what I see happening is someone who is typically the technical expert. For a project is can be geographically located separately from the actual production is extremely common. Um, and that way you don't have to like go and sit on set all the time that my goal with this is that, you know, it's frequently very counterproductive for technical people to be stuck on set, waiting for something to go wrong.
Uh, it's just, you just burn days that would be better spent, you know? Uh, and so we did actually [00:08:00] some tests with, um, Uh, Harold Zwart, uh, was, uh, the, the, the film director is, uh, originally from, um, Norway, but it's worked in the, in the U. S. for, you know, decades, uh, but, um, that's his, his, uh, technical teams looked at a lot of the different tracking systems, and they were very complex, and they, they found Jet Set, and were using it to test out a, a concept for an animated film, and the, the technical people, it's from Gimfel in Norway, it's a very, very good, uh, visual effects company, came on set the first day or two, and then realized, you know, They got it, you know, I don't need to be here, and then left.
And that, I said, that's, that's my goal. Because otherwise you're just sitting there, you know, kind of trying to babysit workstations. And we're trying to actually get rid of that whole thing. You know, where it's all just in the phone and there's just not much to go wrong.
**Ron:** You know
**Eliot:** what
**Ron:** I mean? Yeah, I think I
**Eliot:** need more clue for the, this tentacle.
No worries. [00:09:00] Well, what we're, what we're going to do before we even jump into that is, um, let's see, let's, uh, go ahead and, uh, just point your normal iOS camera. Let me share my screen and I'm going to put up a QR code, uh, share there's QR code. Okay, and just to point your normal iOS camera app, just a standard camera app at this, and it'll pop up a yellow button, and that yellow button will open up Jetset.
Um, it'll say okay to screen record. It's not actually screen recording, it's just copying the pixels and pushing it over a server so I can see it. And Rosanna, I just, uh, Yeah, I saw. Checked you the password, uh, for the,
**Ron:** uh, Yeah, yeah,
**Eliot:** yeah, yeah, okay, I'm opening the app. So let me, oh, wait, let me, uh, I just realized our certificate bonked.
All right. Let me, I may, I may need to go get Greg to reboot our certificate. Ah, right. Hang on. Let me, uh, let me.
**Ron:** Yeah, no [00:10:00] worries.
hello.
Um, did you not find the other two borders as well?
**Eliot:** Yeah, I did. They're, they're stacked up at the bottom of the left thing, but without the semol, without the tentacles on top.
**Ron:** Yeah, I'm pretty sure I put them with that, but it would be really weird if just those are gone.
**Eliot:** All right. So I think mine's working now. Let's see. Then,
[00:11:00] where's, where are we at? Okay. So I'll share my tentacle again, or my green capture again. There we go. All right. So once, once that you've got your, uh, the iPhone with Jet Set, you can point that at, uh, point that screen, it'll pop up a little yellow button, hit that. And then that'll be screen sharing and all that.
I'll screen share. So you guys see what I'm seeing. It's going to see, it's going to be your, your, your UI. But then, uh, the nice thing is we can record. And so if we find some bugs, then I've got it preserved in amber and I can send it to Greg and like, here it is, here's our problem.
**Ron:** Okay. Go ahead. Sorry.
Where am I? I didn't understand. What am I doing?
**Eliot:** Point the Jet Set iPhone at this, uh, with have, have Jet Set turned off. Oh, okay. Yeah. Go ahead and exit Jet Set. Yep. Yeah. And exit and then just use the normal iOS camera app. Yeah. So I've, I've done that. Yeah. Oh, [00:12:00] okay. Yeah, I've done that and I, I pressed open Jetset and I'm, and I'm here.
And it asks you to, uh, um, record screen, uh, to record screen pixels. It's not actually recording, but it should. Let's try that again. Let's go ahead and exit Jetset. It should ask you if it's okay to, let's go ahead and exit Jetset. Try that iOS camera again. No, it still opens. Oh, oh, there's that. Yeah. Yep.
Record screen. Yep. It's not actually recording. It's just the nag apple. There we go. Okay. All right. Now we, now we're a win. All right. Let me share a screen so you guys see what I'm seeing. Uh, there's Highland Park video share. All right. So let's see. Now I should see your UI. I don't see it yet. Let me, uh, refresh here real quick.
There it is. Okay. There's a win. All right. This is our kind of in house [00:13:00] support system. So, um, okay, let's just go ahead and click charter house. That's fine. It doesn't matter which, which scene we're using for this. All right, and then just point it down at the floor or something like that so we can, uh, so it's going to lock to the floor.
There you go. Okay. So, um, okay. So let's take a look here. Let's, um, let's first, um, uh, you've got all, all your, uh, stats up there. We can, we can toggle that off. Go ahead and click on the lower hand, right hand corner on your dashboard window. Uh, it's the three little, uh, green dials in the bottom, right? Yeah.
All right. And let me refresh my screen. I can't wait till we get this hammered into zoom. Okay. Uh, so then, uh, just tap the toggle timing button because we don't, we don't need all this data display. Uh, toggle timing, lower left hand corner. Yeah. All right. And then just, uh, tap the screen again to, to have that go away.
Okay, so then, let's go check at what our tentacle is doing. So, uh, there is a little button to the left of those, [00:14:00] uh, dashboard buttons that looks like a loudspeaker. Uh, go ahead and click that. Oop, that's, uh, cancel that. Let's hit the X. That's, uh, just to the, there it is. Okay, so right now we're on the internal source at frame rate 24 percent, uh, frames per second.
So let's first make sure that we are hooked up to, we can hook up to the tentacle. So, the problem right now.
**Ron:** There we go. Rosanna hasn't been able to find the tentacle.
**Eliot:** Oh, oh, okay. Well there. Yeah, so that's going to be a bit of an issue. There's a first order issue. Okay. So it's going to be extraordinarily unlikely that we can debug the tentacle link.
Yeah. Do you guys want to go find it? Is it easy to find or difficult to find? We can come back on another day once you find it. Or if, I mean, you know, if we find it, it takes ten minutes to go find it, then let's just go find it and track it down. Let's try another five minutes and then if we can't, we'll move.
We'll move on. Jo, uh, Ron, can I, uh, call you on uh, on WhatsApp? On, on WhatsApp on a video call? Yeah, yeah, yeah, yeah, that's fine. Okay. We'll be back in like maybe a minute. All right, no worries.
Okay, good. There. Just, we found it.[00:15:00]
Got it.
So you have to be patient with me because I don't remember the last time I used this.
**Ron:** So you can take out the. Uh, purple one or the yellow one, doesn't matter, one of the two. Yeah. And, uh, press the button and hold it until it starts blinking green.
It's about three seconds or so, but just reach it when it's, it's starting to go doing green stuff.
Oh,
**Eliot:** yeah.
**Ron:** Okay. There we go. So, I don't know whether the time is correct now, but we probably set it to 30, so it's probably correct now.
**Eliot:** Hmm. Do you want me to go on the app?
**Ron:** Uh, yeah, if you go into the app, you can, uh, yeah, although you can't do it at the same time. That's a bit annoying, of course.
**Eliot:** Yeah, that's true.
So it's, it's set to 30.
**Ron:** [00:16:00] Yeah.
**Eliot:** Says timecode cable unplugged, but no one cares about that. So if I set it to 29. 97, uh, yeah, it'd be hard for us to see what's going on. Do you have to count out loud? Obviously we could, obviously we could see it last time because we were on Yeah, two devices.
**Ron:** Uh, no, there's,
**Eliot:** uh, maybe I think It
**Ron:** says 3997 now, so that's, that's correct, but I don't know what Yeah, you see,
**Eliot:** you can see that it's not counting the frames.
That's one thing that you can see. C is only doing seconds.
Yeah. You're muted, Elliot. I, uh, yeah, that, the Bluetooth, it doesn't work to actually try to update on a per frame basis on the Bluetooth aspects of it. We sort of, I have to check on Greg on the details of this, but we, we generally show Uh, [00:17:00] for, for both this and the, um, digital slate, we show it updating every second.
So you know, you know, it's, it's, it's correct, but you can't get the, um, you can't get the, the, you know, the 42 millisecond timing that you would have on, on the, uh, the frames to get across that. So we, you know, get the general, general thing, correct. Um, okay. So, and we can verify this actually. So I can tell you, sorry, because I just connected a tentacle from my phone.
For my other phone. So I can tell you now that they're on different time codes. So if you look, I mean, I can, yeah, easy way to do this is just in jet set. Let's go back to jet set and drop the slider. Uh, and actually, yeah, there we go. And drop the slider down the ghost slider down. So we can see, uh, see the live action view and then hold the, uh, hold the iPhone in front of the, uh, the frame view.
Nice, Nick. Yeah, there we go. Then we can see, we can see everything in the same. Uh, all right. So let's see. So we are Tentacle Sync, uh, and in yellow, [00:18:00] 1727. It's about a minute off, right? Yeah. Okay. Now, hang on. Okay. In yellow. That's weird. 1727, 48. Okay. That's, that's, okay. That's weird. Um,
**Ron:** so this is, yeah, this is the same as what we see in last week.
And as
**Eliot:** soon as we change it to 30 FPS, this was fixed by the way. Yeah. So then of course the camera
**Ron:** couldn't read it anymore.
**Eliot:** Interesting. Okay. Well, this is extraordinarily useful to have, uh, this and I'm just a little check I am recording because I want to be able to show Greg this. Yeah. Oh, good. So I'm recording.
Okay. So yeah, that's, that's clearly not doing, not doing what we want it to be doing. So it's talking, it's in yellow. It's, uh, Yep, uh, it looks like it's not doing the right thing. So that's great. Uh, what I can do is to show this to Greg and he can, he can probably replicate this cause it's just, it's just with a tentacle sync and uh, uh, and let's just double check what version of Jet Set you're on.[00:19:00]
Um, let's go into the, uh, uh, the, uh, oh, wait a second. Okay. Okay. Yeah. Yeah. Okay. So settings and support and let's go take a look at settings and let's scroll down a little bit. And, uh, okay. Keep scrolling. That's where we at one, one 35. And I think that's the current bill that's in the store. Let me double check on that.
Let's see.
Let me just check real quick. Oh, yeah.
Yep. Yep. One 35. So you're in the current version of the store. Yeah. It looks like a bug. Uh, so that, that's, that's enough for, for us to track it down and find it. So let's, uh, [00:20:00] uh, it'll be a good plan. Oh, but like the. Yeah, so the more theoretical question that if I was shooting a film at 24, you're, what you're saying is that it doesn't matter too much that, that, uh, Jetson is at 30 because AutoShot figures that out.
So like, that's not something to worry about in principle. Yeah, so it's, um, did you have a digital slate for the, for the production? Um, so there's, there's a couple of useful things here and, and, um, let's see. Let me look at my zoom screen. So let me stop my share. Um, so let's see. So on the production, uh, Ron, you said you, okay, you, you didn't have a digital slate.
Um.
**Ron:** No,
**Eliot:** we just did.
**Ron:** Yeah.
**Eliot:** Oh, that's okay.
**Ron:** That's what we did. So we, we can still find out the offset by, uh, uh, by syncing the clap, basically.
**Eliot:** Yeah. Yeah. That, that's, that's possible. It's a bit
**Ron:** manual,
**Eliot:** but Hey,
**Ron:** for this test shoot, it's doable. Yeah, that's
**Eliot:** fine. And we were off, [00:21:00] uh, were we off by a minute or a sec?
Can we go back to that screen? I just wanna look at that. I wanna see, yeah, it was about a minute. Not exactly, but, oh, shoot. Okay. Yeah, we're off by a couple seconds then we have, have a feature that could have. So, um, I don't know if we've caught that, but I think a couple minutes is going to be hard for it.
Um, we're at, we, we added something where we can synchronize with time code. Um, and the problem of course is that time code, and it can be depending on your, on your device, it can be a frame or two either way, um, from sort of reference truth. Uh, and so we have another layer of, of the, of, um, It's called an optical flow detection where we run optical flow on the entire take and look for the overall movement vectors of it, and then we can lock those together.
It's extraordinarily precise, but you need to give it an initial condition that's within within a couple seconds. Of reality before the, the, you know, the minimization algorithms can lock. And I think this is going to be a little far off for it. I bet this is going to be, have to be a manual manual one.
And I apologize for that. That's, this is going to be a pain in the butt. [00:22:00] Uh, it's fine. It's, it's more about the future. Yeah, yeah, yeah. That is, this is extremely useful to find this out and, uh, in a repeatable replicable manner. It's, we see it, then we can get it.
**Ron:** Yeah, no, I mean, we're in games industry ourselves.
So we're, we're very familiar with, uh, with books and how important VBros are.
**Eliot:** Yes. Yes. Yes. As soon as you can see it, then, then, uh, what's the, Yeah. Okay. So yeah, but so generally though, like the, uh, because, sorry, the other question is, there's a setting when you're on the desktop version. That's called, uh, target camera FPS.
Is that what it's called? Ron? Sorry, I forgot. Um, external camera FPS.
**Ron:** Let's see. Oh, that one. Yeah, that was 24, right? Which when we went in, we saw it. Yeah, exactly.
**Eliot:** We didn't even like consider it until we were trying to troubleshoot this issue and we [00:23:00] saw it and it was set to 24. And I kind of started thinking, Oh, there's jet jets, the jet set to 24 as well.
Does it not? And it's, yeah. Let me look at that. If that's on the digital slate, let me look at that. Yeah, that's, that's going to be misleading because we, give me a second on that. Actually, let me look at that really quick. Let's see,
**Ron:** because
**Eliot:** we would be getting that from the tentacle. If we're going to get that from anywhere, it has to be from the tentacle.
That's interesting because I don't think the
**Ron:** tentacle ever was on 24.
**Eliot:** No, I didn't. It didn't. I don't think it took it from there. All right. Camera. Okay. So let me. Look at this. I'm going to share a screen real quick
share. There's that. Okay. So, uh, where, where [00:24:00] is, is it in because there's these settings, settings. Uh, no, go into the actual, yeah, and then, um, lower, lower, lower, yeah,
**Ron:** there you go. External tracking FPS.
**Eliot:** Yeah, okay, this is for, this is for the live output. Um, uh, let's see, this is for the real time output that we would send to Unreal.
Right, makes sense. And so this, this will not affect the, um, this will not affect the Jetset timing. It's no, I get it that that would be a little confusing. So what I want to, I want to think about is what I would rename that. Cause it's, this is the, this, this is all the external tracking protocol pieces that we would, we send to it.
So whether you put Mona to, or 3D or et cetera, and external tracking FPS. Um, I guess, give me a second. It's yeah, that's, that's a completely different subsystem. I, I'm going to have to think a little bit of maybe we should rename that. So it's less confusing. [00:25:00] I tell you what, what's, what we're, what we're going to be doing is, um, you know, we're, we've got, we've got a couple of things we need to be doing.
One is, is, uh, as, as I told Ron, we're going to be building a feature that lets see what I, what I just did today is I gave you a system that let me kind of remote, see your display and we can, you can diagnose stuff just like this. We're actually going to make that into a real feature that that's kind of a.
I hacked together internal system that we built for exactly this reason. Um, but it's profoundly useful when you have a team that's distributed. Uh, cause then, you know, Ron could be in wherever, you know, in, uh, in the Netherlands and you guys can be shooting somewhere and he can patch in and see exactly what's going on in the same way that I just did, because as soon as you see it, you're like, Oh, there it is.
Um. And so we have to do that. And then the very next thing we're doing after that is starting to work on compositing in the phone where we take the video signal from the actual scenic camera and composite inside Jet Set, which we need to do that because it solves some [00:26:00] inordinate number of problems.
Yeah, that's great. I was going to ask you, this was going to be my next question. I mean, you said half of the answer, but the other half was. If I, you know, if I had a monitor, is there a world in which, uh, at some point we will be able to see not what the phone would say in terms of CG, but what the camera would see in terms of CG and with the camera plates composited and will that be saved?
That's exactly what we're doing next. Is is because it's, it just. We have to, right. You know, this is, this is, it's not a new thing. We built two full virtual production systems before this that had HDSDI and, and, you know, Compton that, et cetera, and an unreal or an N in an engine, et cetera, and it's just, it's just necessary.
Um, and it's, we, we sort of staggered by a little bit with the reticule and these other things, but it's just, it's too easy to get into a mess, um, with no real time feedback. So, yep, that's, that's what we're doing. Yeah, no, that's good to know. [00:27:00] It's a decent amount of engineering to get there, but that's, that's just what we have to do is then, and then it's one to one, you know, and then you can do things like generate a Gaussian splat from your Unreal or your Blender scene or whatever, and then drop it into the phone.
And then you actually see a lit, you know, composited view with no need for a bunch of workstations packed in a row behind you. That's where we're, I want to make that gone because it's madness. Yeah. Yeah. Yeah. Every shoot it's bad. That's true. That's true. That's true. No, good to know. Um, but okay. And you're not, but you're not, um, you're not thinking of, of doing different, sorry, this is the last time I'm going to ask this, but you're not thinking of doing different frame rates for Jet Set itself, like that it records at 24 for any reason.
I would so love to be able to do that. The Jet Set device itself, we run using, uh, Apple's internal tracking technology called ARKit. And as soon [00:28:00] as you're doing that, the phone locks to 30. Um, I mean, you can try to run it at 60, but it turns out the phones can't handle it. It just can't, you know, um, and they get hot enough at 30 that we, we have to run coolers and all sorts of things.
Uh, we, we run the phones pretty hard. I think a lot harder than Apple quite originally intended to. So that's why the new phones, like if you haven't, haven't, uh, phoned up and, uh, got one of the, a 15 15 pro or 16 pro strong recommendation to do that. Cause it's It's a game changer. The cooling, especially on the new ones, is far, far better, and the GPUs are far better.
Um, everything, everything is just a lot, lot better. Uh, but, when we, when we're tracking on the phones, they're locked to 30. As the day, I've already talked, contacted people at Apple and stuff, and can we get, And to some degree, real time tracking wants to have a higher frame rate. So if they could track at 48, that would also be fine.
Uh, ideally you'd be able to track at 24, 25, 30, you know, usual suspects. Right. It's just, uh, [00:29:00] not a thing right now.
So it's just where we're at. All right. Cool.
**Ron:** I think I'm done. One more question. Something that I noticed at some point was that, uh, sometimes the tracking, the live tracking, uh, seems to work, work really precise. And then when we bring it into a real, it doesn't seem to always quite match. Just to be less accurate.
Uh, and I was wondering, like, where should I start looking for the difference? Because if it can track at real time fine, then I'm sure the data must exist that has the same tracking. Would it, would you expect it to be maybe an offset that's wrong, or?
**Eliot:** Yeah, on this one, the, uh, the key thing for us is the, is, is we have a take zip.
Um, so, uh, the, the thing you're gonna want to, whenever you have a take that looks not right in Unreal or something like that, Uh, just, uh, here's, I'll send you the, the, um, the doc page on it. Cause, uh, the way we always do stuff is, [00:30:00] there we go, chat, is you load the take and auto shot and since once you have it in auto shot, you can hit, uh, and you have, you're using Cine footage or something, uh, you can, uh, hit file take zip and it'll zip up both the original Jet Set take, the Cine footage take that's, that's picked for that.
Um, and wrap it up in a zip file, and then you can send us the Dropbox link of that. And then we can recreate one to one everything that you're seeing on, on your, on that take. That, that has everything. Lens calibration, cine camera, jet set footage, you know. And then we open it up in our system and we should be able to one to one pin down the same thing that you're seeing.
Uh, I guess one question is, do the takes have a lot of visible ground contact? Because that's the, there are, there are precision limitations.
**Ron:** Yeah. No, that's always a tricky one. I think the one where I remember also seeing it was actually one that I didn't do with an external camera. I was just with the phone.[00:31:00]
So in that case, I was expecting to be like really the same as I saw like life because it's actually one to one, right? Yeah. Yeah. So I need to probably dive more into it. If I do that and I find it not matching. Yeah. I will definitely follow the link you just sent to get you guys a, uh, But it might just be my mistake and I just need to dive deeper and find out basically,
**Eliot:** um, Send the zip and we can we can open it up and uh, or you know jumping on office hours Uh, and I I have fast I have a pretty fast internet connection So if you jump in on office hours and you can send me a dropbox link, I can pull it down During the course of the office hours, uh, pretty easily.
It'll take us, you know, three minutes, three minutes for me to download a, unless it's crazy, right? You know, you have some crazy, like, you know, eight minute long red. No, no, no, no. Said that the day before. Yeah. Those things are what, what you're, what camera are you shooting with? What's your. We've been shooting with different, with different cameras across normal.
Our, our set production one is a red. Okay. Yeah. Okay. That sounds good. [00:32:00] Um, and we're actually working on implementing direct R3D to EXR conversions for our frame pulls. Uh, because earlier you'd have to be using, you'd have to convert the RED footage to like a QuickTime ProRes or something like that for us to extract it.
But we are planning to implement direct RAW conversions from RED, Canon, ARRI. You know, we already had the Blackmagic RAW, so it's just adding on to those. Uh, and that way you can just drop your R3D file, um, in there. Into the CineSource, we still need the proxy, um, to, uh, to help detect some of the pieces of information in it.
Uh, if you probably already have a proxy workflow, if you're shooting reds, just for your editorial. Yeah. So just send us, you're just going to load in the proxies and the proxy proxies area, the source and the source area and hit scan and that'll be it.
Um, but all right, so I will, uh, I'll get this to, to Greg so we can re we can replicate it, but it's clearly happening. So, you know, um, And I should have had you switch to [00:33:00] 30 or 30 on the, um, on it, but I'm sure. Do you want to do it quickly? I can do it. That's not a bad idea. Let's let's see if we can't do it.
No problem. Then it'll probably let me re share my screen. All right. Hey, Navaz. Good to see you. All right. Let's see. Hey, good morning, Elliot. Oh, can't see anything. Let me share it so
I can catch it. There we go. It's not like I can see you, but I can't see this. Oh, there you go. All right. So let's see, we've got that and let's get that guy in focus 1743. So yeah, we're clearly, clearly a minute off there. All right. So then what happens when we switch to, uh, Oh. Sorry, I have to switch through my, through this phone, but you will see, so we're[00:34:00]
synced now.
I don't know if you can see it, but it's pretty synced on mine. That looks synced. Let me refresh my screen.
There we go. 47, 48, 49. Yep, that's locked. Okay. Um, all right. So it must be something with 29 and 7. We'll find it. And, oh, last question, what's your, um, uh, what's your firmware on your, uh, On the, on the, um, does it, does it show firmware on the Tinnacle syncs on the, on the app? I don't know if that needs an update or anything like that.
Just to make sure we catch our bases.
**Rossana:** Uh, 1. 1. 1 [00:35:00] BT203. Okay. I can send a screenshot.
Don't know where.
**Eliot:** Uh, you need to support it like craft. pro.
Support at lightcraft. com. Uh, dot pro. Dot pro. Yeah.
Uh, we're interior night, by the way, so I'm gonna call it the evening.
Okay. Excellent. All right. We'll, we'll hunt that one down. Thanks for, thanks for bringing it up. That's uh, No worries. One of those we probably would have never, we would have never tried that on our own.
**Ron:** No, thanks for, for, uh, yeah, giving the attention, man. [00:36:00]
**Eliot:** I
**Ron:** really appreciate it.
**Eliot:** No, this is, this is great.
Till then. You guys are making it. It's a, it's a game. It's a,
**Ron:** yeah, I'll, I'll put a, um, uh, I'll book a meeting with you probably later this week, uh, outside his office hours, because we can tell you a bit more about the project. If we're not in an open forum like this, sorry, but we don't have nothing against you, but it's also recorded and you have to protect the, uh,
**Navaz:** Oh, no problem, man.
No problem. I understand.
**Eliot:** Okay. Well, great. Well, I mean, great to meet you all. And, uh, I'll, uh, we'll, uh, and so you have, okay. And so this is on the forum as well. So I have, I have a place to follow up, follow up well. Yeah. And I'll send you an email now and I'll CC Ron today. Okay. We'll be on that. That's perfect.
All right. Thanks a lot, man. Hey, no problem. Thank you very much. Very helpful. All right, all thanks again. All right. See you later then.
**Rossana:** Bye.
**Eliot:** Bye.[00:37:00]
All
**Navaz:** right. Noah,
**Eliot:** how you
**Navaz:** doing? What's up Elliot? I set up a meeting for us tomorrow at like 10 30 if that works for you. All right, let's see
**Eliot:** that. I tell you what, um, can you do one a little bit later? Just because I know Bill is sometimes busy up through about 11 on Tuesdays. Uh, and I want to have him, have him jump in so we can, he can talk to you as well.
Cause that'll, that'll be fun.
**Navaz:** Okay.
**Eliot:** Yeah. Um,
**Navaz:** what time do you think, like around 12?
**Eliot:** Yeah, I should do it. Um,
**Navaz:** okay.
**Eliot:** I mean, let me just modify it right here. Let me do it. Uh, I can modify this 12 PM and add bill.
**Navaz:** There we go. So we've been working on a method for the 3d or no 4d. They call it 4d Gaussian splats.
Oh really? Yeah, we think we got it somewhat figured out. [00:38:00] Um, we're, we're gonna be, uh, retrying it again I think on Thursday or Friday, you know, just 'cause we have so many meetings this week and things we pro or the stuff we're trying to finish, you know. I can imagine.
yeah. It's like, I mean, I have so much stuff, you know, going on, but, um, what's interesting is that it's very doable and I think the problem is sometimes. In situations like, like what we're trying to do with, you know, with the 4D Gaussian or animated Gaussian splatting is that the people that are, are trying to solve it, they look at it from, uh, from, uh, I guess you could say from a technical point of view, you know what I mean?
And so when we look at it from an impractical point of view, can we get the same, um, results doing it a different way? You know, that's how we've been able to solve a lot of problems. I mean, kind of like with even like the, you know, the VR thing and, you know, the lighting thing is like for us [00:39:00] to be able to, um, look at it from a different point of view, instead of, you know, Hey, we can build this, this hardware is kind of like, goes back to what I was telling you that day.
Is that instead of spending the next month trying to build some hardware, why don't we just buy it, you know, and do a deal with the company, you know? I mean, yeah, that's a lot more, more, uh, beneficial. So in that mindset, I looked
**Eliot:** up, um, I looked up Govee and, uh, and I looked up and one of the other ones that might be worth looking at, uh, Govee and, um, cause the, And I don't, I only halfway understand this world because they're making stuff for bias lighting.
There's also a company called MediaLite, um, that has extremely high CRI stuff, color rendition index things. Um, and I, and I don't know yet whether, I, I haven't been able to discern, I don't understand how they take the video signal and convert it to RGB yet, whether [00:40:00] there's like a box that takes HDMI in and it does the magic, but I don't, I don't understand that yet.
**Navaz:** Yeah, there is a box. There's a, there's a box that converts it. I mean, I think they use like a Raspberry Pi type, you know, situation. I mean, I know you can do it with the Raspberry Pi, you know, I mean, I've read, I've read all the tutorials on how to do that. But, um, what they do is they convert the signal.
You know, and then the signal, you know, um, outputs to, you know, to the lights. Um, what's kind of interesting is this, is that the reason we like Govee over, you know, all, all the other off the shelf, um, hardware is because Govee has so many, um, so many different options for lights.
**Eliot:** Okay.
**Navaz:** You know, so, so being that they have so many options, it makes it to where it's, uh, I mean, let's say if we wanted a whole, like a control panel lighting up at a certain, you know, at a certain location, you know, like when we actually move the camera back, like Govee has like little lights that can light up and do that, you [00:41:00] know, compared to like most of the other companies, they're more, um, you know, like, Hey, we have a lighting setup.
You kind of understand what I mean? Um, I mean, yeah, they're more flexible as far as options out there. I mean, cause like, honestly, if we actually did, if I, well, me personally, if I actually sat down, I could probably, you know, design my own, um, hardware, you know, within like a week or two, you know, but then at the same time, it's going to take, you know, two weeks or, or a week of my time, you know, trying to figure out something that.
Can be easily solved with off the shelf hardware. And I think also when we're, what we're looking at, I mean, our overall goal is to make virtual production easy, you know, to where basically anyone can do it. And you could just show people and, you know, they can be like, okay, well, instead of me having, you know, a 50, 000 budget or whatever, with, let's say 5, 000, I can set up my own studio and do [00:42:00] everything that the volume is doing.
You know, cause I mean, that's kind of where we're at right now. We're at the point now to where we're looking at it as how simple is this? And that's the thing that blows me away on a daily basis is that when we look at a lot of the stuff that we can do with what's out there now, you know, we can do anything, you know, with AI, the way that it is, um, we were looking at this new, I forgot what it was called.
Cause we've been doing motion capture and, you know, trying to do all that. And then. All of a sudden a YouTube video pops up on my feed that shows basically that, uh, that you can use a video and just put a model over it, but it's not wonder studios. It's actually, um, I forgot off the top of my head what the name of the company is, but what they're doing is pretty impressive.
But what people are using it for is. Oh, they're using it for means they're using it for, you know, like, Oh, I just want to change clothes. You know what I mean? So they're not really using it for filmmaking yet. You know, I could see the, I could see [00:43:00] some people have used, or one guy used it for, um, um, what was it called?
He made a cartoon. So basically he live action did out and then basically put a cartoon, you know, body over him, you know, and was able to, to make a cartoon instantly. And the, the. The tracking the motion tracking and stuff was exact. I mean, it was pretty, pretty close. Actually, let me look to see what that thing was.
I think I sent it to, I think I sent it to Roman. So that way I can show you the name. But I mean, if not, the whole thing is, is that there's enough technology out there right now to make. To make a movie with next to nothing, you know, yeah.
**Eliot:** Oh, this is this is exactly um, okay. Let's see. Let me just look at this.
If you had, um, it's always useful. Do you have when you do you have a link on any of these things? Because it's,
**Navaz:** yeah, let me find the link because I actually had it. We actually had to go through [00:44:00] the whole process and. And we put it on a Mac and it was, it was actually, it worked really well. The only thing that I was kind of hesitant about was you have to use Pinocchio.
I don't know if you've ever used Pinocchio. Um, it's kind of like a alternative to, uh, um, anything. Oh, it's called, Oh, here it is. I'll send right now.
**Eliot:** Give me a
**Navaz:** second.
**Eliot:** Cause there's, there's a bunch of it. I mean, there's one called trace. It's kind of wild, but those are, it's more of an academic thing. You have to go compile the code and all this kind of stuff.
**Navaz:** Yeah, well, this one I sent you is called Vigil, and it's actually very impressive. Kind of nuts.
**Eliot:** Oh, geez.
**Navaz:** [00:45:00] Right? That's pretty crazy. And when you look at the other videos that the guy did, he even used Switch Lite to, um, uh, adjust the, the lighting on it, you know? And, I mean, stuff like this is, is like, I mean, I don't know what these guys are using it for, but if you actually incorporated it into a movie, You now have complete control over something, you know, that's crazy.
**Eliot:** That is, that's pretty crazy. Yeah. And hang on, let me, uh, share the screen. So this is kind of nuts. This is, let me show you this thing. Uh,
I mean, it's interesting to kind of see in the variations. Let's just see.[00:46:00]
**Navaz:** See what I'm impressed with is the fact that he was able to do it with the background, the way that it is. And to be honest with you, Ellie, it's very easy. Like, I mean, I thought it was going to be, you know, like a lot of going back and forth and stuff, but it was very, very, I mean, it was basically like you take a picture and it will basically take that picture and put it over that, that person, whoever's moving on your input.
Well, so when I
**Eliot:** look at this and the part that I look at it would actually be. Um, there's, there's a, there's an unsolved problem, which is, which is costuming, you know, digital costuming and Yeah. You know, 'cause the, the, the trick of course is that, is you usually, can you kind of break your, um, the big old page, the crazy, crazy, of course.
sheets, . That's, um, where is, I would love to find something [00:47:00] where, let's see if this is, yeah. Like,
**Navaz:** Oh, this is hilarious for what we were looking at it for was using live characters and just basically putting different costumes on them, you know, because that, that, that's something that I've seen in those videos.
I mean, the face stuff, I think they're still, you know, probably like, you know, a few months away from getting it right. But I mean, the clothes is huge and to be able to do it instantly, you know, so you can basically put a whole costume on somebody. You know, I mean, right then and there,
**Eliot:** you know, if they can handle moving camera, I, I worry that they, that's that, I mean, just look at all this kind of stuff, Vigil, uh,
see if they can deal with moving camera, create. I'm just kind of looking at all this kind of stuff.
**Navaz:** Yeah. Well, does it have to handle, I mean, well, you mean moving [00:48:00] camera stuff
**Eliot:** with moving cameras there? Oh yeah. I see what you're talking about. Interesting. Cause this, with all these things, You know, the trick is you want to be able to, okay, man, it's, it's good.
It's good. They're, they still have shimmers, but okay. You know, whatever you could, you could deal with that.
**Navaz:** Yeah. We already figured out how to get past that. We were just going to go through and remove the frames that have the jitters in it, and then, uh, I forgot my, my, my, uh, Roman was explaining it to me.
It was like something, there was a software that will fill in the frames. Right. Yeah. The can, you know, the missing frames and stuff. Yeah. He, you know, de jitter stuff
**Eliot:** because it's, it really is. And, and people are much more forgiving of costume issues than, than the face. Right. Whether you just, you want to have the face, the hair that just needs to be real.
Otherwise you are in a world there.
**Navaz:** Well, that's what I think. Like when I see software like this, I'm like, wow, this is really impressive. [00:49:00] But what are people using it for right now? You know what I mean? And then on top of that, when I look at the overall encompassing, um, like, let's say, tool set that we're putting together to make films.
I mean, it's going to come to a point to where you can do anything, you know, like as, as you're, you're, you're imagining it. I mean, I'm still kind of iffy about AI backgrounds and stuff yet. Um, but I was looking at what you were talking about with the Gaussian splatting, you know, as using that as a background.
And I think honestly, um, you taking a 3d model and turning it into a Gaussian splat is probably the most genius way to use it for filmmaking for at least, you know, because As we start looking at it, I mean, one of the things, because you know how they have the art, you know, the floating artifacts and all that stuff, if we can remove those floating artifacts, which most of those floating artifacts are, or at least from what my experience has been with God's in supply is.
It's either we forget, we don't get certain [00:50:00] angles or we don't get, you know, enough coverage, you know, and so if we're actually doing it on for the 3d model, we're going to get all of the coverage, especially if we, we, we figure out the system for, um, you know, for what angles are, are necessary, you know, right.
That was more from the theater. Yeah. Well, that was more from this weekend. Just trying to animate a Gaussian splat. You know, I mean, that's what I was kind of looking at was, okay, so if I was to make a 3D model of myself, or let's say have multiple camera cameras from different angles, and then, um, you know, be able to take that.
I mean, I was going based off of that paper that you that you shared with me. You know, on the method that they use, but then I was trying to see, well, how would I do it? You know, I mean, how would I actually do it? And it would have to take multiple cameras from multiple angles at the same time, kind of like the whole, um, um, you know, how they did the matrix and they just kind of did that whole, you know, uh, [00:51:00] wraparound thing.
Um, it would have to kind of be that same type of set situation, or at least that's how it started. Right. Eventually I actually started to get an understanding of how, It could be done easier than that without, you know, a higher end setup, you know, which I'm still going to try to perfect. And once I perfected, then I'll explain it all, you know, but I might be going just down the rabbit hole, but I do believe that there's something there because, I mean, what we were looking at was doing, um, a VR movie.
Um, and by doing a VR movie, the biggest problem we've had is, or what I've seen from looking at VR movies is that if you're looking at a character acting, and if you move around, it's a 2d plane. You know, so you're looking at a 2d plane and that's a problem. So you need to be able to, let's say if you move forward and the coordinates are relayed back to, uh, you know, to where the camera location or your view location is to where the lighting will automatically, you know, be there.
I mean, that's something that we were looking at. So one of the [00:52:00] things, just to let you know, so as far as, uh, animating gassing, splat, we kind of figured out a different way to do it. It would be more to have a camera rig around the actor's head. And we basically get all of those angles from his head. And then we just, uh, CG his, his body, you know, to where then you could actually walk around them.
Cause then, like you said, is like one of the things that. In looking at, at, you know, let's say, uh, uh, you know, like a character is that is the face. If you don't get the face right, you know, the clothes are, nobody looks at the clothes and says, Hey, you know what? Those clothes are fake. You know? I mean, it was kind of like, even with the Avengers movies is that most of the suits were, were, were, you know what I mean?
But when you looked at it and you watched it, you really didn't notice until you were told like, Hey, that was CG. Maybe.
**Eliot:** Yeah, you know, in some really bad way, but you know, you can see, you can see it being able to train it off [00:53:00] off like a real costume and but then being able to adjust back and forth. Yeah.
Yeah. You preserve this. This is what you really want.
**Navaz:** Yeah. And so you being that we're looking at more from a practical standpoint of we're not talking about putting a space age, you know, or, or sci fi costume on somebody. We're talking about taking somebody that's wearing regular clothes, putting a cowboy outfit on or, you know, You know, putting a, you know, a, just a suit on them, you know, something very, you know, subtle to where, you know, somebody watching it wouldn't expect us to do CG, you know, a CG, uh, outfit.
Yeah.
**Eliot:** Yeah. And if you get as close as possible to the costume and then, then kind of do, um, and then modify it as, as you need, because then it's, then it's a much closer to one to one sort of, sort of thing. Yeah. And that's what makes that
**Navaz:** thing so, so amazing is that the fact that it can do that. And get the movements and, you know, in the, in the flow with it too.
You know, I mean, that's what we were, we were like, wow, that's pretty good. I like,
**Eliot:** I like this. 'cause it's, it's, it's borrowing, it's, [00:54:00] you know, you're still doing, we're, we're keeping the pieces of traditional filmmaking that you want to keep, like, which is, you know, actor and, and camera and things like that.
Yeah. And then, and kind of, you know, using AI to, to fix the things that otherwise really intractable to get, get right. Which is, I mean, it's hard to do really great at costumes. It's really hard and it's one of the first things you see in a low budget production is, is, you know, the costuming is, is whereas the, the richness of, I don't know, you see the.
All the other really good ones, you know, you just, they're just amazing. So, um, this is, that's really interesting. That's, that's really, I, I'll be curious, you know, I'd say, I'd say take a, take a look at the, at the, um, You know, the 4D Galaxian Spot stuff. Sometimes on this R& D stuff, the best way to handle it is you're like, That's great!
And I'm going to wait for about another, another six months for it to cook. Yeah. It's going to be great. You know, you try to get on to the early part and it's, it's like trying to ride a crazy bull all over the place.
**Navaz:** Well, that's kind of like when I started looking at it, I'm looking at it more [00:55:00] from a perspective.
At first I was like, okay, I'm going to do, I'm going to try to replicate exactly what they did and see how I can make it work. But then after a while, I was like, you know what, when it really does come down to it is it comes down to, I mean, with filmmaking, it's more of the illusion that what we're trying to show the audience is real, you know, so do we necessarily need that, you know, not really the way that they're doing it, you know, there are other ways to do it.
Like, let's say like what we were talking about with Vigil is that if we were going to do. You know, uh, afford the animation where basically in V. R. You're walking around a character. All we need is their head. You know, that's really all we need. If we can get their head and then we can actually use Vigil to put the body on or, or not Vigil, but we just make their body, you know, a CG character that's achieving the same, somewhat the same, um, same goal, you know, right?
Which I don't know if anybody's done that yet, but especially with the V. R. In a V. R. Uh, [00:56:00] environment, it's But, you know, we're gonna try it,
**Eliot:** you know? This is great.
**Navaz:** This is
**Eliot:** great. I, I like, I like the, uh, I like the approach. Um, I like it a lot. And the, the, the vile thing's pretty nuts. It's, it, I mean, I've seen variations as it, as it gets, gets closer and, and, and better.
Uh, and it's, it's, it really is kind of getting wild. .
**Navaz:** Yeah.
**Eliot:** Um, alright, well, well fast and oh, and I, I bought my, uh, I bought my, uh, headset. My, uh, whatever the, uh. Uh, so I, I haven't even had time to pull it out of the box. So I'm going to have to, I'm going to look at that. I've been thinking,
**Navaz:** it's kind of good that you don't, you have to actually put it, put aside a lot of time because once you do it, you're going to be stuck in it.
Yeah. Cause I'll just tell you from experience is like we run it till the battery dies
**Eliot:** every single
**Navaz:** time, you know, and, and, and. I mean, like I said, I think I, I mean, it's pretty self, you know, I mean, it's pretty [00:57:00] simple on setting it up. You just got to make sure you have steam, uh, steam VR on your, on your, on your desktop, you have to make sure you have, you buy a virtual desktop and then you just follow the steps that I kind of laid out.
And once you do that, I mean, it's pretty easy, you know, I mean, once you get it to that point of. Of being able to, to, uh, remove the background, you know, cause I mean, in virtual desktop, it gives you the, the, the option to change the opacity. So let's say if you want to be fully immersed into the VR environment, you know, you could just say opacity, you know, zero or whatever, you know, but then let's say, if you want to be able to still see the world around you somewhat, you can go at 50, you know, you, I mean, the good thing is that they actually give you that option of changing it.
And then being that the color of the background, um, the virtual background, let's say in blender, you can change the world, um, background to let's say green. So that way it'll key out all of the green. [00:58:00] Does that make sense? I hope I didn't confuse you, but if you have any problems, we can walk you through it, you know, but once you see it, it's going to be one of those things to where you're going to be like, okay, this is.
This is the future of filmmaking. And it's kind of interesting. Cause yes, last night we were actually talking with the director about it. And he was just like, wow, you know, he's like, just to be able to have that control in his hand, you know, without having to learn everything and going back and forth, because I mean, it's the real time aspect that really sells people.
You know, because being able to change it in real time and see what the scale is. Cause I mean, that was like one of the things that me and Roman were doing. We were like, it looks about right. You know, we were guesstimating a lot of things, you know, to like the scales looks right. You know, in the finished product, it'll look okay.
But if we're talking about trying to replicate physical environments. perfectly, you know, in the distance and everything. It's like, that's the one way to do it, you know, is to [00:59:00] actually see what it's going to look like beforehand. You know, are you
**Eliot:** guys on a, on PCs Max? Lemme just remember what, what, what your, what your sales is.
**Navaz:** Well, I'm on a pc, my and Roman's on a Mac.
**Eliot:** Okay. And you, do you generally run the STEAM vr? Is that on a PC or is that run on a Mac? I don't, I don't remember how it works.
**Navaz:** Uh, on which, which one? The, the, uh, the vr,
**Eliot:** the, yeah, the, the Blender stuff. Are, are you, um, are you, are you mostly running that on or does it matter?
It works on both.
**Navaz:** We're kind of going Oh, with, with the vr, you have to run it on pc.
**Eliot:** Okay.
**Navaz:** And we tried it on Mac and the problem is, is that when you, we get to a certain point when you're, you're, uh, in Blender, their add on does not recognize, uh, you know, for Mac for some reason. I mean, you can get all the way to that point to where you're about to, to switch over and then it won't switch over.
And I haven't figured out, I didn't even want to spend time figuring it out, but I'm pretty sure I could figure out how to make it work. But, um, once you get to that VR point, like [01:00:00] you have to basically in Blender hit start VR. And for some reason in Mac, it won't work, you know, or it doesn't work natively.
And then in PC, it's like, it works flawlessly. So,
**Eliot:** that's why I was just like, you know. When I go into it, I'll start looking at it. I'm coming up with ideas for how to get the coordinate systems to lock. Like, that's, that's the thing that I, I just have to. I have to like, do a few cycles. Um, Okay, this is, this is really cool and I, I may need to jump here because I need to, um, the, the previous folks, uh, found a bug and I need to get, get all this stuff compiled and sent to Chris.
Oh, yeah, no problem. Where I get nailed by another call, but that was
**Navaz:** Oh, also, uh, Conrad mentioned that we can get on test flight. Because I did notice when we were there, he had, uh He had a version of, of auto shot, but I've never seen it. I'm like, what the hell? Like he was able to change around the size of his green screen and stuff.
And I was like, I want that dude. I was like, Oh, like, is that only for iPad? I was like, he's like, no, it's like, dude, this is from test flight. [01:01:00] Yeah, this, this, this
**Eliot:** is the, uh, So the, I mean, test flies for you. We have some of the internal test builds and stuff like that before we, we unleash them upon the unsuspecting world.
But, uh, you know, the problem, I mean, we can fire you up on it. That that's no problem. It's just, uh, some of them, some of them are a little raw. So I would just say, you know, take a giant, giant salt, like, yeah, we want the early builds and some of the problems is that they, some of them blow up and burst into flames.
Um, so, but you are able to roll back. Right. Yeah, yeah, we can, we can roll back versions pretty easily in TestFlight. So yeah, I can, uh, I mean, what I'll do is, are the emails that I have of you, are those also your Apple IDs? Because I need the Apple ID of the system to use it.
**Navaz:** Um, let me see. That's a good question.
Let me see what my Apple ID is. Ooh, it will be, yeah, it's the same. nwestdallingatel. com
**Eliot:** Okay, yeah, so let me get you that on TestFlight. All right. [01:02:00] So I'll send that, we'll send the link invite because yeah, that's, that's what he was, because we have, I mean, you saw it, we have a job. We've been rebuilding the UI of Jet Set for the past month and change.
Um, it's a fairly monster change, a good change here. I'd have, we fixed a ton of stuff, uh, in that, uh, but it's also a completely undocumented. Okay. When I, when I came
**Navaz:** to the show, I was like, I was like, dude, why does mine not do this? I was like, that's, that's awesome. And he was like, oh yeah, that's from test flight.
I was like, dude, when are they releasing? They should be releasing this like right now, you know, this is cool.
**Eliot:** It's it's you know We put it on test flight and we run it and then and find out where all the smoke comes out And because we don't yeah, we don't want to because you think big changes like this You don't want to mess up somebody you really don't want to mess up somebody in production.
That's what I yeah Very, very, very worried about, about doing that. Now I say that then of course, some of the production people need the stuff that's in test flights. So there we are, but yeah, let me do that. So I'll get on that. And, um, and then let me get this call [01:03:00] sent over to Greg. So he's got this to, to fix, but this is, this is wild.
You know, it's, it's the digital costuming is one of the last sort of sorts of pieces. So I I'm curious to see, I wish you luck on the multi, the like multi side stuff. That's, That's a big bear. I will, I've, I've looked at that a little bit and I'm like, you know what? Let's nail 2d filmmaking first. Yeah, it's hard enough, but if you figure out a
**Navaz:** way in and then the more power to you because it's it's well That's what kind of like like got got me hooked is like I was like, okay I'm gonna try to figure this out over the weekend And then I was like, damn, I had to actually stop myself and was like, okay, so it's Monday.
I gotta, I gotta work on other stuff, you know, cause I probably could spend about two weeks working on it, getting it down. Perfect. You know, but yeah.
**Eliot:** Yeah. Yeah. It's, uh, I honestly, the digital costuming thing, that's, that's it. Like figuring out how to weave those pieces together where you have a virtual backgrounds, like live action character in a digital costume.
I mean, right there. [01:04:00] That's, that's a production workflow that you, you can start to, you know, show your director and, and, you know, they're going to go, okay, we're making a movie. And then you're going to disappear for a long time.
All right. Good to see you. I'll talk to you soon. I'll
**Navaz:** see you tomorrow.
**Eliot:** All right.
**Navaz:** All right. Bye.