Transcript
Office Hours 2024-08-12
===
[00:00:00] Eliot: Uh, yeah, I'll try. That's right. If we need to screen share, we can screen share or whatnot. Um, so, okay. So what you're looking at at solving, uh, yeah, this was the one where, where again, where it solves on windows, but you want to process this on Mac. Uh, so we want to, we want to go through solving the, uh, doing the camera solve on windows, then copying that file.
[00:00:22] Eliot: Uh, over to the Mac, Mac version, then hopefully we'll, uh, everything will work, work
[00:00:26] JP: cleanly. Exactly. Exactly. Now that's what I'm having trouble with at the moment with one of my colleagues who has a Windows computer. Um, but first of all, before, um, I even show you the screenshots and, um, the Dropbox, I'll share with you the Dropbox link again, um, to refresh your memory.
[00:00:42] JP: Oh, yes. Where would I find this SYN OFFSET JS, uh, JS, JSON file, if I can call it that? Let's say it gets computated by Windows. In which folder would we be able to locate it in? Let
[00:00:58] Eliot: me take a quick [00:01:00] look. Let me close some of my windows here. Let's see. We have Alright, so there's an Autoshot build. And let's actually look through this.
[00:01:13] Eliot: And I'll have to refresh my memory as well. Let me put this, let me put this on Zoom. Um, alright. There's Zoom. I'm sure I'm gonna learn a couple things too. Alright, there's screen one, screen two. Okay, so let's see. So, um, we have our take and then let's open our take. Let's go look for it. All right. So then, uh, I think the solve is actually upstream of that.
[00:01:47] Eliot: So project, um, all right. So I believe it puts it in calibration. So that's the, the rock calibration file. Uh, and then when it does it solve, we can actually check that. So I'm just going to [00:02:00] try and click resolve. Oh, this one is going to do the feature extractions and kind of go through the whole process and then we can actually look to see where it's putting, where it's putting the files.
[00:02:09] Eliot: I'm going to be more certain of that.
[00:02:15] Eliot: All right. Cook through that for a second. No worries. And you can see my screen, right? Yeah, I can see it perfectly. Okay, great. Great. This is an excellent question.
[00:02:33] JP: While this is computating, um, what's it like you say the features? Again, just to refresh your memory because it was uh, I think two weeks ago when when I um What's it sent the dropbox link dropbox links? But have you ever seen the solve or in your in your case the resolve button? display being grayed out and displaying a message, um, or Expressing [00:03:00] the word missing where solve the solve button should be Oh,
[00:03:06] Eliot: that's interesting.
[00:03:07] JP: That's what we've experienced.
[00:03:09] Eliot: Um. Okay, well, so, okay, so here we go. Compute calibration offsets. That's where I put it. So project sequences, project calibration, and I click on that and it'll, uh, and the way we have, we have our, so we can find things, everything in that we generate, uh, that this, these blue, um, uh, these blue links, you click it and it'll open up your, your file explorer to go directly there.
[00:03:32] Eliot: So, uh, shoot new fix dot JSON, uh, solve dot JSON. Okay. So here it generated that. So if you're working in your own project, um, then you can, so it's, it's, um, you can, can see where we are in the project hierarchy. Mm-Hmm. Interest. So put it under sequences, project and calibration. And then it generated this, uh, this solve json and you should be able to copy that and from one of your, uh, one of your [00:04:00] devices to the other device.
[00:04:01] Eliot: And it's, and that's the equivalent of a SYN offset JSON file. Um, let's see. I think this is, um, a little bit different. The SYN, I, we'd have to look at that carefully. So, uh, cause the files end up being specific. So I'll pull this open so you can kind of see what it, what's, what's in there. Um, okay, so here is a simple radial, um, and X-V-F-O-V, fx and FY are the, uh, the focal lengths and pixels.
[00:04:28] Eliot: I'm gonna switch this to word wrap, uh, view. Where is my word wrap. There we go. Okay. So this is a, this is a lens calibration solve. So, uh, see this X-O-F-O-V is the horizontal field of view, FX and FY, uh, those are the focal lengths in pixels. Uh, CX and CY are the offsets. K is, is, uh, is our initial distortion value.
[00:04:54] Eliot: Um, and we're, and for the basic Jetset solve, we just do a K, uh, K1 distortion [00:05:00] value, um, since with, you know, these are all the things that were part of the solve. And this is the, the M is the offset matrix. So this is, this is it. This is, uh, this is exactly what you would want to copy over from your, your, uh, your windows machine over to your Mac machine, uh, put it in the same directory and it, it, uh, it should show up.
[00:05:18] Eliot: You want to try it? Um, as in, uh, is it, is this, isn't from my shot, is it? Uh, no, this is from, I can load your shot. Actually, let me just do that. That, that makes more sense. Let me, um, You still have it, of course, that would be super helpful. Yeah. And let's just, let's just load that and see if we can't find it.
[00:05:38] Eliot: Um, let's see if I can find it first.
[00:05:41] JP: So if you want, I can share with you the Dropbox link that contains that folder, if you need to access it again. All right. So
[00:05:51] Eliot: this
[00:05:51] JP: is,
[00:05:52] Eliot: I think this is your JPTX. Yeah, JPTS, that's me. Alright, and it's calibration test. Okay, so let [00:06:00] me, let me load that one up. So, let's go to our user files.
[00:06:08] Eliot: And, there it is. Where are you? There you are. Alright, so I'm going to set that folder. There we go. There's scenic camera, scenic proxy. Okay. And so let's then look at our calibration. Um, okay. So this one was on a Blackmagic URSA. Uh, and let's take a look at the take real quick. Is that, that look like the correct take?
[00:06:35] Eliot: Yep. Great. All right. So let's just resolve that.[00:07:00]
[00:07:01] Eliot: Yeah, so what I'll just do is I'll solve this, um, and then I'll send you the, uh, the, um, uh, the solved JSON file and we can try it on your, your machine. You can screen share and we'll get it in the correct directory on the Mac and let's, let's make sure it works. Plan a closed loop it. Yeah, please.
[00:07:27] JP: So I'm just going to send you a link on the, uh, the, um, chat side of Zoom with, uh, what's it? With the Dropbox links of the folder containing an issue that my colleague had on his Windows platform where the solve button was greyed out and it had the word missing on it. Oh
[00:07:48] Eliot: yeah, yeah, uh, if he can send the um, just send the zip up the take and send the take and we can actually look at it, look at it here if he wants.
[00:07:57] JP: I mean, I've sent, I believe I've, um, kind of in that, [00:08:00] um, in the Dropbox links, which I've just sent you, I believe I've got the, uh, take a zip up. Oh, good. Screenshots. And, um, what's it, the console log information, which was copied and pasted onto a word document, shall we say, in case there's any data there that's, um, of interest.
[00:08:20] JP: Perfect. Perfect.
[00:08:21] Eliot: Yeah, you can see the solvers just like going, like going over and over trying to, trying to get this thing to, to, to solve.
[00:08:28] JP: Yeah, this is what you were telling me. It was taking forever and a day, um, just to kind of compute.
[00:08:33] Eliot: Yeah, and it's a numerical solver, so it iterates to try to find a, you know, a minimum, you know, minimum error solution.
[00:08:40] Eliot: Uh, but, uh, solvers, you know, if it's, if the mathematics are, are a little, uh, on the edge, then solvers can, can go or not go depending on what they're doing. Yep. Still go. It's like, there's a solution in here somewhere. It's like, it's like [00:09:00] iteration at this point. I think we have a test set to like timeout at 50 iterations and just, it's as good as we can get with that.
[00:09:07] Eliot: You know, otherwise you're there for hours trying to get it to solve.
[00:09:13] JP: Yeah. Okay. Wow. I didn't, I thought it might take maybe 10 minutes at most, you know, uh, based on the way, uh, based on your last message over how long it took , um, to solve, I, I'd forgotten it was long.
[00:09:27] Eliot: Otherwise I would've just got gone because it solved it.
[00:09:30] Eliot: Like it probably already, uh, you know, yeah, I can open up a file, see if we already have the, the JSON, but we're probably in the middle of solving it, so. Um, go up on one here, sequences, project,
[00:09:48] Eliot: and there's the calibration. Oh no, there we go. Project calibration. Oh yeah. So, okay. So there's, there's actually, there's actually our solve. Let's see if it's okay. [00:10:00] Oh, there we go. So that, that it finished. So we'll just go to this and click on this, this, uh, still working. Compute calibration offset. Is it thinking we had an internal error?
[00:10:17] Eliot: Yikes. All right. So let's see what that one was. Um, I'm going to send that to Greg.
[00:10:30] Eliot: Okay, but, well, we, we got it, we got a calibration solve out of it. So let me send that to you. Um, and then I will get a couple of these other pieces done. All right. So there is the,
[00:10:52] Eliot: okay, let's go find who that solve was. Let's [00:11:00] see.
[00:11:05] Eliot: Okay. And there's auto show in the background. Okay. There is the project and then there is calibration and oh, it's sequences. Uh,
[00:11:22] JP: Sequences were always
[00:11:24] Eliot: sequences. Okay. There is there it is. Okay, so there's our soul. So let me send that to you. Then
[00:11:32] JP: I'll just. Type your email in case you don't have it to hand.
[00:11:36] JP: All right, that sounds good. You can send that to me. So that's, that's on the Zoom chat. Okay, great. I got it. So let me send an email over there. And forgive me, the first letter should be a lowercase p, but yeah. Okay,[00:12:00]
[00:12:00] Eliot: there's the solve. There is your,
[00:12:07] Eliot: there we go,
[00:12:27] Eliot: there we go. Okay, so that's sent to you. Thank you. I'm going to, meanwhile, I'm going to copy this error over, so for reference, uh, because this is one that we're going to want to chase down. Give me a second to grab that.
[00:12:43] JP: No worries.
[00:12:47] JP: Yeah, I received your email.
[00:12:49] Eliot: Okay, great. So let me get this, set this up real quick. So solve for Greg,[00:13:00]
[00:13:49] JP: let me know if you're available to ask you one more thing about, um, what's it, my friend's experience on the windows platform. Uh, yeah, just a
[00:13:58] Eliot: moment. Why? [00:14:00] Yeah, of course. We'll send up a couple of pieces. So have it all in flow. So this one was Cal test. Let's see. You want to want to KO five B nine four.
[00:14:13] Eliot: Okay. So that one. All right,
[00:14:20] Eliot: it's making a couple quick notes so I can reference this take back to the, to the actual matching error.
[00:14:35] Eliot: Okay.
[00:14:48] Eliot: Okay. All
[00:14:54] JP: right. Okay. So, uh, all right. What's next? So, um, again in the [00:15:00] chat, I've sent you, uh, no, that's the wrong thing. Okay. Bear with me. Let me just send you the link to, um, there it is. The forums roadmap. Let's see. No, that's the wrong thing. I copied the wrong link. Oh, no, ignore that one, but I'm just going to share the dropout links.
[00:15:15] JP: with you, but I'm also going to share my screen just so you can have a visual. Um, so yeah, cool. Uh, okay. So let's just get out of this. So this is the issue that, um, okay. Um, I think I sent this, um, and this is what my colleague was seeing on his end. Um when we tried to when we hit the scan button that worked perfectly But we were unable to do a solve and it was something
[00:15:45] Eliot: missing here Strange, it sounds like the uh, the actual calibration solves it's looking for.
[00:15:50] Eliot: Okay. Uh, let's see Is that uh, is that link in the drop box folder?
[00:15:54] JP: Yeah, so that's the link on the drop box folder and here is just what was on the console side of [00:16:00] auto shots Which we thought okay, maybe um, this might be useful um Again yeah, yeah Go look for that. There's some details there. And if I would come out of there, this is, um, the take folder on his end.
[00:16:16] JP: Cause, um, I had to explain to him how we're going to set up the folders because silly of us, um, when we let, when he left, uh, the, the, uh, the set team, he didn't actually copy the whole project for the crews. A huge file. He only copied what he thought was relevant. Oh, gotcha. Gotcha. Ah, okay. That that might be key here.
[00:16:36] JP: It could be, yeah. So the key could be there, as in we probably mistyped the re, um, wrongly named folders or we don't have the right folder. Yeah,
[00:16:45] Eliot: the, I bet it's missing the calibrations. That's what it's missing. Um, possibly. So on that one, on that take that I just saw, that's the one we've been looking at that is be, uh, be nine four.
[00:16:56] Eliot: So let me look in the forums to see if there is a, a different. [00:17:00] A different take that we're looking at. Uh, let's see.
[00:17:04] JP: I believe it's the same one because it's the same one It is the same one. Um, I was going, I was following your suggestion about, um, you know, um, solving it in the windows platform and then just getting the Jason file.
[00:17:19] JP: And that's where we struggled. Um, that's what the issues that he was finding on his end.
[00:17:24] Eliot: Now the, you sent me a take zip and that had everything in it. So I was, I was able to solve, if you do file, uh, zip up take, uh, that will include all the, all of the different pieces, including the calibration to completely run it.
[00:17:35] Eliot: And that's why I was able to run that take on my, my end. So the link that you sent me should contain all the things, things that you would need to run it on windows. Right. Okay. So would that, okay. So, all right, fair enough. So it could be that. If it's the same take, if it's a different take, that's a different story.
[00:17:52] Eliot: But if it's the same take, then we just solved it. You just, you just watched my solver kind of grind through it.
[00:17:59] JP: So you're [00:18:00] saying that missing file might just be something else. Well, let's, let's
[00:18:05] Eliot: try to find it. Um, let's see. So is it, um, let me look at this. So first of all, let me skip the correct link. Uh, is that,
[00:18:20] Eliot: let's see,
[00:18:24] Eliot: so can you text me the, or can you put in the chat the, uh, the link in the forums I should be looking at for, uh, this other, the missing solve problem, just so I'm looking at the right spot. Uh, bear with me. Sorry, can you repeat that again? Uh, could you just put in the, um, uh, in the chat? Yeah, or the, um, uh, Yeah, the Dropbox link of the take that is, uh, has missing, uh, that has a missing solve.
[00:18:52] Eliot: So that way I can, I can go back to the, to the source and figure out what's going on. All right, let me change the [00:19:00] settings. And you can share your screen if it helps. Sometimes that helps.
[00:19:06] JP: Yeah, okay, cool. Give me a sec. All right. Let's share my screen. So we're here. All right.
[00:19:19] Eliot: So this is the
[00:19:20] JP: take.
[00:19:21] Eliot: Okay.
[00:19:21] Eliot: Contests, scene 101, take five. And then BE 94564, 94564, 3D0. Yeah. Okay. So that's the same take we've been, we've been working with. Uh, the
[00:19:34] JP: only. The only thing is I didn't zip up this take and send it to him because he already had, um, for example, the brawl file from the take on his computer. Sadly, when he came off set, he didn't actually, um, copy over all of the, the project folders.
[00:19:55] JP: So when, um, I spoke to him online, we were [00:20:00] trying to recreate the project, uh, the folder directory, which auto shot would automatically do by itself. In order to, you know, run, um, uh, the scan and solve and find all these, uh, the profiles, the proxy files and what have you. So even though you, you have this take, you have my version of it, but you don't have his version.
[00:20:23] JP: And maybe by download, downloading his version, you might be able to tell us where we went wrong because we were recreating the folder
[00:20:31] Eliot: structure. It might be simpler for you to just send him the zip takes then. When it, when it, when it, uh, unzips that automatically unzips in a, in a reasonable folder structure, you saw me just open up your take and it just, you know, everything works.
[00:20:46] JP: Yeah. You know what? That's, yeah, that's probably a good show. That's something we can do rather than, yeah, have you kind of look into this. Cool. Um, so it was those two queries. So I'll contest that with him. And if it's still, um, if there's an [00:21:00] issue, then I'll, I'll, I'll get back to you on
[00:21:01] Eliot: it. Yeah. And then, um, you would just copy the, um, that Jason, that salt Jason file that I sent you to the same, you know, the, the corresponding project, uh, folder inside the sequences into the, to the new one.
[00:21:14] Eliot: And then it should, uh, it should show up. It should show up as a, you know, you'll get a resolve button. You don't need to resolve it. It already solved it. Um, and in fact, you don't have. You wouldn't have the raw materials there to resolve it. The raw materials are in the same, uh, the same, um, uh, file name, but with a zip that has all the kind of the raw materials, the individual still frames for the solve.
[00:21:35] Eliot: So once, once you load it into the new, the new machine, you don't need to resolve it. Uh, and then you should be able to just run your take. The save and run, right?
[00:21:43] JP: Yeah.
[00:21:43] Eliot: Okay. Yeah. All right. Well,
[00:21:45] JP: I'm gonna, I'm gonna try that out right now because I think I still have to download the new version of, um, AutoShot.
[00:21:52] JP: Um, but you're going to be here for the next half hour, right? Is the, are these, uh, surgery or slot sessions only for an hour? Other for an hour and a [00:22:00] half.
[00:22:00] Eliot: So I'm here for another hour. Uh, so go ahead and try it and then jump back on if you run into, run into any problems and we should be able to close the loop pretty fast.
[00:22:07] Eliot: Fantastic,
[00:22:08] JP: I'll do that. Okay, so I'm just going to end this call now. Thank you so much, uh, Elliot, for your help, and I'll report back with, um, either a positive message or a positive outcome, I should say, or a negative one. Either way, we can fix it. All right, good to
[00:22:20] Eliot: see you. All right, take care. Bye bye.
[00:22:28] Eliot: And it looks like, Ryan, you're still here?
[00:22:32] Ryan: I am.
[00:22:32] Eliot: All right, fantastic.
[00:22:36] Ryan: Yeah, I just, uh, I was going to announce myself, but you guys were deep in the conversation. I'm like, I was letting go in the background and if he doesn't shut up in time, it's no problem. So, um, I just, uh, I just got my hardware delivered today.
[00:22:49] Ryan: So I'll be setting up the C Blue, um, hopefully this afternoon I'll drive down to the studio and, Start working with the camera.
[00:22:56] Eliot: Fantastic. Yeah,
[00:22:58] Ryan: so one of the things, you know, [00:23:00] I've been thinking about it a lot over the weekend of like trying to augment the system the way it's built to the way I need it.
[00:23:07] Ryan: Um, and one of the things that I thought of was, and you kind of brought it up with the last guy that you were working with, is like there's a, there's a raw section and then there's a proxy section. And I'm curious if there's a way, like, can I trick the system into being essentially making my own proxies from the raw footage for the cannon and then being like, that's my source media that auto shot should use, but in the round trip, like you were in fusion, when we run the fusion script, it's going to make a loader and say, yes, the loader is now what auto shot created and just change that loader back to what I want my loader to be.
[00:23:47] Eliot: Absolutely.
[00:23:48] Ryan: So that, so the fusion end of it, I was like, I can totally make that work. Do you have a list of codecs that it'll see as a source? So what auto shot we'll see is like it's actual source [00:24:00] media when we try to correspond the takes to each other.
[00:24:03] Eliot: Yeah. Yeah, actually it's, um, most of the MP4s, uh, and, uh, you know, cause those are all, those are all fairly standard, uh, and the ProRes ones, we can read ProRes Um, and, uh, and we can read the MXF ones.
[00:24:18] Eliot: I know, I know those three we can, we can all read.
[00:24:20] Ryan: So, yeah, so, my first test, today when I get everything up and running, my first test was going to see if I could use what we typically make with proxies as MP4 files, just because they're lightweight, they're easy to keep on disk and not have with the project file.
[00:24:37] Eliot: Those should work fine.
[00:24:38] Ryan: Would resolution dependency come into play? Meaning if I'm shooting at UHD,
[00:24:46] Eliot: can I
[00:24:47] Ryan: make my proxies at 1080p? And even it's, it's a two to one conversion. So I'm feeding the system a half res file, but the pixels mapped in them are the same, just blown [00:25:00] up two times. It's not, there's not an aspect change.
[00:25:03] Eliot: Yeah, that, I think that should be fine. Um, we'll have to, as we go through it, I think we'll have to experiment with it. I contacted Canada to get their C RAW SDK. It's going to take a little while like that. There's a decent chunk of work,
[00:25:13] Ryan: but
[00:25:14] Eliot: I like this approach that you're doing because it's, it'll be fast and, and.
[00:25:18] Eliot: And then you can, if you are, you use the fusion, like, you know, our fusion set up at a lot of link, make the loaders and hook everything together. And then you can, you can point the loader, the fusion loader at a different resolution and it should be a different sequence of EXRs and it should be fine.
[00:25:34] Eliot: Okay.
[00:25:34] Ryan: Awesome. Cause like what, you know, typically this is again, I kind of go down these rabbit holes of like the way I've normally done things and trying to patch it into the way that you have everything organized. And again, 99 percent of the time I've come up with something. It's well, I can get into your files, your file structure, and I can just steal what I need to from it.
[00:25:54] Ryan: So it, you know, my, in my head, the workflow would be create my proxies the way I normally do. [00:26:00] Send that proxy as the folder for. Auto shot to match to our Cine takes
[00:26:07] Eliot: run
[00:26:08] Ryan: auto shot the way it's designed. And I probably would just use the PNG sequences only because it's faster. Like. I don't need the EXRs from AutoShot for my workflow.
[00:26:17] Ryan: I'm just trying to get through the system to get the solve into either Omniverse or Cinema 4D. Do the round trip with the new Lua file back into fusion.
[00:26:28] Eliot: Mm-Hmm. .
[00:26:28] Ryan: And then in a separate process, what I typically do is I will make e XR sequences from a DNO process in resolve. Mm-Hmm. . Yeah. So that those frames are already been processed through a dno and it just makes teeing so much faster than having to wait for the DNO or every frame while I'm compositing
[00:26:45] Eliot: Right. Right. So I
[00:26:46] Ryan: would, you know, essentially trick the system into using the XR files that I've manually D noise. Um, and that, that would be my full workflow, the round trip, [00:27:00] essentially. I think that should
[00:27:00] Eliot: be fine. I mean, that's the great thing about, that's why we kind of decompose things to image sequences, because then it always works, you know?
[00:27:10] Eliot: You know, you just, you go, okay, that's frame one and away we go. And so then you can kind of mix and match, mix and match what you need.
[00:27:17] Ryan: So, within that, that round circle, when I create the PNGs from AutoShot, can I essentially delete those, or will, I mean, basically will just come up missing, like I can't find, relink this texture inside of Cinema or Blender.
[00:27:35] Eliot: Right, right. That will be, um, you'll need to experiment a little bit with that. I mean, one of the things you can do is uncheck the image plate. So there's a checkbox in AutoShot, um, that, uh, that says, you know, the, you know, you know, add an image plane. Um, and that's in Cinema 4D. I have to stop and remember how we do it.
[00:27:54] Eliot: We may have had to do it in background images in Cinema 4D, because I don't think it supported animated [00:28:00] textures on the image plate. Uh, but I think you uncheck the, uh, the image plate box, and then, then I think you're just going to get the, the tracking data imported into Cinema 4D. Um, it'll still extract the PNGs.
[00:28:11] Eliot: Then again, we'll have to try this, but I mean, it's not, you know, we'll be able to figure out something, you know, like it's because it, what it, what it ultimately generates is it generates a Python script for cinema 40. Um, and so, you know, if, you know, what, if we have to go in and like, look at the Python script and just delete the line that was referencing the PNG lines, that's all right.
[00:28:30] Eliot: You know, that's, that's, uh, it's not, it's not that, that tricky to do. It's pretty obvious where it is. So I think that should work and that, that I like that. Cause that way you're, you can just go flying ahead and you're not waiting on us to try to get our, our red rock or our, um, uh, our, our, um, our Canon raw workflow up and running, we want to do it.
[00:28:47] Eliot: It's just, you know, it's raw. It's going to take, you know, SDK
[00:28:52] Ryan: It just, I'm more, more or less, I'm just spit balling ideas. And as I come with up with them, if I can give you a resources to a clip or what, you know, like [00:29:00] just kind of given as much information as I can, as I work through it. And, um, hopefully that's great on the process.
[00:29:06] Ryan: Um,
[00:29:07] Eliot: I mean, this is why this is the system is built the way it is image sequences, text files and scripts. And so he can open it up and it's real easy to tell what, what it's, what it's doing at any given point. So, uh, so you can adapt it.
[00:29:19] Ryan: Perfect. Yeah, I think that's, um, that's what I'll try and try to get the round trip.
[00:29:25] Ryan: And as I test this, I'll try to just. Kind of started on glowing like blog type posting on the forums like here's what I'm finding. Here's what we worked through That's great just for anybody else that's working in cinema That's great What i've done in the past when I was doing the quick round trips without the cine process Is I would literally just copy and paste the the python camera from the scene It would load into a master scene to render from like Literally my workflow was just copy the camera into my render scene Hit render from the new camera that I copy and pasted.
[00:29:57] Ryan: So
[00:29:57] Eliot: yeah,
[00:29:58] Ryan: it worked and that was [00:30:00] working great I mean, it was all lining up everything like nothing shifted. I was like, okay, this works like this workflow completely works this way Great.
[00:30:07] Eliot: Great. I mean, this is, this is the, the, you know, this is honestly, it's great to have that posting in the, in the, in the forms because we, we're, we're kind of establishing all the single shot workflows, but then of course the real project has more than one shot.
[00:30:20] Eliot: So, uh, it's great to kind of see the adaptations and, and, you know, it's, it's, when you just copy and paste from one to the other, it's still pretty easy to sit there and do it. And it's not, not a thing.
[00:30:31] Ryan: The hard, the hard part is like, what we're doing right now is figuring out that process. Once we get it, it's going to be like, here are the nine steps you take and it all works, you know?
[00:30:39] Ryan: Right.
[00:30:40] Eliot: Right, right. Math, math works.
[00:30:42] Ryan: Yeah,
[00:30:46] Eliot: this, this is why we, we script all the inputs to, so that it comes in instead of trying to get Alembic or something to work as the transit mechanism for the camera motion, because then man stuff breaks so fast, as soon as you have FBX or Alembic in it, we just, [00:31:00] we just code it.
[00:31:02] Eliot: So it shows up as a cinema 4d native camera or whatever, you know, on real native camera, and then there's just not much to go wrong. Uh, it did go wrong with it. Yeah, please, please post as you find out stuff.
[00:31:13] Ryan: And, you know, if it is possible to get the image plane working in Cinema, I think that's a, it's a really cool feature in Blender and I've, like, I'm going to toy around with, can I get my scene in Blender simply for that purpose?
[00:31:26] Ryan: Like, I think that's a great, It's amazing to be able to like literally watch your scene together, already married together and be like, I can make decisions right now without even a comping yet. So
[00:31:37] Eliot: no, the Cinema 4D should have the image sequence in there, but it's, I believe we had to put it in the background,
[00:31:44] Ryan: correct image sequences can come on an image plane.
[00:31:50] Ryan: There's a trick to getting them. Like you have to change the texture to animated inside one of the property panels.
[00:31:55] Eliot: Interesting. Interesting. So like
[00:31:57] Ryan: it's an extra step inside of like [00:32:00] loading an image sequence for, um, for an image plane. So you like, you have to set your frame rate based on the project.
[00:32:07] Ryan: You import, um, a directory that points to a single file. Um, in the property panel, you can select animated and then it'll calculate frames. So it'll basically look at the folder and say, how many frames are in this folder, and that's what my image sequence length will be.
[00:32:23] Eliot: If you can go start from like our, you know, our, our generated scripts, like in cinema 4d and like do a screen recording of this, this series of steps that you would use to, and to get it to show up as a, as an image plate, then we can look at, at, at converting that to scripting.
[00:32:38] Eliot: Because I, I agree. I, I, I try to, we try to standardize how the things come in in different, different apps. And I know we took a couple of good hard runs at the image plate in Cinema 4D and bounced off it. Yeah. But if you have a sequence of things, then we can look at it to see if the Python system actually supports the same set of, of actions.
[00:32:57] Eliot: And, and then we can, we can just write it, code it in. [00:33:00]
[00:33:00] Ryan: Yep, right. I can certainly take a look at that as I'm working through everything on my end. Um, I mean, what we've done in the past and, you know, before this system is we would just line up a cam, like we would know where our camera was based on measurements, project the image through that and then basically just keep moving the image plane until it would line up.
[00:33:20] Ryan: Then you lock it to the frustum like you are where you can scale it. Matches to wherever you put the camera, they could be further away from the camera than they actually were, but it scales correctly for a larger shot. And, um, you know, make a, make a camera move inside of the existing camera move in post.
[00:33:40] Ryan: Me that you mentioned in one of your tutorials with Ian Hubert's method where it's like as long as you don't move about 10 degrees left or right you can get away with a lot of animating another camera Inside of your animated camera as long as you know where that plane is so you can animate the feet
[00:33:57] Eliot: Yes, yes
[00:33:58] Ryan: within that within that space.
[00:33:59] Ryan: So, [00:34:00] um, cool Well, I think that was really my biggest one was like what codex could we use and trick the system into doing what we needed? to Um, and hopefully we can get back to the studio here soon and get everything set up. Um, I'm a little worried I don't have all the hardware I need currently, but I'll dig through our grip gear and see if I can mount it the way I need to.
[00:34:24] Eliot: Fantastic.
[00:34:26] Ryan: Yeah, I will give you an update as soon as I can, but, um, that was really the only question I had for right now until I, you know, get into it and start testing.
[00:34:34] Eliot: Exciting. Alright, well looking forward to hearing from ya.
[00:34:36] Ryan: Oh, one other thing too, when we, when we're setting up, like, so we go through the whole calibrations and we have a set with green screen behind it.
[00:34:46] Ryan: Is there anything other than just putting markers around the scene that helps Jetset like stay locked in other than the, um, like overheating or, you know, is there any like best practices when we go and straight onto a green screen of like, [00:35:00] have 10 markers available that it makes it easier for Jet Set to see what it should be tracking to.
[00:35:06] Ryan: It's trying to just limit the amount of times that Jet Set's tracking fails, because I don't know, once we're in production, it's just gonna be like, we're gonna shoot this, and then after, we're gonna find out if stuff tracked or didn't track. Like, I think it's going to be hard every take to be like, Hey, let's review that take, make sure it tracked correctly.
[00:35:22] Ryan: Do we need to do it again? so I'd
[00:35:25] Eliot: say a couple of good suggestions that I've seen, and I'm trying to remember who on the forum suggested it is that, um, uh, if you have one of the 15 pro maxes, then you can plug in a HDMI, uh, uh, output that also has a, um, to the, um, to the camp, to the phone that also has a power input.
[00:35:43] Eliot: Cause you need to keep the phone powered. Cause that, that's, you know, Powered and cool, like phone, keep the, keep a, a big battery on the phone and a, and a cooler on the phone. That's like number one and number two, like almost everything else is, is comes in later after that. But they were, um, they used an [00:36:00] HDMI, uh, converter, uh, that would take just the, the screen feed off of, off of Jet Set.
[00:36:06] Eliot: And they ran it to, uh, like a Ronan, uh, or a DGI, Raven Eye. It's like 200 bucks. Um, and it's a real time, um, uh, it's like a wifi video transmitter. It's really good. It's low latency cheap, and it goes to a, and that way you can be watching the real time composite, um, of, of jet set, you know, as you're shooting.
[00:36:23] Eliot: And if things, if things are lining up or if they're even close. Um, then, you know, you're, you're pretty good, you know, if you start seeing like the, the live comp jitter a lot or way off or something like that, yeah, then, then, you know, stop and reset your origin or, and whatnot, but that way, the, the key thing is if your live composite is, is looking good, then, then your data is going to be looking good.
[00:36:44] Ryan: Okay. Yeah. No, that makes total sense. Is
[00:36:47] Eliot: I like that
[00:36:47] Ryan: troll playback from the web interface or is it only through the phone?
[00:36:53] Eliot: You know what? That's actually a great question right now It's only through the phone that we we do. Um, we do the review and playback. [00:37:00] Uh, but that's that's an interesting idea uh on the remote, um, i'll i'll I'll look at that and see if there's, uh, cause this, this is starting to come up.
[00:37:12] Eliot: For a while, we were hoping that Apple has a remote control system on the new iOS 18. Um, so you can, but when we looked at it, it looked like it disabled the camera. I'm like, I'm like, nope, that's not gonna, that's not going to do it for us. So for, we're going to have to look at adding more remote control systems to our, to our web interface.
[00:37:32] Ryan: Gotcha. Cool. Cause yeah, I could definitely see that being like, There's the DP role. And then there's the DIT guy, which would be me and be like, Hey, can we just quick review that take? And I can do it right from my laptop through the web interface to be like, just play back the last eight right now. Um,
[00:37:48] Eliot: yeah.
[00:37:48] Eliot: And, and honestly, with the, the real time with that, that Raven, I like, you know, H TMI to Raven, I, to your, you know, again, iPad or whatever that you can be watching it, then you can just be watching the UI [00:38:00] and if it's all looking good, you're, if you're getting, you know, if you're not overheating and the tracking is looking reasonable and stuff.
[00:38:06] Eliot: Then, you know, great. You know, then, then you, you kind of know, you'll know when you're, when it's, when something's going wrong with, if you're, if you're real time composite is, is something's going, going wrong there. That's, that's what I'd say is just that have some way of monitoring what it's doing. You can do that completely through the, the app as well.
[00:38:23] Eliot: Um, you can, um, Uh, you can open up the, um, the web browser and the, and the, one of the panels in the web browser is a video panel where you hit play and you can actually see what it's, what it's doing in real time. Um, but there's a pretty decent delay and sending video over general, general wifi is just less efficient.
[00:38:42] Eliot: Reliable than using a dedicated video transmitter, like the Raven eye or something like that. It just is, you know, it's the difference between dedicated signals and non dedicated signals. So, uh, in a pinch, you can use the web browser one, but, um, I would, I liked that idea of, of splitting out the [00:39:00] HDMI converter to a Raven eye and sending that.
[00:39:02] Eliot: Cause then you have a very, very low latency feed.
[00:39:06] Ryan: Yeah, sounds good. I think the, um, you know, one thing I did notice when I was testing with just the iPhone, I would get. Like green, everything would be green. So temperature tracking, and then I forget what the other one is. Um, but all three would be green.
[00:39:23] Ryan: And then as I would shoot a take tracking would go like in between green and orange, like it would kind of just flip back and forth, but the take always ended up being like, I, I'm not seeing anything getting really wonky, like stuff's not getting really weird. So it was almost like. All green is not necessarily saying if the tracking is going to work.
[00:39:45] Ryan: It's more of it could go wrong than it is going wrong. So that's kind of how I interpreted it anyway.
[00:39:52] Eliot: Yeah. The, um, yellow you're still, uh, you know, or, or orange as, as, as you might, as it might look, um, is, [00:40:00] is there is. You know, green is everything is, is perfectly hunky dory. Yellow is there's something that's a little bit on the edge, but where there's not a huge, huge thing yet.
[00:40:08] Eliot: And red is like, okay, we, we got problems. Yeah. So, uh, that's, we're, we're trying to do something that's intuitive. Um, that is, uh, and sometimes the messages we get from, cause we're using apples, it's called AR kit is their internal, like real time tracking system. And sometimes the messages you get from AR kit are.
[00:40:25] Eliot: A little bit, you know, a little hard to interpret, so we're trying to make those into something where, you know, you know, if you're okay, you know, I think green, green and yellow, you're, you're okay. And just, um, and especially watch out for the temperature and you hit red or something. It means that, and then the tracking, it means that it doesn't know where it is.
[00:40:43] Ryan: Okay. Sounds good. So try to get it all green. Orange is probably okay, but let's get it back to green.
[00:40:53] Eliot: Yeah, if you can, if you can, if there's, there's always. limitations in the reality of stage shoots, right? That's, that's, it's [00:41:00] just, it's just what it is. Um, Yeah, I think. And so in terms of things on green screens to help out the system, I, again, I'd use a bunch of, uh, you know, a bunch of X's and stuff like that, uh, for, because you're going to want it for in case you, uh, in case you're doing any post tracking anyway, those are, those are going to be helpful and you, in the origin screen, if you click on your origin screen, you can see which parts of the scene that the Apple, the iPhone.
[00:41:26] Eliot: Is finding for featured for tracking features, it'll, you'll show a bunch of tiny little, they look like little red, green, and blue fireflies or tiny little origins. But as you pan around, that's what it's seeing. That's what it's using as, as its reference points to detect and kind of lock onto. So you'll, you'll be able to look in the origin screen and see, um, how much, and if it can't see any of them, then yeah, you're going to have some problems.
[00:41:47] Eliot: Um, cause it has nothing to grab onto. So.
[00:41:52] Ryan: So that's good to know to at least be able to see what it's actually tracking, um, as a source. I have a feeling what, [00:42:00] you know, my, in my head right now, it's shoot this like I didn't have your system and I would have to track it. Like I'm treating it that way. Like if I had to track the shot.
[00:42:11] Ryan: I need Xs there, there, there. I need a front X up here with the C Scan. Like, set the scene up that way. And hopefully that the system can grab onto those feature points. And if it doesn't, I still would technically have a way to solve the camera. If we didn't have another take where it wasn't good. Like, you have to solve this take.
[00:42:29] Ryan: Alright, well, now I'm gonna go in and You know, do the whole like Mocha track and put a plane up there and use those as the points to help track it. And I'll,
[00:42:37] Eliot: well, have you looked at the, I can't remember, have you looked at the new synth eyes workflow that we just got, got working? It's pretty, Oh, you want to look at
[00:42:46] Ryan: this?
[00:42:47] Ryan: I saw the like beta version that you did.
[00:42:51] Eliot: Okay. The final version is well worth looking at because it does. Uh, I'm going to put this in the, in the chat. This is really worth going, going through [00:43:00] this because what we did is we really sat down and it took like weeks, but we really dialed the workflow and synth eyes.
[00:43:07] Eliot: That takes full advantage of, of both the tracking data we have in Jetset and the onset scans and a couple of really neat tricks that SynthEyes has. And I, I have not been, I, you know, none of the other track, like Nuke doesn't have it. And, um, uh, After Effects doesn't have it, but with SynthEyes, you can basically create survey data.
[00:43:27] Eliot: And it's almost completely undocumented. Like there's just nothing out there on the internet about this, but it is a, it is a key, um, because if you have the scan data and you'll see, you'll see in the tutorial, um, and you can actually get a traction sub pixel shot in, you know, minutes. Like boom, just that fast.
[00:43:47] Eliot: And it will contain, it'll have the same position and orientation as the original camera. It won't move that much, like a centimeter or two. Um, it's great. It's really, really good. Awesome.
[00:43:58] Ryan: Yeah. I'll take a look at that. I [00:44:00] think, um, I mean, it's actually funny because I remember when they announced survey tracking, I remember going into it, like I could survey our green screen set, cause that never changes.
[00:44:12] Ryan: And then I, like, there was no information about it. I'm like, how do I do this? I have no clue what I'd be even doing if I tried to survey this. And there was one tutorial of like, it was like a side of a hill.
[00:44:22] Eliot: Yeah. It's some random little bar.
[00:44:25] Ryan: Sounds crazy. This doesn't help me at all. Like I don't get what I'm supposed to do with any of this.
[00:44:30] Eliot: No, it's there. They're there. What we, what we're working at is going through, and I've been working with Matt Merkovich. She's a phenomenal synth eyes, uh, instructor and going through and testing and refining this. And so, you know, you'll look in the tutorial and we're using, we, we generate AI rotomaths so that the automatic tracker doesn't put any points on the person, um, and it just makes it into a very, very clean process that, you know, the goal is for like one person to be able to sit down and track, you know, a show's worth of shots in the course of a day.
[00:44:59] Eliot: Just [00:45:00] go through them. And I think we mostly got it. You know, I want to, I want to keep testing it and making sure we got it. Uh, I also used. Uh, and I'm working on a second one where I show how to use, um, there's something called Cynthia, which people think it's a voice system and it sort of is, but it's actually a very simple, easy to use, like natural language scripting system.
[00:45:22] Eliot: So we use that to implement, um, I show a process, which is basically a multi war working through synth eyes, um, feature detection system from small to large and getting, you know, setting it to small, detect all those features, peel them, which is what synthesize terms for it, you know, switched it to medium, find the features, peel those.
[00:45:40] Eliot: So find a lot. So you get a ton of features in a very short period of time that are very, very good. And then your solve is like, you know, just a breeze, you know, cause you just like get rid of all the ones that look bad. You still have a hundred left over and you get a sub pixel track and it's clean really fast.
[00:45:57] Eliot: So I have a script, um, that Matt [00:46:00] built. They use a Cynthia that just runs the, uh, the multi peel process. And again, you just sit there and you just watch Synthize buzz through stuff and it just works. So I'm, uh, I'm quite excited to see how that, I knew this was going to be the problem we needed to solve.
[00:46:15] Eliot: And we've been working toward this for a long time, but seeing it, how well it solved. Was extremely satisfying. Um,
[00:46:23] Ryan: so anyway, within the way that since I, the way that synthesize is built currently in the implementation that you have will, if I bring in multiple shots, does it, does it know the coordinate locations?
[00:46:36] Ryan: Like you were just like, when I put in eight cameras, it's all in the same coordinate system.
[00:46:41] Eliot: You would probably solve them. Cause
[00:46:43] Ryan: I could use the same scan. It's all coming from the same scan.
[00:46:47] Eliot: Yeah. Yeah. You know, the. The new original script we set up, um, just sets up the whole shot, you know, from scratch.
[00:46:55] Eliot: Um, and, and as you go into synth eyes, I'm sure you could probably, [00:47:00] uh, the script we do maintains the coordinate frame of the incoming shots, right? So
[00:47:06] Ryan: it doesn't really matter. I mean, unless I needed to have multiple shots in a single synth eyes document, which I wouldn't, I would just be, here's every shot, all of the stuff I need.
[00:47:16] Ryan: And I just kind of worked through that same process for every shot.
[00:47:20] Eliot: Yeah, it's pretty fast. I mean, you just hit, hit a button and it crunches it. I mean, the parts that take a little longer are, it has to go generate the RotoMats, but you hit that and, you know, go, you know, go do something else while it's crunching that.
[00:47:30] Eliot: The actual SynthEyes solve is, is like, I don't know, less than five minutes, you know.
[00:47:35] Ryan: Can you import your own RotoMats?
[00:47:38] Eliot: Yeah. Yeah. You can, you can do
[00:47:40] Ryan: that. Just a point to a folder basically and say, here's the, here's the Roto that I want you to use versus. Whatever the sky or the net one that you're using currently.
[00:47:52] Eliot: Yeah. All it does is it generates a set of numbered PNG image sequences. So you can, you can look at that and, and like, you open those up and whatever [00:48:00] program and just augment it, you know, with whatever you want, just paint them in, you know, and that's, that's what it's going to look at, you know, when it pulls in, in the synth eyes, it's an image sequence, the grayscale PNG.
[00:48:11] Ryan: I love it. Well, cool. Elliot, this is awesome. I'm hopefully going to be able to jump on tomorrow and share some good news and show you where we're at with everything. And, um, we'll keep moving forward.
[00:48:24] Eliot: That's exciting. All right. Well, fantastic.
[00:48:27] Ryan: Thanks again. Talk to you soon.
[00:48:28] Eliot: All right.
[00:48:29] Ryan: Bye. All right. All right.
[00:48:37] JP: Let me, uh, take a look. So, um, out of habit, I kind of went through the whole setup, the scan, the solve, what have you, and it says, um, down here, no SYN video file. You need to scan a folder containing SYN video file to match this take. So back to my folder structure. Which, uh, [00:49:00] build of AutoShut are you on? Are you on the, the current?
[00:49:02] JP: The latest one. So I believe it's 0. 0142.
[00:49:07] Eliot: That sounds about right. Yeah, there we go. All right, so let's see. So let's, um, it's looking for, it says, uh, it doesn't have a Cine video file. So, uh, let's go ahead and, and click scan up here, uh, under the next to the Cine source. Let's see if it's going to detect a, uh, uh, going to detect it.
[00:49:27] Eliot: And a refresh memory. Is this BRAW files?
[00:49:30] JP: Yes, BRAW.
[00:49:31] Eliot: Okay. All right. But it also has a proxy file
[00:49:35] JP: in case
[00:49:35] Eliot: it,
[00:49:36] JP: for whatever reason, it can't read the B Raw.
[00:49:38] Eliot: Yeah, we actually, when you click scan on this, it is actually using the proxy files to detect for exactly that reason. We can't, we can't directly scan the B Raw files.
[00:49:47] Eliot: Later when we pull our EXR frames, we will pull them from the B Raw files. For the scanning process, we use, we use the proxy. There we go. All right, so it found it. There's a, so you'll look at, let me grab my little annotator here. [00:50:00] Where is an annotator? Notes, captions. They changed zoom on me and now I don't know where anything is.
[00:50:07] Eliot: Yeah, okay. Yeah, I'm I feel like a newbie. All right. Well, I'll uh, I'll just not worry about it for right now. Anyway, um, so you can see when you click scan under the cine footage match Um, you can see it detected the correct E10, uh, you know, CO6 raw and it detected the correct Cine offset, which is 3. 461 seconds.
[00:50:33] Eliot: So the last thing we need to do is, uh, solve the Cine calibration. So you just click solve over there and it's going to crunch on that. Now this is. This is the one. Oh, okay. This is the one where I think we had problems with it. Right. Where? Uh, exactly. Okay. Okay. Sorry. Yeah, the same take. Sorry, I shouldn't have done that.
[00:50:49] Eliot: I should have, um, we should have copied the JSON file in and
[00:50:54] JP: the JSON file's already been copied in. Um, it is just maybe, um, I believe it was, uh, [00:51:00] well, I placed it in the sequence folder. Um, and then we, sorry, so I went through the sequence, project, calibration, and then the folder where you can find my JSON file.
[00:51:11] JP: And I'm thinking that's where I've, um, you know, made an error. Okay. Because I didn't pick it up, but it didn't even prompt me to say, oh, we can't find the syn offset, uh, file. Right. Okay. Okay. So let's see. But while it's doing that, shall I show you my folder structure and you can can compare what you have on your end?
[00:51:32] JP: Yeah, that's that's probably a good idea.
[00:51:36] Eliot: Let me look at mine, get a reference for this.
[00:51:51] Eliot: Okay, so let's see. So we are in, there's project sequences and project calibration. Uh, and I think, did you, did you just, [00:52:00] uh, write in the calibration file? Yeah. Uh, okay. I think that might be a lowercase c on calibration. So let's, uh, let's go ahead and change that to a lowercase c calibration. Um, okay, and then let's go back to AutoShot and let's see if we can cancel that.
[00:52:15] Eliot: Let's go before we get into it. Actually, cancel is, uh, right in the middle of the screen. Oh, I see. Yes. Yeah. Go ahead and cancel that. Let's see if canceling will work. Okay.
[00:52:30] Eliot: Maybe not.
[00:52:34] Eliot: I can close it perhaps. And then yeah, let's, let's, let's go ahead and close it. Cause I think we already know that this is the numerical solve is isn't going to behave correctly. So let's just strut up artists out of shot again. And it looks like, why are there three of them? I don't know. Let's say exit out of all the others.
[00:52:51] Eliot: Uh, let's
[00:52:56] Eliot: make sure we're not trying to, okay, there we go. [00:53:00] Okay. So now let's. Let's go ahead and click open, uh, next to your take. Uh, let's see. No, that's, um, sorry. Let me, I gotta find an annotator here. Uh, there we go. So there's that. There's, and let's click on, uh, project.
[00:53:18] JP: And inside the sequence folder. Yeah.
[00:53:20] Eliot: Inside the sequences folder, project calibration.
[00:53:23] Eliot: Okay. VM URSA 2. And then, uh, can you drag that folder just over a little bit over to the, to the side? I want to see why, why it's there. I thought that would recognize,
[00:53:32] JP: uh, which drag, which folder to the side. Uh,
[00:53:36] Eliot: can you just drag that, um, that finder window down in a little bit so that we can see both, uh, both that and the, uh, the original file name, uh, for the, uh, the calibration, uh, maybe down just another inch or so.
[00:53:48] Eliot: VM 24 million or four. Okay. So, okay. I would have thought that would show up there. So let's try, [00:54:00] okay, let me look at, it might be looking for it up higher. Uh, and then in which case, okay, let me look at this real quick. Imagine calibration. Let's try copying also. Um,
[00:54:22] Eliot: ah, okay. Let's, let's copy that. Solve that, uh, JSON solve, uh, file and we're going to go up, uh, a couple of levels. Uh, up to, um, let's see. There should be under project. There should be a ca folder. Um,
[00:54:42] JP: yeah, so my calibration is in footage.
[00:54:47] Eliot: Okay. Okay. Okay. Okay. Um, all right. Let's go into, click on the calibration folder. Let's see what's in there. Okay, so go ahead and, and, uh, and [00:55:00] paste that, uh, the BM 4 24 millimeter, the, the JSON, uh, file next to the VM Ursa. Uh, 24 millimeter 4k and just go ahead and paste, uh, paste that JSON file into that folder directory.
[00:55:15] Eliot: There we go. All right. Let's say, and let's restart out of shot. I think that's double checking to see where it's referencing the, the data from. There we go. That's it. So now you can see instead of, uh, next to Cine calibration, you can see it has a BMRSA 24 millimeter 4k, and instead of a solve, it's just saying resolve.
[00:55:36] Eliot: So now you've got all the pieces. Um, so let's, let's pick a subset of frames. Like, I don't know, a clip in frame it through 600 and out at, at, you know, 6. 50 or something. 7. 20, let's see. Sure, that sounds good. Just save and run, right? Yeah, let's just save and run and see if that just cooks through it.
[00:55:59] JP: Well, that's [00:56:00] something that hasn't done before.
[00:56:02] Eliot: Yep, there we go. I was just, yep, that's pulling frames.
[00:56:10] JP: So am I right in thinking, but on the Windows, um, version when you gave me this, uh, JSON file, it was within the sequence, uh, project calibration folder. Yeah, okay. Somewhere else
[00:56:25] Eliot: and it's kicking in. Ah, so what's, what's going on is that, um, there's the, the takes that it initially pulls down, um, and. Normally, um, the way we're doing it, and I think this is from a project that was shot like a month and a half ago, right?
[00:56:45] Eliot: Something like that. Okay. Yeah. So this, what we, we changed something in how we do calibrations and now our calibration is interactive or right after you, you finished shooting the, um, the frames, you, you have auto shot, uh, running [00:57:00] on a connected machine and you just hit, you enter the sensor width in auto shot and you hit calibrate and just crunches the solve right there.
[00:57:06] Eliot: And then. So you know whether you've got a problem, like right, right there and then, and then it pushes that JSON file back to, you know, back to, um, to Jetset. And it's stored at a project level. So, uh, under project and then, uh, footage and then Cal, Calib, this is where it would be, would be stored. So it's stored in that source footage.
[00:57:26] Eliot: And then when we generate a sequence, then what, uh, what AutoShot does is it just reaches into that original calibration folder, the, the footage, uh, calibration folder and pulls out the calibrations it needs, and then it uses those to generate the sequences. So the sequences are downstream from the footage, and that's, that's, that's what was going on.
[00:57:43] Eliot: So when we crunched the solve, yes, it was solving it in the sequences folder. Um, but when we actually, when we wanted to, uh, give it the correct place upstream that we need to put it back up in the footage folder.
[00:57:55] JP: Oh, okay. Okay. Okay. Good to know.
[00:57:58] Eliot: Yeah. And what I'll [00:58:00] do is, is I'm recording this and I'll, and I'll put this, uh, I run this, all these things through Descript.
[00:58:04] Eliot: So we have a trans, uh, transcription from it. And so we're, we're going to put up, uh, an office hours section on the website that has all these things up here with the transcription and, uh, uh, with text with the full text of it. So that, that way, you know, you can go back and like, just search on the website for, you know, for, for some of the keywords in this.
[00:58:23] Eliot: Okay. Calibration or project or stuff. And boom, you can find out, find out exactly, uh, how this was done on the, uh, on the original, uh, office hours,
[00:58:32] JP: we should have
[00:58:33] Eliot: that. We should have that this week.
[00:58:35] JP: Great stuff. Cool. All right. So I'm just going to quickly go into my sequence folder because I think that's where, what's the, these, um, uh, what's it when auto shot runs it's math, shall we say, yeah, you
[00:58:50] Eliot: can just look under, uh, yeah, there's the solve.
[00:58:53] Eliot: And then there's also, if you look under calibration tests. Uh, 2024, you click on [00:59:00] that, and then it's gonna, it should have, there's your individual. Oh, there,
[00:59:02] JP: okay. Yep. Yeah, yeah, that's it, that's it. Cool. And then, for example, I'll just copy this link, uh, into, um, SynthEyes. Yeah. Have you gone through the, that, the, uh, the tutorial on that?
[00:59:15] JP: Um, I have but I need to refresh my mind, uh, but it's either copy the link from the console or copy that file and um Put it into the synth eyes. No need to to show me because I I know there is a video on it I'm just a little bit rusty at the moment. I can't remember But it's definitely you've definitely kind of spoken about it at length.
[00:59:38] JP: So, um, well, thank you so much elliot for um, walking me through this and for Um, what's it solving this issue? Um, hopefully it'll be smooth sailing from now on.
[00:59:47] Eliot: Absolutely. Absolutely that that sounds great And uh, let me know how it goes.
[00:59:51] JP: Yeah, we'll do okay. I'm gonna start Um, i'm gonna stop sharing and um, yeah, i'm gonna exit the call.
[00:59:57] JP: All right. Good to see you. Thank you. Take [01:00:00] care Bye bye