Transcript

# Office Hours 2025-02-10


**Eliot:** [00:00:00] Hello. Hello. All right, there we go. All right. Morning, John. Hey, Elliot. All right. What's, uh, what's going on? 


**John:** Oh, I'm coming to you, uh, from, uh, Lima, Peru today. Oh. 


**Eliot:** Fantastic. Fantastic. Yeah. 


**John:** Um, but I spent the weekend, you know, um, trying to figure out, uh, this, this whole workflow. Alrighty. And, uh, I have a couple of, uh, couple of questions.


Overall, like, I feel like I kind of wrapped my brain around, around it for the most part. Uh, very helpful, you know, the tutorials and also the, uh, the other office hour that you recorded with the young, with the guy, uh, you walked him through the whole process of the camera calibration. 


**Eliot:** Yeah. Yeah. That was great.


Cause it's just like soup to nuts. 


**John:** Yeah, so, so basically where I'm at is, [00:01:00] is the following. Um, I've, uh, you know, I've managed to, uh, managed to, well, the main glitch that I have, I'll tell you in a second, but basically export, uh, convert from Blender, I'm starting in Blender, convert it into, um, what is it called, the file format?


Oh, USD? Yeah. USD then using light craft, compress it. Right. Push that too. And I've, I've tried it with both the phone and an iPad. I have an 11 pro phone, so it doesn't have the, uh, it seems like the only thing it doesn't limits in the phone is the, um, uh, it doesn't have the lidar. 


**Eliot:** Right, 


**John:** right. The tablet has the lidar, but the phone's a lot easier to mount to the camera, obviously.


**Eliot:** Right, 


**John:** right. Um, I'm, I'm using the, uh, the CIMO, I don't know if you can see, the CIMO, Accent CIMO. CIMO Pro, yeah. That all worked. I managed to, you know, get, get that, uh, get the image into the CIMO. Uh, the camera calibration seemed to work. [00:02:00] And then I was able to record takes, bring the takes back out into Lightcraft and then push those back to Blender with the, with the animation.


So that all worked. Uh, what I Uh, the biggest problem that I've had, and I've had it on a couple different things, I think I must be doing something wrong in the export from Blender. I don't, I, I can't see any of my textures in the app. Um, I'm using emissive textures, they all come out white. All right, uh, yeah, but I know what's going on.


You want to screen share? What's it, yes, please. 


**Eliot:** Yeah, so, Andy, there's the underlying technical thing is that Um, the iPhone, uh, apple basically uses, uh, everything is USD and, um, the only, sorry, everything's what? Uh, US USDZ. U-S-D-U-S-D-Z? Yes. Okay. USDZ is just a compressed version of USD that's just uhhuh all the stuff wrapped up to a, into a package.


Uh, if you'll unzip, it's just USD and the only shading systems the it understands is what's called, [00:03:00] uh. Uh, USD preview. So it's a limited subset of, it's not like, it can't handle like full material networks. It's actually very simplistic. It can handle like the big, you know, big four or five, like the diffuse slash albedo.


It can handle normal maps. It can handle, you know, but it expects things to be plugged into a principled shader. Um, and when it's not, sometimes it goes, what? Well, I 


**John:** think that's what I, I think that's what I have though. And I've tried and just, just for some context, I've tried two different, I've tried two different blender files.


Now the first was, so here's what, here was my workflow. I wanted to just sort of test with the scene with some, uh, sort of landscape, um, things that I downloaded from, um, um, crap, what's it called? Um, um, the, the unreal engine one, um, You mean? Megascans. Oh, Megascans, gotcha. So, so, they were Megascan assets.


Did you use our 


**Eliot:** importer? What? Did you use our importer? Um, here, let me show you. We built [00:04:00] a, you're gonna want to know. I used, I used the built in Megascan importer. Oh, okay, alright, alright. And this is into Blender? Wow. I didn't know they had that working. Yeah, it works 


**John:** really well. They have their own importer.


Uh, uh, it's called Bridge is the application. They have, they have something called Bridge. I think they're getting rid of it actually. It's going away. 


**Eliot:** Yeah, I tried 


to get it to work and I failed. So just so you know, we, I, out of, I got out of frustrated, um, irritation. We just went and built a, uh, 


**John:** But let me, let me just describe the whole process and then, you know, I guess we maybe.


Okay. So, um, I then, because I, you know, uh, I then decimated because they were a fairly high polygon count, you know, so I decimated it down a lot, all those assets. So, so my final polygon counter was around 60, 000. So that seemed to be fairly reasonable. That's 


**Eliot:** easy. 


**John:** Yeah. Um, and, uh, yeah. And then, and then, and then, you know, created the, uh, the null objects, the, um, [00:05:00] uh, 


**Eliot:** locators.


Yeah. 


**John:** See locators. Thank you. Um, so, so then what happened was, yeah. And then, um, I also tried it both ways. I tried, I tried the emission say shader. If you want me to share my screen or do you want to show your 


**Eliot:** go ahead and share your screen. And then you look at them, the material shading networks are, 


**John:** Let's see here.


Uh, where the hell is screen share? Um, share, right there. Share. I'll just share the entire screen.


This is actually a different scene I was messing around with. So, let me, let me go and get the, the first Blender scene that I had a problem with. So, um, I guess,


okay, this, this should be it. Um, [00:06:00] okay. So, so this is a basic scene. We've got, I just put a couple scene locator and this little post thing is just to have a reference for scale. If I go into, let me just look at it rendered.


So I may have switched this back because I was trying a bunch of different things. No, I switched. Okay, so this is where I, this is where I did land as far as, uh, as my material setup. So here's this, uh, one rock thing that I downloaded. Um, Okay, now it looks like you've 


**Eliot:** got a, a node group in there. So if you, so plugging into your mission, You can click on, uh, here, um, this is a node group.


What's this? If you hit, select that and hit tab. That's a, that's a, that's a node network. If you hit tab. Yeah, yeah, you're going to open to that and it's going to be like, 


**John:** Ah, okay. That's 


**Eliot:** what's killing your export. I can [00:07:00] guarantee it. Ha ha 


**John:** ha ha ha. You know why? Because this is a material that came, came from Megascan.


**Eliot:** Um, okay. So I would be worried because they're going to do some wild stuff, um, and it's cool, but it's going to be hard to translate. Take, so I put a link, um, we built a Megascans importer. It's simple. And what it does is, uh, you download the, the FBX file from, from mega scans, the mid-level, high level, whatever you wanna use.


Yeah. And it gets you a folder. And, and it has a bunch of different sub folders. And each one of them has, uh, either just a set of text, raw text maps, uh, or a a 3D actually here, you know, lemme just share the, share the screen. It's easier to, uh, you just show it to you. Um, because the, the trick with some of these importers is they, I mean, there it is, right?


You're gonna, you're gonna open up that and it's gonna be a. And actually just, just to, uh, do you know how to go into node groups or just, you know, highlight it, hit, hit tab and it'll open it up and there'll be a material node network inside that. That's, that's worth [00:08:00] just kind of, let's take a look. I, you 


**John:** know, something I missed that because honestly, I'm still somewhat of a blender newbie.


And, uh, I know. I just didn't notice that. But yeah, so you basically just click on that little thing, and you're not. 


**Eliot:** Oh yeah, can you screen share briefly so we can we can see that because this is going to yeah I just sorry. I should have let's open that, and because that's the the tab is how you can like shift down into the the sub networks inside blender.


**John:** So here's 


**Eliot:** that go ahead and highlight that guy. And if you hit tab, it's going to dive into it. 


**John:** Ah, okay. 


**Eliot:** And Oh yeah, that's it. Some of these things have crazy stuff going on inside. Right? So some of these have what's called a textures, uh, texture bombing. So it has a multiple sets of textures that are overlapping and things like that.


It's awesome for render time. It's just like, that's not good. It doesn't translate to, you know, again, the, the, the, the basic. Um, uh, USD material preview surface is, is basic. You get, you get five, five [00:09:00] textures and that's it. 


**John:** I even seem to have problems with my own, my own little object there. Um. And this, again, I just have, well, this is like, it's the built in checkerboard texture.


**Eliot:** Yep, but that's a procedural texture. Um, so it doesn't, it won't, the exporter won't actually understand that. So the exporter is going to go through. Okay, so that's not going to work either. Yeah, because it's a procedural system, you know, proceed. So, yeah, okay. And, and so there's, there's light at the tunnel, which is that material X is, and, and.


I mean, you know, I even do this, like materials are nonstandard, right? It just over whatever network and material X is an attempt to have this be more standard. And we'll see how this comes out. But right now material networks are, I mean, you know, I was telling this transfer, but tech materials don't, unless they're super simple, you know, just plug in the plug in the maps and you can bake that map out.


Like if you bake the tech, the checkerboard map out into, you know. PNGs or JPEGs or whatever, and then have it as your yield [00:10:00] basic, uh, you know, UV map texture plugged into a mission. That's going to work. 


**John:** Okay. Okay. 


**Eliot:** Yep. So, 


**John:** uh, I guess. What would be the, um, 


**Eliot:** all right, here's what you want to do. Yeah. So I'm going to share my screen.


Uh, and I just put a link to our mega scans batch import because mega scans are awesome and we want them in blender. And so, uh, what this is, is it's a, it's a, uh, It's a script that's, that's linked to it in the YouTube. And I think it's on the downloads as well. But, um, all you basically do is, is you, you basically, this, uh, you, you point it at the standard fab.


com download, you know, you get a zip folder and you open it up and it's very unhelpful because it has a series of, you know, just a bunch of raw textures and files and then an FBX files. And it goes through and parses it. And if it's just textures, then it creates, do you know the blender asset system? And all of you messed with that.


No, I, 


**John:** I know that I haven't used it actually. 


**Eliot:** It's so cool. And so this is, and [00:11:00] it turns it, uh, it turns all those mega scans into actual blender assets, right? So, and so when the blender asset system, you can just drag and drop. So it breaks it down into the materials or models. So you can actually just drag and drop stuff into your scene.


Uh, and, and it's plugging in all the textures, you know, more or less correctly, right? So you can get them into your scene and I'm sure someone's going to have to tweak it a little bit, but. It gets you so far the way there and then, so it loads in everything and it generates the, you know, previews so you can see which ones are 3D, which ones are, you know, whatever, and, uh, and this, this should just work because what it's doing is it's just doing a very basic blender, you know, uh, it's just looking for all the, the textures that have standard names, you know, you know, diffuse and, and normal and stuff, and it plugs it into the correct blender things in the principle shader and hooks up the, uh, hooks up the UV coordinates and that's it.


It's very like vanilla. It's not, it's not trying to do wild stuff. Yeah. Okay. 


**John:** Well, that would explain a lot. Uh, the other, um, actually I'm going to share my screen again [00:12:00] because, because the other idea that I had, um, I'm working on a low budget feature down here. And if I can figure out some, I, I may, uh, want to use at least for a couple of little test scenes that this technology, if, if I can kind of get it sorted out in time.


Um, and. The, um, so I guess what I'm, what I was, what I was trying to do, what I, what I'm planning on doing is we have a scene that takes place inside the Peruvian Congress, you know, and we have access to go into the building, uh, but we may not, we're probably gonna have to shoot everything on green screen and then comp it together because we don't have, you know, we can't film in there, but we can go in and shoot plates effectively.


So one thought that I had, which is like very quick and dirty. would be to, to, like, shoot an HDRI 360 photo, depending on the geography of the room, etc. And then map that just into a, into a, you know, a cube or something like that, just to have a room. [00:13:00] It wouldn't have any real depth to it, but depending on the shot, I might get away with it.


If I don't see the parallax is kind of where I'm going with this. So I'm thinking about having just like a shot of my actor standing in the middle of the camera kind of going around him and then Mapping that motion if that makes sense. So yeah 


**Eliot:** Yeah, I see I see it. Um and depending on how complex or how kind of interesting and tricky you want to get Um, you might want to do, there's a couple of things you might want to, might want to look at.


And, uh, have you seen the X Grid scanners? Um, let me share a screen. There we go. Okay, so this is a nifty scanner because it's got a, it's got a revolving kind of, uh, LIDAR sensor on the top. And it's got, you know, multiple cameras pointing all different ways. So you can walk through it in range. And it maps pretty fast.


Now, it maps color and texture at the same time. And it's not Final Pixel. Just, just to get that out of the way. Um, However, it's fast and it generates Gaussian splats very, very, very quickly. They can [00:14:00] handle, handle a range of stuff. So I'm, what I'm kind of curious is if you may want to, um, and of course, like the, the color sampling on it, it's not, it's not what you want, right.


You know, it's, it, but it's. A very fast way to get things that actually, I mean, it looks pretty, it looks pretty good if you, if you were, if you were in a tight corner and you could blur out some of the background and you, you're going through it, you, you know, you could make, make some of that work, but I think it's also good as a foundation.


And so I gave a talk a little while ago and, and on, um, what I think is going to be a future way of doing exactly this, which is to go in, scan with Elixir, just get, get the 3d model and it's accurate within a centimeter, you got color, you know, and everything looks good. But it's just kind of a little soft, um, and it's not, it's not, it's, you know, it's not log footage, right?


So it's, it's, um, it's, it's just, it has some limitations. But then you figure out, use that to kind of, you know, load and adjust that, plan your shots a little bit. Uh, and then go into the actual location and, you know, with it correctly lit, hose that down. You know, the, like, figure out where your camera positions are going to be.


And then you can, [00:15:00] you know, you can pose that down with some, you know, bring, bring along a red or black magic or something that has a lot of pixels and a lot of bit depth that you can just walk around and, and, and then make a Gaussian splat. Right. Now you have. So hold 


**John:** on, stop right there. Yeah, so you're saying to like live video walking around filming it, not a series of shots on tripod, you know, photogram is like photogrammetry effectively right but it really 


**Eliot:** is.


Yeah. And, and it's. And this is a, this is an area of, of, of interest, um, because you can, of course, you can, of course, you know, just go around and shoot photogrammetry because when you put it into a solver, like, you know, reality, reality capture or something, it's the first stage is just photogrammetry, no, no two ways about it.


Um, but the nice thing is that the Lixil gives you, you know, uh, you know, a point cloud that's accurate and then of course, reality capture, you just, you know, stick them, stick them together. So now you're. Now you're correct in 3d space. Things all kind of line up the way you think they would. And then you can walk your images and then train [00:16:00] a splat in post shot.


And I think you can actually, you know, for it'll, it'll be a lot better on your camera motion than trying to map is still onto a cube. Um, I mean, you still might, you want to just go to the still on the cube. If you have something really simple and you're barely moving, but as soon as starts to move, then.


Yeah, I know more. Well, again, this 


**John:** is, this is very, you know, we have zero money. This is like a very, you know, 


**Eliot:** Oh, this is why I'm suggesting it is because the Lixil probably won't rent it for 100 bucks a day. You'd have to go check. But like, it's this big, right? And you rent, rent in LA and fly down. And it's going to take you an hour to walk through the whole thing.


And bam, it's going to be mapped. And then, you know, maybe if you can use it to plan shots, that'd be great. But then go in and photograph the daylights out of out of it with again, yeah. Yeah. Get bring a sweet camera and camera of your choice and, you know, photogrammetry. But the, instead of trying to bake a model 3d model, cause that sucks, right?


You, you [00:17:00] end up, you lose all your reflections. The textures are a munch, all this kind of stuff, make a Gaussian splat. Um, and this is a little. Right. Some of these things are fairly new, but it's the thing is it maintains reflections and it maintains lighting through your, I saw that 


**John:** on one of your, I saw that on one of your, uh, uh, Tutorials or office hours about the Gaussian splat thing.


And I honestly don't even, I can't, my brain doesn't even know how that works, like how it can do that. Think of it 


**Eliot:** as photogrammetry and, and accept. Instead of extracting, crunching it down to model and textures and, and, you know, which is hell, right. And then it never works. Right. You're trying to get reflections and, you know, off this thing.


It's not, it basically doesn't try to do that. It just, it's just really kind of resampling the light fields and generating, you know, it has the view dependent aspects of it and it works like, holy shit, like, and you're going to want to, you're going to probably want to run it slightly defocused in the background.


Right. Um, you know, if you need something hard and crisp, you need. Principal photography, [00:18:00] right? That's just, but yeah, there's a little bit of defocus and I, I looked at this and I've run it by a couple of like, you know, serious effects supervisor and well, it was kind of came to the same, came to the same conclusion, which is, this is totally going to work for as soon as the cameras you're doing a closeup shot or mid shot, it's a little soft in the background.


Yeah, and you're starting with high bit depth information, right? You're not, it's not a crunch date bit thing. You're no shoot with a real camera. So the way you feed into reality capture is, is, you know, the real deal. And then when you crunch it through the Gaussian splats, then you can maintain that. That information to re render it, you know, on real, you know, actually or actually, so, 


**John:** so what is that, um, software that would do that 


**Eliot:** post shot?


Um, so you start off with reality capture to, you know, stick all the images to the, to the, um, to the LIDAR. Right. Um, and then the other one is a Joss at post shot it's free. Um, right now, you know, while they, you know, I'm going to put this. Link in here and now just to you know, be clear I haven't actually gone through this whole thing and shot the shot the splat with a super high end camera [00:19:00] and pushed it through I'm sure there's gonna be a couple quibbles, but the the post shot guys are serious The reality capture people are serious, you know, and they're used to dealing with high bit depth bit depth imagery So I would you know, take a look at the I put the link of a post shot in here Just so people can see it Sure.


Okay. And there's, there's a bunch of other apps coming out. PostShot's just the one that I've, I've used and it's fast. Um, but then yeah, you move the camera around and it renders like really, you know, it's not doing this. Yeah, I kind of, I mean, photogrammetry is okay for this very contained thing where you're going to do a round table on some, on an asset, capture it, do all the stuff, delight it, extract the materials, this kind of thing, but it comes apart.


Hey, Kevin, it comes apart for, for, um, environments, I think, because there's so much view dependent lighting, whereas the splats maintain it and you don't need like 20 people to do this. Like it's you yourself. You know, uh, with, with, uh, you know, run around the Elixir, then put the Elixir in the bag, now take, take out your, your sweet [00:20:00] camera, run out with your sweet camera, and then the rest is, is, you know, the, and you can even, I mean, the Jet Set will actually read, um, Gaussian splats.


So, uh, if you want to see an example, let me find, uh, 


**John:** So what is that first, sorry, what, and what is that first thing called? I don't know if I can get a, get a handle on that. Oh, Xgrids. Uh, 


**Eliot:** it's, uh, let me give you a, send you a link. There you go. It's the newest, their Elixir scanner. They've got a variety of them.


That's just their, their high end one. It's like 20 K. Um, but I mean, rented, you know, it's like. It's like this big, you know, put it, put it in a suitcase and fly down to wherever you need to fly down. And they have lower cost ones that was just, I was kind of blown away by the range of environment that thing can capture in a short period of time.


Like I've done that, you know, you take out the Pharaoh, click, you stand, stand behind a post while the Pharaoh spins, move it, click, and then you get a billion points. Then somebody has to mesh it and align it and it ends up being such a pain in the butt that people throw away the data and they don't [00:21:00] use it.


Whereas the wild thing of this thing is that it's generating Gaussian splats natively, right? It doesn't mesh it, you know, cause the meshing is death, right? You got 4 billion points and you, you know, and nobody has their 10, 000 license of geomagic. So everybody's trying to do it and something else and the algorithms die.


Now this. You know, this is, this is pretty wild. So we actually crunched that and, uh, let me, um, I'll show you the 


**John:** talk. If I can't get ahold of that, what would be, what would be my alternative? Is there any like an iPhone app that would get me? 


**Eliot:** Yeah, I mean, you know, like Polycam, you know, for it, cause you're in a constrained interior environment, it's, it's going to be pushing the limits cause it's not small, you know, and Polycam is using the lighter scanner.


So it's, it's not going to be, the Lixil is going to be accurate on the, it's got a big spinning lighter on it. So it's going to be roughly a centimeter accurate, um, a decent range. And the iPhone is not, you're going to get drifted on, on the larger scans, but you know, uh, no, I might be able 


**John:** to get away 


**Eliot:** with it on what I'm doing.


[00:22:00] Yeah. But I, that, let me, uh. Find the it was uh, uh, give me a second. What was the name of the it was production? Uh summit 2024 los angeles 


**John:** No elliot, maybe I don't know if you have a uh Because I don't want to take up the whole meeting but with my with my textures thing I don't know if you have a blender scene that has textures that we know works that maybe I can Yeah, 


**Eliot:** let me try, um, what I should do is actually, is to actually go through and export that the mega scans one, like we did the, the import and I looked at, I'm like, oh, okay, great.


So everything's hooked into the principle shader, but I didn't, I didn't go through and I closed the, closed the loop. It was like, ah, that's not work. Um, but of course, assumption, but you know, in truth, you What it comes down to is the prints, the exporter, um, so, and I know this reasonably well because I talked to the people who wrote it in Blender, the USD exporter, uh, first looks for the presence of a principled shader, [00:23:00] right?


And it works. The, the principled shader, uh, and in, um, the BSD at the standard BS, uh, BSD principled shader and Blender, that's what it looks for. And then it, then it, then it comes out from that looks for the albedo or diffuse. And then, and it looks for the normal and it looks for. Um, roughness. Um, and maybe a couple others.


Um, I forget 


**John:** exactly. But should, but I saw in the tutorial that I should be using just emissions. Just emission shader. 


**Eliot:** Yeah, you can, you can, you can just plug that in. I think emission shader works, or you just plug it into the emission click of the BSDF. Uh, or the principled shader. And I think that, you know, that's, that's fine too.


It's got an emission shader. But could I 


**John:** just, do I have to do that? Or could I just have a light in the scene and have a regular, regular texture? 


**Eliot:** A light in the scene and a regular text. Well, 


**John:** I mean in other words there was something about like the the phone itself Doesn't have any lighting information in the app.


**Eliot:** So So if you if you have a texture that's like a scanned texture, that's basically assumes Emissive, then [00:24:00] you do need to plug it into some form of emissive, whether it's in the BSD, the principled shader, whether it's in the emissive port of that, or whether, I mean, blender has a native emission shader and that usually works, although whenever something goes wrong, I just plug it into the principled one and then it works.


Right. Okay. But either way, the, the interior lighting model of the phone is just, I mean, yeah, you got three orders of magnitude, less compute. So it's, it's very simple thing. And so the textures won't, the emissive texture won't look right unless it's plugged into an emissive. Emissive, you know, port and then then it's then it's ignoring the the phone lighting because the phone lighting is Almost not the 


**John:** phone does have.


I said that the phone didn't have any lighting at all. So, so for what you're saying is like, I don't need to, I don't need to use an emissive. In other words, in the BSDF, I could just have it into the regular albedo. 


**Eliot:** Yep. You can do that. And what the phone is doing is it's as you walk through and you've got your environment, it computes a very, very rough HDRI at first about, Hey, it's computing HDRI.


We're going to download it and use it. No, no, no, it's not worth it. Um, and it computes a very rough HDRI [00:25:00] from your environment and uses that as the lighting model. This most simple lighting model ever. Uh, and it's okay, you know, again, it's for the basic preview sort of thing. Um, it's just, it's, um, what's the best way to put it?


It's a system designed for when somebody puts an AR widget on their countertop to get the lighting to roughly match that. And it does 


that. 


Um, it just breaks down when you have, you know, a 500 megabyte city street, you know, yeah, I just want to be able to see, 


**John:** see stuff for reference. Obviously I don't care.


**Eliot:** Yeah. Okay. Plug it into the standard BSDF and, uh, and you'll see stuff. Um, and just, just make sure there's no material node networks because it doesn't, it doesn't speak material node network. It's like, where's the texture? If there's not a texture, it's like, you know, 


**John:** um, So, so one more question and then I'll, I want to let everybody else, you know, I don't want to hog the meeting.


These are, these are great. I mean, these are, these are the real questions, right? The, um, no, the [00:26:00] other question about the workflow is if I'm, if I'm shooting with red, um, because I had, I, I had trouble, um, with, with the sinking, I had to manually sink and I'm not at the moment. I don't have time code. I don't have a technical time code thing set up right now.


It's just just the using the slate off the laptop. You know, um, is there anything I, like, does the, does the red raw workflow work the same way? Oh, 


**Eliot:** excellent question. Really good question. We're actually writing 


**John:** that part. Okay, so should I should just use the ProRes files? I should just, uh, I should just make ProRes proxies and just use those?


For a little while, yes. Yeah. Okay. 


**Eliot:** We're still crunching through the direct. Oh, we're punching in the the direct raw conversion from Red, Alexa, and Canon. And it's gnarly because we have to do the correct color space transformations and some of the, those SDKs are like, I mean, Jesus. It's like, we're digging [00:27:00] bits and hitting them with, with tongs kind of, kind of stuff.


Whereas we need to do the color space conversions, convert it to AP, AP1 primaries, do all this stuff, do the right things, get into nice format, USD, you know, not USD, um, ACES CGXR. So nobody else ever has to look at this. Morass. Anyways. Yeah. Everyone's well, it's just funny. You, you bust open an SDK. I will not, I will not name names, but you're the nineties.


I mean, and it's, it's important because you're dealing with low. Low level bit depth stuff. And you start questioning, is it, is it big NDN or small NDN? And all these sorts of things that are, that are just intrinsic to when you're, when you're, you know, slamming, slamming bits, um, all right. So the, uh, the talk I'll give you, there's a link to this, the talk, um, and it's walking through, um, my theory on, on this kind of new way of doing location scouting.


And the cool thing is you can see, uh, the [00:28:00] Lixil, um, share a screen here. Yeah. Do that, do that, uh, share. And the talk's called, Bye Bye Brain Bar. Ha ha ha ha ha. Because we are working on, on, very much on removing the need for that. But, we go through, and let's see if the, if he's, if I'm gonna see, if you see any of the, Okay, so that's one of the Galveston Splats, um, Let's see if I have, okay, so it's, it's, this is behind me over here.


So this was, um, uh, you guys see the screen, right? Um, okay, so over here on the left, I've got a, uh, a Gaussian splat, uh, made from Elixir, you know, one, one of the big, big scanners by, uh, it was Eric Geisler's company. It's, uh, uh, Global Objects. And so, uh, Eric went and, and to a small New England town and scanned it.


Like, scanned the town. Which scanner is 


**Kevin:** this? Sorry, Sagan. 


**Eliot:** Oh, this is the L2. Um, [00:29:00] Kevin, have I talked about this with you? I, I, I always tout the Lixil thing, but, uh, you know, let me show you what this thing is. Uh, Xgrids. This thing's nuts. Um, so that is a revolving, uh, LiDAR scanner on the top of that thing with like four color cameras pointing all the different directions.


So it literally does a real time SLAM LiDAR map as you walk through. So you can hoof it through a fairly large environment and the resulting, uh, it gets both color and, and, you know, and point data. And the great thing, and it's, it's not final pixel. And this is one of the things I talk about in the talk, is you want to do this in two stages.


I think it's the world's perfect location scouting tool. Because one person with this thing running around, you know, if it's in a, you know, your backpack, they can carry on and that kind of stuff. And you just walk around and you can scan a big environment in a short period of time. You just walk it around.


And the other thing is all the other ones. Generate, um, a big mess of point cloud and like maybe color data. And then somebody has to go turn that into a form that might be useful [00:30:00] somehow. And it's brutal. Um, and where is this thing actually generates Gaussian splats directly. So we can just downsample them and we drop them into the phone, which is exactly what I did on the talk.


Um, so in the talk, so this is the, the New England, you know, thing over here. And I'm going through, um, the capture and, uh, like we dropped it into Jet Set, uh, the, the, you know, the subsample scan into Jet Set. I shot a take, pulled it back to Unreal and hit re render and, you know, it re renders, you know, frame match onto the original, onto the original Elixir data.


And then later on in the, in the, in this, uh, somebody asked me a question. I'm like, oh, let's, let's, let's, you know, fire it up. Yeah. Okay. So I just, uh. I just turned on, I turned on Jet Set and I'm running it on the stage. My right hand is holding, holding the phone. So it's tracking. And so that's the, uh, you know, that's, that's the actual, uh, you know, down sampled, you know, we, we cut the scan down, right.


Cause the original scan is like not small. So we cut down to a corner of it to fit into the iPhone, but yeah, it works. And then we [00:31:00] maintain the same coordinates, coordinate space between Jet Set and the iPhone and the original scan, uh, in this case in Unreal, the original, um. Lixil scan in Unreal and you can render it works great.


That round trip. How much is that? How 


**Kevin:** much does 


**Eliot:** that scanner cost? It's like 22k, you know, yeah, I read it You know ran it ran a few days and throw in the backpack and go wherever you need to go But I looked at that way or even get a local to do it, you know And because this isn't it's not a final pixel thing But it gets color and and spatial data at this crazy rate of speed and it's you know centimeter accurate.


That's fine You're not building a bridge with this thing And it's a zillion times faster than hauling the Pharaoh out, you know, waiting for it to spin, go, stop, even, okay, go and post, light up all the scans, and they're all like, eh, eh, you know, kind of stuff. Yeah, no. So my, and, and Kevin, my, my comment to, uh, to John, uh, cause John is, is dealing with something, a potential project where he's, he, [00:32:00] they all have very brief access to, um, To a large interior space, and so I suggested, you know, go down there with Elixir, you know, use the plan shots and then go in with a high res camera and, you know, Red, pick your, you know, pick your poison, shoot, you know, a lot of shots from all the directions you're going to cover, and build a really high resolution Gaussian splat for the background, and then you can shoot track shots, you know, just because then you know where the camera is going to be, right?


You want to know where the camera is going to be, otherwise you're going to spend a lot of time on places. You don't need to spend time on, um, but then you can do, you know, take all that, that data to post shot and make a splat and then, you know, render with it. Right. So then, then you have free form camera tracking, um, without having to have a giant crew in, in a sensitive area.


So anyway, so that was my, that was my talk I did on that. And, uh, I'm, I'm, I'm hoping somebody, you know, does it because it's by far the easiest way to do something like this. I mean, by far compared to, you know, trying to let, you know, do all the stuff you'd have to do to have [00:33:00] a crew in there. All right. Uh, John, anything else you don't, um, I can, I can, I can try exporting something, uh, or Kevin, did you have, did you have questions?


I want to go through this and, and Umaru, I see you're on there as well. So. Uh, I, I 


**Kevin:** really, I'm, I'm mostly just here to, uh, to learn smart things. So I have tiny questions, but don't even worry about me. They're not, they're not pressing. 


**Eliot:** All right. Uh, Amaro, um, do you want to, so did you check to see, you saw the clip I sent?


Um, so background, uh, Amaro had, uh, had, had a shot that the, um, when he processed it in, um, in auto shot, some of the tracking was, was, wasn't really accurately matched. Uh, and so when I loaded into resolve. Um, I'll show you what, and this is worth looking at to kind of see when I, when I check something, when I check tracking, this is one of the first things I check.


Um, And I think the image stabilization was on, on the camera and it just jacks up everything, um, to the point where we're going to have a, some giant blinking flag on the next release of our camera [00:34:00] calibration stuff that says turn off the image stabilizer, you know, at all costs, uh, cause otherwise, otherwise this is going to be a, it's going to be a rough day.


Um, but I'll, I'll play the, um, just for reference, I'll play, play what it kind of looks like when. When I think a stabilizer is on, I go find it and I'll share a screen because it's, it's worth seeing the, the shifts. So, all right, there we go.


So what I do is I, okay, I'm going to share screens. You guys see, see my debug process share. There we go. Okay. So what I, what I do is I just go through the, the usual steps of, um, take alignment. So I make a, and this isn't resolve, but I'm sure it'd work in anything else. So I loaded their, the camera. Um, the cam original first and set the timeline to, to match the correct tent camera original.


Uh, and then I find the [00:35:00] point in the camera original, uh, disabled video track. Um, there we go. And I wasn't resolved resolving. There we go. I'm gonna go back over here.


Yeah, let's see. There you go. Now it's moving. Okay. So I look for the point in the camera audio or the camera video where the first, you know, first frame lights up, make a marker. And then I, uh, put in the overlay, the jet set clip onto that and the jet set clip automatically it's 30 P, but when you put it into the resolve timeline, it, it, you know, it interprets it as the correct, correct value.


And I get both of them to align up exactly. You know, and that way, that way things are, things are lined up and once they're lined up in time, then, then, you know, you can, you can watch through the clip and as soon as you have, you have camera motion, you can usually see, um, like stuff's going on. So, I turned down my volume a little bit.


Uh, [00:36:00] So the trick, of course, is they're aligned perfectly in time, but the, um, as the camera moves, you'll see, you'll start to see it move quite a bit in a few places. And so, and it was sometimes it's, it's really obvious when the camera's moving fast, these were, these were more gradual shots. Uh, so it took me a while to kind of figure it out, but this was the place where I could, I can, I could see it.


So, so you can see the, the, the background is like kind of going up and down quite a bit with respect to the, the foreground. So I think, I think that's what's, what was going on. So Amaro, uh, so that, does that make sense? 


**Umaru:** Yeah, in my first test to get it. I figured, uh, that, uh, problem, I, I turned off destabilization, but I've forgotten.


Now I have many, many projects that I shot with destabilization, that, uh, a big problem for me now. 


**Eliot:** [00:37:00] Okay. So, um, uh, are they, how long are there, are the, so when you're editing this, is it one continuous take, or will you be editing different shots out of, out of your take? Um, 'cause we can, 


**Umaru:** so normally I export all the sequences.


How, um, like I showed you, I export all the, the, the shot 


**Eliot:** mm-hmm . 


**Umaru:** From the beginning to end. After that, I do my keying. 


**Eliot:** Mm-hmm . 


**Umaru:** I co, uh, pre-comm that on fusion. I run it in place. It become one shot, like a normal shot. Mm-hmm. I do my editing. After my editing. I, I decompress all that to do the, the, the, the good Kiko.


Mm-hmm. For, for, for rendering . 


**Eliot:** Okay. Um, and then when you're, when you're making your final project, are you going to be editing, editing, because this is fixable, but it's going to be hard [00:38:00] because we're, what we're going to, uh, it, it, it's moderately straightforward. It's just going to be a bit of a pain in the neck.


Uh, because fortunately if, if the, the, the takes are like what I saw, uh, you have, you, you scanned, you have the 3d scan of the area and this is, this is the saving grace. We have a 3d scan and then we can, we can start to fix stuff. Um, Have you used synth eyes at all? Uh, it's a post tracker. Okay, so you're gonna want to let's see So we built actually a pretty good Uh, post tracking refinement workflow, and it's, it was originally designed, um, to, uh, the, the iPhone tracking is pretty good, but if you have it really, really close to the feet, you're going to see them slip, even in good conditions, you're going to see it slip a few millimeters around and even the tracking's pretty good.


It's not a sub pixel track. So you have really exposed feet. We knew we needed to hit a sub pixel track for, for, you know, high end episodic feature kind of work. [00:39:00] And so I, We very much made sure we could do that. Um, and so I made a tutorial with SynthEyes and which is a, it's a very powerful tool. It can be very difficult to use if you're coming in from scratch.


But what I did is we built a workflow. Uh, I'm going to put the link here and I'll share the screen. Um,


and there's, so there's, there's a couple of different approaches we're, we're, we're doing with this. All right. Let me look at this here. So what I, what we did is we, we, uh, uh, auto shot can generate a output script, uh, that can drive synth eyes and it can automate most of the processes. And so I go through this tutorial of, um, setting it up and we use a special AI Rotomat generator, uh, to tell, to crop out the person so that the tracker doesn't pay attention to that.


And. What it does is, is it, is it uses the, the magic part of this is that, uh, we have synth eyes can detect points in [00:40:00] the scene that it'll use to track that that's fine, but by itself, it doesn't have any 3D information. And so what the script that we use does is it takes the jet set data, including the Sydney camera, you know, calibrated optics, because you've got that, you've got the calibration.


And, uh, as well as the scan drops into SynthEyes and sets it up with all the things that are otherwise it takes a lot of clicking and the script just out of automates everything so you can tell it to look for all the features in the scene, um, detect them. And then we use a feature in SynthEyes called drop onto mesh, uh, where we on these 2D points.


As you know, so I found a set of 2D points. What I'm going to do is, uh, tell it to go drop those points onto the 3D scan mesh. And then that tells us where we are in 3D space. And then the, the solver, uh, when it, when it locks the solve. The normal problem in 3D monocular tracking is that it doesn't know where it is in space.


So you end up with, maybe even if it solves, the camera is pointing in some random direction at some random scale. And [00:41:00] you spend the next, you know, hour and a half, like, going around the scene trying to adjust things and guessing at the scale. Which is a mess. And so this removes that, that problem. So I'm just going to grab these and go track, drop onto mesh.


And it, what it just did is it turned those, those kind of, you know, points that Synthi has already picked out the good points to track in the scene. Now we associated that with the mesh data. So now, now it's 3d points. So now we basically fed its, um, survey data, which Is, uh, that's, this is how a high-end feature film would work, right?


They would have a team with during tracking points and stuff like that, with a, with actual special architectural survey tool. Giant pain in the butt. I've owned them, owned a bunch of 'em, used it, giant pain in the butt. This is like, 95% is good and it's super easy and it's automated with, with if you have to shoot your scans with Jetset, which you did.


Um, and so then we can get your shots to lock. Um, using using this method, and we can do it pretty fast now. It's gonna be, um, you [00:42:00] know, quite a few different shots. You may end up having to break it down, but it's solvable, and it's once you get the process going, it's reasonably fast. You know, we're this shot, and we're getting about five minutes per per shot, and this was a few 100 frames, you know, so that's a few seconds at a time.


Um. And you can track longer shots in SynthEyes. You may, you may just, it's pretty fast. Uh, it'll all, it'll all kind of depend on, on, uh, uh, on how long of your cuts you want. And I did actually a specific tutorial on fixing a misaligned shot, uh, in SynthEyes. This is, this is gonna be the one, uh, let me put the tutorial link in here.


Alright, um, do, do, do, okay. Okay, there's the synthesized tracking and then this is the fixing a misaligned shot and this is One goes over the process of, we use the same methodology in the first one, uh, but in this case, and we, we import the shot, we go through it. And the, the problem of course, is that in this particular [00:43:00] case, something got, got whacked.


So they did their calibration and the phone, and in between the calibration and them actually shooting, something on the phone got, got knocked. Um, so the, the, you know, the scan doesn't line up with the, uh, uh, with the phone, uh, with the, the image. And. And fortunately in SynthEyes, we can go through and grab a few of the keyframes, move them up and down and, and get things to align and then use their, our same process to, to lock, to lock the shot.


Um, and this, this is probably what you end up wanting to look at because, uh, I mean, your, your shots, it's not too complicated, right? The shots it's getting, the camera's moving gradually. You've got to scan and the green screen has enough, you know, wiggles and marks in it that I think you'll be able to detect features in it without too much difficulty.


Um, And then you can solve the shots and it, and it's. Learning SynthEyes from scratch is too much, I'm going to tell you that. Like, this is not to make you a professional SynthEyes tracker. This is just to use the scripts, the automation scripts we, we, we built. Um, and, and the data, the Jet Set [00:44:00] data, and it will save your butt.


It's this, with that data and these scripts, you can fix crazy stuff that went wrong on set. If you have, if you have that scan data and you have all the calibration data, uh, from Jet Set. Then, you know, you can fix all sorts of things because you know, 80 percent of it already going in instead of 0%, which is the usual case with traditional 3d, uh, normal, unassisted 3d solving is so hard.


That is the reason I started like craft. I tried solving shots just from scratch. It was awful. Okay, so yeah, and, uh, and you can rent SynthEyes, it's, I mean, I forget, it's like 20 bucks a month or something like that, but you know, from the, from the Boris site, 


**Kevin:** 25 a month, I think, yeah, 


**Eliot:** it's not that expensive.


It's, it's pretty fast. And, you know, again, once you go through this and you learn it and we, we have a couple of scripts to automate different parts of the process, it's an eye opener because now, you know, you can drag anything [00:45:00] once, once you go, go through that, it's, it's, uh, it's made to handle. Things that go wrong.


Uh, so let me know how that goes. And if you run into problems with it, with it, um, you know, just screen, we can screen share and I can walk you through pieces of it on the shot. Um, but that that'll get you going on, on it. And I think, I think it'll work. Uh, work okay. Thank 


**Umaru:** you. 


**John:** No problem. No problem. Yeah, just a quick question because when you were looking at lining up those two shots, just I wanted to be super clear about something.


So the, the, um, the app is always recording through the phone. Those synth eyes takes are 30 frames per second. 


**Eliot:** So, okay. Uh, the original source data from, from Jetset, yes. 30 frames per second. And what AutoShot is doing is when you process that CineTake, AutoShot is looking at the, the, either the flashed frames and that's why we flash those frames or we use time code and plus, um, optical flow, because we didn't know, you know, the Jetset, you know, time, you know, tracking data is operating at one time zone or [00:46:00] time period.


And the CineTime is like. And what we're doing in autoshot is we're locking them. We find like now is now for those. And then we reinterpolate our jet set data, the 30 frames per second data. We interpolate reinterpolated to 23, nine, eight, 50, 


**John:** whatever, you know, I wasn't sure how that was working. So then, and then when you push out the, um.


The, the takes, uh, into, you know, in my case, Blender, 


**Umaru:** then 


**John:** the, the Blender scene will automatically be set to whatever the cinema, cinema camera shooting frame rate was. Is that correct? Give me a second. I think 


**Eliot:** what we actually do, okay, uh, in, I actually have to look at this a little, uh, a little carefully.


I'm not sure if we reset the Blender frame rate. I think we, I think what we do. Actually, so that's a really good question. I'm actually gonna have to have to look at that. The tracking data is re sampled and so when it comes into Blender, the frames match. Just, they just do. Yeah. We re sample the data outside of Blender and so we just load that in.


[00:47:00] Um. When we create our Blender scenes, uh, it can be created in one of three ways. Um, you can reference in, uh, So Blender has a couple different ways of hooking scenes together. One is a pen. Where you grab a whole scene and put it together. That's what I've been using so far. Okay, yeah, that's the place to start.


When you start having like 300 shots Then you're going to want to use the link system, which links the original scene data into the blender file, and then it's, it's creating a new blender file on all these, uh, for the track shot and you can use link and it'll reference it in. And, um, And then it's, you know, the background sort of thing.


Uh, or you can do basically an empty, what we call an empty comp starter, which is just, you know, it's just the track shot and, and actually, uh, interestingly enough, we put in, uh, Because we thought more people were going to jump into the Blender, um, the viewport compositor. We put in a Blender node group that's doing compositing in that.


So, but of course, almost no one's using that, but it's, it's kind of cool. It's, uh, there's some neat things in there that someday someone's going to jump into. Uh, we actually built the node systems to do [00:48:00] automated, um, uh, green screen, the leveling systems. I don't know if you've ever seen this in Nuke or Fusion.


It's like the, uh, the Ivy cake here in Nuke. Uh, where you, you give it a clean, a, an image, you cut out, you use the keyer, cut out, blow out the alpha of the, uh, of the, of the person, paint in the images and use that as a difference mat, um, and, uh, and diff it out. And then you can do some really cool things with keying.


So we built that in Blender, no one's ever used it. That's all right. 


**John:** People are keying in confusion. I get it. I guess what I'm saying is that if I have 23. 98, uh, on my shooting camera, and if I go into the blender, I could, I could look, open one of those blender projects and just see what it did. Cause I don't, I didn't even pay.


Yeah, look what it did. I think, 


**Eliot:** I honestly can't quite remember whether we just, whether we referenced the initial frame rate from the appended scene or the linked scene, or whether we're just setting the blender frame rate. Actually, that'd be a great thing to find out. Because I actually can't remember.


I'm going to look right 


**John:** now because I haven't, I didn't even mess. No, it stayed at, is this the right one? [00:49:00] That was at 30 frames. That stayed at 30 frames per second. But I don't know if I'm looking at the right, uh,


no, that wasn't the one with the cinema camera. I'll have to, I'll have to go in and find which, which one. 


**Eliot:** Yeah, and it's, so there's actually an interesting question to be posed, which is when we're synthesizing this new Blender file, um, should we, um, basically reference the, the, the frame rate that's set up in the original Blender scene file?


Or should we actually force the newly generated Blender file to use the frame rate of, of our cine camera? That's actually a really interesting question. 


**John:** I would say, oh, it, oh, that is, I think that's what it did do. It did, okay. Yeah, because now, now I'm looking at one of my exported, um, you know, uh, takes.


And, uh, and actually that's the other interesting question. Uh, do you, do you mind if I share screen for a second? Oh yeah, screen share, this 


**Eliot:** is great. Exactly. I love these gnarly details [00:50:00] because, you know, nobody pays attention to it until like 300 shots land on them, and then it becomes your life. It's getting all this stuff queued up a priori.


**John:** So, so basically, in here, in my, in, in, in the take, Um, I'm sort of seeing, you know, I'm still seeing the camera, the camera footage from the cinema camera. But I have two frustrums or I have two frame guides is one of, and I know that the inner one was the aspect ratio I shot with. So is the outer, I'm just not quite sure what those, what those are like, what, why I have two and what the difference is.


**Eliot:** Oh, um, let's see, just sort of open up, see, this is, that's the, that's the image plane. And yeah, All right. Yeah. So that's the camera. Um, and then perspective, camera, focal, uh, [00:51:00] just 


**John:** to make 


**Eliot:** sure, uh, let's go up to, uh, I don't think you have your viewport compositor on, but let's open up this menu. Uh, there we go.


And okay. So the compositor is disabled. Um, all right, let's take a look at our, how do you, how did you do that? Oh, yeah. How did you, oh, this is the annotation tool. It's, it's on the left. Uh, left side of the frame. So there's, and it took me the longest time to find this out. The lower left corner of zoom, there's this tiny little, little green pencil, that's the annotate tool.


They used to have a button that says annotate, and it took me, after they put it over here, it, I don't want to tell you how long it's going to be to find that. That's pretty cool. I didn't realize that existed. That's cool. Oh yeah. Okay. Sorry. What do I change? Uh, no, you're, you're good there. Um, so I'm going to, let me delete my annotations.


Let me, let's take a look at. Um, let's look at over. Okay. So there's a resolution 1920 by 1080. Um, what was the camera original [00:52:00] aspect ratio? Were you shooting two to one or something like that? 


**John:** Yeah, it's like, it's, it's, it's, I think it 


**Eliot:** is two to one. Yeah. Red two to one. That's right. Okay. Okay. That's sort of interesting.


I would have thought that we would have adapted. Why are we, um, and are you working with a 1920 by 1080 proxy? Is that what's going on? Because what we usually do when we're generating. Um, uh, when we're generating, uh, from camera, original footage is this, we would, we would set this to exactly match the camera, original pixels.


So if you're shooting magic, you know, 6k, whatever, then this is like 61, 44 by, you know, 3, 200, whatever, some goofy number. So my guess is what you are working with is from your two to one camera originals, you pulled 19 by 20 by 10, 80 proxies that have, um, that have a frame crop on here. 


**John:** Yeah, that's probably what it is, Elliot, now that I'm thinking about my workflow, because it was all just, you know, trying to get this, it was sort of a mad, just crazy, trying to get anything to work, uh, moment.


[00:53:00] Yeah, yeah, yeah, I gotcha, I gotcha. Probably lost in the blur of all that, because again, I, I ran up against the thing, like, oh, right, right, I can't use my R3D file, so I got, uh, you know, 


**Eliot:** yeah. This is why we're gonna, we're building direct decode for all the major, the R3D, Canon, Canon C RAW, ARRIRAW, is that way we, you know, It's there too many switches when you're doing all this stuff, you're transferring over and you're got like 600 takes, you know, like, you know, this is, this is why we want to want to just remove and there's no reason there's, there's no artistic, what's the best way to put it?


We're not making decisions for anybody when we're correctly pulling the frames, you know, from the camera originals and getting him a DXRs. There's no artistic decision and getting it to an EXR. It's just. Camera, EXR, and then the artists get to do the real art stuff and not worry about goofy color gamut stuff, which is hell.


**Umaru:** I have a little question about Blender and [00:54:00] Unreal. Why using Blender over Unreal? Because I think Unreal is easier to use with Jet Set than Blender. Blender is heavy for me, I think. 


**Eliot:** It all depends on what you're doing. Um, and Unreal is astounding, right? You know, in terms of what they're doing and the ability to, uh, especially put together a complex scene from kind of prebuilt assets, uh, you know, whether you're going from the Unreal store, uh, or if you're going from mega scans or something like that, it's amazing on that Blender is what you start running into Blender when you start having to.


I guess go off the trail a bit more when you start having to build a lot of your own, own pieces. Um, then that's the part where it gets really strong. So it's, it's, uh, uh, it's just, it's just different, different tools have different sweet spots. Uh, Unreal is more of a very fast, um, assembly system where you, you don't build, you don't, you don't scratch model something in Unreal, right?[00:55:00] 


Not going to happen. You, you assemble a bunch of assets into Unreal and it renders them really quick. Uh, and Blender is more of, I got to build something from scratch. Yeah. So we, you know, we use both, you know, just horses for courses. Um, Blender is easier to debug. I'll tell you that it's a lot easier to debug.


Uh, and it's very easy to write custom tooling for it. So whenever we're trying something where we're not sure what we're doing yet, we usually, we test it out in Blender first. Uh, it's cause yeah, that way we can always, I can always fix it. Whereas sometimes in unreal, you can get to things that I don't know how to fix.


Um, uh, especially. Uh, if you have, if something breaks in on Unreal, it can be very, very difficult to understand what broke so just, but if it works, you're, you're good. And they're, they're both amazing. 


**Umaru:** I have trouble to import one sequence on, uh, Unreal, uh, the city sample. Do you know it? 


**Eliot:** Yeah. Okay. [00:56:00] Yeah.


That's okay. That sample is using three or four brand new systems that are running at the absolute edge of what Unreal can do. Um, And they were, they pioneered a lot of stuff in that system and I, I've downloaded it and then when I ever tried to do stuff with it, things broke and I couldn't fix them. Um, so I, I went, okay, I can't, I can't solve this.


And if I have to do a simulated city, I'll probably do it in blender. Um, or actually these days you would, you would, we're, we're very close to being able to do an AI. Generated 3d system, which is going to save a lot of effort. So, but yeah, I don't, I don't, we've 


**Kevin:** been using city sample. I think the thing to keep in mind in city sample is that it's probably the most expensive and high end unreal asset ever made, right?


Like whole teams of people were dedicated to figuring out how to make that whole thing work and. And those people are more specialists in whatever little thing they're doing to do that optimization, [00:57:00] to do that whatever, than I am. And, uh, so, so you end up trying to do really simple things. Like, I want to just move this building over a little bit, or change the sidewalk texture, and you get completely lost in their, in their complexity, so.


**Eliot:** Yep. That was my experience. Um, you know, if it, if you can, if you can use the asset in unreal, like as is like, you're not going to change it. Great. And it's, as soon as you start, you know, going off that trail, um, then it's frequently a cliff, whereas, you know, uh,


let me see, cause, and Blender doesn't have as high end of things, but they're, they're, they're growing. So this is, uh, share this guy. So blenders. Blender has a procedural system in it that's, that's really, uh, you know, Somebody clearly was looking closely at Houdini when they, [00:58:00] they did it. Uh, but there, it now has, I mean, this is stuff, this is all Houdini stuff, right?


That the people have, have, they're all techniques to people 20 years ago and Houdini, um, and then they're finally making it into the other applications. But now, now you can actually lay out procedural systems and start to make it. And it doesn't quite yet look as good as, as the best of the Unreal systems.


You know, they're, they're still, they're still better, but the thing with this one is, is, I mean, you can, you can fix stuff. Yeah. Jump to your nose, you pop it open and you look at it and you can see how all the chains went together and, and how, how it works. All right. Um, let's see. Actually. So John, you want me to, I can, I can try pulling in, uh, John had a question on, uh, where he ran into, um, problems with exporting, uh, some Megascans pieces.


Uh, and what I can do is. Let me just do it. Let me go find my, there we go, stop share, uh, let me pull up [00:59:00] Blender, let me pull up my Blender.


All right. Blender, Blender 


4. 2. And what we'll do, what we'll do is we'll just export one of the Megascans pieces that we, that we did. And, uh, it'll be interesting. I think it should work, but it's always good to, good to make sure these things work. And if not, then I have something to fix. All right. So let me go find, um, I'll share my screen after I get it.


Go find it open. There's unreal. Okay. There's mega scans


stuff. Remember where I put it? Oh, there it is. Okay. Unreal fab. Okay. And I'll get the junkyard.[01:00:00] 


Okay. So I'm going to pull up a, uh, screen shares. You guys see this share and screen. Okay. So what this is, is, um, Just for reference, this is the, we built a, our own Megascans Blender importer. It's just a simple script that you can, that then you can download, uh, the Megascans, uh, zip packages from fab, you know, com.


So this one was, I think a junkyard. Turn on my material preview. And what it does is it uses Blender's internal asset system to generate a set of assets, uh, both 3d and 2d for, uh, for the system, right, on Blender. And Blender's thinking while it loads in all the textures. And the nice thing about this is that, Blender's asset system is really nice, right?


So, uh, models, is it going to load? Where'd my assets go? [01:01:00] That's kind of weird.


Normally there'd be a whole set of these things, but the previous thing seems to be kind of not behaving. Let's try this. New general. All right. So if I want to load up assets, 


**John:** sorry, Elliot. So is the, is the workflow, do you do, do you download the asset first into a local folder or does this, does the, does this a importer work directly with the, uh, the fab website?


Uh, you download it into a local full folder. So if you 


**Eliot:** go to, uh, see, let's see, go to fab. Um, let me pull this over fab. com. Let me just pull up a,


So let's see, so in fab, if you go, uh, Quixel [01:02:00] and then we can, you can pick out the different, different pieces of it. So this was, I think the junkyard, um, I think I was experimenting with a saloon interior, but you can click on it and then you get a choice of, you can either, you have either unreal format or FBX format, right?


And so FBX, what you end up doing is you, okay, we're going to download. Uh, there we go. Zip. So that's a lot of gigs. All right. So I already downloaded that. And, and actually I'll just show you how the, how the plugin works. Cause this is probably a better way to do that. See how it works. So file new and then I'll link to the tutorial on this.


Uh, all right. So we're going to go to the end. We go to our, uh, batch import, get rid of everything in the scene. Okay. And so let's save our blender file. And I'm going to go to Unreal, and Unreal Projects, Fab. Okay, so let's save this as, uh, let's try [01:03:00] the saloon. Uh, or actually, let's try the slate quarry. No, the junkyard's fine.


Junkyard, uh, midnight. Alright, we have to save the blender file. And then I'm going to, just so you see what you get when you download, uh, from that. Go find this.


And this is, you know, the details are in the tutorial. So this is, this is more kind of useful reference. Unreal fab. Okay.


So we go down to fab and so this is just where I left the stored stuff. So you get a, a, um, um, you get just a zip file, right? Junkyard, you know, junkyard, and it's different levels like raw mid, depending on how big you want your textures. So I was doing one of my experiments with junkyard mid. And so what happens with that is you download it and unzip it and you get this really not helpful set of directories.


So let's go over into, you know, [01:04:00] this random one and there's, this is it, right? It's just a set of raw maps, right? There's no material information, you know, it's just, and it tells you displacement, gloss, this kind of stuff. Um, but normally you'd have to sit there and hand wire every, every one of these things.


And, you know, no one's ever going to do that, right? And I just accidentally dropped one of the, there we go. Um, and some of them have the 3d ones have, uh, an FBX file in it. So 3d file, and then all the maps be someone who's going to have to sit there and do that, or you just write a script because it's at a pretty structured format that goes through and parses all that.


Right. So, okay, so let's go, uh, put it to the folder path. I'm going to go to here and I'm just going to go point at the very top level of the, of the folder. Um, where's my fab folder. There we go. So we'll use Junkyard Mig because that's what I was testing with and accept, uh, okay. And I'm going to tell it to, um, [01:05:00] let's bring it as objects instead of collection assets.


And I'm going to use displacement. I'm going to tell it to import. It's going to chug for a second. It's going to go through and import the various assets and give it a second. There we go. It's pretty fast, right? So, and I'm going to go to my. Asset browser, and here's our, I'm going to go to the current and what it's doing is it's, is it's building, you can see it building all the previews.


So current file. Okay. So this is just what it created in, in, from, from the assets, right? So here's the, these are 3d models. You can see the screen, right? These are, so, so if I switch my materials previews, you can actually see stuff. In this original file, it just puts all the 3d objects at the 0, 0, 0. Um, uh, you know, portion of the scene.


Um, and because you don't really want to build with these asset files, right? You, what we ended up doing in the tutorials, we save them to the blender standard asset [01:06:00] directory. Um, and then when you load in another blender scene, um, Then you can actually automatically reference those as, as, as the, the asset scenes.


So, uh, and I'll show you where the blender standard directory is. It's over. Communally located in documents and then Blender, Nope, not Blender add ons. Blender assets. And so that's where I put, you know, the junkyard scene, the South African states, you know, all the things that I already built, built assets for.


And then, you know, you can actually just drag and drop stuff into the scene, right? So if I, if I like this, you put this here, zoom in on it and it has the full 3d texture. And this is the mid level one. The high level, high res ones are very, very detailed. Um, you know, so, and if you wanted to just apply a material, uh, semi 3d cursor, like a mesh plane.


Okay. [01:07:00] All right. So I'm going to, if I want to drag it, apply a material, I can just drag and drop a material onto it. And there we go. And it comes in with the correct transparency and everything like that. So if I want it to be lit, uh, let's turn on cycles.


Where is my cycles? There's cycles. And let's do GPU compute. There we go. Do noise. All right. So it's going to think for a second. And I don't think it sees anything quite yet. So let's add a light. Oh, there we go. So of course right now it doesn't look like anything.


Right. There's like kind of a mess right now. So, uh, So a cursor, add a light. Area light. There we go. And X.[01:08:00] 


There, so we have it lit from the side. There we go.


But anyway, so then it actually brings in the, uh, the different material properties. And it should have brought in displacement. Let's see if it's My computer is like super lagging right now. I'm not quite sure why. Let's go in.


That was a bit laggy. Uh, let's try a different one. 


Let's try stones.


Did that load displacement? 


Let's check. 


Let's check our shading. See if it brought it in. [01:09:00] All right. So this is, this is the shader network that it automatically populates. So here's our, here's the original text here. And here is our, so it loads in the albedo and anamorphic. And that's actually interesting if it's going to give us problems when we try to export that.


So it's going to be one of the questions we can try it out. And, but anyway, so this is, this is what the, the script does and load plugs in the neural map and plugs in displacement. Um, so I think most of the things should go in there in the right spots. It's not really, it doesn't look like it's displacing.


**John:** Sorry, quick question about the, because I haven't used the asset library. Uh, the asset library, uh, is, you can access, access that from like any Blender file, because it's, it doesn't live in, it doesn't live in the Blender file per se. 


**Eliot:** Yeah, so the way the Blender asset system works is that, [01:10:00] uh, what the, what it does is it goes through, you build your initial Blender file and you mark it as an asset, right, and which is this little bookshelf icon over here.


Okay, those are all marked as assets. And then, uh, when you have another scene in Blender, you can either, you can make a new scene in Blender. It either looks for, um, uh, other, it looks at the different blend files in the current directory, um, and look to see if any of those have assets and you can, you know, access the assets from there.


Or it looks at, at them from a central location. You can pick the location in this case. It was the, I just put the default blender, you know, blender assets location. And so I was just been storing the asset system. Oh, it's saving 


**John:** a copy. It's saving a copy to that other folder 


**Eliot:** as 


**John:** you 


**Eliot:** go. Yeah. Well, you can, you know, you can, you can just put the blend file there and then it does to reference with blender.


A lot of things are manual, but on the other hand, they, you know, you can see what it's doing pretty easily. Yeah. Um, okay. Thanks. So then, you know. And ideally, let's see if it works. Um, and we can actually go and we're in, I [01:11:00] just made a new blender file and I'm going to open up our asset browser and then we'll, let's look for the asset library instead of all, let's go to our user library.


There we go. There's our user library. So now, now we're looking at, uh, different assets from that. Now, what I would have thought is they would show me the, the South African quarry one. I don't know if I see that. Um, I'd have to look at that to see if it's splitting it up into different asset. Uh, asset, is it another essentials?


No. All right. And it's under all. Okay. I'll have to look at it a little bit to see exactly what's going on. But, um, anyway, so that's, that's the basics of, of it. Is it, you know, imports it, sets up, sets up the material trees. And the question is, and let's, let's try this because I don't think I tried it. And I'm actually suddenly curious to see, uh, if this will export correctly.


Um, so let's go take a cube [01:12:00] and, or actually let's grab one of the 3D objects. Staircase. And it has its materials on it. Okay, there's a, there's a staircase materials. That's fine. So then what happens when we export that? Let's do it. File. Now we're gonna do export. Universal scene description. And let's bring it out to, uh, let's bring it out somewhere.


Where's my Unreal? There we go. Do test exports.


Alright, so let's bring this out. Staircase. Let's


export to USD. And let's take a look at what the shader is, just so we see what's going on with the stair. So we've got base color, this may give us some, some fits. Uh, what we [01:13:00] were debugging earlier with, uh, with John was the connection between the, um, uh, the, the, the exporter expects everything to be plugged into a base color.


And so I'm not sure if that multiply is going to give us troubles, whether than a straight texture, but Hey, you know, let's find out. Um, all right, so let's go, let's go back to our blender and let's export. Oh, actually, I think we're, do we already export? Let's do it again just to make sure. There we go.


Okay, staircase. Now go to autoshot.


Alright. Point that to our models.[01:14:00] 


Test export. Okay, there's our model. And let's set the just to the same one.


Alright. Okay, so let's export it and we can keep some of the texture size. Look better. Make USDZ, 


and if


it works, I'll just show it, show it on this, on the screen in a second. So push the file.


Yeah, there it is. All [01:15:00] right. You know, let me, uh, share my screen. Open up web page and video. Oh yeah. Let me sign in for my test flight stuff.


Planning a test flight here.


**John:** Oh, hey Elliot, while we're, while we're here, looking at the slate for a second, I just have a quick question. So, um, the video button there, that, that allows us to see through the laptop, that's the video from, uh, [01:16:00] from the phone? 


**Eliot:** Yep, that's, that's our remote video feed. Cool. Let's see if this is, let me refresh, and, there we go.


Yep. Play. And so if you were doing a 


**John:** take, 


**Eliot:** there 


**John:** it is, or you are doing a take, 


**Eliot:** you know, I'm just, this is just a real time feed, uh, from what I'm seeing in, in jet set. So, right. That's cool. There's our staircase. 


**John:** And so it would also be showing if you have somebody on a green screen, it's going to be showing you that sort of rough composite as well.


Oh, yeah. Yeah. So here, let me, uh, 


**Eliot:** let me just dial in a messy desk thing. But yeah, this is the basic, basic pieces of it. Right. Cool. Yep. This is, and this is honestly, this, there's an area that I'm, I'm working on, on this, which is, um, because we're, we're going to have a couple of things coming together pretty soon, which is where we're almost, [01:17:00] we almost have compositing in the phone, uh, starting to behave.


Uh, so we're actually going to be able to do a real time comp with the actual cine, uh, cine footage in the phone, uh, composited with the 3d background, uh, which is great. And now since we can actually do Gaussian 3D background, it's going to look, I'd say we're going to look probably 75 percent of what it's going to look like in Unreal.


We're going to be able to get with Gaussian splats. Um, you know, and then it's, you know, your onset is a phone, right? No, and a laptop, right? So you can, now the difficulty is that There isn't, the phone doesn't have a lot of hardware off connections. And so getting the, the remote feed, um, out to, uh, out to, you know, a, you know, the producer, right.


So, uh, is right now we have this, but it depends upon wifi and H two 64. Um, and it's, you know, I want to experiment more with it. I suspect what's going to happen is we're going to run simultaneous feeds of, um, You know, Axiom has a new device, which is a Cinemaster, uh, which is, uh, two things, a CMO, [01:18:00] uh, which is, you know, what we use to, to take video in, and it also has a wireless transmitter in it.


So I suspect what we'll end up doing is doing a wireless transmitter of the cam original, uh, you know, the green screen feed, and then we'll have our You know, our Wi Fi feed of the comp, so you can see both of them, and we're going to have some problems with the comp, I can just already tell you in terms of the reliability of Wi Fi on some of these sets, I think we're, I want to do as many experiments as possible with having a local router, that's like, everything's line of sight, and it's on your own network, so there's not much to go wrong, and then the, you know, the, The green screen feed is, is, you know, it's a hardware transmitter.


So those things are more reliable, more reliable on set. Um, I think that'll be, I think that'll be fine. That way you can look at one, say, okay, you know, this is what, this is what the, the camera original looks like. This is what the comp looks like. Okay, good. Um, but yeah, I'm thinking through it because it's, I would love to have a wireless transmitter feed at the comp, but it's just the iPhone.


This is one of the limitations is that it doesn't have that many hardware connections. It has one. Oh, and we're going to be using it with the, with the CMO to [01:19:00] get the, uh, the Cine feed in. But I think that'll be a reasonable trade off, but yeah, as you, you are all our VFX supervisors and you'll have to, as we go through it, tell me if I, if I've got that about right, um, sometimes people, you know.


Have people flip if, if, if, if the feed isn't, it's just perfect. And I think seeing the camera original be just fine. And knowing that the data is coming through is going to be fine enough if the comp is, is reasonable, but. Um, this is, these are the decisions you're, you're, that we're making to figure out how best to do this.


Um, and you can always, you know, this is something Kevin and I have discussed, which is what's the trade off point where you just bring along a giant workstation, you know, a stipe, a keyer, and you have two people sitting there and babysitting it to keep it, keep it alive versus what I'm kind of aiming for is something where it mostly just works, you know, and, you know, maybe you're observing it through the remote feed, but.


You're not having to do a lot. Things are mostly automated, um, and, uh, and it's a [01:20:00] very lightweight onset. So it's just so you guys see what we're, what we're thinking through. Um, okay. So, yeah, so it looks like the, the, um, Megascans importer exports. I'm reasonably fine. We can try others, but It's all, they're all gonna, they're all generated with the same, the same, um, texture map algorithm.


So they're all gonna work, they're all gonna look exactly like this because it's the same script. And there's a link to that script on the, uh, website? Yeah, that's a link, uh, there's a link to, um, there 


**John:** we go. 


**Eliot:** Let's see. That's, 


**John:** that's like basically, uh, it's a, it's a Blender add on effectively. Yeah, 


**Eliot:** it's a Blender add on we built.


Right. Because where I got, I mean Megascans is great and But I wanted a way to be able to use them in Blender. And I looked at Bridge and I just, I took two or three shots at it. I just couldn't figure it out. I'm like, screw it. It's a folder and a file and structured data. We're just going to write a tool and then it's bonehead [01:21:00] simple to use.


And when something breaks, we can fix it because it's Python. So here's the, uh, I'm going to share the link. 


**John:** That's my next, uh, my next, uh, project then is doing that. 


**Eliot:** Yeah. Yeah. I mean, then, then all of a sudden you get all the Megascan stuff and you're in Blender and it's great. So, um, anyway, so there, there's that, that's in there.


And I think, I think that I put the blunt, uh, 


**Kevin:** Yeah. And to Umari's question, who's gone is why I use Unreal Engine. Uh, I, I think the main reason I use Unreal Engine, honestly, is because of their marketplace and how well, what quality of assets they're getting once they acquired Quixel. 


**Eliot:** Yep. 


**Kevin:** What a ridiculous library of assets that was to.


To have access to, so the whole fab thing has made life actually a little bit worse for me, but I think better for blender users and other the broader community. 


**John:** So I just it's just I it's only I have only so many hours in the day to learn a new software so that for no reason I should [01:22:00] on my to do list is learning unreal.


But in the meantime, the only logic that I have whatsoever, I put that on 


**Kevin:** my to do list years ago, and now I'm working in unreal full time. And it's still on my to do list. It's still on my, so 


**Eliot:** here's actually an interesting thing. And, and, and I say this in a non trivial way, which is, and, and, you know, I'm not going to stop recording.