Gearbox Software on Borderlands 3 | Live from HQ | Inside Unreal


>>Victor: Hi, everyone, and
welcome to Inside Unreal, a weekly show where we
learn, explore, and celebrate everything Unreal. I’m your host, Victor Brodin. And with me today, I have Ryan
Smith, Technical Art Director of Gearbox Software, and Brian
Thomas, Managing Director of Design. And we’re here to talk a
little bit about Borderlands 3.>>Ryan: Yeah, yeah.>>Brian: Thanks for having us.>>Victor: Yeah, for sure. I’m super happy you guys came. We’ve been planning
it for a while. And now we’re actually here. And we’ve got the
editor loaded up. But before we dig
into the project I was just curious a little
bit about your history with Gearbox, and
sort of I guess my first question
should be, which was the first Borderlands
you guys worked on?>>Brian: Man, I’ve been with
Gearbox for 13 years now. So Borderlands 1 was the
first one that I worked on. So I’ve worked on all the
core Borderlands franchise series, so 1, 2, and 3. Did a little bit of stuff
Pre-Sequel, but not a lot. But yeah, I’ve been
around for a while.>>Ryan: I’ve been with Gearbox
for eight years. And my first title
was Borderlands 2. I came in on the tail
end of that project, like about three
weeks before ship. So I’ve been working on that. I did all our DLCs for that,
Pre-Sequel, and Borderlands 3. Did you play Borderlands 1? Yeah, a little bit. I remember playing it
with my step-brother. Oh God, this was so long ago. This was, like, 10 years ago. We were playing
split-screen on it. It was a super-cool game. And yeah, when I
got to Gearbox, I was like, this is going to be
so fun to work on these titles.>>Victor: Yeah. It’s definitely a
well-known franchise. And I know it has changed a
lot of people’s sort of ideas of what they want in
games, and what they like. You know, a billion guns– it might sound violent, but
it’s definitely a lot of fun.>>Ryan: Yeah, it’s super fun. A lot of toilet
humor, which is great.>>Victor: Yeah.>>Brian: For fun, we just had
the 10th anniversary of Borderlands 1 was last year. And when it came out, it
was a hard sell for people to be like, yeah, it’s an
FPS, but has RPG mechanics. And now that’s just,
like, De facto. Every FPS that comes out has RPG
mechanics in it at this point.>>Victor: Yeah, it is. And there’s definitely– I
mean, there’s even a little bit of pop culture when it comes
to, like, Claptrap and–>>Ryan: I remember
playing through 2 before I got
hired at Gearbox– or not playing through it,
because it wasn’t out yet– but when I got there and I was
actually playing the game, just laughing out loud so
much, and remembering it was one of the only
games that I’ve ever played that, as
I was playing it, I was just laughing really
hard at all the jokes and stuff like that. So that, to me, was always
a big selling point, is on top of all the
super-awesome gun combat, you just get hilarious
dialogue, and killing psychos. And what they yell out when
they die is always so funny. Like when you kill
one in 3, he says something like, “I wish
my child was born,” or “tell my child I
wish they were born.” It’s just ridiculous
stuff like that. [CHUCKLES]>>Victor: That’s good stuff. And I remember playing
a lot of Borderlands 1. It was clearly
sort of groundbreaking in terms of when it came out. How big was the team
back then versus now?>>Brian: Oh man, BL1, Dev team,
probably around 75 people. Studio was a little
larger than that, just for operations, and QA,
and sort of ancillary roles. But yeah, I mean, it
was a pretty small team. But it was definitely
at that size where it’s like
you know everybody, and everybody’s just
wearing lots of hats. Like I was doing
cinematics, and I was also helping out with boss designs. And one of the other cinematic
designers was building a level, and doing a boss combat
arena at the same time. Everybody just kind of had
two, three different roles that you sort of
had to put on, just because the team was so small. So it’s crazy now, because now
we’re up to hundreds of people. And we have a studio in Quebec
City, and our studio in Frisco. So the scale of it is
just so freaking different than like, oh, man, I
had a problem with that. I’m going to walk down the hall
and just hit somebody, and be like, what’s up with this? [CHUCKLES]>>Victor: And you guys are both
down in Frisco, right?>>Brian: Yeah.>>Victor: And that’s your
headquarters?>>Ryan: Yes.>>Brian: Yeah. So our Gearbox Software and
Gearbox Publishing is there, and then Gearbox Studio
Quebec is in Quebec City.>>Victor: Cool. Well, I think that’s enough
of, you know, talking. And why don’t we just
dive into the editor. Because I know everyone’s
pretty excited to see this.>>Ryan: Yeah, absolutely. So I’m going to be showing
you guys our recruitment map, and some of the technical
stuff that we have developed internally to kind of help along
the production of Borderlands 3. So kind of the first
thing you’ll see is this is Borderlands
running in the Unreal Engine. We haven’t really changed the
interface too much at all. The tools have been
great all along. And yeah, it’s just– I know it’s probably
cool for a lot of devs to see Borderlands just
being used in stuff that they’re used to working on. But yeah, we used all these
tools to build the game. And the first thing
that I’m going to talk about that,
to me, was one of the most exciting
innovations that we built is our day/night cycle editor. We have an amazing tools team
and rendering team at Gearbox who helped us put all
of this stuff together. So the goal with the
day/night cycle– and any devs who are watching, if
you try to animate a day/night cycle without
tools, it’s kind of complicated. There’s a couple of
different ways to do it. You could do it in Sequencer. You could just kind of like
hard-script it in Blueprint. Or if you remember
the Unreal 3 days, we had to do
everything in Matinee. So we had to keep track of all
of the different Actors that contributed to the
day/night cycle, like Skylight, or Directional
Light, or any Meshes and Materials. We had to place all these
Actors in the scene. And we had to make
sure that those were consistent across
every single map. And that was just a
nightmare to work with. Towards the end
of the project, we had one guy, Carl,
who would go in and he would pretty much spend
so much time just copy-pasting from map to map. And we had like 30 to 40
maps or something like that. And then if somebody
broke something somewhere, he would have to
track down that map and figure out where it was. And if he made any changes,
they wouldn’t propagate. So some of the design
goals were we wanted a drag-and-drop solution. So we wanted everything to be
contained in a single Actor. We wanted to have a
custom editor that kept track of all the data. And we wanted to be able to
save those data sets out, and reference them across
maps so that, for instance, all of the day/night cycles
on Pandora were the same. So that’s kind of
the spiel there. So let’s kind of go in and see
exactly what this looks like. So we had our tools
department give us a specific window, Time Of Day
Preview Controls, over here. And this allowed
the developers to kind of come
in their scene, and just drag the slider
up and down to kind of set their time of day so that
they can sort of develop the map in whichever time
of day that you need it to. And it just works. And this is really cool, because
our lighting artist would kind of set it to nighttime. And they’d come in
the map, and they would light the
majority of the scene with mainly lights at nighttime. And then we would test
it in the daytime, just to make sure that
everything looked really good. Because in the daytime,
most of the stuff that we’re really
worried about is just what does the sun look
like, what does the bounce lighting look like. Because we don’t
really see too much influence of all
these other lights. But when we get to nighttime,
the auto-exposure kicks in. And we start to see all of
these light contributions, and how they affect
the scene at nighttime. And all of these lights– like,
we’re not doing anything crazy, like animating the
state of lights. We’re not turning
them on or off. We’re not really scaling
the intensity or anything like that. Because we’re making use
of high dynamic range. So at nighttime, because
we’re using light close-to-physically-accurate
lighting values, the camera will expose up. And you actually start to see
those lights in the scene. Like if you look
at the ground here, we’ve got this purple lighting
from this light right here. And it’s just a standard light. It’s right here. It’s one of these lights. There’s a bunch of them. But if you think
about– like if you’ve got a floodlight on your garage,
it shines on your driveway, if that’s on in the
daytime, you don’t see that even though that’s a
really bright light. But as soon as it gets to dusk
and it goes into nighttime, that light, if it’s
still on, it becomes very bright because of the
way that our eyes adjust. And we took full advantage of
that as we were developing. [OBJECT CRASHES TO FLOOR] Oh boy.>>Victor: That was a first.>>Ryan: Dropped the water. Yeah, well, I’m a huge klutz,
so that was bound to happen. So let’s actually take a look
at what our day/night cycle tool actually looks like. You know what,
first, before that, let’s actually just
click on the sky. And I’ll kind of show you
guys, like, what we did here. So when I said there was
a drag-and-drop Actor we had one of our
tools team people just make a Blueprint that
has a custom time-of-day component on there that was
kind of like the tracker for all that data. But all of the
components here are stuff that’s very recognizable. Like a Skylight Actor just
has a Skylight component. Same with an Directional Light
Actor or an Atmospheric Fog component. And we just pretty much
dragged all of those components that we needed into
this singular Actor. And we have some
other stuff that we can tack on there, like cloud
rings, and Skybox stuff, and for instance,
like Elpis back here is part of that subclass
of that particular Actor. But these are the
main components that the time-of-day
editor actually looks for. So now that we’ve got
that out of the way, let’s go ahead and open
up this Asset editor. So if you look at this, it
kind of looks like sequence. Because you know, we wanted it
to look similar to Sequencer, with the timeline
and all that stuff. But we also wanted it
to be custom to what we wanted it to do. So on the left, over here– you know what, I’m
just going to make this a little bit smaller so
we can actually see the scene. When you have this
editor open, you can slide this back and forth. And you still see what’s
going on in the scene, which is really nice. But over here on
the left, we have layers that we can
transition to at gameplay. So if you’ve played
Borderlands, you know that there’s a part in
Pandora where there’s some boss battles that we wanted to happen
specifically at nighttime. So instead of just
locking the time of day cycle to a
specific time, we can just transition
to a specific layer. This is a part of the
game where the moon gets kind of phase-locked. And I don’t know about
spoilers or anything like that. But that’s all I’m
going to say about it. But you know, we just
kind of lock it in place. And we make sure the
keyframes are all shared. We have different
ones for the main menu than we do for the game proper. They’re just kind of
like subtle changes. And some planets will have
different layers, too. Like our Eden 6
time-of-day layer has all of the different
maps in their own layers, because we want to
get really granular with what those day/night
cycles felt from map to map. But let’s look at some
of these keyframes. So if we click on a keyframe,
over here on the right, these are the keyframe
properties that we added. Up first, we’ve got
some general parameters. And these are
collection parameters that we can sort
of like opt into for each time-of-day Asset. So I click off here. We have a parameter
collection that’s being referenced right here. And we have these
properties these arrays called allowed scalar parameters
and allowed vector parameters. And this is where we
are able to opt in. So if we were to opt in
any of these parameters that we’ve got– like I could add one
really quick here. Here is all of our
collection parameters. And I could add one here. And when I click
back on the keyframe, that is now accessible
for me to animate. So it’s very flexible for any
tech artists or any content artists that need to
add a specific shader parameter that they need to
edit throughout the course of the day/night cycle. The things that we used these
scalar and vector parameters on the most were things that
actually happen out in the sky, like the clouds. So none of these clouds
are being dynamically lit. They’re all emissive. They’re fake lit because
of runtime performance. We didn’t want to have a bunch
of translucent-lit Actors in the sky that
were overlapping. So we just figured that in
an emissive-lit approach or an unlit shader
was the cheapest way. And it gave the artists
the most flexibility, too. Because they can get really
creative with the types of colors that they animated
the day and night transitions. You can see how pretty it gets
in some of these golden hours here. And our lighting team just
did such an amazing job with all of this stuff. And they were able to
pretty much get everything that they wanted to with
this particular system.>>Victor: And that’s all just
a texture, right? There’s no volumetric–>>Ryan: No,
so we do have volumetric fog in the game in some spaces. We didn’t ship with
it on Xbox and PS4. But we shipped with it
on the Pro and the PC. But we’re not relying on actual
volumetric fog, or any raymarched stuff in the sky. We had some concept artists
paint all of our Sky Dome and cloud textures– Lucas, and Tris, and Adam
Navarro painted a lot of these. And they just did– I mean, it’s really pretty. And it did exactly what
we wanted it to do. When you’re playing
a shooter, you’re not looking up at the
sky too much anyways. So no, we made sure that we
had like a really nice horizon cloud ring, and just
interesting things going on in the sky in general so
that everything felt good. So moving on with the keyframes
to kind of keep demoing this, once we get past the collection
parameters and stuff, this is when we get all of the
other recognizable properties that you would
want to play with, from the Directional Light, the
Skylight, the Atmospheric Fog, the Exponential Fog. All of those parameters
are here and keyframeable. We even have a crap-ton of
post-process effects, too. Because Carl gets really
into color grading and things like that. And like I said, they
wanted the most control that they could have. And we just made sure
that we had that. So you’re probably wondering
what these checkboxes are. So one thing that we
didn’t want to have happen is weird interpolations
happening, with minute changes or duplicate keyframes. So for instance, we have this
time-of-day variable right here. And this is just something
that the shader reads. It’s a 0-to-1 parameter,
where midnight is zero, and the midnight on
the tail end is 1. So we have a normalized range
for the entire day/night cycle. And instead of
having to make sure that the value
interpolates smoothly across all these
keyframes, we can sort of like opt out of animating
that particular value. So if we click on this keyframe
here, we could set it to 0. And then we could come
over here to this keyframe, and set it to 1,
and then disable that on all of these
values in between. So you can see that it’s enabled
on some, but not on others. This means that someone
actually didn’t do this right. But you know, it’s
a house of cards. It’s game development. It must not have mattered too
much for this particular map.>>Brian: Close enough for Jazz.>>Ryan: Yeah. So we could just
disable that keyframe and it will do a smooth
linear interpolation across the entire
day/night cycle. So we could opt out of
keyframes so not all properties are getting animated. And these little
purple guys underneath, these are states that we
actually tie into gameplay. So we could query, in
Blueprint, what time of day it is currently so that
Blueprint can do something either on construction script or
it can receive callback events. So imagine, like, a torch
flame effect or something like that, or even a spawner
being activated or deactivated, depending on the time of day. So we built all of
that into the system. And it just worked
out so amazingly well. And I guess now would be a good
time to take any questions. If anybody has any
questions about this, I’d be more than
happy to answer–>>Victor: Yeah,
for everyone in chat, feel free to ask
us any questions you got. I’m fortunate to have these
guys for quite a while here– at least I’m hoping so.>>Ryan: Oh, yeah, we’ve
got plenty of time.>.Victor: Yeah. So they were a
little bit curious– let’s see, there
was one related to– they were wondering if
it’s all dynamic lighting, or if there’s
something that’s baked.>>Ryan: OK, so that’s actually
a really good question. We did a sort of a hybrid
approach to our lighting. So in Borderlands, we have a
stationary Directional Light so that we get, like,
nice, dynamic shadows. But the light isn’t dynamic. We’re not moving it
throughout the scene at all. It stays stationary in the sky. And we bake lighting so we can
get bounce lighting and all that stuff. Because we just didn’t really
like the feel of dynamic– like the way that
dynamic-lit stuff worked in our particular game. We wanted to get all that bounce
lighting and stuff like that. So we baked everything. But yeah, the directional
light itself is stationary. And I think the Skylight
is also stationary. If I recall correctly. I mean, we can just look. Where’s the details? There, that one. Oh, yeah, here we go. Yeah, so the Skylight
was stationary. But that’s actually a
really good question to segue into the next thing
we’re going to talk about, which is reflections. Since we were using
a high dynamic range, it was really challenging
for us to make sure that everything looked
good at nighttime because we were
baking the lighting. And we needed to
have a way for metals and other glossy surfaces to
reflect an accurate reflection environment. And the tools that
comes stock with Unreal, they’re quite flexible. But the only thing that we
really had available to us is we either had to
recapture the Skylight and completely rely on the
Skylight’s reflection capture, or we had to tint and
adjust just the reflection environment in post. And that wasn’t
particularly good for us, because we have lots of
emissive things going on. we wanted to also
change the sky a lot. And if you’ve got a
really colorful sky, and you just add
a tint to it, it’s not really enough to get
us where we needed to be. So our rendering engineers
upgraded our Sphere Reflection Capture Actors to do
some really cool stuff. And I can kind of show you
what that looks like here. So here is a– this is just a metal sphere
with a metallic shader that has roughness set to 0. So we can kind of see a mirror
reflection of the environment. And if we change the
time-of-day cycle, you can see that all
of that is updating. And pretty much not every
frame, but we kind of have a latent kind
of capture going on, where it’s kind of like
accumulating itself every six frames or something like that. I can’t remember exactly how
our rendering team did it. But they made it so
that I think, like, one face of the keymap would
get captured once a frame, just for performance reasons. It just made it cheaper. And the time-of-day
cycle changes so slow that you don’t perceive
any of that stuff. So it was just a clever
way to optimize it towards the end of shipping. I don’t know what that
big purple sphere is. I think I’ll just
move that away. So the way that we did this
is every one of our reflection probes kind of captured– they’re like mini
buffers in that. So it would capture
just the unlit scene, it would capture
the emissive scene– and those are our RGB maps. And we also had a
flag on our primitives that said, like, draw into our
real-time sky environment map. And that kind of dynamically
creates a mask in the keymap. So anywhere where
you see sky here, actually just think
of it as a mask, where if you imagine
anything that’s in the scene that’s black,
and the sky is white, it would composite the
real-time sky in to match what they captured in the buffers. So I think I do need to clarify,
though, we weren’t capturing the scene every frame. We were only capturing what
was called our real-time sky environment map, which where
the Actors specifically flagged to be rendered into that pass. So that was almost always
just stuff in the Skybox. It was just the cloud
layer, the Sky Dome, and some of those emissive
clouds, things like that. So it was usually only
like five or six draws into that particular scene. And then we composited into
what we actually capture. We also capture the
Directional Light, its influence on the scene,
and the bounce lighting. So we can edit those
later in post– or not in post, but in the
actual time-of-day cycle. And that just gives us
sort of like this way to efficiently
update reflections that kind of feel
like they’re real-time. But it’s almost like
it’s pseudo-real-time. Because the scene is baked,
the sky is real-time, and we composite those together. And we needed to do that
because if we didn’t, when we went to nighttime, we
would get, like, really bright reflections because the Keymap
Capture Actors were static. So they would be reflecting
what they capture in the daytime at nighttime,
and that didn’t really work well for us. So we had to kind of play
with some ways to fix that. And the solution that our
[inaudible] team came out with worked amazingly well for us. And it even did a really
good job in interiors, too. Like we can zoom all the way
over here into Shiv’s room. And this is all baked. And when we change
the time of day, the only thing that’s
really changing here is you can kind of see
that fog getting changed, because that was just a
property that we were changing. But nothing out here is
changing– or nothing in here is changing, which is good,
because we don’t want it to. But outside of here, you can
see, on the walls, all of that is changing, because that’s
all coming from the bounce lighting. So we get really good,
pristine lighting, that we don’t really
have to worry about it. We just light it
like a regular scene, and everything just works
with the time-of-day system.>>Victor: Is that using some
kind of Global Illumination to do that bounce lighting?>>Ryan: Just whatever baked from
the Directional Light.>>Victor: OK.>>Ryan: Yeah,
so it’s just Lightmass. But we capture the
contribution of Lightmass in those probes as
a specific channel. So the main direct
light gets captured, and then the bounce
light gets captured into a separate channel,
like the green channel. And then we can
tweak those values when we kind of recomposite
the scene together. So it’s kind of
like fake lighting. It’s almost like
each one of these is like a really lightweight
deferred renderer, just for the reflection pass, and
the GI pass, or the Skylight ambiance and stuff like that. So it does a really great job at
tying all the scenes together. It was really cool. I guess we can move on
from that, unless anyone wants to know anything else.>>Victor: They were curious if the
time-of-day editor was done all in C++.>>Ryan: It was. Yeah, we did not do that
in Blueprint at all. Our tools team just kind of
took what was in the Engine, and kind of just were
like, all right, cool. I could use all that
stuff that’s already here to make this look just
the way we want it to look and then there was some
homebrew stuff for the keyframes and stuff like that. But a lot of it was just
really clever data management and interpolation. And it performed really well,
it was really easy to use.>>Victor: It looks beautiful.>>Ryan: Shout-out to the tools
team and the renderers team. You guys are awesome. We wouldn’t be able to
do this without you.>>Victor: Give it up in chat
for the tools team. [CHUCKLES] They were wondering what the
total size of this map is.>>Ryan: Like, in megabytes?>>Victor: No. I believe in terms of units.>>Ryan: OK. You would probably
know that more than me.>>Brian: Oof, in straight UUs?>>Victor: Yeah, I think giving
them an idea might be– [crosstalk]>>Ryan: Oh, you mean– Like, actual top-down. OK. Yeah, let’s do a little thing. Where the hell is that? There it is at the top. Yeah, so here’s the
play space, where you see the terrain, pretty much. And then the rest
of this is Skybox. Yeah, the
ground and Skybox. So what is the–>>Victor: Middle mouse click?>>Ryan: Yeah,
middle mouse click. Yeah, so let’s just say 70,000
units-ish, 70,000 to 80,000. And this is one of
our smaller maps. This is actually one of the
smallest maps in the game.>>Brian: Yeah, besides the Circles of
Slaughter and Proving Grounds, this is the smallest
real estate in the game.>>Ryan: I don’t do a lot of measuring
in this view, as you probably realize. I was like– [CHUCKLES] Because I know you can do it. I just can’t
remember–>>Brian: When I get to my
part of the demo, we actually have a
really clever tool our Quebec studio
made that kind of uses a bunch of our
standard sizing metrics to really help block
out spaces quickly.>>Victor: There was
another question. They were curious if you are
utilizing streaming levels.>>Ryan: We actually don’t. We stream stuff
that we kind of just get for free, like textures
and all that stuff. But because we’re doing
just kind of isolated maps, we just pretty
much load the stuff that we need for those maps. And we just do it that way. One of the reasons we
do that is because we do local co-op and
local split screen. And we’re not on
dedicated servers. So it’s really important
for us to make sure that the players feel like
they can kind of go everywhere in a map and be separate
from the other players if they need to be. And we don’t want
to tether them back. So there’s some games
that actually do that. And it doesn’t really
feel that great. So this is the
solution we landed on, just kind of old-school
map development, where we load everything
in, and we let everybody do whatever they want to. Because if we had to implement
some streaming method, we probably wouldn’t have the
same freedoms that we did.>>Brian: Because we have drop-in,
drop-out multiplayer online as well, it just
helps that we don’t have to understand multiple
user states over our network, and understand what
streaming bits need to be in what, and
what combat areas are relevant to which user.>>Victor: And that can also get
pretty heavy for the host, right, that he would have to
manage all of that on his CPU. And so you’re taking
that out of the equation. But are you still using sort
of sublevels for managing what each one works on?>>Ryan: Yeah, yeah. I’ll pop open the
sublevels here. So we have a lot, right? We’ve got sublevels dedicated
specifically to missions. And we’ve got audio
effects, combats. I mean, just read the list. This made it so that
all of our departments can kind of like work on the
same map at the same time. Because they would just
check out the effects map. And the effects team
could go in there, and they can populate it,
and then check that in. And it just made it really,
really easy for multiple people to work on the same map at once.>>Brian: And even some of
our level art maps, like down there where you
see Recruitment, and Terrain, and Light, and Intro, those are
different zones of the area. And then the Lighting sublevel. And that way, one
level artist can be kind of propping out
of the beginning area, another can be propping
out the boss fight area, another one could
be doing lighting. And we can just parallelize
that work as much as possible. So like, I just un-hid
the Skybox level. So we would have someone go
in and lay out the Skybox. And at the same time,
a designer would be in here doing mission
combat or something like that. And we have light, too. So I’m not going to
disable lighting, because it’s going to kill
perf because all the lighting. It’s just not going to– I’m not going to risk it. It might crash or something. Who knows? Law of demos, right?>>Victor: We did talk a little bit
about how to manage project when it comes to control
in several developers, just a couple of streams ago. We did some version
control fundamentals. And sublevels just definitely
makes that a lot easier. Otherwise you’d have like 5,
10 level designers all wanting to work out the same file.>>Brian: It also lets us do– we do a lot of
automated auditing and automated build testing. So when a build runs, we can
understand certain Actor types should be in certain levels. So it’s a lot easier if
we sort of split it up, and we understand, hey, if
there’s a non-lighting Actor inside of the lighting
sublevel, there’s probably something somebody
needs to get track down. Just because a
Static Mesh probably dropped into it
accidentally or whatever. So it’s not perfect,
but it at least helps sort of let
everybody know the lane that they should be working
under best circumstances.>>Victor: They
were wondering if– so shadows don’t change
direction with time of day?>>Ryan: Shadows do not change
direction with time of day. So this is always–>>Brian: By lore–>>Ryan: Yeah, oh, God. Oh, God. This has always been
a hilarious thing to talk about in the office. Because there are purists and
sci-fi fanatics who are like– they want everything to
work physically accurately. And I always get in arguments
with some of my buddies. They’re like, nobody notices
if the shadows change or not. Because when I first got on,
I was like, if the sun moves, the shadows need to move. And they’re like,
well, no one cares. And they’re like, pick a game
that has a day/night cycle, and try to remember if the
sun disk actually moves. There’s some of them
that you can pick. But there’s some of them
that you can’t really think of And the fact that
someone asked– or in chat, they’re like,
do the shadows move– tells you that no one really
pays attention to that. They see the lighting
change and all that stuff.>>Brian: We’re a game where you throw
guns, and tiny little dinosaur legs spring out of them,
and they run around. So realism is not our
first card to play. By lore, the sun is on the
other side of the planet, and it bounces off the moon. And it’s in some sort
of gravitational lock.>>Ryan: Elpis is supposed to
technically be the sun. And it’s been that way
since Borderlands 1. So we never changed it. It’s like, well, let’s
just keep it that way.>>Victor: And this
is on Pandora, right?>>Ryan: Yeah, this is on
Pandora. So Elpis is actually
one of the things that we’re changing the colors
of in the time-of-day cycle. Interestingly enough, that
emissive on Elpis right there, if I zoom in, that’s always on. But like I said,
because of the way that we change exposure in
the daytime, you never see it. Because the lighting
values change differently. So I zoom, and you
can kind of see it. You just don’t notice it.>>Victor: They were curious if
the emissive is baked– from, I guess,
Actors in the scene, and models that have emissive.>>Ryan: Oh, are we using
emissive lighting on the primitives and stuff? Sometimes we do. Like if we have lava
or something like that, and we need a lot of
bottom lighting from that, we’ll throw in a
plane, and throw in emissive Material on it,
and turn on emissive lighting, and let that bake
into the scene. That always looks a lot better
than hand-placing 30 lights.>>Brian: Hundreds of point
lights.>>Ryan: Again, because we are baked,
it allows the lighting artist to go in and do what we call
light painting, where they can accentuate an area or light
the way just with lights, and add color to
the scene that way. We don’t have to worry about
the performance impact of that, because it just all gets baked.>>Brian: If you go back to
that oil sign, there’s multiple clusters of lights. There’s lights
directly up next to it, and then there’s another cluster
that’s closer to the ground, just to sort of emphasize where
we want that to be hitting.>>Victor: And it gives the
artist a lot of freedom, right?>>Brian: Yeah.>>Ryan: Did I talk about how it
shines on the ground, too? Yeah, I think I mentioned that. So this is just another example
of how, at nighttime, we just see the pop of
all of this stuff. It’s just– God, they did
such a good job at it. It was one of the first games
that we made where nighttime, beyond just being a blue
color and some color tinting, felt like a completely
different level because of the HDR tools. When these guys lit it at
night, and then transition it, it feels like you’re
playing in a different map. I mean, it’s the same map, but
it looks totally different. And that really adds to
the longevity of the game. Because our fans, they
play these games for years. And if it was just
static, and we didn’t have a
day/night cycle, it would just probably get
kind of boring and tedious.>>Brian: It is one of the
biggest enhamcements. If you go look at BL1
and BL2 nighttimes, there’s just sort
of a blue wash, because that’s just basically
what the Engine could handle at that point.>>Ryan: Yeah, before PBR
lighting, everything was flood fill ambient. So we just had to tweak
a couple of properties. And with PBR, everything gets
a little bit more complicated. It looks 1,000 times better. But we needed to evolve the
tools so that our studio can actually handle that workload
of all the complexity that comes with that.>>Brian: But it’s a
good virtuous cycle. Because the Engine
having those tools allowed us to build the
time-of-day editor, which allowed us to have better
tooling to allow the lighting artist to then actually go
create the more hand-detailed stuff without it being super
time-consuming for them. So it’s just sort of a– now that we have that
tool, it’s like, OK, well, what can we really take
this out for a spin with? Because now we’re good
at it, and fast at it.>>Ryan: We did some really
clever stuff, too. In Promethea, we– so we
invent lore for all the planets that kind of like have
stationary shadows. It’s just a fun, kind of
goofy thing that we do.>>Victor: It works, right?>>Ryan: Yeah, it works. And it’s kind of clever. And it’s more creative
in some respects. So like in Promethea– Promethea actually has a sun. And it’s there and
it doesn’t move. And we don’t really
draw attention to it. But if you look up in the sky,
there is an asteroid belt. There’s a huge thick
asteroid belt on Promethea. And you could see it
orbiting the planet. And it’s kind of equatorial. So if you look up, you can
see asteroids clip the sun, or occlude the sun. And when that happens, we
tie the time-of-day cycle up to that animation so that
when there is an eclipse, we go to full night. So there’s actually six
or eight nighttime cycles in the Promethea day/night cycle
that just kind of like happen. And some are longer than
others, because you look up, and you actually see
those occluding the sun. And a lot of people might not
have ever noticed that detail, but it was something
that we paid a great amount of attention to. Brian: Yeah. And that planet is, then, also
meant to be hyper-urbanized, mostly built-out environment. It has way less terrain
areas and stuff like that. And so because of that, it has
way more of these neon lights and other ambient
light sources that are coming that are
not direct sunlight. And all of that just
nicely blends together. And you sort of
always get this– it almost always
feels near dusk kind of vibe, a little
Tokyo/future/noir vibe.>>Ryan: I can actually kind
of show you guys– because this system
is so simple, I can go in here and
just type in Promethea–>>Brian: If you could type.>>Ryan: I spelled it wrong. Promethea. I could click on that. And I could come in here and you see how complicated
this day/night cycle is? So we wanted to do like
a Blade Runner approach. But again, this is how
cool this system is. We could see what
another planet’s lighting looks like on Pandora. And it still works just fine. Like, nothing’s going to
break or anything like that. We could see what we did with
the fog and all that stuff. And if we scrub
through, you can kind of see, like, how you notice more
pink lighting for the horizon. Because we had a horizon
and zenith controls. We had all these things. And these are pretty much keyed
specifically to the painting texture that is controlled
with that 0-to-1 value that I was telling
you about earlier. And then it just loops. So we made it work to loop
with the time-of-day cycle. And then Promethea is always
changing color and things like that, depending on
how occluded the sun is.>>Victor: And that seems like
something that sort of you stumbled upon because
of the tools, right?>>Ryan: Exactly. Yeah, the tools enabled us to
do so much more than we thought we were actually going to do. Because we were sitting
there, whiteboarding it out. And we’re like, oh, it
would be really cool if we could transition layers,
and we can do all that, and just copy-paste keyframes
from one map to the next. We even saved data
of keyframes out. We never got that
involved with it. We ended up just sticking
to the actual Asset itself. But yeah, because of how
simple the tools were, we were just able to
quickly test ideas, too. And I remember being
in meetings with Randy, and he’s like, well,
what if we did this? And I’d be sitting there,
and without this tool, I’d be like, absolutely
not, we can’t do that. But because of it, I was
like, hm, yes, we can do that. And then a week later, we’d
have a prototype of it.>>Brian: What if the bad guys
phase-locked the moon? It’s like ugh, and then, you
know what, actually, yeah, we can do that.>>Ryan: So I’ll just pop
Pandora back in here. And we can set the
starting time of day, and we can transition
layers, and stuff like that. It all just worked. It was great. So if no one else has any
questions related to this, we can move on to some other–>>Victor: Yeah,
we can keep going. There are definitely
a lot of questions. But we’ll try and go
through some of them.>>Ryan: You can go ahead and ask
them as they come out, too. Some of them, I might
not be able to answer, because of stuff that we’re
not supposed to talk about.>>Victor: They were curious how
many reflection probes.>>Ryan: So we had
to limit those. Because of the upgrade
of the reflection probes, they became a little
bit more expensive than a standard reflection probe. So we had to take that
into account in our budget. So man, if
Strobel were here, he’d have the actual number. But I want to say it’s– first of all, it depends on
how many are on screen at once, and how many overlap.>>Brian: How big they
are volumetrically, and how much they actually
overlap with each other.>>Ryan: Yeah, so there was just a
GPU budget that we had for that. And it was usually like– we didn’t want people to have
more than like three or four of these overlapping. Four would be a
really expensive case. So we would usually
do something where we would have the Skylight,
we have a main fallback, and then we would put a big
one in a room, like a box. And then we would add other
ones that informed the– like if something was really
metallic, and needed it, we would add one there. But all of those probes, because
instead of just one capture, it was three captures, like
three different textures, or mini-buffers,
like I was saying, it kind of reduced the
amounts of total probes that we could fit
in memory budgeted by hitting– it decreased
that budget by a third. Because now one probe was
kind of like the cost of three in memory– ish. These numbers aren’t
super accurate, but they’re kind of ballparky.>>Brian: Yeah. And every map has its
own eccentricities. So like you can get away with
some more expensive stuff in a smaller map if
you’re not doing really GPU-intensive stuff in there.>>Ryan: It did make it more
expensive on the DB because of all of
the work that needed to go into compiling– or not
compiling, but compositing that reflection
environment as well. So we thought that the cost was
worth it because of the result that we got and what we
needed to light with. So we shipped with it.>>Brian: Yeah. It’s also one of
those things where it’s like you know
it’s working when you don’t see a lot of bugs
or a lot of customer feedback about it. Like all those scopes always
look so good with reflections coming off of them. All the metallic
surfaces feel good. And if you look back at some
of the old Engine stuff, you can notice some,
like, eh, that’s not– it’s a little
over-bright at night, or a little dim during noon.>>Victor: Yeah, there’s more
general questions about the game that I think we can
dig into in the end once you’re done with
the presentation.>>Ryan: Yeah, sure, great. Let’s go ahead and move on then. So now I’m going to talk
about a couple of tools that we built just with
stock Blueprint, things that were incredibly
useful to the team during development and during
just population of the world. And this is going to be geared
more towards level artists or people who just like the
tech art part of all of this, the art part of the tech art. And it’s all it is just
how we kind of used what already is in Unreal,
and just kind of enhanced it for our particular workflows. So one of our tech
artists, Brian McNett, did some really
awesome work with decals. We subclassed a Decal Actor. And we made decals that had
height maps and other things in the shader and we exposed all
of those properties into the Decal Actor
itself to give the most flexibility for the decal. You know, typically what you
might do as an artist is just kind of make a decal– like
think of graffiti or something like that. And then we bring
in the textures, and then we’d slap a Material
on it, and that would be it. And the Decal
Actor would just be a Decal Actor. It would have fading
and stuff like that. But what we wanted to do
to get the most mileage out of all these decals is
expose those Materials and all of their properties. Whoops. So if I drag this decal
off over here– actually, I don’t want that one. I want this road one. This one’s really cool– this one. So if we go into
the details here, we have all of the controls
for that particular Material, like opacity, we’ve got
height map blending opacity. So this is a really
great way to change how it actually fades in. And the goal with
these, by the way, is to make these as absolutely
artist-friendly as possible. We didn’t want our level artists
to have to go in the Blueprint, go into the Material,
make Material Instances. Because of the rate that we
actually deco these maps out, we needed it to be fast. And that’s exactly what
these were meant to do. They were meant to be flexible. So they have a lot of parameters
that might not always get used. But it just ended up being
really useful because of how flexible they were. By the way, these are
debuffer decals as well. So if anyone’s
wondering about that. So yeah, we can control normal
string, things like that. We could change the
color of all of these. And pretty much all of
the decal subclasses had all these properties. So as long as, in the Material,
one of our tech artists expose all these
and set it up right, these just became so useful.>>Victor: And these parameters
that you’re changing here, they’re actually not
changing the Material Instance parameters, they’re changing
it individually for this decal?>>Ryan: So in this particular
case, each Decal Actor creates a dynamic
Material Instance that gets applied
to that Decal Actor. Every one of these decals has
its own Material Instance, which has some
memory implications, but it’s not as much
as you might think. We’re talking about
kilobytes of– I mean, even less than that for
one of these Material Instances. Because there’s not that
many properties being edited. But I will say– and we’re on 4.20. That’s what we shipped with. Afterwards, 4.22– I think
it was 4.22 or 4.23 introduced the concept of– they’re kind of like
buffer parameters. Each primitive component
has a constant buffer that you can add, and set
up in the Material itself. So instead of creating
dynamic Material Instances, we can expose
parameters that are specific to that primitive
in the scene that get interfaced in the Material– I’m probably losing people here. But at the end of the day,
eventually not all of these will create their own
Material Instances. They will use one
Material, and the Material will adapt based off
of the properties on the primitive
component itself, not based on the Materials. And that’s a technical thing
that you guys have released, that when it came out, I was
like, argh, I wish we had that. Because we could’ve optimized
our memory just a little bit more, just a tad bit more.>>Brian: But I mean, from
the big memory– when we’re talking
about some of the– if we didn’t have
this level of control, that is a completely
different texture sheet that we’re bringing in
to get a detail that behaves like that versus a
decal that behaves like this. So yeah, we’re going to eat a
little bit of dynamic Material Instance cost. But scalar wise against a
high-res texture, it’s trivial.>>Victor: And also it’s the budget
between the creation time, the amount of time it takes for someone
to actually use the tools, versus how much you’re
saving in performance. You have to balance
that as well.>>Brian: Yeah. And again, it’s all about– when we talk about
building tools, our end user is our
fellow developers. It’s the people at the studio. And we want everybody to be
maximally creative and fast. So anything that we
can do to speed them up is really worth it.>>Ryan: And it maintains–
this one in particular is a tiling Material, too. So it maintains its
texel density as we scale this up and down. So we don’t have to like
worry about rescaling if we need it to take up more space. All right, so that’s decals. Let’s move over to some
actual Blueprint tools. So one of the
things that kind of drove our level artists crazy– and there’s examples of
it all over this map– are things like this. These are actually
cable components. But we don’t sim
those in the game because they’re expensive
at runtime to simulate this. There’s actually a
funny story here. Strobel’s got–
Strobel is our optimization guy. optimization guy. We throw his name out
all over the place. Because when someone does
something really bad for perf, he’s like the bogeyman. He comes in your office, and
he’s like, what did you break? But he’s a very valuable
member of the team, though, because like one of the
few people– or not few people, but he’s the main guy
who really, really cares about optimization
and performance. So we had these
cable components. And he would come
in one day, and he’s like, why are there 128
cable components in a map? And the level artists are like– It’s pretty. Yeah, because I just
wanted to use them. And they’re fast to work with. So we either had to– we had to ditch doing that. We couldn’t sim them. But we changed them so
that we didn’t simulate, but they kind of did. And then when they got to the
area that they needed to get, we can lock them, and
then copy those around. But that took forever. And it was a bad workflow,
and people didn’t like it. And the designers wanted to have
a start point and an endpoint, and they wanted to
drag those up, and see it update instantly. And this map doesn’t
have a lot of them because this map was completed
before he made the tool, well on into development. So these kind of things
really drove artists crazy when they were placing them. This is actually
the perfect example. Because these flags
are Static Meshes. And they’re not very flexible. Because if someone wants to come
in here and move this post– I don’t know, I’ll
just move it this way– that’s broken. And then, well, I’m going
to have to scale this up. And then I’m going to
have to do all this. And it doesn’t matter
where the pivot point is. This is just a huge
pain in the butt. So we kind of made a new
type of Actor, called– it was a Hanging Spline
Actor, essentially. And I’m just going to go
ahead and drag that guy out here, and pull it up. And you’ll notice
this start point and endpoint, these labels are
kind of not on the widgets, because I think we’re doing
some screen scaling right here. But ignore that for now. So what these guys
did, we used what’s called a catenary formula
to kind of calculate the physical hang of
cables when you have a start point and endpoint. You can kind of just
calculate that iteratively. And we used Blueprint to do
that, all in the construction scripts. So with a few properties and
some start and endpoints, we can move these around. And you can kind of
see how that updates. And we had some
really cool variables to work with, like tension. We can make them tight. We can make them really loose. And this isn’t just
a parabolic arc– parabola? Whatever.>>Victor: You got it.>>Ryan: Yeah, OK. I’m close to being there. So when you drag
it up like that, you get that nice
sag right here. And yeah, there’s
easier ways to do this, but they don’t– sometimes
you hit uncanny valley, because you can’t get that exact
physical look without doing this all mathematically, or
simulating it like the cable companies do, right? So yeah, you can
calculate all that stuff. And we used the Procedural Mesh
component to then create a– I call it lofting the spline. We create the verts that go
along the resulting spine. And we just make a tube that
kind of fits over that spline. And then we just get rid
of the spline component in the construction script. So when you’re
looking at the Actor, the only thing you
really see over here is the default scenery. But you actually should see
the Procedural Mesh component. I’m not entirely sure why
that’s not showing up. But it doesn’t matter. But you know, we can
do things like change the amount of edge loops. So if the resolution ever
gets a little cruddy here, we can just set that to like
50 or something like that. It smooths it out. You could change how many sides
are on there, and the radius, and all that stuff. So that’s really cool. And they absolutely love these,
because duplicating these and changing the tension, it
was just a really great way to hang wires, attach some
of the other Blueprints, things like that. And it just made their
workflow way easier. And that’s all just
done in stock Blueprint. That’s the power of Blueprint. We can do these really
cool things like that. But one day, one of
the level artists was like, well, what if I wanted
to hang something from these, like those flags
that you saw earlier. And I was like, yeah,
absolutely we can do that. Because we know
where the spline is. And we can scatter
anything we want on there with Hierarchical
Instanced Static Mesh Actors, and essentially get all kinds of
stuff scattered on these things for really cheap. We’re talking like a
draw call per unique Mesh that’s being added
to these components. So we have the ability to do
that here with these hanging Mesh Data Assets that we made. So these are just structs. We have a struct. And then we could
select the Mesh. We could override the Material. And we can change all these
parameters, like how many things get scattered on this. And I’m going to show you
a quick example of where these got used the most. And that was in
the Eden 6 system. And we just subclass those. So we made the structs, we
set the default parameters, and then we subclassed
that into a new Blueprint, and we call it hanging vine. And when we drag
hanging vine out, we get something
that looks like this. So you may notice
here that we have a Mesh that is also
kind of going along that spline catenary. Well, that’s because we
used Spline Mesh components. So we can measure how long these
things were on their x-axis, and then we could kind
of like divide those up along the spline, and
just add those dynamically in the construction
script as well, so that when an artist wants to
create kind of like that Eden 6 bayou feel, they
can just do that. And they can move these. And it’s all very fast. And all of these are getting
populated in the construction scripts.>>Brian:Yeah, and the moss
is all hanging down. So you’re not getting any of that
weird rotation at the edge bits.>>Ryan: And we could
change things– like we could jitter these, and just
settle these properties. We could change
how many we want. So if you went crazy and added
like 100, we could do that. And we even added
a scale curve that looked at the total normalized
distance of the spline. And we could change the
scaling of those Mesh depending on where it
actually is on that spline. So you could kind of see,
like, this is the scale curve. And that’s where this
peak is, right here, is right in the middle. So if I wanted to do something
crazy just to show that off, I can kind of do that. And it doesn’t update
until I actually move it. But see how it changed
that scale over there? Then you can go up even higher.>>Victor: By the way, we love
seeing anything that’s crazy.>>Ryan: OK, yeah. Let’s break it. Let’s lift it really high. Boom. So it actually
doesn’t really break. It just adds
another cool profile so that they all
don’t look the same. And this is done with
the curve tables– super useful. Any time you have a value
that you can look up, especially a spline, or UV
coordinates, or whatever, any kind of variable
that you put on a Mesh that you need to map
it to another value, these curve Assets
are just amazing. They’re amazing. We use them all over the
place, too, don’t we? We use them in design
quite a lot too.>>Brian: Yeah. Curve math– if you
really want to get into really good technical
design and technical art, learn curve math. There’s a great GDC talk– Squirrel runs the GDC
Math for Game Devs track. Almost every couple of years,
there’s some sort of talk. So if you get in
the GDC archive, there’s always lots of
really good talks about it. Because once you start
thinking that way, you just optimize your
workflows so efficiently because you’re not having to
constantly hand-key values.>>Ryan: So it’s just really
fun to play with, too. Once you get in here,
playing with the tension variables to get
these different vines. You can just really lay down
some cool silhouette detail. And I’m not a level artist. So this stuff
probably isn’t going to look as good as what
Dave or Brad can do. I’m just dropping
names left and right. These don’t mean
anything to anybody. But they mean something to me. All right, yeah, so we did
that for a lot of decoration as well. Ooh, I see a Houdini
question up there.>>Victor: Yeah, they were curious
if you were thinking about using Houdini to do similar things. We did use the Houdini
Engine for some things. The problem– first of
all, I love Houdini. Houdini is great. We’re using it a
crap-ton right now. We were ramping up on
Houdini during production of Borderlands. It was still kind
of a fresh tool. And it’s a very
scary tool to learn. And all of our tech artists, and
even some of our level artists are starting to learn it now. So we didn’t get to use it as
much as I think we probably could have for Borderlands,
but we are definitely going to be using it in the future. One thing that the
Houdini Engine– it’s kind of slow
when it generates, because it’s not
using Procedural Mesh components right now. It’s running the script, and
it’s outputting a Static Mesh. So when you’re changing
values and stuff, you do not get the
same speed and feedback that you would get from
changing values in Blueprint. Blueprint works super fast. It’s like that. And I’ve talked to Luis at
SideFX, and he said that– well you know what, I
can’t remember if I’m not supposed to talk about that. So I’m just going to
not say anything else.>>Victor: We’ll leave that
for later, in case.>>Ryan: Yeah. But let’s just say that I
think Houdini will probably end up being a little bit
quicker in the future– I think, predicting. God, don’t sue me. [ALL CHUCKLE]>>Brian: How dare you say nice
things about them!>>Victor: They’re going to be here
on the stream in a few months.>>Ryan: I just want to shout
out to the Houdini guys. Those guys are so good. They’re so awesome with
their community outreach. And they listen to
their developers. And they make sure that
anything that we suggest, or any time we need
help with their tools, they are there
instantly to help us. And I have so much love
for that whole company and what they’re
doing, because it’s making our jobs as tech
artists so much easier with all their procedural tools.>>Brian: They care about
pipeline. And like I was saying,
we want our people to just be able to get
in the editor and jam as quickly as
possible, and achieve the things they want to do. And every time you talk to the
people from the Houdini team, they’re mentally at
that space already.>>Ryan: Yeah. They come to our
studio every year, just to be like, what
are you guys working on? What do you need? Can we do anything? And it’s always great. We go get pizza. It’s fun. All right, so the
next natural evolution of this when people started
using it and seeing it was the lighting team. They were like, well,
we do this thing where we hang lights from
these wires all the time. And that’s a huge
pain in the ass because I have to go
individually and place these Static Meshes, and then
I have to go place lights, and then I have to make
sure the colors all match. And if I set the
wrong value, and I’m editing the day/night cycle, and
the value is not bright enough, I have to go through
all of these, and shift-click them all,
and it’s a huge pain. So we extended the hanging
spline tools to lights. And you could see these
all over the place in this particular map. So this is one of those tools
that when we extended it, they went back and
ripped out all the temp stuff that was there. Because lighting kind of comes
at the end of the development cycle, because maps
seem to be locked down, thew were able to use
a lot of these tools when we actually made them. Because God, the
development goes– it’s so fast that we have
stuff that we’re like, yeah, we’ll get to it, we’ll get
to it, we’ll get to it. And then we’re
like, hey, the game needs to ship in six months. And we’re like,
oh, crap, we really need to make those tools now. So we did that. And here’s an example of
them using those here. So this is a special
subclass of these. And these do everything
that you just saw, except they also add lights. And we have these here in a
special category that we just added to a subclass that allows
us to go in and kind of change the light color. Just set them to blue. So we’re going in there
and we’re making sure that the Material
instance is getting updated and all that stuff. And let’s go to nighttime, because
we can do that really easily. See that a little bit better. Yeah. So let’s change the
color of these lights. Oh, it’s baked, so it’s
not going to change.>>Brian: They’re
not going to do it.>>Ryan: If I move it,
they will, though.>>Brian: The Material Instance
will.>>Ryan: Well, the Material
Instance will, but this won’t. So let’s just break the
lighting really quick. There we go. It’s kind of changing. Let’s do something else,
like change the intensity. 5,000. There we go.>>Brian: Yeah, there we go.>>Ryan: Yeah. So now you can really see that. And we can make it even
brighter by adding more lights. And those, I think, are
controlled via this. So we can add, like,
10 more lights. And change the tension. And these are static
lights when they come out. So until lighting
is actually baked, these are super-expensive,
because they’re all dynamic until they get baked. So at first, the lighting
guys are placing these. And then I might get a
visit from Strobel and be like, what is going on? What did you do? And I’m like, relax, they
need to bake the lighting. And they would bake the
lighting, and it would be fine. So we made we made a lot of
different types of Light Actors for the team as well. Some change colors. Some can randomly select
between different colors of lights and all that stuff. And they’re all very
easy for them to manage.>>Brian: Yeah. When you go to Athenas,
they’ve got more Asian-inspired lighting, lanterns–>>Ryan: Asian weaves and things.>>Brian: Empty beer bottles with
light bulbs inside of them is the lighting inside
Sanctuary and a couple other different places. It’s funny, because
I ended up having to make a lot of
the– our main menu has alternate backgrounds that
turn on for holiday events. We needed to do a background
for a Christmas event, but we didn’t have
any Christmas art. But we had these things
sitting off the shelf. So it just populated with red
and green beer bottle light strands everywhere. Five minutes later, I have a
Christmas-themed background. Didn’t have to
bug lighting team. Didn’t have to bug art team. The content just allowed
people to riff on it quickly.>>Ryan: And I want to hang these
all the way across the map, because I am a lighting
artist for this stream.>>Victor: That’s beautiful.>>Ryan: Isn’t it nice? More. [CHUCKLES] All right, ship it. It’s good to go. Make this higher. All right, so– and I
think the last thing, I’m going to talk about
another Blueprint tool. And I just like to– I’m sticking to spline
tech, or things that we did with the spline component. Because I just think
that the spline component is like the best thing that’s
ever happened in my life– well, not in my life,
but in game development.>>Brian: There you go. Way to walk that back.>>Ryan: Yeah, I’m going
to walk that back. My wife’s probably like–>>Brian: She’s
watching right now.>>Ryan: Yeah, she’s like,
all right, so I guess I’m not as cool as splines. No, she is. She’s awesome.>>Victor: The struggles of game
development and relationships.>>Ryan: Yeah, so we wanted to extend
these to tarps, because we’ve got all of these bad boys– or these guys, like these dudes. We’ve got these guys
all over the game. And these are really
frustrating to work with, too, because they’re the same
exact problems as the flags. You know, an artist would
have to make a shape. And they would have to
conform their entire space to the shapes of
tarps that they had. But they also wanted the
tarps to hang realistically. So one of our level
artists came to me one day. And he was like, can
we do the same thing that we do with the
hanging spines for cloth. And I was like, no,
I don’t think we can. And then I sat there
and I was like, wait. I was like, all
right, well, what is a cloth when you
hang it from four points if not just four catenaries–
or cross-sections of catenaries. So I tested it out. And it turns out that
it actually works. It’s not physically accurate. But it’s enough to
kind of give these guys exactly what they need. I’m going to close
out the World Outliner. I’m not going to
close out of that because it will snap it back. So let’s just drag this down. All right, whatever. So when I made this, I
was in this mindset of, like, just keep it as simple
as absolutely possible. So I didn’t want to expose
any crazy Material things or anything like that. Or I just wanted
to expose color. Because I was looking at what
they were doing with cloth. And I was like, all
right, they seem to just be choosing a couple
of different Materials, and changing the color, and
the wind direction, and speed, and stuff like that. So I did this thing where
I enumerated styles, the types of cloth styles. So I have block-out. And I have different types of– I pretty much just
looked for all the cloth and tarp Materials that we had. And I made it so that we
can just hot swap those without us having to go
through the Material editor and looking for these. And it looks like some
of these aren’t working. But they should. They might be just compiling
Materials or something like that in the back end. That’s actually probably it. I wonder if I just
leave it here for– we’ll give it five seconds.>>Brian: Live game
development off our network.>>Ryan: Yeah, all right. Cool. We’re just going to go back to
tarp, because it looks pretty. Whatever. So when we move
these guys around, we can set how we want
this tarp to hang. And you’ll notice it
has the same kind of– I’ve got to change
these cameras. It has the same kind of
variables at the other catenary side. We can adjust tension
of specific sides. So this gave really great
shape controls to the artists as well.>>Brian: You made so many level
artists happy with this.>>Ryan: I probably made a lot
of content artists happy, too, because they
stopped getting tarp requests.>>Brian: Well, it’s also just– I mean, talking about bugs,
those tarps that are up in the Skybox, we had to keep
baking those multiple times, because little minor
things in the Geo would change just over time. Like the tower gets
a little taller. Or a crate that
was attached to it got moved for gameplay reasons. And now some
environment artist needs to go in, and take that thing,
and make minor adjustments to it over and over
and over again.>>Ryan: We can do
fun things here. These are all the
Material settings I set up for these guys. There’s not many. But change the color. And oh, you want it
to be really windy? Cool. Let’s set that to 5, or
10, or something like that. And now it just gets a
nice panning wind value. And we have Wind
Actors in the map that just change the direction. So it’ll automatically pick
up, from the wind strength, that that thing is doing. So everything just
ties in nicely, and everything’s blowing in
the same direction, and all that stuff.>>Victor: Is that the default Wind
Actor? Or did you write a custom one?>>Ryan: We had a custom one. It wasn’t anything special. All the custom one was doing
was piping its direction and strength to a
collection parameter. It might have been
doing a little bit more. But I don’t think we did
anything in C++ with it. We just made it
do what we needed it to do to interface
with all of our art, and our foliage, and stuff. And we just used that. We just called it the
Borderlands Wind Actor. We put it on the maps. And it just worked. So and then, I’d love to
open up this Blueprint. But I just don’t think I
have the time to kind of go through everything. Yeah, I’m not even
going to bother. You get to see what the
construction script looks like. I’d say, guys, just find
the construct cloth node. And that’s going to work for
everything you need to do. [CHUCKLES] No here’s the
construction script. It’s not that bad. What you don’t see up here,
there’s actually two splines. It’s just two spines. And that’s 0.0,
0.1, 0.2, and 0.3. Those are the ends
of the splines. And we adjust those. And it’s like two
wire components. And then I do
cross-section catenaries. And I just translate
all those points to a Procedural Mesh
component, which comes out looking like that. So it’s very simple. We can change the
amount of segments. You just have those kind
of flexible controls. And I think you can also
change UVs and stuff like that. So and here is all of
those tarp variants that somehow work in thumbnail,
but not in the viewport.>>Victor: Do you bake those
out as a Static Mesh once it’s been generated? Or do they stay as procedural?>>Ryan: They stay as procedural. Because we’re talking about– the Mesh memory
for that is not– there’s not going to be like
1,000 of these on the map. If there are, we’re going to
come talk to you at your desk. But there’s not. Our artists know
better not to do that. So we just eat the cost. If there’s a flag on
memory, yeah, we’ll go in, and we might bake those down. But there’s no point
if it’s not impacting memory in a really bad way. Maybe this works now. Ah, no. OK, well, I’m just
probably going to end my portion of
the demo with that–>>Brian: Cool.>>Ryan: –I think. Covered a lot of stuff.>>Brian: Yeah.>>Ryan: Or we could answer questions
before we pass it off to you.>>Brian: Do you want to answer
questions, and I’ll load up my stuff?>>Victor: Sure. Yeah, this is good. Let’s go through. We have a lot of questions. They were curious if you
ever used any Unreal Engine Marketplace content
for prototyping.>>Ryan: If we did,
we didn’t ship with it. That doesn’t mean
that we couldn’t have. I just don’t think
that we were– yeah, I just really
don’t think that we did. A lot of that contact
that comes in, especially if it’s
our content, it doesn’t match the
style of the game. So we would have
to redo it anyways. But there is actually
really cool Blueprints. Like, there is really cool
ivy generators and other– I mean, everyone has their
own homebrew of hanging cloth. There’s a really cool one
that drapes wires and does collision detection,
and will even wrap it. We would look at
the marketplace, and then someone would come
to me and be like, hey, we need that. And I’d be like, (WITH
SKEPTICAL TONE) do we? And if the answer was yes,
I’d be like, all right, I’ll just make our own
version that does everything that we need to so we don’t
have to spend $15 for that. [CHUCKLES] If you think about it that way–>>Victor: If you can make them,
it’s a little bit easier.>>Ryan: Yeah, for me, my
job is the most fun when I’m challenged to kind
of like recreate something that someone’s done. Using Marketplace content, while
it’s absolutely cost-effective if it’s right for your
game, we just didn’t do it. And I like making my
own versions of things. I like to try and– the “can I recreate that?” And that’s a lot of what
drives what I like to do.>>Victor: But also, you probably
have a large library of Assets from the
other two games that, in case you
need something–>>Ryan: Yeah. Believe it or not, though,
because last-gen wasn’t PBR, and our style is kind of
toony, and very stylistic, a lot of that
content didn’t have some of the maps that
were required for PBR. So they didn’t have
roughness maps. They didn’t have metallic maps. A lot of them didn’t
have normal maps. Because we baked
all the lighting. And when you bake
lighting, you don’t see too much of that
detail, especially when you don’t have a strong
reflection environment.>>Brian: Especially the BL1
Assets in particular. There’s such deep
black inking on it that you’d never see a normal
map in a million years.>>Ryan: What we did, though,
is we would take some of those and retexture them. So we would take the
content, and we would have someone go in and either– sometimes the source
had normal maps, and they would bring it
into Painter or Photoshop, and they would generate
all the content that they needed to make it look good. I think we did that
with a couple of things. We went through
several iterations with rocks, like especially
on Pandora, where we had– poor Tris had to make
like four different versions of rocks before we landed on
the ones that felt like Pandora. Because we were looking at– I’m probably getting
too in the weeds here. But we were looking at Western
United States rock formations. And we would sculpt those out,
and they would look awesome. And then we’d put
them in the game, and we were like, this
actually doesn’t really feel like Pandora for some reason. And for a while, we
just didn’t realize that it’s because we’re not
using the same kind of tones and the same kind of shapes. So what we ended up
having to do was just evolve a lot of our older
content to hit that feel.>>Victor: OK. That’s a lot of references
to go from as well. Two entire games, right?>>Ryan: Yeah. We had a lot of
stuff to look at.>>Victor: Cool. All right.>>Brian: All right, a little brief
riff on the glory of Blutilities. So like I said, we’re on 4.20. And Blutilities
were really starting to come online during
game Dev for us. And we have a group called
Editor Test Engineers, ETEs. And they do a lot
of our automated build testing, a lot of
our performance testing. They really focus on ways that
we can automate performance in QA processes. And they started messing
around with Blutilities. And they created this
one that our level design teams use a lot. And it’s called the level fixer. The game Dev process is
three, four years long. We’re building the
bicycle as we’re learning how to ride the bicycle. So cruft builds up. And it’s situations like
this, we would end up with– we would make some stacks
of crates and some stuff like that. And then eventually
we’d be like, you know what, we need it to be
a Physics Destructible Asset, not a Static Mesh. It just looks better. And you end up in circumstances
like this where it’s like, hey, in that little art vignette
that was over there, all of them are fine except
this one ended up being a Static Mesh instead
of a Physics Destructible. This one ended
up– one of these. You end up with inconsistencies. In the old way,
you’d either have to rely on the level
designer or the level artist is going to manually
come through the level and find them all, or QA is
going to have to find them all. And the bigger and
bigger the game gets, it’s impossible to QA
it like that anymore. So they started to
create this tool. So I’ll open my Blutility shelf. And I will open
up the Blutility. So this is our map fixer. And basically it contains
a lot of common things that started to crop up. And we just sort of
grew it over time. Where if we have a
persistent issue, and we can solve it
with a Blutility, we would just sort of
layer it onto this. So it solves a lot of
our kind of basic stuff, where it’s like
you see black fog. We use lots of back black
fog planes as spawners. But they need to have
translucency sorting set in the right
way, or else you end up with weird ghosting stuff
that happens behind them. So I could go in and I could
audit the translucency sorting. It will fill out an array. And then if I want to
fix them, I can fix them. But we separate
auditing and fixing. Because sometimes we’re
doing it intentionally, and we don’t want to bash that. So in this case, we have
these Physics Destructibles. So I’ll audit destructibles. And then I’m going
to close that. You’ll see, in the output log– whoa. So much stuff has happened
in the output log.>>Ryan: It’s mounting a lot of
plugins.>>Brian: Yeah. I get my map fixer notification. It’ll either say
that all identified bad destructibles have been
added to the audit list, or it’ll say, hey, it was clean. Didn’t find anything. And now when I’m in here, this
auditing list gets propagated. And it’s like, hey, I found a
cinder block pallet and a wood plank trash can that were the
wrong Static Mesh versions. And all this is really
doing underneath the hood is it’s basically
just using a Map Asset to look at things that
we know we have replaced. Because we can’t
just do a normal– it’s not a Static Mesh anymore. I can’t just say, hey, replace
all the Static Mesh Instance with a different thing. I’m replacing it with a
totally different Actor class. So they just sort
of build up this map of known destructibles,
known cloth tarps that we made Static Meshes versions of. And then we made Cloth
Actor versions of them. And I’ll just go through
that, identify all of them. And then if I want
to fix them, I can say, great, go ahead
and replace audited Actors. It’ll clear the list. And now when I come
through, all these things are going to be Physics
Destructibles now. So it’s just a really
wonderful way of, again, just– end-of-project work,
we’re just trying to make sure everything’s
as clean as possible. Throw some grenades in there.>>Victor: You got infinite
grenades?>>Brian: No, but I can just
keep balancing myself, and just throw more
grenades in there. So now all the bad offending
Actors have been replaced. And those bad wood
planks are good to go. And like I said, this was one
of those tools that started out with them just trying to solve
a couple of little things. And then the level design team
can sort of go to them and say, actually, we’re running into
these two or three issues, especially for really
pernicious things where it’s just this one
data value that’s stored on an Actor Instance over here. And you’d have to
know to go click every version of
that Actor Instance, and go click that individual
data value to make sure you’re being on the bar. We set region balance
across the world. So it’s like the
entire world may be balanced to late-stage Pandora. But one loot chest over here may
be set to early-game Pandora. So it’s spawning loot
that’s level 2 or 3 instead of level 25, 28. So little itty-bitty details
like that get caught like this. So, big fan of Blutilities. If you’re not used to working
with them, nothing complex. Very well organized. I’m very proud of the
ETEs for doing this. But all this is really
doing is it’s going through, it’s going to discover
a bunch of things that we’re looking for. It’s just going to run
through that, make sure– we have a whitelist, so we
can say, hey, actually ignore these certain things,
because they’re special for whatever reason. Go through, scrub out everything
that’s on the whitelist. Go look up, in these class maps,
what the destructible version of that Static Mesh is. Go propagate them to
that internal audit list. And then that way
you can actually audit multiple things at the
same time, fill it all up, click it. It can clean up 10 different
types of bugs for you in a level, in seconds. And again, that’s just
good sanity check. Good, you know your
map’s clean now. It didn’t take you all day. Took you five seconds to open
the tool and run the thing. You know, those are wonderful
speed and efficiency gains.>>Victor: Do you keep a very
strict rule on naming conventions, then, considering that that
map would have to be otherwise automatically changed
based off if someone changed the name on an Asset?>>Brian: Somewhat folder-structure-wise,
and then it’s sort of– they try to hit the known
offenders most often. We do have naming conventions. I can’t say that we’re the
strictest people in the world in terms of naming conventions. But generally,
yeah, we’re trying not to go too far off base. And if it does, somebody
will go and name it. Usually the redirectors will
catch that sort of thing if something does
get named wrong. As long as it’s in there,
if it did get renamed, the redirector will
catch it usually.>>Ryan: Yeah, we do a lot
of Asset auditing, too, just after
things get checked in. Is it every night
the auto-audit runs?>>Brian: Yeah. And that’ll sort of go
through and scrub things that we have known
rule sets for. And we have hundreds of rule
sets for– it’s like, OK, you named this thing wrong,
or it’s in the wrong folder, or this is referencing
something that’s not supposed to be referenced,
all kinds of things. This has a dead mission
objective reference that got deleted in the game. Go clean it up. So sometimes you’ll
come in, and you’ll have like 200 auto-audits. [CHUCKLES] Something you did
broke a lot of stuff.>>Victor: Is that a middleware
tool?>>Brian: No. It’s part of some of what
those ETEs were doing and some of what our release
engineers were doing. And basically it’s exposed
to us as collections, where we can basically– auto-audits kind of have
collections associated with them so we can
whitelist and blacklist certain Assets based on adding
them to those collections. But it’s just sort
of about anything that you can define an actual
rule set around so there’s no human error to it. It’s just, hey, if we said
Static Meshes cannot have more than x amount of verts, you
can make a rule around that, and an auto-audit will trip. And some of the auto-audits
just give you a warning. Some of them will
[inaudible]– we use JIRA. It will [inaudible]
stick a JIRA bug in and say, hey, this
thing was nonconforming. You know, things like
our developer folder– if there’s a reference to an
Asset in the developer folder, it will spit you a
warning and say, hey, your map is referencing to this. Fix your broken link. So this is what I was
talking about earlier. One of the things
our team at GSQ– Gearbox Studio Quebec–
ended up making is they started to do analysis
on the sorts of environments that we end up building
a lot, and how we can sort of rapid-prototype maps. So they built out this
prototype blocker. And basically they looked
at standard sorts of things that we do in the game. So level intro areas– when
you come into most maps, there’s sort of a lobby. There’s a safe space,
there’s some spawning, there’s some vending machines
and some ammo crates usually. It’s usually where we can
do some light narrative when you’re coming into the level. So they went through and they
actually kind of figured out good ranges for most
of these things. So level intros, transitions,
standard combat areas, combats with progression,
wildlife combat areas, vehicle areas. And you can just
come in here and say, you know what, I want to
make a transition area. And then that
transition area is going to go to a standard
combat with a progression. And then maybe I’m going
to have another transition area come off of that.>>Victor: And this is all based
on previous gameplay design and feedback, right?>>Brian: Yeah, literally they went
through after the game shipped, and they went through
every single map, and they sort of
took measurements for a bunch of different stuff. So this is something
they started deploying during DLC development
we’ve sort of started to use internally, too. And it’s just sort of a– it’s a ballpark. It’s not meant to
be super rigid. But you can definitely
quickly sort of lay out a top-down
for your level if you kind of know that we’re
going to go into a vehicle combat zone. Like, a combat zone’s big. So it’s really handy for just
sort of rapid prototyping.>>Victor: Yeah, it must save
a lot of time when the artists go in, too, right? Because they have to redo
less work eventually, because the space
has already sort of been set up to be
the right size.>Brian: Yeah.>>Victor: That’s really nice. It seems like a pretty simple
tool to build as well, right? It’s all [inaudible].>>Brian: Yeah, and again, this is
just doing real simple math on construction. It’s just taking these
sorts of presets. And depending on what
you’re trying to do, it’s just going to take your
Mesh, which in this case, we’re just using a cube. We have some basic Material
parameters in here. So we can change
the colors of it. So if you need to get
contrast for whatever reason, you want to make your wall
red, or yellow, or whatever, you can do that. If you don’t want it to have
collision for whatever reason, you can do that sort of stuff. So it’s just going through
there and saying, OK, based upon these preset
grid sizes, take that cube, lay it out, do your
dynamic Material Instances, and set scalars based
upon preset types. So super simple, super clean. But it’s not always the most
expensive thing in the world that actually gets you
huge time savings in terms of rapid prototyping. Just really simple thoughtful
tools can go a really long way. All right. So those are my
little [inaudible] and how we do a couple of
different things within it. And then with the rest
of the time that I had, I kind of wanted to walk through
how we think about the mission system. For a lot of the
missions that we end up giving that are side
missions in the game, we give them through
these wanted posters. And this is a very
quick recreation of what the initial prototype
for this sort of looked like. We had the idea. We knew it was going
to be this poster. We knew it was going to fold up. Before we went very far
with feature development at all, it was just, OK, how
can we quickly make this work? How can we quickly
make a little thing that answers the who,
what, where, how, and why? So it’s not doing much stuff
other than this is built out of cubes, and a sphere,
and a text component, and a Skeletal Mesh with a
really crappy simple texture on it, and a box
component to handle switching between
the exclamation mark and the hologram. The reason we
prototype quickly is it allows us to get it in front
of as many people as possible. So the art team, before
they spent a lick of time actually going and
building the Skeletal Mesh for that wanted poster,
they could be confident that this was going to work. We could give it to
the level designers. They could propagate
it in their levels. We could give it to
the mission designers, and get that kind of wish
list about usability features and stuff like that. But when you look at
what a prototype looks like versus a shipping
Asset, it’s not much. It’s simple logic. It’s just sort of
answering, again, the kind of key questions
about, OK, the player’s going to be nearby. What’s it going to look like? And this is just
some custom Blueprint that we end up making that
changes switches in bodies. When the player walks away,
what are we going to do? When the player uses
the poster, what are we going to need to do? Nothing sexy, nothing crazy. I will note, because I see
this a lot in student projects, I’m very much a believer of,
don’t build Blueprints off of Actor. So we make a
Prototype Actor class. And that allows us to
sort of consolidate things that are prototypes
into a clean class over here so we can kind of track down all
the sort of nascent Blueprint that’s been created
on the project.>>Victor: Just an easy way
to find them all?>>Brian: Yeah. When it’s like
one or two people, it’s easy to track
down your Blueprint. When you have 400, 500
people on your team, it’s insanity to
try to track down. And that way we sort of
know that all the prototypes live over here. And we can kind
of cauterize them, or we can create an audit. So just like, how
many Prototype Actors are your levels referencing. We kind of clean those
up, and make sure they got replaced with the
real things or got upgraded. So the good thing
about our process is we sort of take this
prototype through all the core stakeholders. And one of the
things that we end up dealing with a lot of edge
cases is simple things like touch and untouch,
which when you’re in a single-player local game,
it’s pretty easy, because you can’t break that very much. When you’re in a
split-screen co-op game or a network-replicated
environment, a simple thing
like touch/untouch creates a TON of edge cases. Because we deal
with cable pulls. So a player got
in here, and then they pulled their
Xbox cable, so they got disconnected from the
game, or a network cable disconnected, or they just
lag-spiked for whatever reason. It’s not always as simple. Sometimes we have player skills
that allow you to teleport. So you touched it, but
you teleported back out. It didn’t actually
register an untouch event, because technically your physics
didn’t drive through the touch.>>Victor: When you say touch
and untouch, do you mean on begin and on overlap?>>Brian: Yeah, sorry. So on overlap. So it’s things like that,
where it’s like, OK, a players in a vehicle. The vehicle went through
it, but the player’s pawn was in the seat that
didn’t go through it. Does that need to trigger? For us, we have Iron Bear. So Moze gets wrapped
up in Iron Bear. Iron Bear touched it,
but Moze is inside there. Does that count? So those are the
sorts of things where we know the intent behind it. And we can just work
with the code team directly to say, hey,
we want to do this. Let’s just handle that in code. Because the code is going to be
able to handle all those edge cases a lot more. We also knew that this Mesh was
going to be a Skeletal Mesh, and the hologram was going
to be a Skeletal Mesh. And we have a lot of
Skeletal Meshes in our game. Most of the lootables
are Skeletal Mesh, a lot of the doors, a lot of
the spawn doors that people come out of are Skeletal Meshes. So we put a really high
premium on efficient management of Skeletal Mesh ticking. Because with so many
lootables populating the area, it can get real expensive. So we try to hedge towards
being very, very efficient, and only letting them
tick when they need to. So that was another sort
of thing, where it’s like, we can just push that
into the code class, and let code manage
that efficiently and in a standard way that
I’m not trying to guess at, because they can just– I’m not trying to emulate what
they’re doing other places. So our shipping version
of it looks very similar. It’s just a little
bit more complex. We ended up with a larger kind
of proximity box around it. We ended up creating a
smaller box around it to help with usability. So this interior box is
a little hard to see, but there’s a smaller box that’s
sort of around the edge of it. So basically it just
helps with the touch trace as people want to use it. And again, we moved from
just a text component to an actual Particle Actor. And the particle is
actually parameterized. So let me see which of these
particles is the projector one. So the projector one is
one singular particle that has the projector
and the text in it. And the text is just
parameterized out of it. So we don’t have to keep
making a different iteration if a mission designer
wants to do something. Through this process, we also
decided we didn’t actually want to subclass this. We actually just wanted
one class of wanted poster. And everything else is
just handled in data. I’m a huge fan of Data Assets. They make your life
so much easier. Again, in big
projects, Blueprint can really spin out of control. If you have a lot of
repetitive Blueprint, a lot of logic living in
hundreds and thousands of different places. So we didn’t want to
do that, because we knew we would end up with 25,
26 wanted poster subclasses that could have potentiality for
some with a logic hidden here and there. We have to go track it down. So instead, we just
provide everything that we need inside
of a Data Asset. And this is great because now,
for my mission designers, when they want to use a Wanted
poster for their mission, this is basically just
a couple questions that they have to understand. They need to
understand what’s going to show up in the hologram. So we ask them, OK,
what Skeletal Mesh are you going to
use, which forces them to think about
what Skeletal Meshes do you have loaded in your map. So you don’t want to bring
in a unique Mesh usually that’s only going to be
the one you used here. So if you’re going
to go find a skag, make sure your skags
are already on the map. If you’re going
to go fight Dave, make sure Dave’s Mesh
is already in the map. We let them sort of set
an ambient idle animation. So usually anything out
of this Skeletal’s suite that we kind of know this
is in memory already. It’s usually like a
breathing idle, or a walking. Like this guy’s just walking. And we use these
gestalt parts, which if I look at one of
our Skeletal Meshes, we try to keep all of the
various parts of a skag Mesh in his Mesh.>>Ryan: I think
it’s in the window. You’re looking for the
gestalt editor, right?>>Brian: Yeah.>>Ryan: Gestalt Part Selector– there
it is, under Recording Settings.>>Brian: So all the different
parts that can be on that Mesh are actually just
physically inside that Mesh. We’re not adding extra
Mesh components to it. And then basically we
just provide a parts list that says, hey, for
this skag that I’m going to use over here, just
to use this list of parts. So I just want the body
and these certain sets of horns that are in it. Usually this is great, because
the Wanted posters are usually like, you’re going to
kill Dave, or you’re going to go kill the skag. So usually the creature
team has already set up a gestalt list
for that creature, and you can just recycle it
because it’s already in memory. So it’s super simple. And that way, again,
I’m not relying– the mission designer
doesn’t have to write any additional logic. Nobody has to be trapped by it. It’s literally if
you can name what you want the hologram to be, and
what mission you want to give, and if there is an
alternate mission point. Some of our missions
allow you to enter a mission at the beginning
or midway through. So if you have an
alternate entry point, you can provide that. But you only really need to
answer a couple questions upfront just to make it work. And then again, I
can come in here and say, OK, I actually
want this to say bounty. And then I’ll just reconstruct
that, and bounty will show up. Or I want it to be save. I can reconstruct that,
and just hit save. So again, just focus
on flexibility. We want people to sort of be
able to jump in and say, hey, I want to use a mission– wanted poster. Great. Five minutes later, they’ve
set up their Data Asset, they’ve plugged it
in, they’re done. They don’t have to worry about
it, if I get a bug in ever, is it happening everywhere? No. OK, it’s in the Data Asset. Is it happening everywhere? Cool, it’s in the Blueprint. It’s very easy to kind of
triage bugs that come in. Yeah. So that’s the sort of– I like using this example,
because it’s a good– I think sometimes developers
have a hard time understanding where a prototype needs
to end, and where you just need to bring other people
in and talk to code, and what a fully-featured
shipping thing looks like. And you can tell, big
qualitative difference of what we’re getting here. But they’re both–
on a simple level, this is really not
answering anything new that this prototype
didn’t answer. It just sort of does
more edge-case checking than anything else. So let me open it up real quick. As I get into the
mission system, one of the things
that sort of did change is when you
use a wanted poster, all it’s actually trying
to do is it gives you the mission delivery placard. It gives you the
UI that says, hey, do you want this mission or not? And then, technically, when
you accept the mission, we do a filtered event
on our mission system. And it says, OK, if
you’ve accepted it, now we’re going to move on, and
either accept the poster, which basically causes it to roll up,
or if you didn’t choose to do it, it’ll stay an idle forever. But there’s a layer
of extrapolation between us offering
you the mission and you actually
accepting the mission. And that’s sort one
of the big differences between the prototype
and this one. Because our mission system is
the nervous system of the game.>>Ryan: This is really cool.>>Brian: Yeah. This is– our tools
team built this just out of custom Blueprint. But this is basically the
mission graph for the game. So the main mission line’s
there, through the middle. And all we’re really
defining is just linkages. Main missions always
unlock the next mission. So we provide what the
next mission in the chain is going to be. But then they will also
unlock other missions to be available to be
given to the players. So those wanted posters
become active, or the mission NPCs will now have
an exclamation mark above their head. So this allows us, as logically
and cleanly as possible, to actually understand
inherency in the mission system. And this is really where I
say it is the nervous system. Everything in the game
that takes mission state is taking it from this
graph or the subgraphs that are inside of a mission. And everything is sort
of dependent on the state of different
objectives, and then when a mission is completed,
or failed, or whatever. So the poster is
actually reacting to the mission going
into an accepted state versus the
not-accepted state, instead of the use prompt
causing it to move forward. So there’s always a little
bit of logical thought we have to put
towards how do things listen to the mission system. But the nice thing is,
it’s very deterministic. You can kind of understand
state flow from point A to point B through it. And then I built a little
sample mission for this, just to sort of
walk through some of how we can think about it. But we want our
mission designers to be able to prototype quickly. We’ve now got them embedded in
rooms so the level designers and level artists,
everybody can think about building a mission
in a space together. So we want people to
be able to act rapidly. So like I built this mission
yesterday, before we flew out. Great. Accept my epic test mission. Me accepting it causes
this wanted poster interactive object
to change its state. And a lot of things that
you see in the world, they’re all simple
state machines. And if I say interactive
objects or IOs, it’s because they’re very
simple state machines. The wanted– the loot
chests are state machines. They’re closed, they’re opening,
they’re opened, they’re locked. Doors are closed,
or opening, locked. Switches or on/off. Lots of state machines. So in this case, I provide– I had one objective
there for like, hey, go over to the bandit camp. In this case, my little switch
now has an objective for it, for me to use the switch. When I use the switch, it
pushes the mission forward, which allows the door to
say, OK, we’ve moved forward. And I’ll walk through
that logic in a second. And then this is just sort of
using GSQ’s block-out tool to define basic combat
ranges and stuff like that. So this would allow me to take a
mission prototype very quickly, and explain to a level
designer, or level artist, or another mission designer
what my intent was. If I want to walk through the
scary bandit canyon, and then I want to get to the
effigy, and then I need to turn on all the
gas lines for the effigy. So I can turn it on. We can burn Mr.
effigy Man over there. And then once we’ve
done the effigy, it’ll summon the fire boss. And I should give myself
a gun to fight back. And we’re going to
kill the fire boss. But I’m just going
to kill enemies. I kill the fire boss. Fire boss had a
McGuffin inside of him. I pick up the McGuffin,
and I completed my mission.>>Victor: Nice. Good job.>>Brian: So this is a very– I call this boilerplate. When I have new designers
come in and work on missions, I have them work on
missions like this. It’s a looter shooter. So this is focused on some
very just core competencies, about you need to
shoot some things, you need to loot some things,
you need to use some things. But like I said,
everything is meant to be responding to that nervous
system of the mission system. So we try to understand what
are the most common things we’re doing on missions, and build
code paths and Blueprint paths for them so that
people can just work quickly. And they don’t have to
write a lot of custom script on their own. So if I actually look at the
Level Blueprint for this level, I think the only thing– yeah,
I put a debug event in here to put me in demigod so I
don’t die on the livestream. But other than that, there’s
no Blueprint in here. I didn’t have to actually do
any scripting in the level. Because I’m just
using core classes, and again, common
boilerplate stuff to just make my life easy. So we have a traditional
Waypoint Actor, where all we really have
to define is, in this case, a mission conditional. So we can set a mission
conditional on this icon so that it’s active. And basically I can define, hey,
in my mission on the objective to go into the COV camp and I want it to be active
when that objective is active. Because clearly that’s
what’s driving me over there. And then when the
player touches it, I want to send an event
back to my mission. Oh, I clicked a thing. It’s going to propagate that.>>Ryan: Oh, it’s going to go
through all the mission lists?>>Brian: Yeah.>>Victor: I got a
question for you.>>Brian: Yeah.>>Victor: They were curious
how all of that is serialized for saved games. That is a great
question for Daniel. [CHUCKLES]>>Ryan: Does it get sent to
Spark?>>Brian: No. I mean, it’s all in
your local save profile. I mean, that’s why we’re not
the most authenticated game in the world.>>Victor: I also think if you
stop the mission midway, you actually have to do
this from the beginning.>>Brian: It depends. It’s on a per-objective basis. Some objectives are failable. So if you come back– and I can sort of show you. When we have a
mission objective, when the objective is
active, we filter off of a switch based on
the different states that it can be in. So if you were active
on load, and we wanted you to start over,
we would actually do this. And we would do a thwart. We would thwart the
objective to push you back, because maybe there is a
narrative thing that we wanted to hit you with again. But sometimes we can also set
an objective itself to be– some of them are all or nothing. So it’s like, hey, you
need to collect 10 cans. And if you only
got seven of them, and you left, and
you came back, we didn’t give you
partial progress. We made you start
from the beginning. But we can actually
come in here and we can set things to be completely
failable or not if need be. So some of the raid
bosses are all or nothing. You have to finish the
entire raid boss fight. You can’t come back
midway through. But largely, it’s
these sorts of things, where in waypoints– like
go to waypoint, very common. It’s bread and butter. So all you really
have to understand is what objective you
want it to be active on. And we basically wrote
custom conditionals. And these are all
code conditionals, just because if you’re going
to be using conditionals a lot, code-based conditionals
are just faster to resolve. You can prototype
them in Blueprint, but get them to a coder
as quickly as possible. And then usually they’ll
slingshot mission events. And we like to do it
this way, because it’s kind of easy to trace
what’s happening. So for my first
objective, I’ve only got one objective in the set. And it’s go to COV camp. That event is being slingshotted
from the Waypoint Actor to come update the objective. And then we just
automatically move on to the next objective set. And this is a very
simple mission. So almost all the objectives
are one objective. Or in this case where the I’m
turning on the three valves, I have one objective
that’s player-facing. And then I have multiple
invisible objectives that I’m tracing each individual valve. So if you did come back
and you had finished two of the three valves, it
would remember those states individually.>>Victor: OK.>>Brian: But yeah, a lot of– as I go through these things,
it’s same sort of thing. For the switch, I just
need to understand, hey, I’m sending this mission event. And this is when I’m
properly unlocked. We do this to sort of
bulletproof ourselves so that the player can’t
run ahead and use the switch before it’s meant to be. So we don’t unlock the switch
until the mission condition is active for them. And then you used it, it sends the mission event. We move forward. The door is just
listening to it. So in its case, it is
a mission-driven door. And it’s going to open when
this mission and the objective for open gate, which
was completed by that switch in the event. So when its status is set to
complete, it will be complete. When I come back in the level– [SNAPS FINGERS] –the door will be open,
because it’s already gotten that state once the
mission system initializes. And that’s sort of– what I like to push
with a lot of people is you want to minimize the
amount of custom Blueprint logic you want to be writing,
especially in level Blueprints, just because it’s very handy,
and you can do a lot of stuff with it, but especially
in the bigger projects where we have a lot of
people and a lot of hands in and out of it,
you just have to be more careful about managing
all that information. Because somebody
can do something really well-meaning, but
it’s kind of hidden from you. And you need to go remember
to open up that submap, and look at its mission
Blueprint, and audit it. And when it’s, hey,
it’s six months to ship, we got to
finish the game, you don’t always have time
to jump into everybody stuff constantly, and sort of
audit it and clean it.>>Victor: And it’s also a lot easier
to just go through Data Assets, and change values. So you don’t have
to check out a map–>>Brian: Yeah. And that’s the sort of
like, hey, if it’s in this, and there’s a problem
with that door, I don’t need to jump into
the map to go fix the door. I can just go fix the
door individually. Or if– like we end up
proxying doors a lot. So let me drill into
some interactive objects. We will sometimes build a door– where’s my interactive — Love doors, as everybody on
my team will validate me on. So we can give you a
quick prototype door that you can put in your map. And that way, art team can be
filling this in with the art meshes as they’re building it. And you don’t actually have
to go up to your map ever. One day, it’s just
going to look beautiful because the real Meshes came in. Or they’ll put the
proxy Meshes in, and you can sort of say, hey,
that’s not wide enough, it’s not tall enough, or whatever. It also just sort of
lets us enforce standards that we understand, like
Iron Bear can always fit through main doorway
paths, and players can fit through main doorway
paths, and stuff like that. But yeah, every single thing
that we did in that mission, from using a switch,
hitting waypoint triggers, using these valves,
even these fire effigies. This is a combination of
both of these approaches. This is– it’s another sort
of interactive object class. We just give it a condition
for when it’s extinguished. So in this case,
I’m extinguishing it when this thing is complete. And then I invert the condition. So it’s basically the
opposite of completion. So it’s not on until you’ve
completed the objective. And then it will turn on. But these are great. Because if you look around
a lot of Borderlands maps, you end up with– we use fire in a barrel a lot. We use flame jets
coming out of things. We love using fire as a
decorative purpose for things. And even some things like
there’s some like cold areas where we’ve used cold
vents and stuff like that. So there are usually a
couple different departments involved there. There’s an effects
component to it. There’s usually a
lighting component to it. There’s a damage
component to it. And that damage needs to
be balanced for the area that we’re in. And it’s really
easy for somebody to sort of be
arting out a level, and they just put
some fire down. But now the player expects
to be burned by it. So it needs all these
sort of extra things. And rather than
saying, no, you guys can’t do that at all, we just
wanted to be flexible about it. And once we could sort of
say, these are the rules, this is what I can do,
we just use data again. So in this case, I just
used this fire spray one. But if I wanted to
use a car fire or– we’ll get to that in a second. Oh, God. Sorry. I’m on somebody else’s
mouse, so I’m always– OK, I’m going to
stop doing that. But basically this
allows us to sort of go through a standard
barrel fire, large areas, for when you go to a big
asteroid and fight on it. Giant orbital thruster
that’s functionally the same sort of thing,
where it’s going to burn you, and it needs to do all
these same sort of things. It’s just that in a
macro-sized version of it.>>Victor: And that’s just one
Blueprint. And then you’re actually
just switching data table.>>Brian: Yeah. So all of this– and again,
all this is on construction. Not much is actually
happening live in it. But it’s super handy,
because then we can keep all these things– audios pretty
happy, because they can define how the audio
for this needs to work. And if it needs to come over–
like a tiki torch making fire, it’s fine. Audio needs to come
from that little spot. But that giant orbital thruster
has different audio needs. So they can come in here,
and they can actually set different relative
location overrides. So they can actually
extend out where audio is coming from in different ways. Our effects team, we
can keep them happy, because we can actually
enforce max scaling in here. So like this one, you can
only take it between 1 and it looks like a 3.0 scale. So we can actually clamp it. And certain effects– like
particles don’t scale well, is sort of the traditional bane
of a lot of effects artists. So this allows us to sort of
put some sanity clamps in. So it’s like OK,
you can take this, and we’ll let you extrude it out
a little bit, but not too much.>>Victor: And the VFX artist doesn’t
have to go and specifically have a list of, you cannot
scale these more than x.>>Brian: Yeah. And when you say, well, you told
me I could scale that one 3x. Why can’t I scale this one? It’s like, because
they’re built differently. And the art’s just sort
of eccentric to what it needs to be. So it allows us
to put clamps in. And then if they find, you
know what, that 3.0 clamp was a little too aggressive,
and we actually need to dial that back, they can
just check out this Data Asset, and set that to 2. And the next time
somebody opens their map, and if they have
that Data Asset, it’s going to
reconstruct the thing, and it’s going to clamp
it back down to 2. It really has been
super helpful. And it does these other
little things in here. Like the damage box that
the player’s actually in, it allows us to
actually manipulate that on a per-particle basis. So we can dial that in. So it feels like it’s
damaging you in a good way, but it’s not cheap. Like we just didn’t put a giant
box around the entire thing just to make you feel
bad about yourself. We try to hedge towards
being player-friendly. So it’s like you can
get close to that thing. And I’m not going to burn you
if you’re standing over here. And then again,
this is those sorts of things where it
comes up in development. We identify a problem. how can we take that
problem and come up with a Dev tool that speeds
everybody up, and keeps as many departments happy and
all sort of on the same page? And now if you’re a lighting
artist or an environment artist, and you’re propagating
all those fire-in-a-barrels, you don’t need to individually
hand-place everything. Now, again, you just take
the generic effects hazard, plug in the fire-in-a-barrel Data
Asset, and you’re good to go. And a lot of these, we
basically just expose the ability for designers to
make their own Data Assets. So I can go Blueprint
another one, and just say, hey, I want– I don’t know what problem
I’m trying to solve. But I’m going to need
to define a Static Mesh, and some transform
values, or whatever. Basically we exposed data
Assets to be this GBX Data Asset Blueprintable. And create that guy. And now when I’m in here, all
I’m really doing is saying, yeah, sure, this is my– is special bool. And maybe I wanted to
define a Static Mesh. Fun of propagating
tables at the end. Static Mesh. So once we have
that sort of set up, if I’m in a
different Blueprint– just use this one
as a side test– I can define that– “Test” was a stupid name. Yeah, not going to do that live. Basically you define your
Blueprint instance over there. Actually, I bet the other wanted
poster already has it exposed. But we can– in
here, all you really end up doing is, on
construction, you’re just grabbing everything
that you want out of it. So in this case, we
actually have a code version of this wanted poster. But we’re going to cast it to a
Blueprinted version of it that has a couple extra
variables in it. It’s very important when you
are doing this, always do validity checks on
Data Assets, just to make sure that the user is
actually plugging something in. Because otherwise
things can spiral off into horrific directions. So make sure your data
is valid before you start scraping from it. But then really
all you’re doing is you’re just grabbing all these
different parameters out of it. You’re just grabbing the
stuff that’s presented to you. And then from there, your
Blueprint or your system can do whatever you want
with that information that the user is providing. But you know, it’s a
very minimal– if people want to make Blueprint Data
Assets, it’s very minimal code work. I’ve seen most of the
framework’s there, literally uncommenting
a couple lines. If you go look for the data
Asset class, most of it’s very much there. I’ve seen really good tutorial
articles and tutorial YouTube videos where people have
been doing the work. Super handy. And it’s sort of, when I talked
with a lot of college students and indie Dev
projects recently, it’s the– my big advocacy
is, OK, Blueprint is great, but keep it to these
sort of core classes, and avoid just trying to always
constantly sub instance over, and over, and over,
and over again, and really see how much you
can take data out for a spin. It helps with reusability. It helps with performance. It helps with just, like I
said, on bigger projects, the more and more people
you have onboarding, if you can avoid this sort
of hidden esoteric knowledge, like, oh, when you make
your new Blueprint, you need to remember to do these
15 things in your child class for it to work properly,
versus, hey, cool, if you want to make
a new wanted poster, you just need to tell me these
six pieces of information, and you’re good to go,
and you can feed yourself. Especially with us
having another studio in a different time zone
in a different country, those sorts of things,
where we can sort of just make speed of teaching people
easier, because you can’t just walk down the hallway
to talk to somebody.>>Victor: It’s like the tools
communicate themselves, right?>>Brian: Yeah. And again, there’s just– you end up with way less
bugs at the end of a day, because there’s no
hidden Blueprint spiraled through the whole project. You can just sort of
constrain the chaos as best as possible,
and just sort of– like I said, when
a bug comes in, it’s really easy
to sanity check. Is it this one instance,
or is it the whole system?>>Victor: That’s awesome. Was that the last one
of your–>>Brian: That’s what I had. And then I was going to just
say, hey, go to questions.>>Victor: Yeah. There are a lot of them. We have about 15 minutes left. We’re already running
over the two-hour mark. But I want to make
sure that we get to answer some of the
questions that are coming in. There are a lot
of good questions. And just feel
free– some of them are a little bit more general
about the game, perhaps not things that you guys
specifically worked on. One of the main
questions that came up was they were wondering
if any of the custom tools and techniques will be
shared outside of Gearbox, or if they’re going
to stay proprietary.>>Brian: They’ll stay
proprietary. We like teaching
the methodology, but we tend to not share
our tooling directly or our codebase directly. For those are you
going to GDC this year, there are a lot of
talks on the docket from tech art, animation,
design, just general art effects, all kinds of things. So if you are at
GDC, there should be a lot of good
deep-dive informations into the methodologies.>>Victor: Awesome. What was the ratio of Blueprints
versus C++ used in the game?>>Brian: I am not a coder. Basically I’ll
put it like this– if the player is doing it or
if it’s really frame-dependent, it is almost always C++,
just because it resolves way quicker. The things that are more
latent and the things that we need to build a
lot of content out of it tend to be more Blueprint. So things like that
door is Blueprint, that switch is Blueprint. The Waypoint Actor
is actually code. So it’s sort of a mix
and match, depending on what our needs are. But everything the players
do, everything the weapons do, it needs to be code,
because we want it to be going as quickly as possible.>>Ryan: And then, like you were
saying, the Data Assets would be pretty
much the only thing that a designer needed to edit. So beyond adding a new component
or some custom component, which was something
still not really– didn’t really happen that much. The Data Assets were the only
things that a lot of these guys would edit.>>Brian: Yeah. The missions are
all in Blueprints. A lot of individual enemy
behaviors are in Blueprints. But the sort of core of how
enemies can work is code.>>Victor: They were asking
about– they had quite a few questions
about AI and the enemies. And they were curious
if they were mainly using behavior trees, or if
you rolled your own system for that.>>Brian: It’s kind of a hybrid. We largely ended up
rolling our own tech there, just because we ended
up needing to make such a volume of enemies. Because we don’t
just make a psycho, we make, like, 35
different types of psychos. It’s like, Psycho Bill
has two extra behaviors that a regular
psycho doesn’t do. And a badass needs to
behave differently. So because we need things
to sort of scale as quickly as possible, we just
sort of ended up needing to do our own solution.>>Victor: Has that been
something that’s been ongoing since the previous ones?>>Brian: Kind of everything
between– after the old generation of
games, 1, 2, and Pre-Sequel. Full reset button when
we moved to Unreal 4. So a lot of core systems, like
our scale system and attribute system, they all
kind of came over. They got refactored,
but they came over. So kind of everything
was up for debate when we were migrating
over to the new Engine.>>Victor: When it comes to
Nav Mesh and how you’re doing the
pathfinding, are you using the stock Nav Mesh?>>Brian: We use Havok. So basically– let’s see if I
can find my tiny little Actor in here. Probably not. Somewhere centered around here. That’s a spawner. Basically we end up with
these Nav Mesh sessions that we can drag around. And these allow us to– again, more areas
that we’re using data. Different bosses have
different needs to them. So certain– little itty-bitty
skags that run around don’t need a ton of Nav
Mesh, versus a giant spider ant that needs a lot bigger
of a radius to walk around. So we sort of define large
areas, what specific– certain bosses need certain– man, I apologize for whatever
I’ve done to the mouse. Certain bosses have
additional stuff. But generally we can say,
default large vehicle, and save all those things into
these sort of Nav presets. And these define basically how
it paints, how large the cell sizes are, and stuff like that. But Havok is our solution
for a lot of this.>>Victor: They were wondering– so
you’ve been working a long time with Unreal, clearly. They were wondering
what some of the main changes that you noted
during the last decade.>>Brian: Oh, man. Sequencer is amazing. I was formerly a
cinematics director. So I lived in Matinee for
a good decade of my life. And Sequencer is everything
I wanted and more. The ability to really drill down
and build Cinematic Actors– in this case, I was able to
build that Actor, and then kind of drill in and,
and actually get into each of its
individual components, and work with them individually. It has been such a time savings. I’m not sure if I have the
marketing directory in here. But basically–
actually, I bet I have this in the next directory. Yeah. I’m not going to load. Player character–
that will be expensive. All right, I’ll that that
hitch for a little second as I load them out. But we were able to leverage
that in Blueprint to– basically we made all
these kind of like paper dolls, as I’d call them,
of common enemies and stuff like that. So it’s like now I have
the Maliwan Soldier. And the Maliwan Soldier has
his baton, and his shield, and individual
particles that needed to be attached to him for
a cutscene that he’s in. And just to be able to
take that, and save him– stop grabbing
components, Brian– save him off so that we
could use him in a cutscene, and drill into all those
things, and animate them, and do Material work on them
individually, and just keep drilling in. Couldn’t do any of
that in Matinee. When all this tooling started
to come online, we were– I think just the little
incremental changes between 4.12 and 4.20,
where we ended up, every single time was like a new
happy day, when it’s like, hey, we did the merge this weekend,
and we moved from 18 to 19. I’m like all right. What can I dig in and play with? We were able to do– our marketing team
just killed it, both internally and
in 2Ks, and some of the external
contractors we worked with, because we were able to do
so much time savings work by just making all this as
user-friendly as possible. And I didn’t have to teach
1,000 cinematic people how to build this Maliwan Soldier. Now he’s just pre-built.>>Victor: And it’s all default
Sequencer?>>Brian: Yeah. We did a little bit– just because where we
were before we stopped taking merges, we
did a little bit of custom stuff for replication,
just to make the replication work a little more seamlessly.>>Victor: There have been
quite a few questions sort of around the
networking portion. I think one of the general ones
was in the vein of, how often– when you, as a designer, or one
of your designers on the team, set out to build anything that
is interactive in the game, it should work for
several players. When you start that
work, do you set it out with networking in mind? Or do you let the designers
create it, and then–>>Brian: It sort of depends on
what you’re working with. If you’re working with
an enemy, the server has authority there in how
projectiles are being spawned. Or usually server
has total authority over what’s happening. When we have other things, like
doors and scripted things that are working, like I said,
these things are state machines fundamentally. We call them
interactive objects. And they become pre-built with
a number of states in them. So let me see if I
can come down here. So like there’s a default set
of enabled, enabling, default, locked, and whether
it’s interactive or not, that are sort of
pre-baked in them. They’re just enumerated states. The niceness from that
is that we end up– the state change is replicated. So we have a lot of
things that happen where a state is going to change,
and we know that that event is replicated to everybody,
but everything that happens after the
event is all client-driven. So the fact that that
door was told to open is sent to all of us. But the actual it moving is
all on each of our clients. And that way we’re not just
constantly pushing things down the pipe to people. So there’s little
efficiency savings like that that
really go a long way. Like I said, the little
sort of custom stuff we did around Sequencer was– so there’s a heartbeat
replication going on, so that it’s not easily
replicating every framed to every person, but
it’s sort of saying, like, hey, I’m at 1 second, I’m
at 2 seconds, I’m at 3 seconds. And that way, if you had a lag
spike, you would just sort of sync up eventually. So try to be efficient about
what we send down the pipe.>>Victor: All right, I’m going to try
to pick some of the good questions here out of the
ones we have left. I thought this one
was pretty cool. Did you adopt some
location and art style from real-world
culture or mythology?>>Ryan: We have a huge team
of concept artists who pulled influence
from all over the place. We have libraries of classic art
books, other games art books, just geographical picture
books, a huge reference library. And these guys are so creative
when they pull these art books out and get inspired by them. So there’s nothing
off the top of my head that is like, oh,
this is absolutely where we drew our influence. But you can see through some
of the planets and stuff, like Promethea, for instance,
where a lot of our artists will draw inspiration
from there. There’s a little bit
of Akira in there. There’s Blade Runner in
there, with the orange fog. We would look at movies, and
we’d be like, that is amazing. What would our spin on that be? So it’s not just
geographical stuff that we draw influence from. Yeah, we’ll look at that for
reference for biomes and things like that. But we’ll also look at some
of our favorite movies, and kind of create big
hodgepodges of biomes, as we call them. We call all the planets biomes. So we have the city biome,
an Eden biome, and the Necro biome, and all that stuff.>>Brian: And so much of our aesthetic is
literally about jamming things into other things. So even when it comes to
like biome concepting, it’s usually never like,
that one place was rad, or that one country
has cool architecture. It’s like, yeah, we like the
light sconces off that thing, but we like the tile work on
the roof of this other thing.>>Ryan: It was also tough, too. Because we know how
to make Pandora. We’ve been there so many
times, for three games. And so when we were
like, all right, we’re going to go to
different planets, it became a question of how do
we get the feel of Borderlands, which was these
big, chunky reeds, and kind of like attaching
trash together to make it look like it’s usable. It’s like backyard sci-fi.>>Brian: Put some rebar on that. It’s Pandora.>>Ryan: How do we get that feel
in these other planets. And that was a huge
challenge for the art team. But I think they nailed it. They did such a good job. I’m always like in awe
when I play the game and see how our level
artists mash things together, and all the source that
our content art team makes. It’s just always so cool. Looking at individual
Assets, and seeing how they like ink
everything by hand, and all the work that goes
into the color choices and form reads, it’s awe-inspiring,
jaw-dropping. Those guys are so pro at this.>>Victor: I just realized that I
didn’t mention that we had keys for the stream.>>Brian: Surprise.>>Victor: Surprise. If you’ve been sticking
around for this long–>>Brian: Your odds just went up.>>Victor: I was so excited
about the technology that I forgot about
the actual game. Let’s figure out what
to do with them after, because we literally have
less than five minutes left of the stream. So that would be
an interesting– we will figure out what we’re
going to do with these keys. They’re not going to stay here. We already have the game. So we’re going to figure
out what to do with them. But Gearbox was kind
enough to give us 10 keys for Twitch and YouTube. So we’re going to figure
out what to do with them. Actually, I have an idea. Why don’t you go ahead
and fill out the survey that I believe Amanda
linked in chat earlier. Maybe we can link it again. And if you put your
email in there, rather than just a T-shirt–
we usually give out one T-shirt for everyone who
fills out the survey– we can split out the
20 Borderlands codes for the game on the game store. Does that sound good?>>Brian: Yep. Cool. Sounds good to me.>>Victor: Yeah, let’s do that. So if you didn’t catch
that, fill out the survey. Let us know what you
thought of the stream today, and what you’d like
to see in the future. Let these guys know
how awesome they were doing on stream,
because I think they are doing absolutely fantastic.>>Ryan: Thanks.>>Victor: And then we will go ahead
and pick some lucky winners off of those emails, and
hand out the keys. Yay, solutions. Let’s see, I think I
had one last question, and then we’re going
to wrap this up. Let’s see. There’s a long list
of questions here. What mistakes were the most
costly in Blueprint design work over the course
of the project? Does any particular issue, such
as repeated errors, et cetera, stand out?>>Brian: The biggest
problem was scope. In a great way, everybody
wanted to jump into the Engine and start making stuff. And Blueprint lets you
jump into the Engine and start making stuff. Wrangling the scope down, seeing
how much we could actually convert back down to
standard processes. Like I said, you’re
building the bike as you’re learning
how to ride the bike. And some things
don’t always work. So I think that’s one of our
number one targets, is reducing the need for it in
a lot of places, and just reducing the
pure volume of it. Because a lot of it can
end up living in data, or it can live in
data-only Blueprints, or it can live in a lot of
other different formats. And now we know what we know. And it lets us avoid the error
the second time, hopefully.>>Victor: That
sounds pretty good. Chat was mentioning
earlier that how to get a job as an FX artist
at Gearbox, 95% fire portfolio. [CHUCKLES]>>Ryan: Fire and explosions–
yeah.>>Victor: And sassy dialogue
lines.>>Ryan: Yeah.>>Victor: All the fun stuff. That’s great. Thank you so much for coming
out to HQ, spending all the time preparing this content for us. I liked the little
special– the little sign you did for your level.>>Brian: Best thing you can ever
do is bribe an artist into making you a
fully-inked lettering set. Because marketing will
pay for it eventually. You get a lot of usage out
of letters in the Borderlands style.>>Ryan: Thank you so much for
having us on your stream. It was really fun. Shout-out everyone
back at Gearbox. You guys are awesome. I love working with everyone.>>Victor: Some of them were
in chat, actually.>>Ryan: Oh yeah?>>Victor: Yeah, there were some
developers chatting.>>Brian: Thank you for stalking
us. Answering questions. We’ll see you tomorrow.>>Victor: They’ll be back later
tonight. For all of you who are joining
us again and you usually watch the stream,
thank you for watching. As always, I like to
leave a couple of notes about what happens
in our community. If you haven’t checked
up our Meetup page, go ahead and do so at the
UnrealEngine.com/user-groups. There might be a
Meetup near you, in case you’d like to meet
with like-minded people who are working with the tools. It doesn’t matter if you’re
new to the Engine or not. We all love to share. As we can tell here, even
Gearbox is coming out. They’re sharing some of
their knowledge with us. Go ahead and check that out. If there are no Meetup
groups in your area and you’re curious about what
it would take to start one, go ahead and send an email to
[email protected], and we’ll let you know
what that includes. As well, make sure you
check out our forums. That’s where we look at new
release projects, what’s work in progress, any
issues you might have. Make sure you use the feedback
form there as well, for us, if there’s anything you’d like
to see happen with the Engine. As always, make sure
you send us and let us know about your projects
so that we can spotlight them at the beginning of the stream. And if you stream
on Twitch, make sure you use the Unreal Engine
category so that we can tune in whenever you’re doing so. I try to follow as
much as possible. And a big special thanks
to you guys, as well as the rest of Gearbox,
who let you actually come out here and show off
all of these amazing tools. Next week on the stream– I haven’t announced
this yet, but I’m going to announce it right now– we’re actually going to be doing
a Blender to Unreal stream, with a little bit of a special
surprise for you there. We got something
cooking internally that we’re planning to release. So next week we’ll be doing a
little bit of Blender with Kay and James here in the studio. And I think, with that, it’s– this is literally
the longest stream I’ve ever done since
I started here.>>Brian: Hey!>>Victor: So I think– gives
us all a good couple knuckles. And with that said, I hope you
all have a fantastic week, and I’ll see you all next week. Bye, everyone. [MUSIC PLAYING]

14 thoughts on “Gearbox Software on Borderlands 3 | Live from HQ | Inside Unreal

  1. Wow, compare this to the 2 minutes of fluff you get with Unity… Unreal is leagues ahead on its videos.

  2. This is absolutely one of the best streams. It's also great to see Gearbox being this open with their development tools and processes. Not enough light gets shed on these topics. Huge props for putting this together (Y)

  3. I hope the UE4 engine team is listening. A lot of really cool stuff here that I would love to see implemented in the base engine.

  4. As someone learning Unreal, these are amazing. I spent about 4 hours taking notes and scrubbing back and forth

  5. in another video that was released closer to the release of BL3 it showed the weapon generation, and they had another tab where you could view the skeleton and the mesh and they were able to toggle which meshes were visible for the weapon bodyA, BodyB, etc i was wondering how this was achieved? did you guys modify the fbx importer in unreal? if so how did you do it?

  6. Houdini is super powerful but man holyhell the Realtime just becomes time…fcuking hell I've had to wait intervals and it really kills your workflow at crucial times.

Leave a Reply

Your email address will not be published. Required fields are marked *