Inside Marvel’s Spider-Man 2: the Digital Foundry tech interview
What's New

Inside Marvel’s Spider-Man 2: the Digital Foundry tech interview


Marvel’s Spider-Man for PlayStation 5 lands pretty much exactly where we hoped it would. Insomniac Games has delivered another new game for the console that pushes back technological boundaries, which a denser, richer representation of New York City, expanded to include Brooklyn and Queens, while the core rendering itself features dramatic improvements in ray traced reflections, along with the introduction of RT reflections on bodies of water. Meanwhile, the studio firmly embraces the potential of the game’s SSD with streaming that allows for more dramatic and faster traversal through the city, along with some spectacular streaming-based set-pieces.

In our tech review of the game, we had plenty of theories about how some of Insomniac’s tech had been enhanced over Marvel’s Spider-Man Remastered and Miles Morales, but we were hungry for much more information. Sony and Insomniac agreed to allow us to speak in an extended interview with Director of Core Technology, Mike Fitzgerald, and we got everything we wanted – and much more.

In this interview, John Linneman talks extensively to Mike about their commitment to ray tracing, how they removed raster-only modes from the game, and how the studio achieved many of its showpiece achievements. We also talk about streaming, compression, along with some of the less noticeable – but still crucial – technological components in the game.


If you don’t fancy strapping yourself in for a 6000-word tech interview, here’s the video source for the text below.Watch on YouTube


Digital Foundry: I wanted to get your first impressions and first thoughts on the approach that you wanted to take in building this new game. Where did you want to take the technology for this next title?

Mike Fitzgerald: Well, you know, the fun thing to do is to take a photo of New York City and put it next to the game and try to find the exact perspective, right? And then say, well, where are we? Where are we falling short of this? Now, you know, ‘photo-real’ is not exactly the goal. I think it’s more ‘Marvel-real’ or ‘game-real’ or something, some sort of somewhat exaggerated, dramatic interpretation of it. But you know, off the top of my head, some of the things we pointed out were that our buildings got a bit flat in the distance, the lighting would drop out, they didn’t have a lot of macro breakup and variation, so that was one thing we were excited about. Ray-traced reflections on our PS5 launch titles were awesome, but there were still a lot of tricks and fakes around there around building interiors, which was a fun area we were excited to tackle.

And then as soon as we knew we were doing Brooklyn and Queens, we knew we’d be going across the river and we’d be having a whole set of unique challenges with the water rendering. But really, you get to the end of one game and the whole team gets to pick up and look a bit farther to the metaphorical horizon about what they’re doing and get ambitious and think about crazy things they want to try. And so everyone has their own little pet thing to do. Doing Ratchet and Clank: Rift Apart gave us a lot of fun ideas about how to do loading and streaming differently in Spider-Man 2.


Digital Foundry: And that’s something we do want to get to, but I actually want to start with one of the things that really caught my eye when looking at the game and it’s the building interiors. So for those that watched our video, you may have noticed this: every single building, every one of these skyscrapers now features what looks like a fully modelled interior with dynamic characters even sitting around inside these rooms. And we had our hypothesis, but I’d like to hear directly from you. What are you guys doing to pull it off? How did you solve this problem?


Mike Fitzgerald:
Yeah, so great job [on the video]. I think noticing that checkerboarding was a great tell. So those are ray-traced interiors for rooms and what we’re doing is almost path-tracing that space in the sense with simplistic geometry and lighting. But taking a step back, I think we had the cube map technique in the previous games, which is clever and fun. And you know, it’s fun to crawl around corners, looking and notice all the ‘oddnesses’ of it. And we actually had another technique we were trying on top of that, which was to add a sort of a parallax depth to the interior. And you could get these sort of 3D shapes, especially along the back wall. Like if you wanted to put a desk back there, or bookshelves on the sides, that sort of convex geometry along the walls looked quite good. But a couple of really talented folks on our render team proposed this idea of, hey, every window we hit, we’re tracing rays out into the world. Why not render an interior by using that same system and tossing rays elsewhere?

So we have, I think, 32 fake rooms deep underground in the city, buried somewhere below the ground plane. And they all have sort of a basic layout and then different variations on furniture and characters that might be in there. And then as you trace into a room, we use an ID for that window. Okay, so floor five, window three of the building at this corner will use this room, and this sort of random set of interiors for that room. And then we can filter the BVH down to that, hit exactly some set of objects in there, even animated objects like ceiling fans, or characters who might be watching TV. We have some pre-calculated lighting in there. And we can also cast rays back through the real window to get a sense for the key light where the sun is, where the shadows might be entering the room and all that. And it worked [laughs]. It comes together really nicely. You get that sense of movement back there, that adds a lot. And that’s how we did the interiors for this game.


Digital Foundry: And also, there are perfect shadows, of course, a benefit of tracing into these scenes. I think shadow maps would be expensive and probably not look very good.


Mike Fitzgerald:
Yeah, not for each window. Certainly.


Digital Foundry: I also liked how even on the sides of buildings you could sometimes see in the office and look, and there’d be a back door at the rear of that office that seemed to go around into the other side of the building. It’s a very clever trick. And that is seriously a very difficult problem to solve. I think this is the first time I’ve seen a game tackle it in such an elegant fashion where you really do get the feeling of depth on all the buildings. But another aspect of this depth that really caught my interest is the way you handled secondary reflections.

This is something I pointed out in my video from Miles Morales back at the launch of PS5, but you know, when you have two windows side-by-side, they’re gonna reflect on each other. And then the reflection within the reflection should also appear in that – and that was absent in those original games, but I notice it is actually present here. And I’m wondering what the source is for the secondary reflections? How did you actually solve this problem?


Mike Fitzgerald:
Mostly, we use just a plain probe of that area. What the reflections were, before we had ray-traced reflections, we can fall back to that technique anywhere, just like we do on very rough surfaces in that area, where we’re using some sort of runtime, or more ‘baked’ calculation of what’s around there. Because you don’t really need to see movement in the way you need to with ray-traced reflections. And so it’s a great fallback for a secondary bounce and a reflection as well. It works for buildings, it helps with cars a lot too, which I think looked a bit flat in reflections before and when they have a nice, clear coat on them, you want to give a bit of that.


Digital Foundry: I noticed the nice clear coat on those cars.


Mike Fitzgerald:
We put a lot of work into that, that that was always a pet peeve.


Digital Foundry: The materials in general definitely feel like a step up, especially on the more minute objects, the smaller objects. But I don’t want to get away from ray tracing just yet, obviously. Because another big thing, obviously, is the water reflections, which are now ray-traced. This discussion has been interesting, because I think a lot of people, especially with the way games used to work, maybe didn’t consider how rough choppy water actually behaves when light refracts and reflects off its surface. But I am curious to see how you tackled this challenge both in terms of how you leverage the existing ray tracing features, but also how you handle something like this rough water, which is obviously not like a smooth, glossy surface, the reflection is more diffuse. So what kind of extra cost are we looking at? How do you pull it off?

Mike Fitzgerald: There’s always a lot of balance with this. I think pretty early on as soon as we had gameplay happening over the water, whether it was a mission like we showed in our first gameplay demo, or some early wingsuit (it wasn’t even a wingsuit then) but just trying to get over to where the Brooklyn and Queens where you very quickly see it fall apart. And actually there are a lot of screenshots of the first game on PS4 where the building reflections fall apart really badly, you get this full silhouette of the character in front of it. And we just knew we could do better. But when you sort of naively drop ray tracing on that water, the performance spikes far, far too high. It was really expensive. We tried to get around it. We tried doing kind of a planar reflection technique, like rendering from underneath up through the water into the scene. But hey, rendering a whole other scene is really expensive, as well. And then trying to translate that into the roughness of the water didn’t quite work. And so we sort of got to a point where we said, ray tracing is going to be the right way to do this. So how can we mitigate that performance? Our graphics programmers are awesome. And the one in particular, who focuses on this stuff, he did a great job, trying to find the right compromises that you don’t notice, in the same way when you’re looking at water, as opposed to windows.

So we actually render the water reflections at quarter-res on horizontal and because they’re so stretched out, you don’t notice that at all. And there is always this fun march of when you’re focused on optimising one thing, you get the progress updates during the day, and it’s like, ‘hey, here’s a side by side. This one is literally half as many rays, half the performance cost, it looks exactly the same, right?’ Everyone’s like, yeah, pretty much. And so it’s like finding those little bits, where you can save time and effort to do that. A big one for water that makes it so expensive, is it’s so choppy, that your rays shoot in all different directions and that’s really challenging for ray tracing hardware. So how can you bin them? How can you group them? How can you maybe not shoot them in exactly the right direction, but one that’s close enough, and yet is more coherent with what’s nearby where you can pull some extra performance out of the hardware. And really, it’s just pulling that thread over and over repeatedly and eventually getting to a point where we could do it in performance mode as well, which was huge for the quality of performance mode.


Digital Foundry: That’s interesting. So yeah, I was wondering about hitting 60fps or higher with ray-traced water. Do you remember the point where you said, ‘you know what, we’re just gonna go all in on ray tracing here, and drop the fallbacks’. Did that happen early in development? Was that always the goal? Or did you just happen to get enough performance that you’re able to say, ‘yeah, we can actually do this?’


Mike Fitzgerald:
Well, it has always pained me, whenever I see a screenshot of the first couple of Spider-Man games, or if I see Clank with ray tracing turned off. And I always know immediately, when I see that screenshot and it always bums me out. And so it was always an aspiration for this one: wouldn’t it be great if we didn’t have to have that real big compromise in there, if we got back to a pure performance and resolution trade-off. And I don’t think we committed to it until earlier this year, but I think basically, we saw the way performance was trending and we said, ‘you know, it’ll be work, but let’s go for it’. I feel like we got there with the last games, but we always got there, like, a week after launch. Or, right at the deadline, and we’re never quite confident enough. And so this time, it just took a bit of ‘No, we’re gonna do it and we can get to where it needs to be.’

Marvel’s Spider-Man 2 – the Digital Foundry video review.Watch on YouTube


Digital Foundry: I like the word trending, because that’s something I think that’s really important to consider when optimising any of these games is that you have to think very far ahead in the future, right? You can’t just say at the last minute, ‘yeah, we’re going to do a 60fps mode’. It’s going to be something that’s really planned early on and and worked towards. And obviously, that must have been a lot of a lot of hard work to get going with all that ray tracing going on. But another thing about the ray tracing that was interesting is, it seems like the particles – not all of them – but many of the particles are now rendered in the ray-traced reflection. And this was definitely not the case in the last games, or even in the earlier Spider-Man 2 trailers. What’s the story there?


Mike Fitzgerald:
Trailers are so funny, you make them and I think there’s this sort of perception of ‘Well, it’s all smoke and mirrors and the real shipping game is not even going to look that good’. But when our media team – who’s awesome – was capturing those trailers, we’re always thinking, ‘well don’t put that scene in, because we know we have this thing coming that’s going to make it look better’. And when the trailers come out, I know, our core tech team is always thinking ‘oh but it looks better now’. Why couldn’t how it looks now be in the trailer? So in particular, I think it was the story trailer. You’re getting pulled up behind by Lizard up the side of a building, and there’s this whole truck bouncing around, and there’s all this fire and smoke – and none of that fire and smoke is reflected in the building. And I think the day the trailer came out is the day that we got that feature up and running and looked at that scene in the game. And we’re like, ‘ah, but it looks better now.’


Digital Foundry: I do actually want to ask you a little bit about the lighting for this game and the way you handled global illumination in the different lighting methods, and if there’s any improvements there. I think there’s actually interest in understanding the way you do your pre-calculated lighting pass.


Mike Fitzgerald:
We have a GI bake across the city and in all the custom interiors for missions and things like that. The city is enormous, it’s twice as big in this game and we always want to be increasing the quality of that bake, whether it’s resolution or whether we capture specular as well, you know, like all these different factors of the light that we’re baking in and exposing to the real-time part of the game. And it’s funny, this stuff takes ages to bake across the city, and also the city is changing during development, so you can’t just ask a team ‘like, well, if you’re working on this street corner, make sure you rebake all the lighting there’. Like they don’t want to sit there for an hour to pre-bake it all just because they move some stuff around. So, in this game, we actually worked quite a bit on the non-glamorous part that doesn’t show up in the final product in a sense, but lets us iterate on all that more easily and cleanly. We used to have a farm of machines that were baking the city over a week, and now it’s like one machine can bake the sunset lighting setup in four days or something like that. So that’s a big, iterative thing for us as we go.

Looking at the city, I mentioned some big picture things about it versus real New York and one is that all our dynamic lighting had to stop at some point in the distance. And as well now, we have a version of the light bake for the entire level, the entire world that can be loaded at one time. Another thing is that across the river, looking from the side of Manhattan, across to Brooklyn, you’re looking at something very far away. But also there’s nothing in between you in Manhattan, there’s a lot of faraway stuff. But usually there’s buildings close. And that’s where you’re seeing. But this was something far, far away, that was also all you cared about, and wanted to look at. So we needed to make sure our lighting over there was good as well. So it’s a nice side-by-side of seeing how much farther real dynamic lighting at street level and real GI bakes in between the buildings, holds up at a distance.


Digital Foundry: Yeah, and there’s a good amount of times of day in this. So each one of those has its own separate bake. Just offhand, do you have any idea how much larger the lighting data is compared to say, Spider-Man on PS4?


Mike Fitzgerald:
I don’t recall exactly what it was on PS4, but I know we were so constrained by how big things were and to stream off the disk that couldn’t have been quite large. I want to say that for this one, streaming data for the open world is maybe a third of the disk, something like that, it’s a pretty substantial component of the data, a big chunk. And the team goes through a lot of work to compress that data, finding efficient ways of storing it, putting it in formats that can be nicely compressed by the Kraken compressor on the PS5. There are all sorts of tricks to get stuff to fit on a Blu-ray, which is a challenge of its own and then also to stay small enough to stream in as we as you play the game.


Digital Foundry: Thinking of the lighting there, I’m just wondering if you’ve examined or explored the cost of something like doing ray-traced global illumination? Do you think it’s actually feasible on these platforms in a game like this? Or is it still just a little bit too much, especially if you factor in the reflection pass?


Mike Fitzgerald:
Every game has its own trade-offs in environment and style of play. I think one thing we looked at was, I mean, we’re in, we’re in New York City, the reflections are just hugely important for the look of that space. And we never want to sacrifice any of that for something else. Another component is, we move so fast through that space, that we can’t afford much time for rendering techniques to sort of resolve. And that’s moving through it, that’s also camera cuts and things like that. And it’s something I noticed as a compromise of ray-traced Gi is that these scenes need time to settle. I mean, that’s a broad generalisation. Some people are doing some awesome stuff with it. But for us, I know we wanted to focus on reflections and make sure any quick traversal through the city was really looking stable and correct, I guess, as much as we could as you were moving through it. But all that stuff is so exciting. And you can do some gorgeous stuff with those techniques, so we won’t be ignoring them!


Digital Foundry: There we go! Of course, you mentioned New York and another part of New York is the density of stuff. Something I really noticed and picked up on here is that there’s so much more traffic, pedestrians etc. Plus there’s just more general detail from any perch, it feels like there’s more detail closer to the player. Could talk a little bit about all those different factors and how that’s changed for this game?

Mike Fitzgerald: It’s funny to see people talk about it, because there’s this notion of ‘oh, there’s LODs and you just watch everything transition in and out of them’. But there’s like 10 or 15 interlocking systems that all factor into how detailed something might look. You have texture streaming in and out at different distances to the camera and blending into each other. You have models with LODs that you see at different distances, you have our imposter system, which is for distant buildings. How do you represent those in a low poly way that you can keep in memory all the time? Anytime we have a building, you can see the whole world and Brooklyn and Queens now, how do we represent those efficiently? When do we transition to something better looking?

So in this game, we added a new high-res version of those imposters, so there’s a new middle distance version of a building that has more geometric detail and far ones but isn’t yet independent models for different pieces that we need to render, but you do get real reflective windows on those, so that’s part of that transition. We also have a lot of objects that need to exist at a distance, things like big air conditioning units, or antennas on top of buildings, or water towers, which affect silhouettes. So those exist as these independent models that we have showing at different distances far away and close. The windows have their own techniques to blend at different distances, the materials have their own levels of detail as height effects and parallax occlusion comes in and out on them as you get close and far. And the goal is to make it all invisible. So your brain fills in the details for the far stuff and makes you feel like it’s the same as if you were standing next to it.

Here’s our video content showing the differences between the various performance modes in Marvel’s Spider-Man 2.Watch on YouTube


Digital Foundry: That’s definitely something I really noticed, especially sitting up on some of the perches, just the general variety across the scene. In the original, you look out across the buildings and there’s a point where they just kind of turn to blocks. It’s within the constraints of a PS4, but here it feels like there’s much more subtlety on every surface from a distance, which does make a huge difference in terms of quality.


Mike Fitzgerald:
To speak to the wear and tear on buildings, sort of the blocky flat sides, one thing we did was we introduced this big step on buildings, that looks at their geometry and says, ‘well, if it had been raining here for 50 years, here’s where the grime would be around the windowsills, or this is a wear pattern on the side of this building’. That can be a dynamic input to a material and shader when you’re close to it. But since we bake those imposters, we can bake it right into that unique imposter. So you can get ‘yes, that’s a brick building’ but we can feed in this sort of macro break up to that brick material that says, ‘oh there was a mural here and it faded away 10 years ago’. And that has added a lot of variety, looking out over Manhattan that makes it feel less cartoony and more real.


Digital Foundry: I mentioned traversal and I’m curious if you can speak on that a little bit, because that’s another new thing, the speed of traversal. What did that look like for you in terms of implementing that because the camera can now just fly across the screen, it feels like three or four times the speed of what was possible in the original game.


Mike Fitzgerald:
Yeah, certainly, that was something we felt the potential for with Miles and Spider-Man Remastered but didn’t want those games to feel different on PS5 vs PS4. We had balanced that gameplay and had all the systems put out and so we knew we wanted to push it a bit farther and the wings were a great opportunity to do that. As you upgrade yourself in the game, I think you can go about twice as fast as in the first couple games. And then there’s the super slingshots on the building tops that fire you off about three times as fast, there’s some cinematic moments that I think you caught in your video, that really have fun with moving the camera quickly through that space. It’s funny, the hardware in the console is extremely good. You know, it’s a very fast SSD and we talked about this a million times. But part of what you need to do then is work on how fast can your engine interpret that data as it’s loading it in? Because that’s now your bounding factor on stuff. And we also do a lot of ‘how can we get less data’? How much less can you load? Does that make sense? There’s a lot of creativity around that.

One of our animation programmers came up with a way to stream animations, in a sense. Spider-Man is a very complex character. He’s got, you know, a million different moves and blends and state machines and he can fly and swing and punch guys and the amount of animation data you have for the character is enormous. But we’re not playing them all simultaneously. So you don’t need all that data at the same time. But the gameplay needs to be reactive, you might need it next frame, or two frames from now, but you don’t know ahead of time, you can’t get ready for that. And so he came up with a system where many animations load and they have every third or fourth frame resident in memory all the time. And then when you go to play it, then we hit the SSD, filling all the extra frames of animation. And at worst, you’re looking at one frame that blends imperfectly into the next. But really, it’s completely seamless, and you just cut your animation streaming costs by 75 or 80 percent. So, that type of creativity across different groups that you don’t see but it comes through, that’s the sort of thing that enables fast traversal and streaming through the world.


Digital Foundry: This is actually an interesting discussion point that I’ve seen brought up often with traversal and fast loading. Conceptually, being able to load like to go directly down into a map or like load quickly to another area or use portals. This stuff has existed in other games prior to Ratchet, prior to the Spider-Man games. But I think based on what you’ve said here and other things, just knowing about this game, the difference isn’t so much what you’re doing. It’s the amount of data that’s been moved to do it.


Mike Fitzgerald:
You know, I can’t speculate too much on what they’re doing. But here’s some of what’s gotten us to where we are. One is that even on PS4, we had that sort of imposter system notion, we have this notion of a low res version of the city that’s always around, that was a great boon when it came time to do ray-traced reflections, because that was the foundation of that system. Similarly, it’s a great boon, just jumping into a fast travel system of some sort where there’s something ready there, right, and it looks a lot like that pause map. We were able to work with PlayStation 5 hardware very early. We’ve gotten a lot of time with this hardware. And it has pushed us in that sense to make a lot of these systems more flexible and dynamic to make sure our texture streaming is prioritising things correctly and efficiently hitting that hardware to make sure we can unload stuff as dramatically and quickly as we can when we no longer need it. And really it has been an iterative walk through those early games, through our launch titles through Ratchet and Clank to this game, where we are continually building on that ability to quickly get rid of what we don’t need, only keeping what we need next time, quickly bringing in the most important stuff. Even knowing as you look through a portal, ‘well, I need to stream in this texture on the other side of the portal, even though it’s dramatically far from where my camera happens to be at the moment, right’ and making sure all those systems fit together really nicely.

Looking for some more behind-the-scenes insights from Insomniac? Here’s our tech interview with Mike Fitzgerald, focusing on Ratchet and Clank: Rift Apart.Watch on YouTube


Digital Foundry: I’d be remiss if I didn’t ask you about sound, because this is something I really picked up on, especially as a user of a surround sound system. I’m wondering if you can talk about how some of these things work? How does the room propagation work? How do sound reflections work?


Mike Fitzgerald:
These are techniques that have been around for a while. And one of the problems is how to consistently apply them to a complicated world, how to give the right tools for authoring them, how to make something that can run efficiently at runtime to truly give a sense for that. So there’s sound propagating through rooms, their sound reverberating in the correct way when you’re between buildings. One of my favourites is that there are rainy moments in the game and that system projects out around you and understands all the surfaces that that rain might be hitting, and then plays the correct sounds and mixes it correctly. So if you’re on a rooftop and there is an aluminium AC vent 10 feet away from you, then as you walk towards it, you will hear the sound of rain hitting aluminium as opposed to the rain hitting a rubber rooftop and you will hear it spatialised correctly in that space. I know the audio team and our audio programmers take a lot of pride over that and how they apply these techniques through the spaces that we have. I wish I could show you the debug visualisation for the rain. It’s really cool. It’s all these hexagons around you that sort of visualise out the space and show you what it’s hitting.

Digital Foundry: Thinking of spatial audio then, one of the big features in the recent PlayStation firmware was the addition of Dolby Atmos support, where it can basically take the Tempest audio data and create a real 3D effect in a setup that has Atmos support. I’m curious what kind of visibility you as a developer have on that feature and if you can actually do any sort of custom mix for Atmos, or if it’s just taking the Tempest audio data that’s created for headphones and applying it over that.


Mike Fitzgerald:
So by default, we get a great application of the sort of Tempest 3D spatialisation to an Atmos setup. But with the new updates, we can actually have the game target Atmos more specifically. Our cinematics audio team did a ton of custom Atmos mixes for this game. As soon as they knew it was coming, they did it immediately. And we’re super excited to be – I think – the first PlayStation game that has custom Atmos mixes for its game. So yeah, I think we [did this] right as this feature came out, so we timed it just about right. Yes, if you do have an Atmos setup, you get some awesome mixes through cinematics and the other stuff that’s really cool.


Digital Foundry: So is there anything else you want to wrap up with anything you’re especially proud of in the game?


Mike Fitzgerald:
I want to give a shout out to some of the mundane problems that you don’t see like in the action, as you play the game, but are so important. Like I mentioned before, we have to fit this game on a Blu-ray disc, I mean, you can go for multiple Blu-rays, but the goal is to fit in on one. And this game, I want to say is only maybe 20 percent larger than Spider-Man Remastered as far as the disk size is concerned, but the amount of extra fidelity and quality and cinematics we packed in there is huge. And this factors into streaming and how we utilise that bandwidth as well. So, one example, we have a ton of animation data in this game, our faces are more detailed, the bodies are more detailed. One of the first things we showed was a cinematic with Kraven where he’s hunting another hunter and he’s kind of bored, because it’s not challenging enough for him. And I think we made that cinematic and and the animation team animated the way they wanted – and it was like a gigabyte in size. This is a 30 second cinematic shot, it can’t take up more than a full percent of the disk.

And so finding ways to take that data, our technically animation team came up with some awesome reverse skinning and joint setups and a lot of teams, a couple of folks in my department put a lot of work into understanding the Kraken compressor and how we can work with textures and things to just cram everything down small, fit it onto a disk, make sure people have a good sized download that’s not too onerous to fit on their console and deliver something that I think you put it side to side, feels head and shoulders above the first game, and yet it’s only like a little bit larger. So that’s the behind the scenes thing that’s fun, actually. You mentioned pedestrian and open world stuff. Vehicles have suspensions and behave differently. Now you see a lot more behaviour around pedestrians in cars, the pedestrians walking around the world can go in and out of buildings and shops, which they didn’t do before. They have varying body sizes and their outfits are much more randomised and swapped around, so you get a lot more diversity of space and character. They have conversations with each other.

There’s this whole modular conversation system where they can have a topic and discuss it and then one disagrees and the other knows how to disagree or change the topic. And you know, it’s really fun to just sit and listen to people around the city have these sort of reasonable conversations with each other. As you walk around, there’s all sorts of little touches. I’m excited for people to spend some time with the game, find random stuff, hopefully have some wow moments and ‘how did they do that?’ moments? And that’s part of the fun of the new game, I guess!


Digital Foundry: You get that right from the start with the Sandman fight. Okay, this is huge.


Mike Fitzgerald:
Yeah, I know, we showed it in the launch trailer, and people said, ‘oh, spoiler!’, and I’m just like… there’s so much stuff to experience. This is not ruining anything for anybody. I hope you’re excited. Even just that one fight has so many surprises and twists and turns. And there’s some really awesome work that went into the bosses and cinematic moments across this game.


Digital Foundry: I also appreciate you mentioned the Kraven cutscene at the beginning. But that’s not even in the city. It’s a completely different, bespoke jungle map that was made just for the jungle space. That’s so cool, you’re able to seamlessly transition to these different spaces.


Mike Fitzgerald:
It was a fun challenge for our technical art team to say, ‘well, yeah, it’s only for this one, cinematic, but we really want to have a realistic jungle and light propagation through a canopy’. And the leaves have to have the right subsurface and they have to slowly wave around, and all that. So, a lot of effort goes into individual shots and spaces but it’s one of the great parts about working at the studio – people really care about their craft and want every little bit of it to look good and come across well to others.


Digital Foundry: I’m really happy to hear all that and I’m glad that you guys are still pushing your own in-house technology. I know that’s not easy to support, but it’s something that people notice and appreciate and it does help create these unique experiences. Well done on the game is what I will say!


Mike Fitzgerald:
Thank you very much. Thank you. I’m just again excited for players to spend some time with it and have a good time.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *