Modern Warfare® Initial Intel: Call of Duty®: Modern Warfare’s game engine is put through its paces

Modern Warfare, Technically Speaking: Principal Rendering Engineer and Studio Head of Infinity Ward Poland, Michal Drobot, discusses the tech behind the new Call of Duty.

Modern Warfare® Initial Intel: Call of Duty®: Modern Warfare’s game engine is put through its paces

Modern Warfare, Technically Speaking: Principal Rendering Engineer and Studio Head of Infinity Ward Poland, Michal Drobot, discusses the tech behind the new Call of Duty.

By now you’re probably very familiar with the launch trailer, as well as the smattering of screenshots released showing the new game. Just to reaffirm what you’re seeing; these visuals are ‘in-game’ (not a pre-rendered, an ‘in-engine’ cut-scene, or something ‘aspirational’ that the game was aiming for). This, Michal Drobot enthusiastically informs us, “is the real deal.”

But it wasn’t easy.

An entire Infinity Ward tech studio based in Poland was built; “to bring this vision to fruition. Requests coming from design, coming from art… we really needed some improvements, so we had to create a new engine, purpose-built for Modern Warfare.” The process took over five years. Michal Drobot now proudly leads this specialized studio, with one overriding goal in mind: To “put all our focus into what this engine is capable of right now, and hopefully it shows.”

Flexible Reflections

For any game launching today, you’d expect HDR and 4K visuals, which the game engine handles with aplomb. But there are additional engine capabilities, constructed to help support the game design itself, as Drobot describes; “we’re making sure your experience is as realistic as possible. To make everything look cool isn’t easy.” First and foremost, this is down to compression techniques. Your console has a couple of gigs of RAM, while the Infinity Ward art team’s field trips of “taking photogrammetry images in the Arizona desert, totals more than two TB of data. It’s not easy to transfer that into the game. We spend a lot of time figuring out how to compress that data, to push that through, and to make sure that it actually represents what the data is in the game.”

(What is photogrammetry? All is explained in this previous blog post.)

With the photogrammetry data processed, Drobot’s team began to turn environmental elements and scenery into realistic assets: “We had all of the details coming in from photogrammetry, from real objects. It’s not as simple as taking a picture of a shrub, or mud in a swamp, and transferring them into the game. You have to understand how the mud reacts to different types of effects and different types of lighting.” This is when Drobot and his team brought out the lasers: “Yeah, we took laser scans. Some elements are semi-transparent, and transparent items are relatively easy to make look convincing but making semi-transparent items that serve a gameplay effect or purpose look good is pretty damn hard.”

Turning photographs into sets continued for months, and Drobot’s team discovered that for an object that requires reflective mapping, raw data only goes so far: “The art department has hundreds of cameras that capture everything. We throw our lasers at the object and figure out from those lasers the actual reflection of the surfaces; how they reflect. It’s not only about capturing the pixels that you’re seeing, but how they reflect and react to light. We put a lot of effort into this.”

Naturally, this attention to detail, and then reflective detail, extended to all the weaponry.

“During this project I’ve had to look at and touch a lot of different guns. There is a huge community of people who like to modify their weapons with really intricate details and materials that they use. There are powder coatings, ceramic coatings, and to make it always operate in the field, you need to clean it, oil it up…” All the different textures on every weapon had to look perfect: “We had to simulate a rainbow-like effect on the oil of the gun after you’ve cleaned it up. Translucent effects on the gun, ceramic effects, powder coating effects, elements of the magazines. We researched all of that.”

“We made everything in the game look as good as they possibly can. The reflections, shadows, the ambient occlusion; all the bells and whistles.” 

Let’s Get Technical for a Second: Caching in on Characters

As you’d expect, the same care and attention was given to all of the characters in the game: “We have photo-realistic and life-like characters. With Price being back – and he represents Modern Warfare to us – we really wanted to get the best out of our performance capture, with the best actor possible for him, and for all the soldiers and the military consultants who acted as various characters. When you get raw footage from a motion capture rig, you get tons of data, and we had to compress all of it so we could replay it back in the console, and still make the character believable, and be active while being controlled by you as the gamer.”

For this to occur, Infinity Ward utilized the same plan as CG studios do when making pre-rendered cut-scenes; the only difference being these are real-time graphics: “We performance capture tons of data, including detail like the wrinkles, the changes in tautness of the skin around the mouth when characters are talking or are excited. This is performance capture 2.0, where you’re getting this additional level of detail.” Drobot then gets technical for a second:

“We have tech called LMB caching, we expanded the texture data to get the feeling of the skin itself, and then we needed super-high fidelity meshes to make sure all the wrinkles and texture changes really are shown. We double-up the sub-division surfaces for the content game engine, and there’s a piece of tech from the movies, where you take a mesh and it adapts to the data you’re passing over it. We use this tech on the fly to make sure it looks as detailed as possible and bring down the detail slider when the character is further away to ensure the performance stays at 60 FPS. This is a CGI piece of tech that has found its way to the game.”

Spectral Rendering: Vision Beyond the Human Spectrum

Lighting plays a pivotal role in Call of Duty: Modern Warfare, and the engine needed to be robust enough to handle the action, no matter the environmental effects in play. That includes visuals outside the scope of human vision: “We have tech that I’m really proud of called spectral rendering. If you take an image from the Townhouse scene of the Campaign, and view it in normal lighting conditions, you can see much of the image is in complete darkness. We as human beings only see RGB wavelengths, we don’t see infra-red, we don’t see ultra-violet. But for the game engine, we wanted to see all these other spectrums of light.”

Does this mean the game can render in infra-red, even when you’re not going to see it? Yes: “For this game, we wanted to be as real as possible, for example as Tier One Operators, you have the Night Vision Goggles (NVGs). You turn them on, and you see night vision. But what is night vision? It is just seeing infra-red. So, we made sure that all the lighting in the game has the infra-red spectrum.” Or to put it another way, the game is constantly running in infra-red, just in case you put your NVGs on. Drobot is quick to point out additional effects too: “We pushed it further than that. We have Forward-Looking Infra-Red (FLIR), which has something called deep infra-red, and other ways that lighting reacts. We have lights that have thermal emission; like those lights that don’t shine that bright, but they’re really hot to the touch. We also simulate that.”

This isn’t just a post-processing filter though; there are a couple of main differences here. Drobot points to an area on the Operators’ gear: “If you’ve watched the trailer, you can see the bright shining squares. All the Spec Ops teams in the world have these infra-red patches that they apply to their accessories and their suits. They use them to be able to distinguish between friend and foe when they’re in NVG mode.” During combat, you can enter a room, switch to night vision, and immediately ascertain who is who. “Seeing the patches under normal vision, you might be able to make out a very vague outline of the patch, but in NVG, it’s as clear as day.”

The tech engine takes this light beyond the scope of human vision, and applies it to other elements, too. Study the NVG scenes of the trailer and you can see shadows being cast and illuminated from the infra-red light on your NVG headset. “Infra-red lights like these can be attached to weapons, headsets, and shows you more information in the infra-red view. We wanted to make the game as realistic as possible.” This means you’re able to play through missions however you like, including the use of lights.

“In Townhouse, you can shoot out the lights, and switch on NVG. But if you look at the same rooms with natural lights, the room still looks real. That’s because we have reflection filters on the lights under any conditions that you as a gamer would create, [like] thermal and NVG. They can be used for real. It’s not a gimmick. It’s part of real gameplay.” This pursuit of realism led the tech teams down a strange path: Actually changing the output video of legacy imaging devices look worse compared to their modern equivalents.

Authentic Distortion: Downgrading Legacy Visuals

“NVG devices date back to the second world war,” Drobot explains: “Snipers were actually using NVG rifles. It’s interesting that the tech evolved throughout all those years, so we wanted to present you with multiple generations of this hardware.” What do the Tier 1 Operators get? “The super-amazing, most modern version of NVG.” Conversely, there are missions where you might be scrambling for legacy equipment, where the imaging is far less perfect: “You might find something during a mission that’s from the 1960s or 70s, and our tech allows us to simulate that as accurately as we can possibly do it.”

This also extends to the scopes of rifles, too: “You can see the image distortion you get from the lenses is what you would get in real life. With post-processing applied, it’s what you would actually see in the devices, so artifacts and distortions are also shown.”

Thermal imaging is also key when revealing aspects of the game’s story, too: Drobot brings up a work in progress map, showing an image from a drone flying over an enemy base in the desert, under twilight conditions. “You can see something is happening down there, but it’s difficult to distinguish what is actually going on. There’s some fire, but that’s it. But if you switch to thermal, everything starts to make sense.” Not only that, but heat signatures allow you to view game elements that would otherwise be overlooked, as Drobot explains: “You can clearly see some kind of storytelling that went on here; which vehicles have just turned off their engines, the fresh tracks in the sand, as well as your friends and enemies.”

Depending on the tech you’re using, the image you’re seeing matches the period of time the technology was developed in: “Thermal feels worse in terms of the quality of the imagery, just because the resolutions are lower, because the rate of data passing through the camera is different. If you’ve watching predator drone footage from around 1996 to 2012, the footage was a bit blurry, it has artifacts, as [the military] needs to simplify the data as they pass it super-fast over their satellite network, so they compress it with jpeg compression.”

In these cases, Drobot’s team dedicated much of their time to actually downgrading visuals to make them as accurate as possible: “We spent extra time creating a low-quality compressed look with artifacts and compression on the blue and red, with artifact squares forming, we call this the ‘Shittify’ filter; it’s interesting because it’s actually harder to make something look bad on purpose, compared to the other way round. We spent far less time making depth-of-field filters look good, compared to figuring out how to reverse-engineer data compression to make it look as shitty as the real thing.”

The result is that Infinity Ward’s designers can specify parameters in the game engine’s post-processing filters, depending upon how the image looked based on the equipment used in the past decades.

Pushing Boundaries. And Polygons.

One of the other improvements the new game engine brings to Modern Warfare is volumetric lighting. But unlike other games that might use “god rays” as a top-end gimmick feature, this game is different. Drobot continues: “Every single cubic inch of air around you… it is filled with something. Dust. Dirt. Aerosols. Gas. And it can change your vision, and how lighting reacts, but you can also use it in the game’s storytelling. That something happened at a location before you arrived. We really pushed hard with volumetric lighting, so every single light in the game is volumetric light; a real light with volumetrics built into it.” There are no ‘pretend’ lights here.

“We have specifications where our artists can change how dense the air is in any part of the environment. Maybe a cellar is dense with dust particles. And you can change this by your actions, and by events; so maybe there’s an airstrike and the air around you becomes much thicker, and we experimented with that, to make sure volumetric lighting is part of the gameplay itself; it affects how we play, and the visuals.”

The last facet of the game engine that Drobot is ready to reveal are the improvements to the geometry pipeline. Basically, the new engine has a new way to render objects: “It allows us push the polygon counts much farther in our games than we used to.” If you watch the crowd scene from the trailer, each of the individuals rioting in front of an embassy is a mass of triangles.

“Previously, we couldn’t push as many triangles or as much geometry. The new system allows us to push roughly five times more geometry per frame than we used to on consoles. Previous Call of Duty games were pushing around three million polygons per frame, [up to around] five million. Recently, we raised the artificial limit we had imposed on our game engine; to 16-17 million polygons per frame. We’ve had it up to silly levels; 24 million triangles per frame, and this is all to ensure we get all the data we have grabbed and match it in the game.”

All this entails a lot of hard work by the talented teams at Infinity Ward, and all these tech features were developed with one overriding goal in mind: “To bring to life game sequences, the gameplay, and the experiences that you can see, and remember.”

Pre-orders at participating retailers are available now, or at CallofDuty.com.

For more information and the latest intel on Call of Duty®: Modern Warfare®, check out: www.callofduty.com, www.youtube.com/callofduty and follow @InfinityWard and @CallofDuty on Twitter and Instagram and Facebook.

For more information on Activision games, follow @Activision on Twitter, Facebook, and Instagram.

ソフトウェアライセンスとサービス契約が更新されます。変更内容を確認するには、こちらのリンク [https://www.activision.com/ja/legal/ap-eula] をクリックしてください。