AI in Games Blog Top GDC Videos of the Decade From the AI Summit

Avatar photo

Tommy Thompson With over 15 years of experience in artificial intelligence research in games, Tommy sought to provide a more accessible format for his area of expertise to those without the same scholarly background. Releasing the first AI and Games YouTube episode in 2014, Tommy has continued to build upon this small platform to form a company around it. With the YouTube channel amassing over 5 million views and 100,000 subscribers, the fundamentals of what AI and Games has sought to do has never changed. Educate developers and students on how best to utilise AI in their games.

read |

If you’re looking to get started working in AI for games, it can all be a little intimidating. Artificial intelligence (and machine learning) is a very big field! There’s a lot to unpack here. One great way to find out more is to learn from the experts. The Game Developers Conference runs in San Francisco each year, and there are a variety of talks on all things AI and machine learning for games. In fact, you might even find some talks presented by the folks here at modl.ai on the work they’re up to as well!

All of the talks are recorded and later made available courtesy of the GDC ‘Vault’. While access to the complete contents of the vault requires a paid subscription, there’s still plenty of interesting content released for free for anyone to watch both on the vault site and via the GDC YouTube channel. So let’s walk through some useful (free) talks to get started on, just to familiarise yourself with the types of problems you might expect to find in AI for games, but also being able to respect the needs of designers and what the open challenges are that we’ve yet to find sensible and scalable solutions for.

Disclaimer: The author has been one of the advisors to the AI Summit for GDC since 2020, meaning they have an influence on the talks that appear at the event and support speakers in preparing their final presentations.

The Post-Mortem on Kojima Productions’ ‘Death Stranding’

Death Stranding is effectively a big-budget walking simulator, where players deliver packages to waystations across a desolate yet hauntingly beautiful post-apocalyptic America (that could easily be mistaken for Iceland, given the artistic inspiration). Given that players are walking across these rough and rugged landscapes, it’s expected that AI-controlled non-player characters (NPCs) will do the same. The Mules and Demens you will come across are intent on hunting you down and stealing your precious cargo, but if they can’t figure out how to reach you, then they wouldn’t be doing a particularly good job, now would they?

The Post-Mortem on Kojima Productions’ ‘Death Stranding’ [GDC 2021]

Typically for characters to move around the world, we need a navigation mesh to be running in the game that will enable us to calculate the available area within which an NPC can move around. So if you need a character to move to a particular location, it can then run a pathfinding algorithm on the navigation mesh to figure out how to get there. But navigation meshes are reliant on nice, flat, undisturbed areas of ground. So how does having a lot of rocky, realistic terrain in the likes of Death Stranding get in the way of the navigation mesh working as intended?

Eric Johnson of Kojima Productions gave a fantastic postmortem on Death Stranding, highlighting the myriad of challenges faced during development. Ranging from the navigation meshes not building as intended to dynamic generation to reflect real-time changes, maintaining smooth performance, and the problems that emerge when having characters path hundreds of meters at once across harsh terrain.

Bringing BioShock Infinite’s Elizabeth to Life: An AI Development Post-Mortem

While it’s critical that we get characters to make the right decisions, it’s all part of the package. We need to get the animations, the dialogue, and other audio to trigger when we need them, and we need all the AI decisions to make sense in a game where design is always shifting. All of this is made all the more challenging when the character isn’t just a companion AI that follows the player for hours of gameplay but is also critical to the story.

Bringing BioShock Infinite’s Elizabeth to Life: An AI Development Post-Mortem [GDC 2014]

BioShock Infinite by Irrational Games revolves around the character of Elizabeth, a young woman that the player character rescues from her prison in the steampunk city of Columbia. As you progress through the game, Elizabeth not only enables a variety of useful gameplay features but presents a new perspective to engage with the story and the themes and concepts expressed in the BioShock universe.

This post-mortem by John Abercrombie digs deep into the challenges faced by the ‘Liz Squad’: a team of programmers and artists whose job it was to bring this critical facet of the game to life. It explores the design rules they enforced to ensure Elizabeth acts as envisaged and how theatre and sports helped make it a reality.

AI Behaviour Editing & Debugging in ‘Tom Clancy’s The Division’

While it’s important that programmers go ahead and craft all of the systems required to bring AI characters to life in a given game, it’s equally as important for programmers to be working to facilitate designers. This is a difficult balancing act, given you want to provide designers with all the tools they need to create enemies that are fun, interesting, and complex, but the tools themselves should be practical, easy to learn, and efficient.

AI Behaviour Editing & Debugging in ‘Tom Clancy’s The Division’ [GDC 2016]

Disclaimer: The video may not be viewable because it is from the GDC vault. Click here to go to the direct link.

At GDC 2016, Jonas Gillberg highlighted that this can still be a painful process to get right, even at the biggest of studios. This talk goes into detail on the experiences of the development team at Massive Entertainment on Tom Clancy’s The Division in trying to create tools that enabled designers to build new enemy archetypes, rapidly iterate on ideas, build on existing features and enable for quick and effective debugging when it all goes wrong!

Bringing Hell to Life: Fully Body Animation in ‘DOOM’

It’s not just the decision-making for non-player characters we need to think about; it’s also about how they look as they move around. Animation is a critical part of the behaviour of an AI character. Your NPCs need to be able to communicate their thought process: are they stunned? Injured? Angry? Going in for the attack? All of this requires your animations to be built to support the game, knowing when to blend between them, interrupt them, or even adjust them so they can be used in a number of unique gameplay scenarios.

Bringing Hell to Life: Fully Body Animation in ‘DOOM’ [GDC 2017]

This talk from Jake Campbell highlights the level of detail required to ensure the animations of enemies in the 2016 reboot of DOOM worked as envisaged. To a point that they had to build an AI control system for the animation layers so that animators could focus on building fewer individual animations to a higher quality while the gameplay systems figured out how to apply them in context.

Marvel’s Spider-Man’ AI Post-Mortem

Sometimes even the most seasoned of developers don’t foresee all of the problems coming their way. What better example of this than the AI post-mortem on Marvel’s Spider-Man by Adam Noonchester. Developers Insomniac are no slouch when it comes to AI for action games, having previously worked on games such as Ratchet & Clank, the Resistance series, and Sunset Overdrive, but Spider-Man presents a myriad of AI challenges. Creating characters that work well in stealth sequences, in open fist fights, oh, and when you’re swinging through the streets of Manhattan too!

Marvel’s Spider-Man’ AI Post-Mortem [GDC 2019]

It’s always great to see how the biggest studios solve many of the problems that come their way. And this is a fine example of how it all comes together in the end.

If you’re looking to get started working in AI for games, it can all be a little intimidating. Artificial intelligence (and machine learning) is a very big field! There’s a lot to unpack here. One great way to find out more is to learn from the experts. The Game Developers Conference runs in San Francisco each year, and there are a variety of talks on all things AI and machine learning for games. In fact, you might even find some talks presented by the folks here at modl.ai on the work they’re up to as well!

All of the talks are recorded and later made available courtesy of the GDC ‘Vault’. While access to the complete contents of the vault requires a paid subscription, there’s still plenty of interesting content released for free for anyone to watch both on the vault site and via the GDC YouTube channel. So let’s walk through some useful (free) talks to get started on, just to familiarise yourself with the types of problems you might expect to find in AI for games, but also being able to respect the needs of designers and what the open challenges are that we’ve yet to find sensible and scalable solutions for.

Disclaimer: The author has been one of the advisors to the AI Summit for GDC since 2020, meaning they have an influence on the talks that appear at the event and support speakers in preparing their final presentations.

The Post-Mortem on Kojima Productions’ ‘Death Stranding’

Death Stranding is effectively a big-budget walking simulator, where players deliver packages to waystations across a desolate yet hauntingly beautiful post-apocalyptic America (that could easily be mistaken for Iceland, given the artistic inspiration). Given that players are walking across these rough and rugged landscapes, it’s expected that AI-controlled non-player characters (NPCs) will do the same. The Mules and Demens you will come across are intent on hunting you down and stealing your precious cargo, but if they can’t figure out how to reach you, then they wouldn’t be doing a particularly good job, now would they?

The Post-Mortem on Kojima Productions’ ‘Death Stranding’ [GDC 2021]

Typically for characters to move around the world, we need a navigation mesh to be running in the game that will enable us to calculate the available area within which an NPC can move around. So if you need a character to move to a particular location, it can then run a pathfinding algorithm on the navigation mesh to figure out how to get there. But navigation meshes are reliant on nice, flat, undisturbed areas of ground. So how does having a lot of rocky, realistic terrain in the likes of Death Stranding get in the way of the navigation mesh working as intended?

Eric Johnson of Kojima Productions gave a fantastic postmortem on Death Stranding, highlighting the myriad of challenges faced during development. Ranging from the navigation meshes not building as intended to dynamic generation to reflect real-time changes, maintaining smooth performance, and the problems that emerge when having characters path hundreds of meters at once across harsh terrain.

Bringing BioShock Infinite’s Elizabeth to Life: An AI Development Post-Mortem

While it’s critical that we get characters to make the right decisions, it’s all part of the package. We need to get the animations, the dialogue, and other audio to trigger when we need them, and we need all the AI decisions to make sense in a game where design is always shifting. All of this is made all the more challenging when the character isn’t just a companion AI that follows the player for hours of gameplay but is also critical to the story.

Bringing BioShock Infinite’s Elizabeth to Life: An AI Development Post-Mortem [GDC 2014]

BioShock Infinite by Irrational Games revolves around the character of Elizabeth, a young woman that the player character rescues from her prison in the steampunk city of Columbia. As you progress through the game, Elizabeth not only enables a variety of useful gameplay features but presents a new perspective to engage with the story and the themes and concepts expressed in the BioShock universe.

This post-mortem by John Abercrombie digs deep into the challenges faced by the ‘Liz Squad’: a team of programmers and artists whose job it was to bring this critical facet of the game to life. It explores the design rules they enforced to ensure Elizabeth acts as envisaged and how theatre and sports helped make it a reality.

AI Behaviour Editing & Debugging in ‘Tom Clancy’s The Division’

While it’s important that programmers go ahead and craft all of the systems required to bring AI characters to life in a given game, it’s equally as important for programmers to be working to facilitate designers. This is a difficult balancing act, given you want to provide designers with all the tools they need to create enemies that are fun, interesting, and complex, but the tools themselves should be practical, easy to learn, and efficient.

AI Behaviour Editing & Debugging in ‘Tom Clancy’s The Division’ [GDC 2016]

Disclaimer: The video may not be viewable because it is from the GDC vault. Click here to go to the direct link.

At GDC 2016, Jonas Gillberg highlighted that this can still be a painful process to get right, even at the biggest of studios. This talk goes into detail on the experiences of the development team at Massive Entertainment on Tom Clancy’s The Division in trying to create tools that enabled designers to build new enemy archetypes, rapidly iterate on ideas, build on existing features and enable for quick and effective debugging when it all goes wrong!

Bringing Hell to Life: Fully Body Animation in ‘DOOM’

It’s not just the decision-making for non-player characters we need to think about; it’s also about how they look as they move around. Animation is a critical part of the behaviour of an AI character. Your NPCs need to be able to communicate their thought process: are they stunned? Injured? Angry? Going in for the attack? All of this requires your animations to be built to support the game, knowing when to blend between them, interrupt them, or even adjust them so they can be used in a number of unique gameplay scenarios.

Bringing Hell to Life: Fully Body Animation in ‘DOOM’ [GDC 2017]

This talk from Jake Campbell highlights the level of detail required to ensure the animations of enemies in the 2016 reboot of DOOM worked as envisaged. To a point that they had to build an AI control system for the animation layers so that animators could focus on building fewer individual animations to a higher quality while the gameplay systems figured out how to apply them in context.

Marvel’s Spider-Man’ AI Post-Mortem

Sometimes even the most seasoned of developers don’t foresee all of the problems coming their way. What better example of this than the AI post-mortem on Marvel’s Spider-Man by Adam Noonchester. Developers Insomniac are no slouch when it comes to AI for action games, having previously worked on games such as Ratchet & Clank, the Resistance series, and Sunset Overdrive, but Spider-Man presents a myriad of AI challenges. Creating characters that work well in stealth sequences, in open fist fights, oh, and when you’re swinging through the streets of Manhattan too!

Marvel’s Spider-Man’ AI Post-Mortem [GDC 2019]

It’s always great to see how the biggest studios solve many of the problems that come their way. And this is a fine example of how it all comes together in the end.

Published by Tommy Thompson

With over 15 years of experience in artificial intelligence research in games, Tommy sought to provide a more accessible format for his area of expertise to those without the same scholarly background. Releasing the first AI and Games YouTube episode in 2014, Tommy has continued to build upon this small platform to form a company around it. With the YouTube channel amassing over 5 million views and 100,000 subscribers, the fundamentals of what AI and Games has sought to do has never changed. Educate developers and students on how best to utilise AI in their games.

Avatar photo