How Artificial Intelligence Has Evolved Star Wars Battlefront II
If you’re a long-time player (or just generally observant), you’ve probably noticed how the implementation of non-human players – driven by AI (artificial intelligence) – has played a major role in the evolution of Star Wars™ Battlefront™ II throughout the year.
As we tend to talk a lot about the more, let’s say, tangible aspects of our game – be it new planets, heroes, reinforcements, that kind of thing – we wanted to take a moment to highlight the work of our coders and designers in this specific domain.
Hopefully, it’ll give you some noteworthy insight in how the AI operates under the hood, and why it ultimately allows our game designers to create awesome gameplay experiences and quintessential Star Wars battle moments.
AI controlled Anakin Skywalker and Obi-Wan Kenobi teaming up, creating this quintessential Star Wars moment on the streets of Theed.
The New AI in Star Wars Battlefront II – A Brief Summary
A big milestone was reached in early 2019 as Capital Supremacy launched, introducing a brand-new AI technology. For the first time in Star Wars Battlefront II’s online multiplayer, players battled alongside and against bots (another word for AI controlled units).
The AI provided scale to the combat by increasing the sheer number of troopers on the battlefront. But it also enabled players across the entire skill spectrum to have a good time, as the bots are – if compared to seasoned veteran players – easier targets.
Since then, as part of the Cooperation Update, the AI has been put front and center in no less than two new game modes: online Co-Op and offline Instant Action. The development team doubled down on utilizing the AI to make players feel heroic on the battlefront, allowing them to create their own, larger-than-life Star Wars moments – and to just have fun without the pressure of human competition.
Blaster fire zipping through the air in a high intensity Clone Wars battle on Geonosis in Co-Op.
It All Started as a Testing Tool
The project was started by Jonas Gillberg, Sr AI Engineer at EA, in August 2017 and involved (as mentioned above) the creation of a brand-new technology. For clarity, the initiative has nothing in common with the self-learning agents as demonstrated by SEED or the AI present in Star Wars Battlefront II’s Arcade mode.
This tech was basically built to make bots play games, so that humans don’t have to. But why?
The initial scope of the project was to build an AI to scale automated testing for large multiplayer sessions. In order to test things properly, an AI behaving as close as possible to human players was needed.
The technology is set up to provide quality assurance analysts – who analyze and test games on a regular basis – and content creators (e.g. game and level designers) high-level control of AI objectives through visual scripting. This way, when utilizing the AI for testing purposes, direct interaction with the code isn’t necessary.
Examples of high-level objectives that can be provided through visual scripting are move, defend, attack, interact, action, seek and destroy, and follow. Ultimately, the AI brain produces the right “button keys” – shoot, strafe, yaw, pitch, jump, and so on – to press at every single frame. Think of it as a robot playing the game with a controller in its hands, pressing physical buttons – but on a conceptual level. When plugged into the game engine, this makes the tool extremely versatile and applicable in almost any game.
The AI clone troopers holding their ground on Felucia, evading, crouching behind cover, and patrolling the area.
As a testing tool, it has a massive scope. It’s used to test game clients and servers, fill up playtest servers, pre-playtest checking, co-op stability, feature testing (like driving vehicles), testing the functionality of an upcoming patch, and more. It’s also actively used by game designers to apply “brute force” and polish levels. For example, by releasing the AI on a multiplayer map, it’ll find spots where real human players could get stuck.
While created to be a testing tool mimicking human behavior, a lot of potential was still untapped. Then, as by a quirk of fate, Manuel Llanes, Design Director on Star Wars Battlefront II, tapped some coffee for him and Luca Fazari, AI Engineer at EA, and asked what he was working on. Luca had just been onboarded on the new testing tool and demonstrated the bots in action. As Manuel saw that it was server-side, performant, and really promising, things clicked. Prototyping started and was presented to the wider development team, stirring up a lot of excitement.
Soon after, Luca took on the assignment to polish up the combat system, navigation, and implement behaviors to make it a shippable feature – with the specific focus on Star Wars Battlefront II.
Under the Hood of Star Wars Battlefront II’s New AI Tech
As soon as you start a game of Capital Supremacy, Co-Op, or Instant Action, the AI bots immediately evaluate what to do next. This continues throughout the game, with the AI prioritizing and selecting actions based on its current state.
AI controlled Count Dooku unleashing his unique lightning ability, before the clones get the better of him.
This is where logic creates magic (if we dare brag a little).
The AI’s decision-making can be looked upon as sub-categories working in tandem:
- Targeting and vision. A line-of-sight check is made by the AI controlled unit in a front-facing cone view. The closest target in this view is engaged with. Targets that are in cover and not directly visible are not engaged with. Each bot tracks its current target’s positions and timestamps.
Looking at the last positions of the found target, the AI estimates where the target will be a couple of seconds in the future, calculates the velocity vector, and multiplies them both to predict the future position. That’s where the bot will aim. Then, of course, the accuracy of the AI is configurable, and depends on how much aim noise is added to its weapon. - Weapons and engaging distances. Combat decisions are based on the bot’s equipped weapon, and how far it can reach. There’s an optimal “stop and shoot” distance for each weapon, which is set slightly before the projectiles’ damage starts to drop off. This is the point at where the AI will stop moving towards the target, since 1) getting closer wouldn't deal more damage, and 2) it would make the AI an easier target as well. Each weapon is also given a float value between 0 and 1, deciding where on the target’s body the bot will aim – with 0 being at the target’s feet, and 1 at the target’s head.
Furthermore, the limit set for where targets will be tracked by the AI is the end of the damage drop-off multiplied by an tweakable number, deciding when the target is close enough to go after. - Combat maneuvers and awareness. The bot seeing through a front-facing cone view opens the possibility to sneak up behind them! However, as soon as the bot gets shot at, it’ll turn towards the player who inflicted the damage and set him or her as its priority target.
If the bot and the player (or other target) are facing each other, the bot will enter a state where it tries to evade and dodge its target’s projectiles while returning fire itself. The bot will strafe left and right (variating randomly each time the bot changes direction) and make the occasional movements back and forth – all within certain configurable time constraints.
At each change of direction, the bot will be given an opportunity to jump or make a dodge roll, as well. This controlled unpredictability, so to speak, helps to convey a sense of human behavior.
That’s the top line, but there’s even more logic set up for the AI to follow, such as:
- Defending an area
- Crouching and patrolling when defending an area
- Navigating through the map in a realistic manner by using a constraint-random set of intermediate positions between the starting point and destination
- Procedures to get out of an area where it’s stuck (where one step is for the AI to mash all buttons frantically in order to unstuck itself – visualize that!)
- Follow objectives, circle up in formations around the player, and reach a position close to the followed player to patrol the vicinity and protect it from hostiles.
The Specifics of the Hero Combat System
We’d argue that there’s a palpable sense of excitement when you’re in the midst of an intense firefight in Instant Action or Co-Op, when suddenly an AI hero appears. The larger battle becomes peripheral, and you’re now dueling the merciless General Grievous or the Chosen One himself. These are instances of when players can create their own epic Star Wars moments.
There are a few things we do in terms of AI hero combat to enhance that immersion. The vision for hero bots is to recreate experiences taken from the Star Wars movies, where blocking and studying the enemy is a big component of the fight.
When the AI hero has targeted an enemy at a certain distance, there’s a range of probabilities for the bot to use one of its unique abilities (character specific attacks, buffs, or Force powers). When the target gets even closer, the real attack and guard logic kicks in.
Player controlled General Grievous dueling an AI controlled Anakin Skywalker, switching between defense and attack.
If the AI is a lightsaber hero facing another lightsaber hero, it’ll have a window of time when it guards, and another window of time when it attacks. To make the hero bot more challenging to predict, those time slots varies randomly between some configurable min and max values – picked differently at every move decision. The same logic applies to the hero’s dash evade maneuver.
In October, the AI heroes were updated to be even more aggressive. A new behavior, triggered when the AI’s target wields a blaster instead of a lightsaber, was also created. In this case, the AI hero will deflect the blaster fire at the shooting enemy up to a certain variable amount of time. When that time’s out, the hero bot will start charging the opponent as fast as possible.
We’ve found that last bit to be equally terrifying as it is exciting!
Player facing off with AI controlled Yoda, absorbing the blaster fire to use it against the player.
How Designers Utilize the New AI to Create Quintessential Star Wars Moments
There are two layers in Star Wars Battlefront II’s new AI tech. One is the tactical layer, and the other is the strategic layer. What is described above can be filed under the tactical layer; the moment-to-moment decision making written in code.
What’s up to the game designers to figure out, on the other hand, is the strategic layer. This directs how the bots, prepped by the programmers with all the tactical know-how, will go about in a specific game mode. In Star Wars Battlefront II, Viktor Lundberg, Sr Technical Designer, drives the implementations of the actual Co-Op and Instant Action game modes, and Martin Kopparhed, Sr Game Designer, scripts the bots to play the modes in strategic ways.
The Separatist AI attacking a Command Post on Naboo in Instant Action.
Looking at Co-Op as an example, this means setting up the rules for which Command Post the AI should attack and when, how many bots are sent out to defend a Command Post under attack by human players, when and how defending bots should retreat once a sector is lost, and more.
Another task is to generate a so called “navmesh,” an invisible layer applied the location itself to help the AI navigate the environment. The navmesh will do its best to prevent the AI from the occasional dive down a cliff or trying to walk through closed doors.
The designer’s work also includes creating the wave spawning of enemies, to generate peaks and valleys in intensity. This is why you’ll have really intense moments at times, where blaster fire zips through the air like it’s full-on Clone Wars, and other moments where you’ll have time to breathe out and regroup.
There’s a similar system in place for AI heroes, deciding when and how often they’re allowed to spawn, initiating periods of increased challenge – a little bit like a mini-boss!
Take the tactical and strategical layers of the AI, pair it with authentic visuals and audio, a plethora of unique characters, locations, plus all other facets that make up our game, and you have all the ingredients you need to cook your own larger-than-life Star Wars moments.
As with the game as a whole, our AI tech is in constant evolution. Stay up-to-date by checking this space, our EA Star Wars social channels, and forums.
If your appetite for knowledge on how Star Wars Battlefront II is made still isn’t quelled, we have a few more articles ready for you to digest:
Animating Anakin Skywalker
The Concept Art of Star Wars Battlefront II
Living World and the Battle Beyond
The Making of Geonosis
The Making of a Hero
The Creation of Ewok Hunt
Visual Effects in Star Wars Battlefront II
–Daniel Steinholtz (Follow Daniel on Twitter @dsteinholtz)
Make sure to join the discussions on our official forums or on social – follow EA Star Wars on Twitter, Facebook, and Instagram.
Also, sign up today to receive the latest Star Wars Battlefront II news, updates, behind-the-scenes content, exclusive offers, and more (including other EA news, products, events, and promotions) by email.