EA Play FIFA 23 F1™ 22 Madden NFL 23 Apex Legends Battlefield™ 2042 The Sims 4 Electronic Arts Home Electronics Arts Home Latest Games Coming Soon Free-To-Play EA SPORTS EA Originals Games Library EA app Deals PC PlayStation Xbox Nintendo Switch Mobile Pogo The EA app EA Play Competitive Gaming Playtesting Company Careers News Technology EA Studios EA Partners Our Commitments Positive Play Inclusion & Diversity Social Impact People & Culture Environment Help Forums Player and Parental Tools Accessibility Press Investors Latest Games Coming Soon Free-To-Play EA SPORTS EA Originals Games Library EA app Deals PC PlayStation Xbox Nintendo Switch Mobile Pogo The EA app EA Play Competitive Gaming Playtesting Company Careers News Technology EA Studios EA Partners Our Commitments Positive Play Inclusion & Diversity Social Impact People & Culture Environment Help Forums Player and Parental Tools Accessibility Press Investors

Content Moderation and Enforcement

Learn more about the features and tools we use to moderate the content that players share in our games.

At Electronic Arts, we believe in the power of positive play and that our players should feel welcome, safe and included in our games. In our Positive Play Charter, you can learn more about what we expect from our players.

You’re in control of the information and content you share

You allow EA and our players to access anything you upload or create, your user-generated content (UGC), within our games and services. You are responsible for your UGC, it must be your own content or content you’re allowed to use. 

First, consider carefully who you share your information with. Avoid sharing personal or private information online, including when creating your EA ID. You can use the Privacy Settings on your EA Account to control who can see or search for your account profile. In our games, you also have the ability to block or ignore anyone you’d prefer not to interact with. 

There are additional controls available to you if you have a child or teen EA Account and want to limit chat access, playtime, spending, or other interactions.

  • Text UGC is anything you type that can be seen by other players, and includes your EA ID, any game specific names, like club names, content descriptions and text chat. 
  • Image UGC is any kind of image you create with in-game tools or upload into the game.
  • Voice or Audio UGC is what you say to other players in voice communications they can hear.

Filtering Content

Text

We have real-time text moderation in our games, meaning that we use filtering tools to check if the text you enter upholds our Positive Play Charter before any other player sees it.  

If you enter an inappropriate name or item description, we’ll ask you to choose another one. If you’re inappropriate in text chat, that text will be replaced with asterisks so that other players aren’t disrupted by it. 

As the technology evolves, more recent games have a system with varying levels of permissiveness based on the age-rating of the game, however there are some types of speech we won’t allow regardless of the rating. This includes hateful conduct, bullying that goes beyond smack talk or banter, and sexual content. 

Our rules around harmful words are constantly evolving and cover multiple languages. We review the accuracy of our systems regularly to check they filter out the worst content without interfering with players’ freedom of expression and enjoyment of chat interactions in our games.

Image

We know that our players like the ability to customize their EA Account avatars. To ensure that the avatar images are appropriate, all uploads are filtered by automated means, supported by human review. 

Voice

We know it is important for players to be able to report inappropriate voice chat in-game. We are actively working on this and will share more information soon.

Player Reports

Player reporting remains a critical tool in how we promote safe gaming spaces, allowing players to tell us about disruptive behavior or content so that we can investigate it. Every EA game launches with an option to report in-game

Reports of Text UGC

  • We use automation to prioritize text reports, which means that some may be auto-closed if our tools assess them as safe. 
  • We spot-check all reported content on a regular basis to make sure we didn’t miss anything. 
  • If we did miss something, we adjust our filter so it will be filtered or correctly prioritized for review in future.
  • If we do remove a name or description we send an explanation to the player who uploaded it.

Reports of Image UGC

  • We rely on player reports to let us know about images that are not suitable for the game or experience where they’ve been shared. 
  • All reported images are reviewed by humans, and anything they find to be unsuitable is removed. 
  • Whenever we remove an image we send an explanation to the player who uploaded it.

Actions we may take

When players sign up for an EA Account or use an EA service they agree to our rules of conduct. We have a well-trained, dedicated team to review and respond to abuse reports swiftly and consistently. This team is supported by automation tools and by our Legal and Security teams.

If players violate our rules of conduct, we may place restrictions on their account and temporarily or permanently suspend access to some or all EA services. The intent of giving warnings and suspensions is to highlight disruptive behavior and give the player a chance to change their behavior. We will issue permanent account bans where the reported players’ actions are illegal or egregious or where they have a record of continual disruption.

We have internal processes for engaging law enforcement or relevant reporting bodies and for quickly responding to law enforcement agencies that are seeking support from us. Reported content of the sort that requires those processes is very rare; the vast majority of player reports are about cheating, vulgar, or offensive content. 

No actions taken against an EA Account, including bans, suspensions, and other actions like coin wipes, are automated.

 

Does all of that actually work?

We’ve learned that the sustained use and continual improvement of automated filtering tools can keep the worst content out of our games, while reporting allows you to let us know if another player is breaking the rules. This, along with feedback to the community, including through warnings or suspension, has been shown to facilitate safe and positive player experiences.

For example: 

  • An internal study showed that 85% of players changed their behavior when we provided information about the rules. 
    • After providing feedback to our Apex Legends player community about behavior that violated our Positive Play Charter, the community improved itself without bans or other punitive measures. 
  • This work, which started with NHL 21 and combined with NHL moving to our newest filter at the launch of NHL 23, has led to NHL 24 having the lowest amount of disruptive content to date.
    • Increased use of technology in NHL to scan and remove content for language that violated our Positive Play Charter led to a reduction in both repeat offenders and the severity of disruptive content. 

Learning from you

If we get something wrong, in text, image, or voice moderation, we want to know about it. 
For text or name filtering, post on the forum for your game on Answers HQ and the Community team will escalate it for you. 

If your account has been mistakenly actioned, you can tell us, and submit an appeal.