Skip to main content

Rainbow Six Siege and the gaming culture war: Why 'toxicity' is a lazy bogeyman

Rainbow Six Siege
Rainbow Six Siege (Image credit: Ubisoft)

Be it in real life, media, or video games, healthy discourse has taken a serious nosedive in recent years. This past year alone, Microsoft conducted research to confirm that people are growing less civil online. As a result, many places have taken a "no conflict" approach, seemingly because the idea of radio silence is better than the idea of entertaining potentially provocative speech.

Look at how many videos on YouTube disable comments. Look at all the sites and forums that manually delete posts and cherry-pick what stays and what goes. Look at the absolute state of Rainbow Six Siege, a game that has struggled with toxicity in multitudinous forms yet only ever decided to crack down hard on something as immaterial and irrelevant as bad words. Fear of diversity of thought, be it constructive or destructive, runs rampant right now, even though said diversity is just a symptom rather than a root cause.

With that in mind, I'd like to make an argument for why we need a new approach to toxicity, specifically in gaming, that focuses more on the big picture and less on making sure everyone is adequately armed with earmuffs and sight blinders.

The deal with toxicity

Rainbow Six Siege

Source: Ubisoft (Image credit: Source: Ubisoft)

We can't discuss toxicity without first defining it or, rather, highlighting how vague its actual definition is. Merriam-Webster says toxicity is "an extremely harsh, malicious, or harmful quality." Here's the issue: That means speech-based toxicity is different for absolutely everyone.

Why are mean words considered the absolute pinnacle of toxic behavior?

You know when you lose in a match of Rainbow Six Siege and someone posts a series of smiling emoji to taunt you? For a great many people, that's just as, if not more, toxic than that same person calling you a good-for-nothing sack of expletives who should uninstall the game. It's far more passive-aggressive and petty, to say the least. And even if the symbols are different, the intention is the same: to build unhealthy rage in an opponent. Yet the person who posts the baiting smiley faces will not be banned, whereas the person who calls someone else a $^%&head will, even though the degree of offense communicated with each type of message is immeasurable and subjective.

Furthermore, if we stick with the Rainbow Six Siege train of thought, remember: You can mute people and turn off chat. You can take matters into your own hands and build a walled garden around your eyes and ears so no one can hurt you. The tools are there for people to speak freely while also not offending your sensibilities.

So why, then, has Ubisoft, along with many similar companies such as Riot Games, decided that mean words are the absolute pinnacle of toxic behavior and placed a premium on combatting that peripheral bogeyman instead of the underlying issues causing the mudslinging in the first place?

Skewed priorities, hollow accomplishments

Rainbow Six Siege Maestro

Source: Ubisoft (Image credit: Source: Ubisoft)

Rainbow Six Siege is a game that breeds toxicity from its very foundation. Permadeath means if you die for any reason in a round, you lose five minutes of your actual life. Always-enabled friendly fire means permadeath at the hands of a teammate is a constant threat, especially in a game with a community such as Rainbow's, which is made up of an even four-way split between tryhards, griefers, braindead goobers, and young children who probably shouldn't be playing a game with such adult themes.

Herein lies the real toxicity. Between these four groups, virtually every match of Siege will feature one, multiple, or all of the following events:

  1. A teammate will teamkill you either by accident or on purpose.
  2. A squad of people will gang up on a solo queuing player via teamkilling, team injuring, or destroying gadgets every round.
  3. A player will bait you, either by damaging you or your gadgets, into teamkilling them to make you a target for being a "TKer." (This is how players get around R6's poorly implemented "reverse friendly fire" penalty)
  4. A player will go AFK (away from keyboard), essentially robbing you of a teammate and throwing the match.
  5. A player will deliberately or unintentionally ignore or avoid the objectives, thereby throwing the match.
  6. A player will place reinforcements and traps in counterproductive places, thereby throwing the match.
  7. A player will use cheats without consequence, despite BattleEye's best (but not good enough) efforts.

Notice how none of these incidents involves someone talking smack in a little text box in the corner of a screen, yet all have far more serious, material ramifications for your enjoyment of Rainbow? It's easy enough to tune out some nonsensical, jibberish text (either by not paying attention to it or literally blocking it). But when teammates are deliberately denying you access to your game by making you wait entire rounds just to be teamkilled, the toxicity is on a whole 'nother level. Especially because if you quit a match to escape the malicious parties (of whom there are millions in ranked matches), you will be penalized and temporarily banned for abandoning your squad.

These are concretely toxic actions in the sense that they are objective illustrations of maliciously motivated, harmful behavior. And yet, none of them are effectively combated by Ubisoft. Ubisoft seems fixated on going after a far more easy target: the folks who type mean things, either as a result of having suffered the above annoyances or just because they enjoy saying nasty stuff.

Rainbow Six Siege Operators

Source: Ubisoft (Image credit: Source: Ubisoft)

Ubisoft has gone to drastic measures such as disabling "all" chat by default, so you only ever read thoughts from half the players in a given match. The company has given people a multitude of muting and hiding options to make sure no player has to indulge another player's voice or text messages. Yet Ubisoft has taken it further, to the point where expressing basic frustration (not slurs, targeted harassment, or anything of a "more serious" nature) via mic or chat can result in a temporary or permanent ban.

Ubisoft's efforts to curtail rude speech are the equivalent of putting a Band-Aid on the Titanic.

In concept, this strictness is good, as it's intended to enforce friendly competition amongst players. But again, that circles us back to the issue of what's toxic. Players can still taunt you verbally or textually in ways far worse than a simple "f&^* you." That goes unpunished, while bad words — spoken inside a game about brutally murdering people acting out terrorist attack scenarios, no less — receive near zero tolerance.

It'd be one thing if Ubisoft applied this hardline approach to all of the game's toxicity issues, but here's the problem: Ubisoft doesn't. All of the aforementioned numbered toxicity items in Siege run rampant. Ubisoft's efforts to curtail rude speech are the equivalent of putting a Band-Aid on the Titanic and punish people for the simple act of venting or getting competitive when the real issues arise.

It often seems that the report buttons in Siege, the ones meant to help you flag griefers and troublemakers to Ubisoft, do nothing and have zero consequence. They're just there to make you feel good, much like BattleEye is supposed to, even when cheaters run rampant. The only tool in Ubisoft's arsenal of toxicity battling measures with any visible effect is the chat auto moderator, which is quite possibly the most effective operator in the game in terms of killing off players. No Jackal or Caviera will rack up as many eliminations as the chat auto moderator, even though half of those eliminated will likely be your own teammates.

Not an isolated problem

Overwatch promotional artwork

Source: Blizzard Entertainment (Image credit: Source: Blizzard Entertainment)

The issues that plague Rainbow Six Siege are not exclusive to it. If you recall the famous video titled "I'm Done with League of Legends" wherein YouTuber videogamedunkey formally quit LoL for good, he cited similar reasons for why he abandoned his game of choice. He broke down how the inner workings of the experience were inextricably linked to toxicity and that no amount of blacklisting players who said mean things would change the fact that the game's fundamental mechanics routinely led to anger.

Toxicity is alive and well, even if developers have deluded themselves into thinking otherwise.

And yet, toxicity ceaselessly plagues Rainbow, LoL, Overwatch, Call of Duty, and just about every other multiplayer experience in existence. It's almost as if the very nature of competition invites conflict, which produces speech and behavior that is not, strictly speaking, "nice." And by attempting to pacify people to such an extreme degree, developers and publishers are essentially forcing a boiling kettle to never overflow, preventing it from ever cooling down.

So, when the kettle boils and has no way to let off steam via sh*tposting, what happens? We then get a player base like Rainbow Six Siege's, where players spend copious amounts of energy and time figuring out how to anger others via in-game mechanics instead of speech, resulting in a game where teamkilling and sabotaging one's allies is commonplace. In an attempt to cure a disease by only seriously addressing a single symptom, Ubisoft and similar developers only make the overall illness worse while continuing to pretend that they've done a good job.

Look at Overwatch, wherein abusive text crackdowns are being cited as cures for the problem. And yet, anyone who plays knows this isn't true. Toxicity is alive and well, even if developers have deluded themselves into thinking that fewer mean words translate to a net decrease in overall bad behavior.

Bringing it full circle

Smoke

Source: Ubisoft (Image credit: Source: Ubisoft)

To make the point I'm trying to make, I have to take the stance of absolute freedom of speech and, at least when it comes to things as trivial as games and entertainment media, freedom from consequence. I do not think anyone should be prohibited or deterred from sharing their thoughts or from using any words they need to communicate said thoughts, so long as they're not actively threatening one's livelihood or physical safety. That's why you won't ever see me delete readers' comments here on Windows Central, regardless of what they say (though I cannot speak for my colleagues in this respect, nor should I).

With that said, I'm not here to encourage little Timmy from Idaho to hop on his Xbox Series X and shout racial slurs and abusive language at people; that sort of behavior isn't productive or healthy, and if a game gets someone in the mood to act that way, the ideal solution is that they find a game that makes them happier. However, I also don't think such behavior is the real foundational issue we all need to be focusing on. That aspect of toxicity can be muted or ignored. The root cause of said behavior is what we need to handle.

Chat abuse in games is a symptom of real problems, be they bad parenting, bad influences, anger issues, or the natural consequence of video game experiences deliberately designed around getting under people's skins as much as possible. I can't do much about people's parenting styles or individuals' anger management problems, but here's the deal: If a game is designed for toxicity like Rainbow, League of Legends, and others are, let the troubled people who choose to play such games have their therapeutic venting space so that the negativity is not brought offline.

Alternatively, demand that developers strip out the inherently toxic elements so that such negativity is not actively bred in their games, though such an endeavor may be fundamentally impossible given the nature of the beast. But limiting communication is the equivalent of putting a layer of bubble wrap over a jagged, rusty piece of metal. It's not fixing the issue; it simply shows how much deeper the problem really is. And if you choose to slam into that metal thinking the bubble wrap is enough to protect you from the real danger, don't be surprised when you get cut.

Robert Carnevale is the News Editor for Windows Central. He's a big fan of Kinect (it lives on in his heart), Sonic the Hedgehog, and the legendary intersection of those two titans, Sonic Free Riders. He is the author of Cold War 2395. Have a useful tip? Send it to robert.carnevale@futurenet.com.

22 Comments
  • This is why I play D2 PvE. I can still play with other people but they can't grief me.
  • PvE is a solution, sure. I guess
  • You are all OVER this Robert. Nicely done.
  • To clean community from toxic elements Xbox should use permanent console, PC id banhammer. Account ban is not enough, toxics make multiple accounts and continue ruin gaming for folks.
  • This was surprisingly level-headed. Of course people shouldn't be rude/mean/nasty but selectively censoring people always has and always will breed resentment and more bad behavior.
  • Censorship itself is toxic. Who's watching the watcher? We need more speech not less. Misinformation and disinformation is of direct Bolshevik derivation. Wanting to declare ideas and thought as verboten is a clue of intolerance and paranoid delusion, not reality.
  • This is correct and I think Robert nailed it pretty well.
  • Agreed. And the more these developers play whack-a-mole with rude/uninformed chatters, the less they focus on the actual issues. Let people voice their silly opinions/thoughts, let people vocally oppose them (instead of suppressing them), and force game devs to stop pretending that they're addressing the real problem.
  • You say a lot of words but none seem to offer concrete paths to resolving the issue. What will you do with people who spam "1488" all over at players of color? Or who keep sexually harassing female players? Let them own the space? Because that's the end result and is a big part of why gaming in general is a cesspool that is hostile to large demographics.
  • I think Robert's point is that action like that can be self policed, generally games provide options for players to block or report specific people from your mic/chat. Of course it doesn't stop the initial rudeness, but it means that you are aware of the ********* and can avoid them whereas if the game itself merely removes the troublesome text, then a person doesn't know that the person on their team is a racist ***** who is intentionally causing grief for that reason.
  • Exactly, Ogaris. You know what bothers you the most when it comes to spoken comments, so utilize the tools provided to you to block out anyone you dislike in a random web-based game encounter. But the broad, sweeping silencing tactics are lazy at best (since they don't tackle the real problems) and harmful to general human discourse at worst, since they encourage people to get more inventive with stuff like trolling.
  • You can't fix people, that's for sure. And that's the real problem. But silencing strategies at least prevent people from getting insulted or harassed beforehand. If you get insulted online, the damage is already done. If someone goes online to play a game to have fun and gets insulted, discriminated or griefed, it's a horrible experience. Blocking, reporting or getting someone banned won't make it go away, maybe it's ruined your day, or maybe it's ruined that game, or even videogames forever for you. In my experience as a gamer, I've longed to play online games because of the potential they have of interaction. But in practice the experience always ends up being a bad one more often than not, when playing with random players or open games.
  • The thing is, on one hand I don't want to go into a game and in addition to playing it having to report people myself, to do myself the work needed to maintain a game as a healthy place. It's like going into a restaurant and having to clean the toilets.
    And also what you say implies that people are getting insulted and harassed beforehand. Any other measure is secondary to that happening. So you log in to a videogame to have fun and you get insulted? Nobody wants that. I much rather have preventive measures from that happening, as many as possible. And yeah, I think censorship applied to prevent violence or aggresion in these scenarios is a good thing.
  • >>If a game is designed for toxicity like Rainbow, League of Legends, and others are, let the troubled people who choose to play such games have their therapeutic venting space so that the negativity is not brought offline. So you're proposing just to let the games turn into a cesspool and let those who choose to play it swim with the turds. The only thing that creates is a game that can't be played (unless you're insane). It's bad for that to exist and to some extent (and hopefully) it's bad business. I agree with how your article blames the gameplay experiences themselves as being designed as toxic. That is an important part of this. But the underlying issue is that people will be crap. People given the slightest room to do it will be absolutely horryfying to fellow human beings. And this happens online consistently. This is a complicated issue because there doesn't seem to be a solution to it. The only takeaway I get from this is to stay away from R6 and LoL. Also, it reminds me of a recent conversation I had with a newcomer to games where I recommended Sea of Thieves to him, and said he had a bad experience with aggresive people, shrieking kids and insults. I said what I believe, that every multiplayer game should be played with all forms of chat turned off. And that is the state of multiplayer in 2021.
  • Seriously speaking, if all you took away from this was to stay away from R6 and LoL, that's perfectly fine. They're very good games to stay far away from since I do think they're the kinds of experiences that draw not-so-nice people, like moths to a flame.
  • I played well after turning off voice chat in CSGO and Valorant. It's a bit hard to tolerate when you got kick off in unranked random match, in CSGO.
  • I think voice chat and text chat, together with any form of emote communication should be turned off by default, so you can only use it if you intend to. As you say, even so people are crap. My first SoT game where I tried playing witha random crew had me locked in the jail with people mocking me and laughing at the "rando". That's awful, and it completely shut down the option of playing without a crew of friends in that game.
  • I get why these governance issues are complicated and why we need to have lots of discussion about how to go about it in a better way. But throwing around the word 'censorship' is lazy and does not help us understand these issues. Private companies are free to set the terms of how they expect their users to interact with each other, and users are free to leave if they don't like it. Private companies are also free to not do business with whomever they want. It has nothing to do with censorship. You looked up the word 'toxicity' but you should have looked up 'censorship'.
  • Andrew, take a peek at Merriam-Webster. "Censor (transitive verb): to examine in order to suppress (see suppress sense 2) or delete anything considered objectionable. Also: to suppress or delete as objectionable" Nowhere is the word limited to defining the actions of government bodies; it applies to private institutions as well. It applies to anyone or anything that censors someone. Just because you're not comfortable acknowledging the reality that this is censorship (albeit in a petty, game-related form) doesn't mean reality itself will change to suit your shifted goalposts. This is censorship, and companies are free to censor customers they don't like. I'm simply arguing why it's a bad practice.
  • Agreed, I don't think that angle solves anything.
  • Wow, just saw this article, months late, but nicely said Robert. I'm saddened that there are people (probably a large number of them) who feel that preemptively blocking speech because of the harm cruel words can have on people is better than than just letting it occur and then expelling the "criminals" (don't mean that literally). This is the speech version of the Minority Report. Proactively punishing because of what might occur is fundamentally wrong. Amazingly, this seems to be becoming the norm. Yes, these may be private companies, and so yes, they have the LEGAL right to do what they're doing (in fact, I'd defend their RIGHT to do what they're doing), but it's still appropriate to argue against their actions and choices, just as it would be to condemn them for encouraging racist or hateful speech. Remember when it was only small children who needed to be reminded, "Sticks and stones can break your bones, but words and names can NEVER hurt you?" I do think it's a good thing that we as a society have become more concerned with not being cruel or insulting to people (we should all strive to be kind and considerate), but at the same time, people act as if there's some glory in being offended. There is not. If you're offended by someone else's words, that's probably on you, especially if their offense was unintentional. To expect everyone to speak only the way you want them to is pretty much the definition or narrow-minded and intolerant. To demand it is much worse.
  • I'm months late as well, but: Nailed it.