The world of gaming has long had a reputation for being a breeding ground for trolls and online harassment. Whether it be towards women, the LGBTQ+ community, or any other marginalized group, the problem persists. Here are a few ways game developers, streamers, and gamers can make online gaming more inclusive.
Impact of Online Harassment
Firstly, there are several adverse effects online harassment and bullying can cause. While there can be a list of dozens, these are four of the biggest.
Mental
Mental and emotional abuse are well-known, and online gaming is a prime space for it to flourish, thanks to the anonymity players have. It can cause tremendous, and sometimes irreversible, long-term damage, especially when you are constantly dealing with trolls.
Lower Diversity
Less diversity in a space is never a good thing. Most online harassment is targeted at marginalized groups, which creates a space filled with only the most toxic players and players who turn a blind eye.
There is also a lower chance of new players joining a game when it is clear the group they are part of is constantly harassed. This has been a common problem with female gamers for years, with some even needing to use voice changers to play on a team.
Reputation Harm
For game developers and companies, constant online abuse within their games will severely damage their reputation. Not fixing the problem comes across as accepting it, and in this day and age, that isn’t tolerated. Nor should it be.
Financial Loss
Building on from the previous point, when a developer is viewed as an entity that accepts this type of abuse, sales will inevitably slow down and drop. In a way, they have a financial obligation to maintain a safe space to ensure as many people as possible buy and play their game.

Solutions for Creating Safer Gaming Spaces
Several solutions can be implemented to create safer spaces in online games, but these are some of the best and, when implemented correctly, will be the most effective.
Finding Harmful Communities
Developers and moderators should become more effective at identifying harmful communities. Whether it be in-game clans or teams, social media followers, or anything in between, finding these communities is the first step to removing them from the game. While this can be tough and does require a lot of time and research, it will help developers and moderators stop these communities at the source.
Impartial Reporting System
One issue many people have had over the years is ban systems that seem far more subjective than objective. This has been a common problem on Facebook for ages; you find a post containing clear hate speech, report it, and find out a few hours later that a moderator has decided it isn’t hate speech.
Taking this subjectivity out of reporting draws a clear line in the sand about what will and won’t be tolerated. This is particularly important as there are multiple ways to harass someone, express hate speech, etc, without “technically” breaking the T&Cs of a game.
Improved Banning Methods
If you have ever played Warzone or another battle royale, you will already know that even the supposed best banning technology doesn’t work well. There are far too many ways for banned players to boot up a new account and be back on a game.
On top of this, if you aren’t banned for cheating, it is even easier to go under the radar and continue harassment. This is why developers need to reanalyze their policies and technology to ensure it is easier to ban players who repeatedly break non-cheating related T&Cs permanently.
Identify Problematic Streamers
Streamers can be and are part of the problem as well. There are many who cultivate communities that share their distaste for a particular group or policy, and with it being so easy to find other streamers and groups that share these thoughts, it takes barely any time for the problem to grow.

Zero-Tolerance Policies
Zero-tolerance policies would also go a long way in stopping online abuse. Whether in a streamer’s chat or in-game, outright and immediate bans for language and actions that break T&Cs should become more common.
Many argue that this infringes on “free speech” or that trash-talking is part of the game. While the latter may be true to an extent, the harassment and abuse many people face goes far beyond trash-talking, and this distinction must be acknowledged and called out.
Incentivization
While this step may be challenging to apply, incentivizing correct behavior, interactions, and diverse gaming communities can go a long way in creating a healthier community. This could be in-game items or events being reserved for players who have not received any warnings or bans or simply some in-game currency.
Transforming Online Gaming Into a Safer Community
Creating safe spaces in online games is a long-term process. While there are several steps developers and communities can take, the changes will only be seen if there is consistency and a “group effort” of sorts. Gaming is for everyone, and the player bases of every title should reflect that.