Because quality control is for nerds.
Continue reading “Steam: Another Day, Another Re-Uploaded Banned Game”
Because quality control is for nerds.
Continue reading “Steam: Another Day, Another Re-Uploaded Banned Game”
Banned a lot of games today.
Continue reading “Steam Cleaned: Valve Breaks Legs With Latest Banwave”
In the first half of the year.
Continue reading “FFXIV Took Action Against 100k RMT Accounts In 2020”
The Cellar possibly the same people.
Continue reading “Steam Cleaned: Valve Bans Night City Plus Others”

(Update: It looks like a total of 833 games have been banned by Valve today)
Bloodbath Kavkaz? Nah, Bloodbath Steam.
Valve is currently in the midst of what appears to be a massive ban wave of shady Steam developers, with hundreds of games caught in the crossfire and no sign of slowing down. The ban wave began just over an hour ago as of this publishing and has been knocking out games left and right.
Chief among the ban list is Dagestan Technology, a Russian publisher of titles such as Bloodbath Kazkov.
We will update if more information become available.
Source: Sentinels of the Store

Blizzard wants you to know that they are completely committed to dealing with toxic behavior in Overwatch. Effective immediately, offending players will begin receiving harsher punishments for cheating, griefing, trolling, spam, and match disruption (intentional AFK). Exactly how strict Blizzard’s punishments will be will need to be seen.
We know that making Overwatch a truly welcoming environment is an ongoing process, and this is only the first step. Over the next several months, we have plans to make additional improvements based on your feedback, including scaling competitive season bans, a notification system that will alert you when a player you’ve reported is actioned, and functionality that will allow us to more aggressively penalize players who attempt to abuse the in-game reporting tool.
Overwatch has had ongoing problems with toxicity, a concept not at all unfamiliar with heavily competitive titles.
(Source: Overwatch)

Stomping down toxic behavior is all the rage these days, between Riot Games putting the kibosh and permanently banning certain players for life, to Blizzard pledging to tackle racism after the latest Dreamhack conference, Jagex taking on streamer harassment and KKK cosplay (a phrase that shouldn’t exist), and now Trion Worlds with ArcheAge. The game has become a lot less friendly and Trion’s customer service isn’t happy.
In a news post published yesterday, Trion Worlds has committed to taking a more hands on approach with toxic behavior.
We are going to be much more conscious about what we allow to be said in public chat channels. We know that some will do their best to test boundaries and try to skirt our intent and then appeal the action with a technicality. Ultimately if our determination is that your chat is contributing nothing except grief to a player or community we will take actions to prevent that.
So what does this mean? Well, you can call a boss a bitch but you can’t call another player a bitch. Personal attacks, using alternate accounts to harass a player who has put you on ignore, spamming the chat channel, physical threats, and more will result in action taken against offenders. Trion also reminds players that this only extends to in-game chat, and that the company can’t extend its reach to outside platforms (Facebook, Twitter, etc).
(Source: ArcheAge)

Today must be a day ending in ‘day,’ because Dreamhack has come and gone and the internet has once again shown itself to be a cesspool of racism and harassment. In the wake of people piling on to the Hearthstone stream to throw racist comments at finalist Terrance Miller, both Blizzard and Twitch have committed to reducing problematic behavior on the platform.
Is there ultimately any difference between someone who posts racist remarks with the goal of trolling/harassment and someone who posts them because they are genuinely racist? Probably not, both are equally disruptive and in need of being stamped down. Because MMO Fallout’s modus operandi is to help solve problems rather than just point them out, I’ve decided to compile a list of ways Twitch can curb harmful behavior.
5. Prevent New Accounts From Using Chat
This one is simple and links in with one or two other suggestions on this list. Many MMOs already do this to curb gold farming, where accounts are not allowed to use chat or access certain trade/communication features until after they’ve hit a certain level. It doesn’t stop the problem completely, but it does lower the ability of people to mass produce burner accounts.
How would this system work with Twitch? You could theoretically introduce a minimum waiting period anywhere from a day to a week or more before an account can access chat. Said waiting period could be removed with the inclusion of two-factor authentication.
4. New Chat Mode: Authenticated
Right now there are only a few chat modes available to Twitch streamers, from subscriber only to off completely. Since Twitch already has two-factor authentication, it wouldn’t be that difficult to implement a chat mode allowing subscribers and non-subscribers that have been authenticated to chat.
Two-factor authentication also means that you have an outside identity tied to the account, be it a phone number or the hardware ID of the mobile device. This would give Twitch the ability to ban all accounts associated with that phone number/device and prevent it from being used to sign up for a new account for a period of time.
Valve already does this with Counter Strike: GO, where a ban will blacklist that person’s phone number for three months and ban all accounts associated with it.
3. Turn Off Chat For Big Events
This is a copout and not suggestion that actually fixes the problem, but right now it seems to be one of the easiest conclusions. Look at it this way, with tens of thousands of people watching these events, is having them all in one central chat room really logical? Imagine packing an entire stadium worth of people into one room letting them drown each other out. Then have a team of ten people try and keep the conversation in line. Impossible, right?
As much as I’m sure event organizers don’t want to use them, there are already systems in place on Twitch to aleviate these problems. Slow chat, subscriber-only, turning chat off, all of these are useful tools. The moderators of Dreamhack even admitted that they made mistakes, with moderators overwriting each other’s decisions.
2. Shadow Bans
Simple, efficient, and taking a card from Reddit’s book. If you aren’t familiar with a shadow ban, it is a special type of punishment where the poster can see his own messages but no one else can. The problem on Reddit is that it becomes readily apparent rather quickly that you’ve been shadow banned, as all of a sudden your posts stop receiving up-votes and replies.
The program works more effectively when the user can’t gauge reactions or isn’t paying attention to them, which is why it is a good idea for Twitch. When someone is shouting into the void (or in this case wall of text moving at 100mph), odds are they aren’t looking for a response. Banning outright tells the player to create a new account, by shadow banning they can go on for hours without realizing that no one is listening.
1. Unify Bans
I like to think of this method as the nuclear option, it is probably the most effective method while simultaneously capable of causing untold destruction with widespread nuclear fallout. It requires a collaboration by a group of people whose opinions and judgement can be trusted.
In short, a recipe for disaster.
How far you want to go with this depends on how much you really want to stomp down bad behavior. For instance, should Dreamhack share bans across all of its streams? Should Dreamhack partner with other associations to share bans? Would regular streamers have access to the ban list? Who decides who is added to the list?
It’s certainly a question, one that requires a lot of thought and planning, but one that could work.
Can toxic behavior be controlled on Twitch? Let us know in the comments below.

Old School RuneScape’s Community Manager took to the game’s subreddit today to address a controversy that sparked up over a number of accounts getting banned overnight for alleged harassment of a popular streamer. According to player allegations, people were banned for the simple act of standing near the streamer while playing emotes.
The story Jagex is telling is very different from the one being passed around between members of the community. According to Jagex, the players were banned for a string of racist and abusive messages, and that only one person was permanently banned which was for dressing up as a KKK member.
Recently, several players were banned for repeatedly harassing a number of people within the community. We want to make one thing clear in this post: the actions of these players were completely unacceptable and we will not tolerate racism or harassment in Old School RuneScape.
The post goes on to state that Jagex took action on players who “spent their time spreading racism hatred, and abuse throughout the game.” Later on in the same Reddit thread, Kemp explains that the reason players were banned for “bug abuse” was because of the length that the ban carries.
Jagex has refused to publish the transcript evidence of players banned for abuse, due to not wanting to publicize the activity and out of fear that it would create a precedent obligating the company to release the transcripts every time someone demands it.
MMO Fallout will update if we get any more details.
(Source: Reddit)
Rust is the open world survival game by Facepunch Studios, makers of the popular sandbox game Garry’s Mod. As a game that relies heavily on player vs player combat with heavy ramifications for death, it should come as no surprise that cheating has become prevalent in servers. To combat this, Rust employs its own anti-cheat system that bans players on a continual basis. We know this because the Rust Hack Report Twitter account is constantly being updated.
In the time it will take to publish this report, more than a dozen new bans will be reported through the account. Many of the accounts banned appear to be throwaway accounts (new accounts with only one game played) and others are repeat offenders with additional game bans and VAC bans from other titles. Some have tallied more than 100 hours on Rust in the past two weeks, some haven’t even bothered setting up a community page. A few are set to private, while one or two are normal accounts that seem to have large libraries and varied gameplay.
If you’d like to follow the Rust ban list, check out the Rust Hack Report Twitter.