YouTuber Jared “ProJared” Knabenbauer has posted a video denying allegations from three months ago that he supposedly had exchanged nude photos with two underage fans. Knabenbauer acknowledged that he did exchange nude photos with other fans, clarified that this was “consensual,” although there was a “power imbalance” between himself and his fans.
Discord, the only way young people communicate via voice after we collectively agreed to stop answering phone calls several years ago, is trying to be more open about how it handles harassment, threats, doxxing, and other forms of abuse on its platform. In order to do this, it plans to release regular “transparency reports,” the first of which is available to peruse now.
The idea, says Discord, is to keep people clued into the decision-making underlying the 250-million-user chat service so that people can understand why things do and, in some cases, don’t get done.
“We believe that publishing a Transparency Report is an important part of our accountability to you—it’s kind of like a city’s crime statistics,” said the company. “We made it our goal not only to provide the numbers for you to see, but also to walk you through some obstacles that we face every day so that you can understand why we make the decisions we do and what the results of those decisions are.”
The numbers are fascinating, albeit not wholly unexpected if you display symptoms of being Too Online. First, a graph of reports received by Discord from the start of the year until April:
“Other” is the most reported category, but harassment comes in behind it at 17.3 percent. Significant pieces of the multi-colored pie also go to hacks/cheats, threatening behavior, NSFW content, exploitative content, doxxing, spamming, raiding, malware, and self-harm. Discord does not mince words in describing what these categories refer to. Exploitative content is defined as “A user discovers their intimate photos are being shared without their consent; two minors’ flirting leads to trading intimate images with each other” while one example of NSFW content is “A user joins a server and proceeds to DM bloody and graphic violent images (gore) to other server members.”
After Discord receives a report, the company’s trust and safety team “acts as detectives, looking through the available evidence and gathering as much information as possible.” Initially, they focus on reported messages, but investigations can expand into entire servers “dedicated to bad behavior” or historical patterns of rule-breaking. “We spend a lot of time here because we believe the context in which something is posted is important and can change the meaning entirely (like whether something’s said in jest, or is just plain harassment),” wrote Discord.
If there’s a violation, the team then takes action, which can mean anything from removing the content in question to removing a whole server from Discord. However, the percentage of reports that caused Discord to spring into action earlier this year was relatively small. Just 12.49 percent of harassment reports got actioned, for example. Other categories saw Discord intervene more often, but most percentages were still relatively small: 33.17 percent for threatening behavior, 6.86 percent for self-harm, 44.34 percent for cheats/hacks, 41.72 percent for doxxing, 60.15 percent for malware, 14.74 percent for exploitative content, 58.09 percent for NSFW content, 28.72 percent for raiding, and the big outlier, 95.09 percent for spam.
Discord explained that action percentages are lower than you might expect because many reports simply don’t pass muster. Some are false or malicious, with people taking words out of context or banding together to report innocent users. Others demand too much for too little. “We may receive a harassment report about a user who said to another user, ‘I hate your dog,’ and the reporter wants the other user banned,” said the company. Other reports that Discord doesn’t action might include information, but no concrete evidence. Lastly, users sometimes apply the wrong category to reports, but Discord says it still actions those, but it may not count toward action percentages.
From January to April, the biggest contributors to bans were spam and exploitative content. Spam accounted for 89 percent of account bans—a total of 119,244 accounts. On the exploitative content side of things, Discord banned 10,642 accounts, and it says it’s doing its best to squelch that issue—perhaps due to its well-documented troubles with child porn in the past.
“We’ve been spending significant resources on proactively handling Exploitative Content, which encompasses non-consensual pornography (revenge porn/’deep fakes’) and any content that falls under child safety (which includes child sexual abuse material, lolicon, minors accessing inappropriate material, and more),” the company wrote. “We think that it is important to take a very strong stance against this content, and while only some four thousand reports of this behavior were made to us, we took action on tens of thousands of users and thousands of servers that were involved in some form of this activity.”
Discord also removed thousands of servers during the first few months of the year, mostly focusing on hacks/cheats. However, servers focused on hate speech, harassment, and “dangerous ideologies”—again, an area where Discord has struggled in the past—are also a big focus. On a related note, the company also discussed its response to videos and memes born of the March 14 Christchurch shooting.
“At first, our primary goal was removing the graphic video as quickly as possible, wherever users may have shared it,” the company said. “Although we received fewer reports about the video in the following hours, we saw an increase in reports of ‘affiliated content.’ We took aggressive action to remove users glorifying the attack and impersonating the shooter; we took action on servers dedicated to dissecting the shooter’s manifesto, servers in support of the shooter’s agenda, and even memes that were made and distributed around the shooting… Over the course of the first ten days after this horrific event, we received a few hundred reports about content related to the shooting and issued 1,397 account bans alongside 159 server removals for related violations.”
Discord closed out the report by saying it believes that this sort of transparency should be “a norm” among tech companies, so that people can “better determine how platforms keep their users safe.” The Anti-Defamation League, an anti-hate organization dedicated to fighting bigotry and defending civil rights, agrees.
“Discord’s first transparency report is a meaningful step toward real tech platform transparency, which other platforms can learn from,” the organization wrote in a statement about Discord’s report. “We look forward to collaborating with them to further expand their transparency efforts, so that the public, government and civil society can better understand the nature and workings of the online platforms that are and will continue to shape our society.”
The developers of cutesy Animal Crossing–Pokemon mashup Ooblets just had a weekend from hell. After trying to preempt a tidal wave of rage over their newly announced Epic Games Store exclusivity, they got hit with a swirling tsunami of foaming-at-the-mouth anger, up to and including death threats and anti-Semitic hoaxes. This is the worst overreaction to an Epic deal that’s yet been publicized. It’s also part of a larger trend that the video game industry has let run rampant for far too long.
Today, Ooblets designer Ben Wasser published a lengthy Medium post about the harassment that he and his sole teammate at development studio Glumberland, programmer/artist Rebecca Cordingley, have been subjected to. In it, he discussed in detail what he’s only alluded to before, showing numerous screenshots of threatening, often racist and sexist abuse and pointing to coordinated efforts to storm the Ooblets Discord and propagate fabricated messages that made it look like Wasser said anti-Semitic things about gamers. In part, he blamed the tone of his tongue-in-cheek announcement post for this, saying that while it’s the tone the Ooblets team has been using to communicate with fans since day one, it was a “stupid miscalculation on my part.”
It is, on no uncertain terms, insane to expect that anyone might have to deal with a reaction like this because of some slight snark in a post about what is to them very good news. Actually, let’s just sit with that last point for a second: If you’re a fan of Ooblets, the Epic Store announcement is fantastic news; no, you don’t get to play it on Steam, and yes, the Epic Store is a weird, janky ghost town of a thing that’s improving at an alarmingly slow rate, but thanks to Epic’s funding, Ooblets and the studio making it are now guaranteed to survive. Thrive, even, thanks to additional staff and resources. You’ve got to download another (free) client to play it, but you get the best possible version of the game you were looking forward to, and its creators get to keep eating, which is something that I’ve heard keeps people alive.
And yet, in reaction to this, people went ballistic, just like they have so many times before. This is our default now. Every tiny pinprick slight is a powder keg. Developers may as well have lit matches taped to their fingers, because any perceived “wrong” move is enough to set off an explosive consumer revolt. And make no mistake, the people going after Ooblets were not fans, as evidenced by the fact that, according to Wasser, they didn’t even know how the game’s Patreon worked. Instead, they were self-described “consumers” and “potential customers” who felt like the game’s mere existence granted them some impossibly huge stake in its future. Wasser talked about this in his post:
“We’ve been told nonstop throughout this about how we must treat ‘consumers’ or ‘potential customers’ a certain way,” he said. “I understand the relationship people think they might be owed when they exchange money for goods or services, but the people using the terms consumers and potential customers here are doing so specifically because we’ve never actually sold them anything and don’t owe them anything at all… Whenever I’ve mentioned that we, as random people happening to be making a game, don’t owe these other random people anything, they become absolutely enraged. Some of the most apparently incendiary screenshots of things I’ve said are all along these lines.”
We need to face facts: This kind of mentality is a major force in video game culture. This is what a large number of people believe, and they use it as a justification to carry out sustained abuse and harassment. “When presented with the reality of the damage inflicted, we’ve seen countless people effectively say ‘you were asking for it,’” said Wasser. “According to that logic, anything anyone says that could rub someone the wrong way is cause for the internet to try to ruin their life. Either that, or our role as two people who had the nerve to make a video game made us valid targets in their minds.”
Things reached this deranged fever pitch, in part, because companies kowtowed to an increasingly caustic and abusive consumer culture, frequently chalking explosive overreactions up to “passion” and other ostensibly virtuous qualities. This culture, to be fair, is not always out of line (see: loot boxes, exploitative pricing from big publishers, and big companies generally behaving in questionable ways), but it frequently takes aim at individuals who have no actual power and contains people who are not opposed to using reprehensible mob tactics to achieve their goals—or just straight up deploying consumer-related concerns as an excuse to heap abuse on people and groups they hate. While the concerns, targets, and participants are not always the same, it’s hard to ignore that many of these mob tactics were pioneered and refined on places like 4chan and 8chan, and by movements like Gamergate—other pernicious elements that the gaming industry has widely failed to condemn (and has even engaged with, in some cases).
In the world of PC gaming, Valve is the biggest example of a company that utterly failed to keep its audience in check. Valve spent years lingering in the shadows, resolutely remaining hands-off until everything caught on fire and even the metaphorical “This is fine” dog could no longer ignore the writing on the wall. Or the company got sued. In this environment, PC gamers developed an oppositional relationship with game makers. Groups sprung up to police what they perceived as sketchy games—but, inevitably, they ended up going after perfectly legitimate developers, too. Users flooded forums when they were upset about changes to games or political stances or whatever else, with Valve leaving moderation to often-understaffed development teams instead of putting its foot down against abuse. Review bombs became a viable tactic to tank games’ sales, and for a time, any game that ran afoul of the larger PC gaming consumer culture saw its score reduced to oblivion, with users dropping bombs over everything from pricing decisions to women and trans people in games.
Smaller developers, utterly lacking in systemic or institutional support, were forced to respond to these attacks, granting them credibility. The tactics worked, so people kept using them, their cause justified by the overarching idea that many developers are “lazy” and disingenuous—when, in reality, game development is mind-bogglingly difficult and takes time. Recently, Valve has begun totake aim at some of these issues, but the damage is already done.
Whether unknowingly or out of malice, Valve went on to fire the starting gun for this same audience to start giving Epic Store developers trouble. When publisher Deep Silver announced that Metro Exodus would be an Epic Store exclusive, Valve published a note on the game’s Steam store page calling the move “unfair.” Inevitably, Steam review bombs of previous games in the series followed, as did harassment of individual developers and even the author of the books on which the Metro video game series is based. Soon, this became a pattern when any relatively high-profile game headed toward Epic’s (at least temporarily) greener pastures.
That brings us to Ooblets. The game’s developers are facing astounding abuse over what is—in the grand scheme of life, or even just media platforms—a minor change of scenery. But they’re not backing down.
“I recognize that none of this post equates to an apology in any way that a lot of the mob is trying to obtain, and that’s by design,” Wasser wrote in his Medium post. “While some of what I’ve said was definitely bad for PR, I stand behind it. A portion of the gaming community is indeed horrendously toxic, entitled, immature, irrationally-angry, and prone to joining hate mobs over any inconsequential issue they can cook up. That was proven again through this entire experience. It was never my intention to alienate or antagonize anyone in our community who does not fit that description, and I hope that you can see my tone and pointed comments were not directed at you.”
And while Epic is, at the end of the day, an industry titan deserving of some of the scrutiny that gets hurled its way, it’s at least taking a stand instead of washing its hands of the situation like Valve and other big companies have for so long.
“The announcement of Ooblets highlighted a disturbing trend which is growing and undermining healthy public discourse, and that’s the coordinated and deliberate creation and promotion of false information, including fake screenshots, videos, and technical analysis, accompanied by harassment of partners, promotion of hateful themes, and intimidation of those with opposing views,” Epic said in a statement yesterday, concluding that it plans to “steadfastly support our partners throughout these challenges.”
So far, it seems like the company has been true to its word. “A lot of companies would’ve left us to deal with all of this on our own, but Epic has been by our side as our world has gone sideways,” said Wasser. “The fact that they care so much about a team and game as small as us proves to us that we made the right call in working with them, and we couldn’t be more thankful.”
That’s a step in the right direction, and hopefully one that other companies will follow. But the gaming industry has allowed this problem to grow and grow and grow over the course of many years, and it’s hard to see a future in which blowups like this don’t remain a regular occurrence. In his post, Wasser faced this sad reality.
“I hope that laying all this out helps in some way to lessen what pain is brought against whoever the next targets are, because we sadly know there will be many,” he said. “You should have opinions, disagree with things, make arguments, but don’t try to ruin people’s lives or jump on the bandwagon when it’s happening. What happened to us is the result of people forgetting their humanity for the sake of participating in video game drama. Please have a little perspective before letting your mild annoyance lead to deeply hurting a fellow human being.”
For the last 41 days, a Twitch user named Mosheddy has been persistently harassing a streamer named Tanya DePass. She quickly banned him from her channel, but now he lurks, follows her to other people’s Twitch chats, sends private messages to everyone he can, and has friends drop into chat with messages “from Mosheddy.” Twitch hasn’t banned Mosheddy, and DePass doesn’t know what to do.
Tanya DePass is a streamer, consultant, and director of a non-profit called I Need Diverse Games. DePass feels like she’s being stalked, she told Kotaku over the phone last week, and she can’t do anything about it because bans administered by Twitch streamers don’t prevent people from watching their channels or seeing who’s in chat. Whenever she streams, she can count on people in her community getting creepy DMs, either from a person who goes by the handle “Mosheddy” or his friends, some of whom have created accounts that specifically mention Mosheddy to taunt her. Other channels DePass has hosted, sent her community into using Twitch’s “raid” feature, or even just decided to watch on her own have also ended up being trolled by Mosheddy and his sympathizers.
This is all happening despite a suite of Twitch tools and a terms of service that should, in theory, enable streamers to curate their communities and experiences—and, most importantly, protect themselves from users they feel threatened or upset by. DePass’ troll has uncovered a series of easily exploitable loopholes. Sure, now that DePass has banned him, he can’t talk in her channel’s chat, but he can still follow it, see who’s in her community, DM them, and tag along any time DePass decides to hit up another channel.
DePass says it’s made her uncomfortable streaming and forced her to lock down a community she once envisioned as an oasis from social media’s endless, landmine-ridden desert.
“No matter what I did, he would follow along,” she said. “Now my Discord is locked down, everyone gets put into a welcome room, it’s subscriber-only Twitch chat, subscriber-only [recordings of previous Twitch streams]. I have to go back and make sure people don’t make weird clips… That’s not the community that I wanna build. That’s the antithesis of what I wanted out of Twitch.”
“It got to the point where streaming made me anxious,” she said. “It still makes me anxious.”
For the past month and change, DePass has been documenting Mosheddy’s near-daily intrusions into her community, posting screenshots of Mosheddy lurking in her channel and sending whispers (private messages) incessantly to her viewers, and his friends posting chat messages like “Merry Christmas from Mosheddy.” She’s reported Mosheddy and accounts associated with him to Twitch, and suggested others do the same. In the absence of results, she’s criticized Twitch for not doing more to prevent this sort of situation from arising. And while one of Mosheddy’s friends did get banned from Twitch entirely—seemingly for ban evasion via usage of alternate accounts—Mosheddy himself remains untouched. (Twitch generally declines to comment on specific users’ bans and did so in the case of this story.)
DePass is far from the only streamer who’s had to stave off Mosheddy’s incessant intrusions. If Tanya DePass mentions another channel, hosts someone else’s stream, or even watches another channel, Mosheddy follows—into people’s chats, DMs, and follower lists. Blocking and reporting doesn’t help.
“He whispered me on Twitch at some point, accusing me of ‘pulling the race card,’ and made comments about people watching and recording us,” DePass’ friend, streamer Brann “Prideceratops” Stalhjerte, told Kotaku in a DM, speaking about an incident that took place “a week or two” after DePass first banned Mosheddy. “On another date, users came into my own stream and began saying things like ‘Merry Christmas from Mosheddy,’ which had become sort of a popular action for them to take part in,” he said. “When they were banned, they reached out and harassed some of my mod team, who simply blocked and reported.”
Streamer AcidQueen described a similar incident, complete with the Christmas wishes. First, Mosheddy entered chat right after DePass started watching, greeting everyone with a troll-y message about how he was PewDiePie. When he got banned for that, he proceeded to DM one of AcidQueen’s moderation bots with a snide comment about DePass. “He launched one last barb at me (again, via my bot), claiming that ‘Overlord Tanya has ordered you to stop talking to me,’ when really I was just tired of his sad and pathetic self.”
Twitch partner Brandon “iamBrandon” Stennis was aware of DePass’ situation and decided, in conjunction with his moderators, to ban Mosheddy and related accounts before they could ever chat on his channel. However, on a recent occasion when DePass hosted Stennis’ channel, Mosheddy still showed up by proxy. “Tanya hosted my channel, and a few minutes after, a brand new name in my channel asked specially, ‘Why is Mosheddy banned here?’” he said in an email. Mosheddy himself also DMed people similar messages throughout the stream.
DePass and Mosheddy both said that the incident that set all of this off was DePass banning Mosheddy from her Twitch stream on December 17. Mosheddy and a friend were in the Twitch chat asking DePass why she didn’t use the LGBTQIA tag on her stream. She replied that the tag can serve as something of a signal flare for hateful viewers. Mosheddy and the friend then pushed the issue, and ended up getting banned. DePass said it was for “being rude about the discussion around LGBTQIA tag and harassment” and arguing with her moderators.
Soon after, DePass said, Mosheddy and the friend followed her into another person’s stream and called her “racist,” “toxic,” and “a lunatic” for banning them. At this point, DePass started tweeting publicly about what was happening.
Reached for comment via DM by Kotaku, Mosheddy admitted that he has been DMing people in DePass’ orbit and joining their channels for more than a month. In his view, it’s all perfectly justifiable, although his justifications don’t hold much water.
“I did stop multiple times, but was invigorated to continue when reading her crusades on Twitter,” Mosheddy said. Even putting aside the fact that “stopping multiple times” is synonymous with “continuing,” it’s not on DePass to stop talking about trolling in the hopes of getting the troll to go away.
Mosheddy said he also kept up the messages to DePass and her friends because of the reactions of “random” strangers. “Reading comments from random people who would never know me or even want to hear my side of the story, saying they wanted to break my fingers,” he said. He also pointed out to Kotaku that he had not dropped any racial slurs or sexist comments to DePass, who is black, as if this was some gesture of graceful composure on his part: “And yet I was still able to keep my composure and not say anything racist or sexist towards her, which is apparently her justification for banning me,” he said.
Mosheddy cited many more questionable motivations throughout our conversation, including that on one hand he was on a very serious crusade to get justice for a friend who DePass had somehow gotten banned from a channel they were a regular in, but on the other, it was all for laughs.
“Every person I told the story to found it as funny as I did,” he said. “It’s not difficult to send a message with my name in it, get banned, and then follow afterwards.”
Does Twitch find it funny? Probably not, but it certainly hasn’t taken any action. As Mosheddy’s crusade to troll DePass and any of her friends or associates went went on and on with no resolution, DePass took a step that few streamers are able to: While in San Francisco for business earlier this month, she went to Twitch’s office to discuss the matter in person with its Trust and Safety team. She came away from the meeting, she said, with more questions than answers.
“No one could really answer why this guy still had a channel,” said DePass. “He was just skirting the line of what’s actionable.” She thinks that if she was a bigger streamer, things might be different. “It’s not like I make Twitch thousands of dollars a month, because I’m sure if I was a white dude playing Fortnite, PUBG, or some other game making them money hand over fist, this wouldn’t have gone on for hours, let alone a month,” she said.
DePass worries that her community and mental health are fracturing, and is considering heading for less Twitch-purple pastures, like Mixer or YouTube, even if that means losing followers and subscribers. “I do love streaming,” she said, “but I don’t love anything enough—be that the work I do as a diversity advocate at a non-profit, streaming, whatever it is—to sacrifice my mental health and well-being.”
DePass wants to see Twitch take its tools and terms of service more seriously. Even though Facebook is a sputtering toilet explosion in countless other ways, she wishes others would learn from its block system. “Banning needs to be more than ‘You just can’t talk in a channel,’ she said. “I wish it could be like Facebook, where if you block someone on Facebook, they don’t exist anymore. They can’t find you, they can’t search for you, they can’t interact with you, even if you still have people in common. It’s like you don’t exist.”