“We don’t want to be knee-jerk”: YouTube responds to Vox on its harassment policies

1

A conservative commentator has been harassing a Vox journalist for a long time. YouTube still isn’t quite sure what to do about it.

It’s a moment of reckoning for Big Tech. YouTube’s is coming, due in no small part to Vox.

Vox journalist Carlos Maza’s decision to speak out about harassment he’s been facing for years on YouTube has resulted in a highly publicized battle between the two media companies and broadened the public debate about how YouTube polices what is and isn’t allowed on its platform. At the Code Conference Monday, YouTube CEO Susan Wojcicki said the company is sorry for those offended by the situation — but is standing by its actions.

“When we look at harassment there are a number of things we look at. First, we look at the context,” she said. “Was this video a one-hour political video that had, say, a racial slur in it? There are different kinds of videos. We looked at the context, and that’s really important. We also looked to see if it’s a public figure and if it’s malicious with the intent of harassment. Right now, malicious is a high bar for us.”

Here’s what’s going on: This all started in late May, when Carlos Maza, a Vox writer and host of the video series Strikethrough, tweeted a supercut of clips showing how conservative YouTube host Steven Crowder has harassed him for two years on YouTube with basically no consequences, including using racist and homophobic slurs in YouTube videos about Maza. Last fall, Maza was inundated with text messages calling for him to debate Crowder, and he has faced severe harassment online.

He called for YouTube to take action, noting that while YouTube had made a publicity-driven show of its support for the LGBTQ community during Pride Month, it wasn’t protecting its LGBTQ creators, including him. Multiple Vox journalists, including me, backed him up on it.

YouTube’s handling of the situation has been inconsistent. It initially said Crowder hadn’t violated its policies, only to later announce that it had decided to suspend his channel’s monetization because “a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies.”

In the middle of this, YouTube also announced that it would ban content that promotes white supremacist views on its platform (think neo-Nazis) and content that denies the history of well-documented violent events such as the Holocaust and the Sandy Hook Elementary School shooting. Some people have conflated those policy changes with what’s going on with Maza and Vox, but they’re two different things.

Vox editor-in-chief Lauren Williams and head of video Joe Posner last week wrote an open letter to Wojcicki calling on her to clarify and enforce the platform’s harassment policy. YouTube has not responded to the letter, and a company spokesperson did not respond to a request for comment.

The timing of the back-and-forth is especially interesting because Wojcicki has for weeks been scheduled to speak with Recode’s Peter Kafka at the Code Conference in Scottsdale, Arizona, on Monday. (Recode is part of Vox.) Wojcicki told Kafka that the company could probably have handled the situation better but largely stood by its actions. “We don’t want to be knee-jerk,” she said.

YouTube has for years faced questions about its content moderation practices and how it decides what sort of speech is and isn’t allowed on its platform. What’s more, YouTube seemingly profits from its algorithms funneling users toward extreme and inflammatory content.

It’s a very public culmination of social media companies’ struggles to define and enforce their policies and find a balance between promoting diverse voices, protecting against harassment and hate speech, and deciding what they’re willing to let slide in the pursuit of capitalistic motives.

Steven Crowder has been harassing Carlos Maza for a long time. YouTube still isn’t sure what to do about it.

Crowder is a conservative comedian and commentator who hosts Louder With Crowder, a political show that airs on the conservative network Blaze TV. Maza hosts the Vox YouTube series Strikethrough, which covers media in the era of Trump. He says that Crowder has been going after him for years — and has the receipts to prove it.

Maza first talked about the harassment he’s faced from Crowder on Twitter on May 30; he also published a video showing Crowder describing him as, among other things, a “lispy queer” and a “gay Mexican.” (Maza is Cuban American.)

Maza also noted that people had pointed out to him that in some of the clips, Crowder was wearing a “Socialism Is for F*gs” shirt, which he also sells.

On June 4, YouTube responded to Maza, saying that it takes allegations of harassment “very seriously” but that it ultimately decided Crowder hadn’t violated its policies, even though his language was “clearly hurtful.”

“As an open platform, it’s crucial for us to allow everyone —from creators to journalists to late-night TV hosts —to express their opinions w/in the scope of our policies,” the company wrote. “Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site.”

On June 5, YouTube announced it would ban white supremacist content and content that denies well-documented violent events. But it’s still allowing videos it classifies as “borderline,” which isn’t clearly defined, as well as a host of other problematic content. Many people — some in good faith, some not — conflated this announcement with Maza’s harassment complaint, but they’re not really related. The issues with Crowder’s videos were never about white supremacism, but rather, Maza said, violations of YouTube’s existing policies.

Also on June 5, YouTube replied to Maza yet again and said it would demonetize Crowder’s channel. Basically, that means his videos won’t be eligible for ads through YouTube’s AdSense network, and it could also mean the channel’s content won’t be recommended by the site. Basically, it’s a sort of middle ground for YouTube — it’s a way to make Crowder’s channel less lucrative for him, but it doesn’t take him off the platform altogether. All the while, Crowder has continued to criticize Maza and Vox, and his supporters have continued to harass Maza. Some of Crowder’s backers even started to sell shirts directly targeting Maza.

YouTube told Vox’s Aja Romano that it found that Crowder didn’t directly incite his followers to attack Maza, despite the harassment he is clearly facing. Per Romano:

According to YouTube, the platform considers the context of all criticism when reviewing harassment claims — that is, it scrutinizes whether the criticism is coupled with a larger debate or whether it’s intended mainly to target an individual.

In Crowder’s case, YouTube decided that since Crowder’s main goal was ostensibly to respond to Maza’s opinions on various contemporary issues, as expressed in Maza’s Strikethrough videos, his videos were not instances of hate speech; instead, they qualified as analysis.

Vox, activists, and journalists are asking YouTube to take a look at its policies. Some conservatives are rallying behind Crowder.

The debacle has generated an enormous amount of media attention and an intense conversation about YouTube’s content policies, its commitment to protecting LGBTQ creators, and what speech should and shouldn’t be allowed online.

BuzzFeed News reported that within Google, which owns YouTube, LGBTQ employees had begun to circulate a petition asking management to remove Pride branding from its social media accounts. Wojcicki told Recode’s Kafka on Monday that she is “sorry” and recognizes that “the decision we made was hurtful to the LGBTQ community.”

Crowder, meanwhile, began pushing the hashtag #VoxAdpocalypse, and he and his supporters began to claim that this was some sort of big media conspiracy by Vox Media and NBCUniversal, an investor in Vox, to silence him. Even Sen. Ted Cruz (R-TX) came to Crowder’s defense and criticized Maza on Twitter.

Because other creators were seeing their content censored or banned under YouTube’s new community guidelines related to white supremacist and violent-event-denying content, some people conflated YouTube’s separate actions.

Then on June 7, Williams and Posner published their open letter to Wojcicki in which they criticized her and YouTube for not taking action on the harassment Maza has been facing at Crowder’s encouragement.

“To Carlos, us, and many of your creators and users, this behavior is in clear violation of your company’s community guidelines,” Williams and Posner wrote.

They later added: “To YouTube, however, Crowder’s behavior — while worthy of demonetization — is not in violation of these policies, as long as the offending language is not ‘the primary purpose’ of a video. If the repeated harassment in these videos doesn’t cross the line by YouTube’s standards, then your line needs to be moved.”

Many observers criticized YouTube for its mixed messaging on the matter — it said Crowder’s actions didn’t violate its policies, but then after more complaints, it decided to demonetize him anyway. On Monday, Wojcicki said YouTube has a “higher standard” for monetization but acknowledged the company had been “too subtle” in explaining its reasoning there.

San Francisco Pride’s board has reportedly expressed concerns to Google about its handling of the situation, and activists have pushed it to ax the company from its annual march, set for the last weekend in June.

It’s true that content moderation is a complex topic. It’s also true that YouTube could probably do better.

YouTube, Facebook, and Twitter have all faced questions about how they police the content on their platforms and decide what does and doesn’t cross the line.

In March, for example, Facebook banned white nationalist and white separatist content on Facebook and Instagram. Twitter is reportedly researching how white supremacists and white nationalists use its service in an effort to decide whether to allow them on the platform. Motherboard, which has been tracking the issue at Twitter closely, reported in April that Twitter is hesitant to change its policies in part because some Republican politicians might be flagged by algorithms that identify and remove supremacist content.

As NPR notes, hundreds of hours of video are uploaded to YouTube every minute, and it’s really difficult for the platform to track all of that content. It uses algorithms and users to flag potentially inappropriate content, but the system, obviously, is imperfect. Kafka pushed Wojcicki on that point at Code and asked whether YouTube should have to okay content before it goes out. “I think we would lose a lot of voices,” she replied.

And more extreme, inflammatory content is often more engaging to users — and therefore better for business. YouTube’s algorithms often push users toward more extreme content. (The New York Times’s Kevin Roose over the weekend published a story about a young man who was led to the alt-right through YouTube.) So YouTube can demonetize Crowder’s videos, for example, but it has an economic incentive not to kick him off the platform entirely.

And even if the tech giant decided it was in its financial interest to deplatform a video creator in this one instance, it would likely face political backlash for such a decision. Many conservatives have long sought to cast social media companies as biased against them and criticized them for allegedly infringing on their speech, even though there is no evidence that there is some sort of coordinated effort underway.

In 2016, Gizmodo reported that workers at Facebook routinely suppressed news stories of interest to conservative readers, citing a former Facebook journalist. Facebook CEO Mark Zuckerberg subsequently met with conservative leaders to discuss how the social network handles conservative content. President Trump and other Republicans last year seized on the narrative that Twitter was “shadowbanning” conservatives after a Vice News report that some Republican officials weren’t showing up in automatic search results. The president has also complained about his declining follower count, which is the result of Twitter’s broad efforts to clean up its platform. CEO Jack Dorsey met with Trump to try to explain that.

Social media companies, including YouTube and Google, say they want to promote different kinds of speech, but they are also private companies that have the right to police their platforms as they so choose. The US federal government owes citizens their First Amendment speech rights; private companies do not. Internet platforms are also protected under Section 230 of the Communications Decency Act of 1996, which prevents them from being held liable for the content posted by their users, and gives them space to police their sites and restrict and take down material as they see fit.

Might YouTube be facing some tough choices when it comes to what it does and doesn’t want to allow on its platform? Sure. Including when it comes to Maza, Crowder, and Vox.

But its bungled responses are evidence that its issues go far beyond this one instance.