As 2018 was ushered in, Logan Paul, a 22 -year-old YouTube star with 15 million followers—many of them teens and younger—posted a video of a dead body hanging from a tree to his channel.
It was eventually removed( by Paul , not YouTube ), and he issued two apologies this week in an effort to move past the outrage, but something about this felt different than other YouTube scandals. It demonstrated the chasm between content and inventor, and the results of the raw impulse to document everything. To viewers, it was insensitive and sickening. To Paul, it was “a moment in YouTube history.”
YouTube isn’t experiencing growing aches. It has reached a point where it’s promoting cognitive dissonance and accountability is absent. Advertisers have pulled money after ads were demonstrated next to extremist content; elsewhere, the LGBTQ community spoke out about YouTube apparently restricting content from certain channels as inappropriate. YouTube is attempting to trot out original content and live Tv, with ad revenue to match. Its YouTube Red originals are apparently seeing millions of views. But its home is not in order.
‘Please the algorithm’
A few months ago, I fell asleep to a YouTube video about aliens. It was one of those oddly automated clips categorizing the different “kinds” of aliens who have visited Earth. The tone was light, and the video featured cartoonish illustrations. When I awoke approximately 20 minutes later, YouTube’s algorithm had brought me to a channel that trades in videos about hidden symbols in amusement and secret reptilian celebs, as well as more nebulous videos about “hidden messages” in Pepe the Frog and repurposed news stories about Vladimir Putin.( The channel lists its country as Russia .)
It’s not surprising this content exists, but the swiftness with which I arrived there caught me off his guard. YouTube’s always leaned on its algorithms to “learn, ” but it’s taken over in more ominous routes.
While creator-first content and community-shaping personalities still prosper, cracks have started to show. Most troubling, its algorithm revealed that there are thousands of videos that contain abusive content aimed at children or involving kids. Elsewhere, pedophiles were allegedly using YouTube to post illicit clips of children.( YouTube declined to comment for this article. Instead, we were pointed to two blogs by CEO Susan Wojcicki about combating harassment and abuse in the community .)
In a BuzzFeed report from December, it was revealed that YouTube channels illustrating child abuse or endangerment under the guise of “family” entertainment were lucrative. In late 2017, the popular channel Toy Freaks, which features Greg Chism and his two daughters engaged in distressing activities, was shut down and Chism was analyse though not charged. Elsewhere, the weird subgenre of superhero videos is targeted at children but containing very adult( and often violent) topics was flagged by rapper B.o.B ., among others. Reached for comment on those videos in July, a YouTube rep claim: “We’re always looking to improve the YouTube experience for all our users and that includes ensuring that our platform remains an open place for self-expression and communication. We understand that what offends person or persons may be viewed differently by another.”
In that same BuzzFeed report, a father who runs a “family” channel and had his account demonetized because of questionable videos offered a telling quote: “In words of contact and relationship with YouTube, honestly, the algorithm is the thing we had a relationship with since the beginning. That’s what got us out there and popular. We learned to fuel it and do whatever it took to please the algorithm.”
This is the beating heart of YouTube’s weirdness: it’s dishing out TV-style ad $$$ with perfectly none of the responsibility or accountability of Tv https :// t.co/ On8XhnU43y
— Tom Gara (@ tomgara) November 22, 2017
To reverse this new kind of worship, YouTube ostensibly needs more humans and has said it’s hiring a squad of 10, 000 moderators to review offensive content. But, as proven by Facebook’s tries at moderation, what humans see in the course of reviewing content has lasting effects. And the guidelines YouTube dedicates its human reviewers are apparently flawed: BuzzFeed recently interviewed people tasked with reviewing and rating videos, and found that it was often unclear what they were rating; YouTube’s guidelines put emphasis on high-quality videos, even if they were disturbing.
Free speech vs. dislike speech
While Twitter has finally started purging hate accounts, YouTube’s scale and automation construct that difficult. Fake reports and conspiracy videos about the Las Vegas shooting indicated up at the top of search results in October. An August NYT report outlined the growing far-right presence on YouTube, one that’s young and “acquainting viewers with a more international message, attuned to a global resurgence of explicitly race-and-religion-based, blood-and-soil nationalism.”
Felix Kjellberg, aka PewDiePie, one of the more popular personalities on YouTube with more than 50 million subscribers and brand recognition, yelled the N-word during a gaming livestream and paid two men to hold up a sign that said “Death To All Jews.” The Wall Street Journal found nine videos on his channel that include Nazi imagery or anti-Semitic jokes. This led to his contract with Disney being discontinued and his show Scare PewDiePie canceled. Many fans claimed this was a violation of free speech, but approximately a year later his subscriber count has not dwindled.( Kjellberg did not respond for commentary .) In a piece for BuzzFeed, Jacob Clifton argued that Kjellberg isn’t some lone monster; he’s just one “symptom of a majority illness” that’s sweeping online platforms.
In July, poet and musician Joshua Idehen tweeted a lengthy thread about the “YouTube thingy” and the state of gaming culture, racism, and harassment on YouTube, including PewDiePie’s stunt.
Hi, I’m a YouTube Thingy. I talk abt gaming, geek culture, the only two genders and Hitler’s good ideas. Feminism is cancer
— Joshua Idehen (@ BeninCitizen) July 12, 2017
He told the Daily Dot he used to be an “edgelord, ” a term for those men( and it’s typically humen) who engage in provocative behaviour and ideologies for laughs, but that he “only stimulated it out with my humanity thanks to many, many good women.” Asked what YouTube should do to combat this sweeping tide, he offers: “Ban Nazis. Ban hate speech. Ban targeted harassment. Hire a dedicated moderation staff with all that world domination money.”
YouTuber Megan MacKay says “we’re getting a glimpse of the dark side of the democratization of content creation.”
“When anyone can pick up a camera and make a video without being beholden to anyone, we’re bound to eventually get content that traverses the uncrossable line, ” she says. “I believe clarifying and enforcing the terms of service is really the only style major platforms can ensure the safety of their community while also stemming the flow of detest speech and alt-right garbage, but even when these corp take a performative stand against this type of content, they seem to fail or waffle when it comes to actually doing something about it. I can’t speak to why exactly they are still drop the ball, whether it’s anxiety of losing big-name users or struggles with scale, but it’s a major problem that merely ever seems to be halfheartedly addressed.”
This was reflected in YouTube’s official response to the Logan Paul video on Tuesday, which falls in accordance with so many of its other reactions: A YouTube spokesperson said,” Our hearts go out to the family of the person featured in the video ,” but beyond that, they just reiterated the Community Guidelines. It doesn’t offer any answers or answers, and Paul will likely continue making money.
But YouTube has always banked on drama, and in the process has promoted fanbases that know no boundaries. In November, fans of Alissa Violet and Ricky Banks trashed a Cleveland bar online, and menaced innocent residents of the city, after the two were hurled out of the venue. Jake Paul, the younger friend of Logan Paul, is also wildly popular with children and teens, but people who don’t enjoy his sophomoric, braying brand of humor weren’t too happy about living next to his chaotic prank house. He was called out for being a racist after he joked that a fan from Kazakhstan might “blow someone up” and a bully after fellow YouTubers the Martinez twins accused him of abuse.( His “pranks” on the twins involved airhorns and leaf blowers .) After Hurricane Harvey, he showed up in San Antonio to “help” and his rabid fans generated even more chaos in a Walmart parking lot.
In a 2015 profile of Logan Paul, who got his start on Vine, he carried his desire to move past “clean” content and expand his fanbase: “I want to be the biggest entertainer in the world. That’s my deal. I’ll do whatever it takes to get that.” Two years later, he posted a video of a dead body and YouTube’s response was basically the shrug emoji.
YouTube has built a massive global audience in its decade-plus of existence, one that hinges on self-starting. We’re detecting what happens when self-starting has no boundaries when an open platform has to keep revising its Community Guidelines instead of ripping them up and starting over. In her December blog, Wojcicki claimed that YouTube’s goal for 2018 “is to stay one step ahead of bad actors, building it harder for policy-violating content to surface or remain on YouTube.”
But what if some bad actors are your most popular inventors? And is one step really enough?