Originally published on September 25, 2017
As a person following YouTube site drama, I’d always felt
like “it would never happen to me”. I’m not a big YouTuber by any means. I
watched the Adpocalypse and heard about videos being demonetized and taken down
left, right, and center, hitting the big people on the platform and threatening
their livelihood.
So last Sunday I uploaded a video I had just spent six hours
the night before finishing, a Game of
Thrones edit I was incredibly proud of. Within a half hour, I went to check
on if it had gotten any views and found it had disappeared. It had violated
YouTube’s Community Guidelines (never said which one specifically, but I guessed
it was the “violence” one) and had been removed from the site. I had one strike
against my channel in the Community Guidelines section but it would drop in 3
months. From following other channels’ troubles, I knew I could appeal. So I
did. Last Thursday it was back up.
From my research, I know it is often an algorithm that
initially takes down videos. There’s no way a human could go through the
thousands of minutes of video content uploaded to YouTube every minute. But no
algorithm is perfect, and way too many videos have been dinged for things taken
out of context. A robot doesn’t know the difference between comedy and reality,
fantasy and journalism.
I understand why advertisers started getting snippy. No one at
Wal-Mart wants a Wal-Mart ad playing next to a journalist being beheaded by the
Taliban. In my experience, I know that does not mean Wal-Mart supports the
Taliban, but that is what the advertisers think. So YouTube lost many sponsors
and was pressured to crack down on content, creating their Community
Guidelines.
Is YouTube getting too sensitive?
They are a company, and companies need to have policies that
partners have to follow in order to be a part of it. If a partner violates the
policies, they will no longer be a part of the company. And before I hear the
argument of “free speech”, even though YouTube is based in the United States,
it is a worldwide company with contributors from around the world. There is a
line.
THAT BEING SAID there is a big difference between fantasy
violence knowingly consumed by the viewer and very real journalism taking place
in our very real world that needs to be spread to raise awareness of things
that are sometimes just words and numbers.
Some of the Guidelines make sense. Threats, hateful content,
and spam should be stopped in their paths. But the others make less sense. Language,
violence, and dangerous content are things found in movies and TV shows and
people have the choice to watch them. At the very least, I would recommend
YouTube starts doing the thing Facebook has done and put before a video “this
video has graphic content; would you like to proceed?”. Age-gating should work
to an extent as well, if necessary.
I will be watching even more carefully in the future to see
what YouTube is coming up with next. Will there be a solution, or will there be
a new platform ready to take its place?
No comments:
Post a Comment