YouTube is now a network on which individuals can carve themselves a career out of from scratch.
Game playthroughs, make-up tutorials, general rants, reviews, how-to guides, and satire all account for millions of videos uploaded to YouTube, some of which can generate a substantial revenue stream for content creators.
However, the Google-owned platform has made a number of changes in recent years to make the website more family-friendly and more appealing to advertisers, putting the company on a direct collision course with some content uploaders.
Over five years ago, once YouTube opened its doors to anyone to make money on the platform by embedding adverts in video footage, the popularity of the platform grew -- but as users uploaded content which was the intellectual property of others and opened the door to court cases, YouTube began tightening up the rules.
Users can no longer monetize unless they reach 10,000 views on their channel, and only last year, a number of high-profile YouTubers exited the platform after being told their content -- including "vulgar language," "violence," and "controversial or sensitive subjects and events" -- was not "advertiser-friendly."
The rules had not actually changed, but rather, the notification system YouTube uses to warn content creators when their content cannot be used by advertisers and therefore will not bring in any revenue underwent a few tweaks.
However, many found the rule changes -- now suddenly brought to the forefront -- stifling.
With household names including AT&T and Verizon pulling out of YouTube over their advertisements being linked to "hate" videos in March, YouTube has decided to take further action in order to retain remaining advertisers -- but content creators are once again left unhappy.
On Thursday, YouTube said in a blog post that after a number of discussions with advertisers, the company will allow them to have greater control over where their adverts are placed, and to make content creators aware of what can -- and cannot -- make them money, YouTube has updated its overall guidelines once more.
"We've heard loud and clear from the creator community and from advertisers that YouTube needs to broaden our advertiser-friendly guidelines around a few additional types of content," YouTube says. "While it's not possible for us to cover every video scenario, we hope this additional information will provide you with more insight into the types of content that brands have told us they don't want to advertise against and help you to make more informed content decisions."
The new guidelines are below:
- Hateful content: Content that promotes discrimination or disparages or humiliates an individual or group of people on the basis of the individual's or group's race, ethnicity, or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or other characteristic associated with systematic discrimination or marginalization.
- Inappropriate use of family entertainment characters: Content that depicts family entertainment characters engaged in violent, sexual, vile, or otherwise inappropriate behavior, even if done for comedic or satirical purposes.
- Incendiary and demeaning content: Content that is gratuitously incendiary, inflammatory, or demeaning. For example, video content that uses gratuitously disrespectful language that shames or insults an individual or group.
To be clear, the videos are still allowed on YouTube, but simply will not generate any income. To some, this may be considered a form of censorship as the firm is hitting where it hurts the most -- the bank balance.
YouTube may be attempting to clean up its act and keep the website family-friendly, but not every quarter approves the changes.
Some YouTube subscribers and content creators met the news with dismay, launching heavy criticism at the company on the blog post's comment section.
One member of the community dubbed the move "YouTube's suicide note," and others raised concerns over how the new guidelines appear to censor users by making them think twice about what content to post.
In addition, gaming was called into question. As so many games include adult themes, violence, and inflammatory content -- in whatever context -- this could mean video content referring to them, such as walkthroughs or guides -- may no longer be monetized.
"This used to be about us as creators, but now it's all about business," user Anime k22x commented. "Why are you destroying us?"
Considering how many gamers make their living on YouTube, such as PopularMMOs, DanTDM, and Markiplier -- which account for over 50 million subscribers between them and millions of dollars in revenue -- unless YouTube clarifies what counts as "hateful," "incendiary," and "demeaning," they could all, as well as the other thousands of game-related content creators out there, find their videos de-monetized.
"Broadcast Yourself but based on these guidelines that restrict you from doing so," commenter Tom Clinton said.
As another commenter noted, the context has begun to cause great concern among some of the YouTube community.
"Context around many words is incredibly important and needs to be addressed," says Captain Sauce. "Being a YouTuber means you need a thesaurus to make a title now since 'killing a Goomba' in Super Mario and 'killing a police officer' in real life are both flagged because the word 'killing'."
"Death, Stab, Kiss, Sexy, Fight, Shoot etc. are all monetization killers regardless if the video is depicting a gruesome murder or a family friendly video game. If you're going to heavily invest in YouTube gaming then this is a hugely important fix, as both gamers and developers are being negatively impacted," the commenter added.
There needs to be a balance between keeping advertisers and content creators happy for the business to stay afloat.
The best thing YouTube can do in this circumstance is to take more care and attention to what content creators are saying and spend slightly less time appeasing advertisers to make sure that not only are topics which are ad-friendly straightened out so they don't affect other topics en masse, but also, clearly stipulate just what these rule changes will do to content creators.
Otherwise, YouTube may find less and less content to monetize in the future.
I doubt many would mind a racist, sexist, hate-filled rant by someone screaming in front of their smartphone camera not making money. But if these broad rules also impact a gamer who happens to swear at a jumpscare or includes a scene from a game which shows a violent fight, the line into needless censorship and restriction may, indeed, be crossed.
ZDNet has reached out to YouTube and will update if we hear back.