Youtube faced backlash for another content regulation problem when videos of young children with exposed private parts began surfacing. These videos also displayed advertising from major brands alongside the content, leading to major companies like Nestle, Disney, Fortnite pull these YouTube ads from the identified videos. This issue was first discovered on Sunday, when Matt Watson, a video blogger, posted a 20-minute clip detailing how comments on YouTube were used to identify certain videos in which young girls were in activities that could be construed as sexually suggestive, such as posing in front of a mirror and doing gymnastics.
Youtube received major criticism from companies and individuals alike for recommending videos of minors and allowing pedophiles to comment on these posts, with a specific time stamp of the video of when an exposed private part of the young child was visible. YouTube was also condemned for monetizing these videos allowing advertisements for major brands like Alfa Romeo, Fiat, Fortnite, Grammarly, L’Oreal, Maybelline, Metro: Exodus, Peloton and SingleMuslims.com, etc to be displayed on these videos.
Companies pull out ads from Youtube
Following this news, a large number of companies pulled their advertising spending from YouTube. Grammarly told Wired, “We’re absolutely horrified and have reached out to YouTube to rectify this immediately, we have a strict policy against advertising alongside harmful or offensive content. We would never knowingly associate ourselves with channels like this.”
A spokesperson for Fortnite publisher Epic Games told Wired, that it had paused all pre-roll advertising on YouTube. “Through our advertising agency, we have reached out to YouTube to determine actions they’ll take to eliminate this type of content from their service,” Fortnite added.
Disney and Nestle have also paused advertising on YouTube.
Replying to these accusations, a Youtube spokesperson said in an email, “Any content –including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments.”
People on Twitter have strongly condemned YouTube’s actions.
This video is distressing, and it should be. YouTuber "mattswhatitis" uncovered the full extent of pedophilia on @YouTube, and how pedophiles timestamp inappropriate moments, share links and details to child porn.https://t.co/1WJI0lMhZo
— 𝕋𝕙𝕖 𝔾𝕠𝕤𝕤𝕚𝕡 𝔾𝕒𝕣𝕕𝕖𝕟 (@gossip_garden) February 18, 2019
Looks like Youtube has become a platform of choice for softcore child porn fans, and has been for a while: https://t.co/oGRIEOREmu #youtubewakeup?
— Tadeusz Sośnierz (@tsosnierz) February 18, 2019
You fucking kidding me @YouTube? You’ve been facilitating soft core child porn on your platform & running ads on the videos! A firearms instructor can make a vid of safe gun maintenance & you shut it down immediately. But THIS! Fuck you, you sacks of shit. #youtubewakeup
— Justin (@justin_ksu) February 21, 2019
@TeamYouTube are you guys gonna do anything about the disgusting child porn on your site? We know you dont care about the kids but plenty of them are monetized. #youtubewakeup https://t.co/TYTNZFLxwb
— rep_turd (@rep_turd) February 19, 2019
Youtube also recently updated its algorithm, introducing a new strikes system to make its community guidelines more transparent and consistent. They are introducing more opportunities for everyone to understand Youtube’s policies, a consistent penalty for each strike, and better notifications. Last month, YouTube announced an update regarding YouTube recommendations aiming to reduce the recommendations of videos that promote misinformation and conspiracy theories.