Statistics for YouTube community guidelines enforcement are now available for the period April to June 2022, via Google’s Transparency Report. YouTube channels are terminated if they accrue three community guideline strikes in 90 days, have a case of severe abuse (predatory behaviour, for example), or are determined to be wholly dedicated to violating YouTube’s community guidelines.
The guidelines are pretty much what you’d expect. Separate categories exist for spam and deceptive practices; sensitive content; violent / dangerous content; regulated goods; and, just recently, misinformation, which covers things like election misinformation, COVID-19 medical misinformation, and vaccine misinformation.
The latest report appears to be the first time YouTube has offered a glimpse at misinformation takedowns, and it makes for interesting reading.
Running the numbers
Of the almost four million channels removed in Q2 2022, 89 percent of were removed under “Spam, misleading, and scams”. Within this tally, 122,000 videos (not channels) violated misinformation policies of one form or another.
On a similar note, by far the biggest source of video comment removal is, once again, “Spam, misleading, and scams”. This time around, we’re talking a staggering 458,784,993 comments. This is just 61 percent of overall removals, with “Harassment and cyberbullying” weighing in a distant second with 130,998,769 (17 percent).
Misinformation has been a major issue for many years online. Everything from conflict to COVID has been caught up at some point, often with potentially serious results. It’s not so long ago that the word “Infodemic” was bouncing around. Elsewhere, deepfakes have tried (badly) to interfere with the invasion of Ukraine, and US elections.
Delayed virality and rapid shutdowns
The biggest reason for videos being removed was child safety, clocking 1,383,028 removals (31 percent of the overall tally). While misinformation is clearly a small chunk of YouTube video change, it’s nevertheless good to see the video giant doing something about it.
One welcome statistic is how quickly YouTube is able to shut down bogus videos before anyone even sees the content. Where misinformation is concerned, this is vital to prevent mistruths and other nonsense going viral. 32 percent of the content it removed had precisely zero views, with 37 percent of videos receiving between one and ten views. 31 percent of videos had received more than ten views, and the report does not appear to indicate more detail beyond that simple statistic. “More than 10 views” could obviously be pretty much anything.
Sizing up the problem
We are very much in a time where infodemics (there’s that word again!) are a thing. Indeed, misinformation can easily become harmful or outright malicious. Social media echo chambers merely exacerbate the problem. We’ve reached the point where people are developing AI tools that fact check their own work.
This is clearly a huge problem which won’t be solved anytime soon, but technology giants opening up the door on their largely quiet work on misinformation may help to slowly point us in the right direction.