YouTube is relying more on AI to flag down videos that are violative of the community policies amid the absence of physical workers.
COVID-19 has changed the workspace forever. Offices realize that work-from-home is a viable option moving forward. As such, more and more companies have allowed flexible working situations. Among these companies include Google-owned YouTube.
Amid the pandemic, YouTube has clamped down on videos that violate their guidelines on pornography and misinformation. Unfortunately, YouTube offices around the world have had to adjust to their local situations.
The majority of the office staff hasn’t been allowed to return to work. As a result, Google had to adjust to using artificial intelligence to impose its rules. It has its pros and cons. Nonetheless, Google maintains that they need to uphold certain standards in keeping their site free and clean from violators.
YouTube takes down up to 11m videos
YouTube shared via a blog post yesterday why they had to take down almost 11.4 million videos in the past months. The number is a record-high, but this isn’t largely to YouTube fault. They admit that they are having a hard time doing human screenings of uploaded videos.
As such, they have had to reinforce their automated video screening. The challenge was to choose between loose policing or stringent enforcement of their policies. Ultimately, YouTube decided on the latter. It said,
“For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible.”
YouTube is the world’s largest video uploading and streaming site. Therefore, it has a huge obligation to its users and even nations around the world. It reinforced this by saying,
“Through these challenging times, our commitment to responsibility remains steadfast.”
What about those wrongfully deleted?
As a result of the strict policing, YouTube admits that some videos may have been wrongfully taken down. To remedy this problem, creators just have to appeal to have their videos reinstated. As the company relies more on automated reviews, they have reallocated more staff to act on the appeals.
It also has relaxed its policies on strikes. For videos that have been removed by the automated review system, they will not get a strike. Instead, they will be subjected to another stage of review to determine whether or not they deserve a strike. YouTube commits the following,
“We are continuing to improve the accuracy of our systems and, as reviewers are able to come back to work, we are deploying them to the highest impact areas. We’ll continue to regularly update the community on our progress.”
Image courtesy of BigTunaOnline/Shutterstock