Tue, Dec 24, 2024 | Jumada al-Aakhirah 23, 1446 | DXB ktweather icon0°C

YouTube follows rigorous protocol to protect its community, says top official

Between October and December 2021, over 3.7 million videos were removed for violating YouTube’s community guidelines

Published: Tue 12 Apr 2022, 5:34 PM

Updated: Wed 13 Apr 2022, 2:27 PM

Top Stories

YouTube’s Community Guidelines clearly list what is, and is not, allowed on YouTube and YouTube Kids - Supplied

YouTube’s Community Guidelines clearly list what is, and is not, allowed on YouTube and YouTube Kids - Supplied

Being part of an ecosystem that is dynamic and ever evolving means ensuring that you are always vigilant when it comes to protecting users from any threats.

This is a mission that YouTube takes very seriously, says Tarek Amin, director of YouTube MENA. “With so many incredible stories to share, and billions of people using YouTube to learn new skills and be entertained, YouTube’s top priority is to protect the community from harmful content.”

“We’re constantly thinking about how to make YouTube the best storytelling tool for everyone to share their stories,” he told Khaleej Times. “YouTube is a place of community, collaboration and commerce. Every minute, 500 hours of video are uploaded to YouTube, ranging from study tips to powerful stories about people in the Middle East, North Africa and beyond. Content creators on the platform are able to earn a living and build a business on YouTube that contributes to the wider community.”

In order to protect the community, Amin shared how the platform follows the ‘4 Rs’ and ensures strict enforcement of its Community Guidelines. The 4 Rs refer to remove, raise, reward, and reduce. First, content that violates YouTube’s policies is removed as quickly as possible and authoritative, high quality voices are raised, especially when people look for breaking news and information. YouTube also makes sure that trusted, eligible media partners, creators and artists are rewarded, and the spread of what is referred to as borderline content – content that comes close to, but is not violative of YouTube’s community guidelines – is reduced.

In addition, YouTube’s Community Guidelines clearly list what is, and is not, allowed on YouTube and YouTube Kids. These policies address all types of content including videos, comments, links, thumbnails and ads. Amin explained that they are constantly evolving to ensure adaptability in the face of new realities, and to pre-empt trends such as the policies launched to address medical misinformation related to Covid-19.

“Content is flagged either by our automated systems or by users,” Amin pointed out. “Through a combination of trained teams made up of thousands of people from around the world and YouTube’s machine learning systems, content that has been flagged is reviewed. If the content violates YouTube’s guidelines, it is removed; content that may not be appropriate for all audiences is age-restricted; and content that is not found to be violative is left up.”

Such a system helps ensure wider coverage and effectiveness, as machines are better at finding content at scale, and reviewers make the decisions on what content should be removed according to YouTube’s guidelines and the context. On the other hand, ads have a separate set of strict policies, which prohibit some types of ads from running on the platforms, especially when they link to sites which may be unsafe due to malware downloads, spam, or adult content. If an advertiser fails to comply with the policies, YouTube immediately revokes their ability to run ads on the platform and reserves the right to disable their Google Ads account, barring them from running any ads on the platforms.

According to YouTube’s latest quarterly Community Guidelines Enforcement report, between October and December 2021, over 3.7 million videos were removed for violating YouTube’s community guidelines. Out of these videos, 38 per cent had 1-10 views and almost 32 per cent had no views at all, which shows that YouTube’s machine learning systems took action against violative videos before their impact was felt.

When it comes to child safety, Amin revealed that YouTube accepts a lower level of accuracy when reviewing flagged videos to make sure that as many pieces of violative content are removed as quickly as possible. “This means that a higher amount of content that does not violate our policies is also removed. In the same time period, over 1.1 million videos were removed due to child safety concerns, including videos that could be potentially harmful to children such as dares, challenges, or even innocently posted content that might be a target for predators.”

Looking back, Amin said that YouTube’s policies have come a long way over the years, and that the world and people’s expectations are ever-changing. “YouTube’s policies continue to evolve, pre-empt and adapt in the face of new challenges and our community plays an important role by flagging content. Regardless of the circumstances, the commitment to protecting our community from harmful content on both YouTube and YouTube Kids is unwavering.”

rohma@khaleejtimes.com



Next Story