Download Game! Currently 94 players and visitors. Last logged in:MaggaraXoBeautyinValkrist

Lyriikka's Blog >> 62326

Back to blogs index
Posted: 08 Aug 2017 21:16 [ permalink ]
An update on our commitment to fight terror content online
Tuesday, August 1, 2017

A little over a month ago, we told you about the four new steps we're taking
to combat terrorist content on YouTube: better detection and faster removal
driven by machine learning, more experts to alert us to content that needs
review, tougher standards for videos that are controversial but do not violate
our policies, and more work in the counter-terrorism space. 

We wanted to give you an update on these commitments: 

Better detection and faster removal driven by machine learning: We've always
used a mix of technology and human review to address the ever-changing
challenges around controversial content on YouTube. We recently began
developing and implementing cutting-edge machine learning technology designed
to help us identify and remove violent extremism and terrorism-related content
in a scalable way. We have started rolling out these tools and we are already
seeing some positive progress:

Speed and efficiency: Our machine learning systems are faster and more
effective than ever before. Over 75 percent of the videos we've removed for
violent extremism over the past month were taken down before receiving a
single human flag.

Accuracy: The accuracy of our systems has improved dramatically due to our
machine learning technology. While these tools aren't perfect, and aren't
right for every setting, in many cases our systems have proven more accurate
than humans at flagging videos that need to be removed.

Scale: With over 400 hours of content uploaded to YouTube every minute,
finding and taking action on violent extremist content poses a significant
challenge. But over the past month, our initial use of machine learning has
more than doubled both the number of videos we've removed for violent
extremism, as well as the rate at which we've taken this kind of content down.

We are encouraged by these improvements, and will continue to develop our
technology in order to make even more progress. We are also hiring more people
to help review and enforce our policies, and will continue to invest in
technical resources to keep pace with these issues and address them
responsibly. 

More experts: Of course, our systems are only as good as the the data they're
based on. Over the past weeks, we have begun working with more than 15
additional expert NGOs and institutions through our Trusted Flagger program,
including the Anti-Defamation League, the No Hate Speech Movement, and the
Institute for Strategic Dialogue. These organizations bring expert knowledge
of complex issues like hate speech, radicalization, and terrorism that will
help us better identify content that is being used to radicalize and recruit
extremists. We will also regularly consult these experts as we update our
policies to reflect new trends. And we'll continue to add more organizations
to our network of advisors over time. 

Tougher standards: We'll soon be applying tougher treatment to videos that
aren't illegal but have been flagged by users as potential violations of our
policies on hate speech and violent extremism. If we find that these videos
don't violate our policies but contain controversial religious or supremacist
content, they will be placed in a limited state. The videos will remain on
YouTube behind an interstitial, won't be recommended, won't be monetized, and
won't have key features including comments, suggested videos, and likes. We'll
begin to roll this new treatment out to videos on desktop versions of YouTube
in the coming weeks, and will bring it to mobile experiences soon thereafter.
These new approaches entail significant new internal tools and processes, and
will take time to fully implement. 

Early intervention and expanding counter-extremism work: We've started rolling
out features from Jigsaw's Redirect Method to YouTube. When people search for
sensitive keywords on YouTube, they will be redirected towards a playlist of
curated YouTube videos that directly confront and debunk violent extremist
messages. We also continue to amplify YouTube voices speaking out against hate
and radicalization through our YouTube Creators for Change program. Just last
week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a
two-day workshop for 13-18 year-olds to help them find a positive sense of
belonging online and learn skills on how to participate safely and responsibly
on the internet. We also pledged to expand the program's reach to 20,000 more
teens across the U.K. 

And over the weekend, we hosted our latest Creators for Change workshop in
Bandung, Indonesia, where creators teamed up with Indonesia's Maarif Institute
to teach young people about the importance of diversity, pluralism, and
tolerance. 

Altogether, we have taken significant steps over the last month in our fight
against online terrorism. But this is not the end. We know there is always
more work to be done. With the help of new machine learning technology, deep
partnerships, ongoing collaborations with other companies through the Global
Internet Forum, and our vigilant community we are confident we can continue to
make progress against this ever-changing threat. We look forward to sharing
more with you in the months ahead. 

The YouTube Team