We often wonder what the Facebook rules are, how Facebook decides what content is allowed – and how many are those that violate the Standards. Finally, Facebook explains, for the first time, what the Facebook rules are.
One often wonders how Facebook decides which content is allowed – and how many are those that violate the Standards, in short, what they really are, the Facebook rules. For years, in fact, social media has had Community standards explaining what space can be found on Facebook and what is not and, three weeks ago for the first time, it has also published the internal guidelines we use to apply those standards.
Facebook rules explained by Facebook
Within the Report on the application of the Community Standards, Facebook additionally communicates the numbers linked to the application of these Standards, in order to allow everyone to judge its work.
Alex Schultz, the Vice President of Data Analytics, explained in detail how we measure what happens on Facebook in the Hard Question blog and in this guide to Understanding the Report on the application of Community Standards.
Alex Schultz's post: the Facebook rules explained by Facebook
"This report includes information and data relating to the application of our Standards between October 2017 and March 2018 and covers six areas: explicit violence, naked (adults), sexual acts, terrorist propaganda, incitement to hatred, spam and false accounts. The numbers illustrate:
- How many content that violates our Standards (Facebook rules) has been seen by people;
- How many contents we have removed;
- How much content we have proactively detected using our technology, before people using Facebook reported them.
Most of the actions we take to remove content contrary to ours to the rules of Facebook, is about fake accounts and the large amount of spam they generate. For example:
- We have removed 837 million spam content in the first quarter of 2018 – almost 100% of which was found and tagged before someone reported it;
- The key to fighting spam is to remove the false accounts that spread it. In the first quarter of 2018 we have disabled about 583 million fake accounts – most of them blocked within minutes of their creation. This is added to the millions of attempts to create fake accounts that we foil on Facebook every day. Altogether we estimate that, during this period of time, they were still false about the 3-4% of active Facebook accounts.
Content that violates the rules of Facebook
- We have removed 21 million adult nude or pornographic content in the first quarter of 2018, the 96% of them detected by our technology before being reported. We estimate, however, that out of every 10,000 content displayed on Facebook, 7-9 views involved content that violated our Nude and Pornography standards.
- For sensitive topics such as explicit violence and hate speech, our technology still fails to function in a completely effective and necessary way the intervention of our review teams. We have removed or labeled as potentially dangerous around three and a half million violent content in the first three months of 2018, 86% of which were identified by our technology before they were reportedthe. We have also removed two and a half million of content inciting hatred, 38% of which were detected directly by our technology "
Abuses of the Facebook Rules
As pointed out by Mark during F8, we still have a lot of work to do to prevent abuse. Partly because artificial intelligence, despite being a promising technology, still far from being effective for many content that violates our Standards, because the context is still very important. For example, artificial intelligence is not yet sufficient to identify if someone is inciting hatred or if they are simply describing something they have experienced to denounce the problem publicly.
More generally, as explained last week, the technology needs a large number of supporting data to be able to recognize significant patterns of behavior, which we often do not have for less widely used languages or for cases that are not often reported. Furthermore, in many areas – even if it is spam, pornography or fake accounts – we run up against sophisticated adversaries that constantly change tactics to circumvent our controls, to contravene Facebook's rules, which means we must continually change and adapt our efforts.
This is why we are investing heavily in more people and better technology to make Facebook safer for everyone.
Because Facebook explains the Facebook rules
And also the reason why we publish this information. We believe that greater transparency makes it possible to increase the sense of responsibility over time and that the publication of this information can push us to improve more quickly. These are the same data we use to evaluate the work done internally – and now everyone can see them to judge our progress. We look forward to receiving your feedback ".
Facebook: another 200 apps suspended due to privacy violation
Fake accounts, Facebook closed 583 million in the first 3 months of 2018