Facebook, by far the most popular social media network in Pakistan, has taken exclusive measures ahead of general elections in the country.
In an exclusive conversation with ProPakistani, Sarim Aziz, who leads Public Policy for Pakistan for Facebook, explained steps the social media giant is taking ahead of elections, including monitoring of content, deletion of fake accounts, enhanced security steps for users and transparency of ads among others
Q) How does Facebook view its role ahead of elections in Pakistan?
Around the world, social media is making it easier for people to have a voice in government — to discuss issues, organize around causes, and hold leaders accountable. This is particularly important during important moments of civic discourse, such as elections.
We are committed to ensuring the integrity of elections around the world, including in Pakistan. We have taken a number of steps to protect elections from abuse and exploitation, including enhanced security measures to protect pages of political parties and candidates, improving the enforcement of our ads policies and greater Ads and Page transparency, better use of machine learning to combat fake accounts, and working to reduce the spread of false news.
We’ve also dramatically increased the number of people working on this area, with dedicated teams focused on preventing abuse on our platform during elections.
Q) How does Facebook plan to identify fake news during Pakistan’s elections?
Soon, we will begin a pilot of our Third Party Fact Checking to our community in Pakistan, in partnership with AFP. Third Party Fact Checking is one of the ways we are fighting misinformation and our partners are well-respected fact-checkers who have been certified by Poynter’s non-partisan International Fact-Checking Network. Here’s how we work with our fact-checking partners, using a combination of technology and human review to detect and demote false news on Facebook:
We use signals, including feedback from people on Facebook and clickbait sensationalist headlines, to predict potentially false stories for fact-checkers like AFP to review.
When fact-checkers rate a story as false, we significantly reduce its distribution in News Feed — dropping future views on average by more than 80%. Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed.
We also want to empower people to decide for themselves what to read, trust, and share. When third-party fact-checkers write articles about the accuracy of a news story, we show these articles in Related Articles immediately below the story in News Feed. We also send people and Page Admins notifications if they try to share a story or have shared one in the past that’s been determined to be false.
Improving news literacy.
We want to empower people to decide for themselves what to read, trust, and share, and we do this by promoting news literacy and providing people with more context.
In Pakistan, we’ve partnered with Media Matters for Democracy and EngagePakistan to develop and distribute localized versions of our false news tips in Pakistan, with the aim of helping our community here learn how to recognize and avoid false news and misinformation. Soon, we will be posting a Public Service Announcement at the top of News Feed – visible to our entire Facebook community in Pakistan – with a link to these false news tips.
Q) How does Facebook deal with fake accounts?
Fake accounts can be a major distributor of harmful and misleading content, and we work hard to keep them off the platform.
We block millions of fake accounts at registration every day, and we continuously build and update our technical systems to make it easier to respond to reports of abuse, detect and remove spam, identify and eliminate fake accounts, and prevent accounts from being compromised.
We’ve made recent improvements to recognize these inauthentic accounts more easily by identifying patterns of activity — without assessing account contents themselves. For example, our systems may detect repeated posting of the same content, or aberrations in the volume of content creation.
In the first quarter of this year, we removed nearly 6 million fake accounts globally, 98.5% of which we detected before anyone reported them to us.
Q) Tell us about how Facebook deals with requests from governments to restrict access to content.
We are a global platform, with a mission to give people the power to build community and engage with each other, regardless of where they live.
People come to Facebook to share their stories, see the world through the eyes of others and connect with friends and family. We also want everyone to feel safe when using Facebook. We have a set of community standards and related policies aim to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for everyone.
At times, we do receive requests from governments to restrict access to specific content because it violates local law. We carefully review every request we receive, and if we conclude that a piece of content violates local law, we may restrict access to that content in that particular country or territory. We are transparent about all requests we receive from governments, and publish them in our Transparency Report.
Q) What are you doing to combat false information and misleading content in ads?
Increasing Transparency for Ads and Pages
We’re taking significant steps to bring more transparency to ads and Pages on Facebook.
Anyone can now view active ads from Pages on Facebook. The feature will allow our community in Pakistan – and around the world – to see ads across Facebook, Instagram, Messenger and our partner network, even if those ads aren’t shown to you. People can also learn more about Pages, even if they don’t advertise. For example, you can see any recent name changes and the date the Page was created.
Strengthening enforcement against improper ads
We require our community on Facebook to respect our Community Standards, and we hold advertisers to even stricter guidelines. We use both automated and human review, and we’re taking aggressive steps to strengthen both. Reviewing ads means assessing not just the ad’s content, but the context in which it was bought and the intended audience, and we’re changing our ads review system to pay more attention to these signals.
This year, we’ve added more people to our global ads review teams and we’re investing more in machine learning to better understand when to flag and take down ads. Also, last year, we stated that we will no longer allow Pages that repeatedly share false news to advertise on Facebook.
Facebook is committed to making sure that our community has a positive, meaningful and safe experience on Facebook, including the millions of people who use our services here in Pakistan. We’re proud of the opportunities Facebook gives people in Pakistan to connect, have a voice on the issues they care about, and take advantage of social and economic opportunities, Sarim Aziz concluded.