On April 1, Facebook announced that it has removed hundreds of pages, groups, and accounts that were considered not in line, or were violating Facebook’s policies on ‘coordinated inauthentic behaviour,’ or spam. The coordinated inauthentic behaviour as defined by Facebook includes, ‘when groups of people or pages work together to mislead who they are and what they are doing.’ It is not the contents but the collective behaviour of these groups or individuals to mislead the others.
Facebook continues to take actions against fake accounts and pages and disables “millions of fake accounts every day.” Over the past few months, Facebook took actions against these accounts in many countries, including the United Kingdom, Indonesia, Russia, Romania, India, Macedonia, Iran, and several others.
In its most recent move, Facebook removed 103 Pages, Groups and accounts on Facebook and Instagram from Pakistan, and 687 Pages and accounts from the IT Cell of Indian National Congress. In addition, Facebook also removed 321 Pages and accounts from India that were found violating rules, and 15 of these were linked to individuals and an IT company ‘Silver Touch’ that hosts portals for the Indian Army and Navy.
On Pakistan based accounts, Facebook stated that “we are taking these Pages and accounts based on their behaviour, not the content they posted.” The social media giant believes that the individuals behind fake accounts were operating Pages that related to the Pakistani military, Kashmir, and political news about the Indian government, and were linked to Pakistan military’s public relations office of the Inter Service Public Relations (ISPR). The ISPR has denied the claim and says that these pages were not “ISPR-managed.”
Facebook is taking measures against fake accounts, but its own ‘inauthentic behaviour’ remains under international scrutiny. It is accused of “data sharing deals with technological companies,” which is being probed by U.S. prosecutors and could end up into a record fine. Facebook was also recently condemned for its failure to stop live streaming of the terrorist attack in which 50 Muslim worshippers were killed in New Zealand. Earlier, Facebook was accused of not doing enough to prevent influencing the U.S. elections, which became a more serious issue after the scandal of ‘Cambridge Analytica.’
In an apparent attempt to shift the responsibility from its own inaction or selective application of community rules, Facebook’s Mark Zuckerberg has called for stricter regulatory measures and the need to develop common global standards. Instead of waiting for such standards to be negotiated by the international community, it remains the responsibility of the Facebook to build its capacity to ensure that it is not misused for selective application of its own rules or to target other government entities without any clear evidence.
While formulating global standards, Facebook must take into consideration religious, social and political sensitivities that could have implications on people from different backgrounds that are part of this global social platform.