Perspectives

Together We Are Stronger: Social Media Companies, Civil Society, and the Fight against Disinformation

Social media companies must cooperate with local actors to fight election-related disinformation, and international organizations must encourage and support these efforts.

danger due to misinformation sign

A hanging sign warns of "Danger Due To Misinformation." Image Credit: 3dpete via Flickr.

Written by
Lolita Berzina
Europe & Eurasia Research Fellow

Social media companies must cooperate with local actors to fight election-related disinformation, and international organizations must encourage and support these efforts.

In February 2019, only days before Moldova’s parliamentary elections, Facebook removed more than 100 accounts and pages for engaging in “coordinated inauthentic behavior targeting people in Moldova.” Facebook noted that tips from a local civil society organization had been a key resource in identifying the accounts. The effort represented a proactive, collaborative partnership between local actors and a social media giant to improve the quality of preelection debate, and thus the integrity of democratic processes in Moldova.

Yet the results of this partnership, while encouraging, were insufficient when set against the scale of election-related disinformation on Facebook, which remains an ongoing problem for the platform and its users. The effects of disinformation campaigns are especially harmful in fragile democracies like Moldova, where pro-Russian actors use traditional and new media as a Trojan horse to undermine the country’s orientation toward the European Union (EU). What is more, disinformation makes navigating the country’s dynamic, polarized, and tumultuous landscape extremely challenging for normal citizens who are trying to develop informed opinions. 

In recent years, Moldova has faced an onslaught of disinformation, originating from numerous sources, including domestic ones, with varying motivations. Recent surveys of disinformation in Moldova note a preponderance of items targeting journalists, civil society activists, and local political candidates. Others aim to influence an ongoing debate regarding unification with Romania. Many items, generally with origins in Russia, seek to damage the reputation of the EU. Disinformation is also particularly harmful in Moldova in light of its underdeveloped media market, in which ownership by oligarchs and political figures has now reached the highest levels in the last decade.

On one hand, social media platforms like Facebook serve as a place where diversified and damaging disinformation campaigns, such as the ones in Moldova, can be countered. On the other, the unprecedented ease with which disinformation can be spread through social networks poses grave threats to democracy in Moldova and around the world. 

Cooperation in action

Disinformation has become a common feature of election campaigns worldwide, and Facebook, with its reported 1.56 billion daily users, potentially serves as one of the biggest arenas for the fight against it. The company reports having made significant investments over the last two years to help protect the integrity of elections. It has worked with governments to ensure the transparency of election campaigns, and has fought inauthentic behaviors prior to elections in Europe and around the world. Recently, it saw some success fighting disinformation regarding the European Parliament elections in May 2019. 

According to Facebook itself, many of its investigations into election-related disinformation and other inauthentic activities have been prompted by notifications from external organizations, and carried out in cooperation with civil society groups

In Moldova, information about the suspected fake accounts operating around the 2019 elections was provided by a platform created by the civil society organization Trolless.com, which aims to help users report information about profiles identified as trolls or fake accounts. The founders of Trolless revealed that when informing Facebook about their suspicions, they learned that the social media giant had already a small task force in place, working to combat inauthentic activities before the polls. However, Facebook acknowledged that the removal of the fake accounts took place because of a tip from Trolless. Facebook reportedly plans to continue cooperation with Trolless, suggesting that it has begun to more heavily prioritize the involvement of civil society in fighting disinformation.

The need for additional motivation

Besides the stated intentions of social networks themselves, several publications and other pieces of research have argued that social networks must cooperate with civil society in order to effectively fight against inauthentic or other harmful behavior. In addition, the EU Code of Conduct on Countering Illegal Hate Speech Online, a document that is not legally binding but which has been signed by the biggest internet intermediaries operating in Europe, encourages partnerships between tech companies and civil society organizations in the fight against illegal content online. 

Yet in many cases, such partnerships remain in their infancy. Trolless attempted for several years to contact parties at Facebook before finally catching the organization’s attention in early 2019. In Ukraine, political party members pointed out in advance of the May 2019 presidential election that there is no fast procedure for notifying Facebook’s election “war room” of fake news and fake profiles. Furthermore, at the end of 2018, fact-checkers working for Facebook complained that the company ignored their concerns and failed to draw upon their expertise to combat disinformation. Facebook’s denial of the allegations shows the complexity of the issue. It also illustrates that it might just be a good time for the international society to take a stance in this discussion.

Internationally backed incentives can help spur action

While several promising partnership initiatives have cropped up around the world, additional incentives are necessary for Facebook and other internet intermediaries—which at their core are businesses aimed at gaining profit—to ensure systematic, effective cooperation with civil society in targeting election-related disinformation.

As was emphasized by several speakers at Freedom House’s Second Annual Media Policy Forum in Moldova, it is not the role of the government to tell media, including social media, what to say. In addition, a widespread liability regime can pose risks to freedom of expression online. Thus, it is left to the international players to create incentives for social media companies to cooperate with civil partners. 

Such incentives could come in various shapes and forms, including additional political pressure, involvement of the largest tech companies in policymaking discussions, and the promotion of self-regulatory codes. The EU is currently leading the way in Europe, and the Council of Europe has touched upon the issue somewhat. However, citing again one of the speakers of the Media Policy Forum—Nika Aleksejeva, Community Manager in the Digital Forensic Research Lab—further discussion and innovation is needed in order to find effective and collaborative ways to fight disinformation.