Meta has come a long and painful way to learn how to respond to crisis situations in different countries. Since 2021, the company has publicly recognized that those states where its social networks and messengers play an important role in society have increased risks of abuse, and need special attention. Such abuses include various ways to manipulate public opinion: dissemination of disinformation or misinformation, use of bot farms, abuse of personal data of users for micro-targeting, etc.

This recognition was rather a forced response to criticism than a proactive step on the part of the company. For many countries of the world, Facebook, Instagram and Whatsapp are at least the main way of communication online; at most, they are the main source of consumption of news and information in general. Ukraine is among them – in 2021, 43% of Ukrainians learned news from Facebook, and another 17% – from Instagram.

During the time when Meta was called Facebook, the positioning of the company resembled a social business – Facebook proclaimed its goal “to make the world more open and united”. However, after Meta attracted such a large number of users on its platform (in April, 2022, Facebook alone had almost 3 billion users), the company was not ready for the responsibility that such popularity entails.

This coverage will discuss how Meta reacted to the onset of a full-scale russian invasion on Ukraine and whether the company has learned previous lessons.

Meta and the World Crises

Given the size of the company and the central role of its products in the information field of many countries, even regular national elections turn into a crisis situation for Meta. One of the most important cases that drew attention to the impact of the online community on offline society was the 2016 presidential election in the United States. At that time, the world was shocked by the victory of the eccentric candidate Donald Trump, and later it became known that voters' commitment to him was influenced by the abuse of personal data of users to configure micro-targeting by the company Cambridge Anatytica, which worked for Trump. According to the data of Statista, advertising is a key source of revenues for the company, that is why Meta (Facebook back then) has actively worked to increase the efficiency of ads online, and consequently, of their attractiveness for advertisers. The global community pais attention to the consequences of this tools of financial growth of the tech giant for the democratic process only after the scandal with Cambridge Analytica. In response to criticism, Meta has begun to work on transparency in advertising related to social issues, elections, or politics – its advertisers and the amount of money spent on them. Learning the sad lessons of the previous experience, Meta prepared in advance for the next elections in the USA. The company increased the requirements for the authorization of users who broadcast political advertising – advertisers had to provide even their tax numbers toth company. The company has also increased its content monitoring team and limited the ability to broadcast political advertising a week before and after the election, in order not to exacerbate social tensions.

The most tragic consequences of the dominance of the social network Facebook were for Myanmar. There, the algorithm of showing content on the feed contributed to the instigation of animosity among the population in the country which resulted in the genocide of Rohingya. The genocide got especially escalated in 2016–2017, and continues to this day with varying intensity. Meta (back then Facebook) was heavily accused that their algorithms promoted calls for violence and hate speech against Rohingya among Myanmar users, and the social network ineffectively moderated content and did not remove posts that violated community rules. After numerous accusations, the company commissioned an independent audit on its impact on human rights, which confirmed the fate of the tech-giant's guilt in exterminating the Rohingya people. In 2021, Meta even received a collective lawsuit from Rohingya refugees accusing the company of contributing to the genocide of their people. It was only after that when Meta hired content moderators who were native speakers of the Myanmar language, who began blocking the pages of the Myanmar military, which organized the genocide of Rohingya Muslims, and the pages of businesses associated with them. Meta also began to prepare for the elections in Myanmar in advance, working on automatic algorithms for recognizing the Myanmar language, added the ability to protect their social media accounts for Myanmar users, and increased content monitoring to capture and remove coordinated networks of influence and hate speech.

The company is also receiving similar accusations against India. This country is the company's largest market, so after systematic acts of violence against Indian Muslims in Kashmir province, Meta was suspected of contributing to the incitement of hostility. Unlike the situation in Myanmar, in India the company is accused of favouritism in favor of the Indian ruling party, BJP. In 2020, Meta announced an independent audit on the protection of human rights in the context of India, but the results of this audit have not been published, despite public demand.

The company also took special measures before the next elections in the Philippines, Ethiopia and other countries where Meta saw risks in the information space. One of the most resonant steps was the reaction of Donald Trump supporters to the US Capitol assault on January, 6, 2021. After the violent protest,  the platform increased its monitoring of the network, removing content supporting protesters, with calls for violence and the use of weapons in the United States, including at protests, and the recurrence of violent protests. Facebook's algorithms have also automatically closed comments in US groups where hate speech has already occurred, and increased requirements for group administrators to control posts. In addition, political advertising on the US segment of Facebook and Instagram was banned for another two months after these events. The restrictions were lifted in early March. The decision to block Donald Trump's Facebook and Instagram accounts attracted the most public attention – first they were blocked indefinitely, and then in response to the recommendations of the Supervisory Board of the company (the so-called "court" at Meta regarding the monitoring and blocking of content and users), the company limited the blocking for 2 years, with the possibility of extending the term if the social risks regarding the activity of his profiles persist after its completion. Thus, the “lockdown” of Trump is about to expire next January.

Meta's Steps to Support Ukraine

It seems that unlike the cases described above, in the case of Ukraine, Meta reacted quite quickly to the beginning of a full-scale invasion and began to adopt the first special policies as early as February, 26. Here is a list of the key measures taken by Meta to protect the information space and user security:
Safety

  • They gave Ukrainians the opportunity to close their Facebook profiles and hide lists of friends for users from Ukraine and russia. Likewise, Instagram has removed the ability to view subscribers and subscriptions of private profiles.
  • As part of the Data for Good program, they began sharing user data with partner organizations providing humanitarian assistance to Ukrainians. The data should help organizations analyze and forecast the movements and needs of Ukrainian refugees.
  • They hid the names and other data of advertisers from Ukraine.

Content

  • They created a Special Operations Center (however, they could have be better at naming) to monitor the platform around the clock. It employs native speakers of the Ukrainian and russian languages who are moderators and reviewers of moderation. In addition, the company has strengthened cooperation with Ukrainian fact-checkers, NGOs, and the government.
  • Russian users were banned from broadcasting advertising and monetizing their content.
  • They added algorithms for warning about publication of a potential fake: Meta states that their algorithms are able to recognize photos taken more than a year ago, and when publishing such a photo, the network warns the user that they are sharing an old photo, just in case.
  • The publications containing references to the russian state media, and the posts of the accounts of these media, special marks were added, which inform that this information was provided by russian government-controlled media.
  • Pages and profiles that have repeatedly distributed false content will be removed from the list of user recommendations, and their posts will appear below in user feeds.
  • They started cooperating with the Ukrainian government to restrict access to russian state media and journalists.
  • They restricted access to RT and Spuntik pages in the EU, even before the decision was made at the EU level. Also, the rating of these pages on Instagram and Facebook has decreased for the whole world – from now on, RT and Sputnik publications appear less often in people's feeds. If links to RT and Sputnik are shared in Stories on Instagram, those Stories are also displayed at the end of the list.
  • On March, 11, Meta temporarily allowed Ukrainians to wish for death and generally use hate speech against russians as “a way to express their resistance and rage against military forces.” On the same day, criminal proceedings were opened in russia against the company, and the company was soon recognized as an "extremist organization." As soon as 3 days later, on March, 14, the company banned the hate speech against the russians again, probably hoping not to lose the russian market at all.

Messenger apps

  • An optional feature was added to send disappearing messages in Messenger and Whatsapp, as well as an option to automatically delete all chats in Whatsapp once in a while, to protect user correspondence in the event of loss of a phone.
  • They restricted the feature of bulk forwarded messages to Whatsapp, Messenger, and Instagram.
  • They added end-to-end encryption of chats in Messenger and Instagram.
  • We opened a hotline of the State Emergency Service of Ukraine, as well as a psychological support hotline in Whatsapp.

Help

  • Meta helps UNICEF and other organizations distribute Instagram-based parent guides with recommendations on how to support children in times of war.
  • Information materials in Ukrainian were added to the Facebook Emotional Health Center, including information about providing psychological assistance to children.
  • They prioritized the search on Instagram for non-profit charitable organizations that help Ukraine.
  • They expanded features for administrators of emergency support groups

Meta tries to police the Ukrainian information space on the network as effectively as possible. At least in the case of Ukraine, Meta applied the largest package of special measures compared to other countries before. Even in the case of accidental blocking of content about russian war crimes (as happened with the hashtags #Bucha and # Buchamassacre), the company responded quite quickly to the errors of the algorithms and took into account the local context – unfortunately, it is now vital for Ukrainians to publish photos with images of violence in order for this violence to stop as soon as possible. Also, Ukrainian Facebook and Instagram users regularly complain about the deletion of their posts or Stories covering the war in Ukraine, which also automatically block the algorithms of networks. The timeline of making special decisions about Ukraine, systematic "accidents" in content moderation, and testimonies of the company's employees who do not manage to follow changes in content policy even within one day, shows that Meta did not have a pre-designed strategy for responding to the case of a full-scale war. This is rather naive on the part of the company regarding the country where its products dominate, and since 2014, hostilities have been continuing, and part of the territory is temporarily occupied by a neighbor. So, at the moment, we are seeing more of response from the company than the implementation of proactive policies.

Without exaggeration, compared to other social networks, Meta has made the most efforts in the company's history to protect the infospace of a foreign country where the company's products dominate. Meta has also done more to protect the information space and Ukrainian users than other social networks. Also, the prompt response of Meta can largely be the result of the cooperation of the Ukrainian government with the company, because from the first days, the Ministry of Digital Transformation has actively interacted with Meta. The example of Ukraine once again proves the need for public control and social responsibility of private companies.

At the same time, russian disinformation is still spreading on social media. In the case of both the Butcha massacre, and propaganda to foreign audiences in general, Meta still fails to monitor the spread of content in foreign languages. Meta is an American tech giant that has successfully spread its dominance among non-English-speaking countries. The example of misinformation about Butcha shows that the least of barriers come in the way of disinformation in foreign languages that the algorithms are less proficient in, in terms of monitoring compliance with community rules. Case study of Ukraine shows that the information field in social media requires systematic and persistent protection by national states and deepening of the company's regional cooperation with national governments and civil society.

Original article in Ukrainian: Українська правда