Targeted advertising on social networks and online search engines has opened up a wonderful new world for public opinion manipulations, especially in an election campaign. Social networks have greatly simplified campaigning, making the distribution of advertising and viral messages cheaper, faster, more personalized, and often invisible to a user. We tried to understand how different countries regulate the information space online and how Ukraine can use foreign experience.
Election campaigns of the last decade in various countries of the world and the COVID-19 pandemic have clearly illustrated that social networks are gradually beginning to play a leading role in informing society. Along with the obvious benefits of the Internet and social networks, the problem of low-quality content and disinformation (false messages disseminated to gain profit or mislead society) has recently become more acute. The UN has also noticed this severe problem and introduced the term "infodemic" to identify a rapid and large-scale spread of information (accurate or false). It called on the media and social networks to work together to "prevent the spread of false messages and disseminate accurate information based on facts and data."
The lack of regulations for Internet space and social networks is not only a Ukrainian problem. At the same time, there are fierce discussions about the regulation of social networks. There are two opposing views on the legal regulation of social networks in different countries: proponents of regulation say that social networks have too much influence on citizens' political choices. Opponents say that regulation can be a dangerous precedent and a potential threat to free speech. In this article, we tried to understand how different countries regulate the dissemination of information on the Internet and how Ukraine can use the experience of other countries.
The General Data Protection Regulation (GDPR), adopted in 2016, has been operating in the European Union since 2018. This directive covers a wide range of issues related to the use of personal data, including targeted political advertising on social networks. Social networks such as Facebook, YouTube, Twitter, Google, and others store large amounts of data about users based on their online behavior, allowing them to set up more personalized advertising and micro-targeting of advertising posts. Micro-targeting allows to customize the displayed advertising based not only on general user data, such as gender, age, or region of residence, but also on information about the interests, beliefs, tastes of the user, obtained, for example, from his/her browser search history, interaction with other advertising posts, etc. The GDPR divides personal data into two categories: ordinary (name, education, place of residence, place of work, age, etc.) and sensitive data (religion and ethnicity, sexual orientation, political beliefs, etc.). The notorious Cambridge Analytica company, which was involved in the scandal around the 2016 US presidential election, developed advertising strategies based on such data.
According to the GDPR, users' personal data can be stored and used only with their informed consent, and the permission request should be simple and clear to the user. Personal data may only be used legally and transparently, stored only for a previously stated purpose, must remain confidential, and may be stored for a limited time. Users may challenge the way their data is stored and used. Each EU member state has a special body - the Data Protection Authority (DPA), authorized to monitor the GDPR requirements. Thus, the DPA can impose a fine of up to 20 thousand euros on an organization engaged in the collection, storage, and use of personal data if it violates the GDPR. The total amount of fines throughout the EU reached over 711.5 million euros in 2019.
The regulation prohibits using sensitive user data, including data on political beliefs, but has some exceptions for political parties. Political parties (or similar non-profit organizations) may use sensitive data only from their members, former members, or persons in regular contact with the organization.
Besides that, the European Union has been improving legal instruments against disinformation. In summer 2020, it announced an amendment of the Digital Service Act, which regulates online space and includes the fight against misinformation. Thus, the EU government has promised to establish clear rules for social networks to protect citizen rights online. Earlier in 2018, online platforms, social networks, advertising companies, and their associations had voluntarily signed the Disinformation Action Plan and report annually on the implementation of the Plan to the European Commission. It is assumed that the Digital Service Act should institutionalize the voluntary activities of companies against disinformation and introduce legal mechanisms of state control.
In 2017, the Bundestag passed the Law on the Improvement of Law Enforcement in Social Networks (NetzDG). This law mainly aimed to combat hate crimes, fake news, hate speech against certain social groups or issues, and all content that violates German defamation law. According to the German Minister of Justice, the reason for such state intervention was the inability of social networks (especially Facebook and Twitter) to regulate content by themselves. The German government affirmed that Facebook deleted only 39% of the content users complained about, Twitter - 1%, and only YouTube - over 90%.
First of all, the new law introduced new standards to manage consumer complaints. Social networks operating in the country should offer users an easily recognizable and always available tool to complain about the content. Each complaint must be reviewed within 24 hours, and the content that violates the law must be blocked. Besides fake news and defamation, this rule also concerns any other violations of German law. In such cases, social networks can block the content for seven days seek advice from public authorities.
In addition, all social networks must name a representative in Germany, even if they are not based in Germany but in the United States or any other country. Law enforcement agencies can contact such a representative to obtain the information they need and impose a fine if the social network violates federal law. It is noteworthy that the fines are extraordinary in this case - up to 5 million euros for the person responsible for the complaint management system; up to € 50 million for the social network itself if it fails to fulfill its responsibilities.
The adoption of this law had quite noticeable consequences: Google, Twitter, and Facebook created special complaint forms and started to report on content deletion associated with NetzDG. In July 2019, the first company had to pay 2 million euros for incomplete information about illegal content on their platforms in Germany. However, micro-targeting of political advertising on social networks in Germany remains regulated only at the EU General Data Protection Regulation level.
In 2018, Canada adopted the Elections Modernization Act (EMA), which also concerned political advertising online.