Issues such as political campaigning on the Internet and social media, the regulation of artificial intelligence, and its use during elections are increasingly drawing the attention of politicians and society. This column was inspired by discussions during a seminar organized by the Civil Network OPORA, which involved MPs, government representatives, media professionals, and experts.
One of the key topics of the gathering was the role of social media in political campaigning. This issue is not just a challenge for Ukraine; it is a global problem already reshaping political engagement rules worldwide.
The Cambridge Analytica scandal, which exposed the extensive use of personal data to manipulate voters during the 2016 U.S. presidential election and the Brexit referendum, highlighted the risks posed by digital platforms. In Romania, during parliamentary elections, and in Moldova, during presidential elections, massive disinformation campaigns were uncovered online, affecting voting outcomes, mainly through targeted advertising and bots.
Similar problems have been observed in France, Germany, and other European countries, where disinformation and fake news have become tools of political competition. These efforts actively utilize not only Facebook but also Telegram and TikTok. TikTok, popular among younger audiences, has become a key channel for spreading political messages, though its algorithms remain opaque. With its “closed” groups and channels, Telegram enables the organization of fake news campaigns without the risk of being blocked. Google and YouTube also play a significant role in political campaigning, but even in countries with robust regulatory systems, control over advertising funding sources and detecting fake accounts still needs to be improved.
The international community is already responding to these threats. The European Union has adopted the Digital Services Act (DSA), which establishes clear rules for tech platforms, and Regulation (EU) 2024/900 on the transparency and targeting of political advertising. This regulation requires companies like Facebook and Google to ensure transparency in their algorithms, disclose the funding sources for political advertising, and remove illegal content. Another significant document is the Artificial Intelligence Act, which regulates the use of AI in sensitive areas, including political campaigning.
Ukraine remains vulnerable to the digital threats facing democracy. Russia is already conducting large-scale disinformation campaigns against us. However, the real issues run deeper: virtually every active social media user in Ukraine risks becoming a target of manipulative advertising or fake news. Vox Ukraine collaborates with Meta under the Third-Party Fact-Checking Program, but is this enough? Without effective countermeasures, new technologies could be exploited by external enemies and unethical political forces within the country. Do we want to wake up one day to realize that the decisions shaping our future were made under the influence of algorithms and bot farms?
Currently, Ukraine lacks clear regulations for digital political advertising and mechanisms for controlling the use of personal data in election campaigns. Problems with regulating digital platforms stem from the fragmented responsibilities of state bodies. The National Council for Television and Radio Broadcasting, while experienced in media regulation, needs more authority to work with social networks. Similarly, the National Commission for the State Regulation of Electronic Communications, Radio Frequency Spectrum, and the Provision of Postal Services (NCEC) could potentially drive changes in this area but currently needs more resources and a legal framework.
Another issue is the role of the Ombudsman, who is responsible for personal data protection in Ukraine. Their powers could be improved, preventing effective oversight of voter data usage in political campaigns. For instance, no mechanisms enable the Ombudsman to audit social media algorithms or identify the illegal use of Ukrainians’ data for targeted advertising.
Thus, one of the primary tasks should be developing legislation incorporating EU directives and practices (requiring social media platforms to report on the sources of ad funding), implementing mandatory content and algorithm monitoring, and establishing accountability for spreading disinformation. Priority actions include expanding the authority of the National Council for Television and Radio Broadcasting and the NCEC to monitor social media content and regulate political advertising; establishing a specialized department within the Ombudsman’s Office to address digital threats and audit social platform algorithms; and passing a law that mandates TikTok, Telegram, Google, and YouTube to disclose data on political ad funding, targeting algorithms, and implement systems for labeling sponsored political materials.
At the same time, the risks associated with implementing these provisions must be considered. Enhanced monitoring of content and algorithms could threaten freedom of speech if the new regulations are used to restrict access to certain opinions or ideas. New rules must be accompanied by explicit guarantees to protect citizens’ right to express their views. Additionally, state bodies should have accountability mechanisms to prevent abuse of power and ensure they do not overstep their authority.
The issue of artificial intelligence (AI) deserves special attention. Ukraine currently lacks a regulatory framework to govern its use in election campaigns. At the same time, international experience demonstrates that AI misuse can lead to the mass spread of fake news, the creation of manipulative content, and targeted attacks on specific voter groups. Both in Ukraine and globally, “deepfakes” have been used to discredit politicians. Europe is already working on early detection mechanisms for such threats.
To address these challenges, Ukraine should establish a specialized regulatory body to monitor the activities of digital platforms and develop a strategy to combat disinformation. It is crucial to involve civil society organizations, media outlets, and the academic community in this process to ensure the broadest possible oversight of digital platform activities.
Globally and in Ukraine, the rapid digitalization of all spheres of life is underway. Along with convenience—such as obtaining documents or administrative services in just a few minutes via Diia instead of waiting in long queues with paperwork—these changes also bring risks. For example, they make companies and individuals more vulnerable to cybercrime. Governments worldwide strive to protect their citizens and businesses from these risks. Ukraine can contribute to this process as a developer of solutions rather than merely a recipient of already-established regulations.
Photo: depositphotos.com/ua
Attention
The author doesn`t work for, consult to, own shares in or receive funding from any company or organization that would benefit from this article, and have no relevant affiliations