Imaginary me: fake accounts on Ukrainian media pages | VoxUkraine

Imaginary me: fake accounts on Ukrainian media pages

27 September 2019
FacebookTwitterTelegram
7638

Facebook becomes more and more popular among Ukrainians. Numbers of Ukrainian Facebook users grow consistently. We are among the top three countries in the world based on the users growth dynamics

Artellence and VoxUkraine disclaimer: The main goal of the following article is to attract the attention of the Ukrainian society to the amount of bots in social media and their obvious influence on communication there.

But with great popularity comes great influence. Almost a quarter of Ukrainians (23.5%) define social media as their main source of news and information on politics

Active Facebook users state they prefer this social media as a information source due to it being aggregated there. You don’t need to surf different websites if Facebook feed has all the news media pages. 

There is a range of factors influencing which posts will appear first in the user’s news feed, and one of them is the amount of comments under the post. But what if the people who leave comments are not real? What if public interest to the topic is orchestrated? Then instead of the real news real people are interested in the user will see more news pushed by specific algorithms. 

We decided to study how bots form the agenda on the pages of Ukrainian media. See the detailed description of the data used below the article. 

Bots on media pages

Ukrainian media actively migrate to Facebook in an attempt to adapt to the rise of the popularity of social media. In order to be noticed an article needs to get a lot of likes and comments. But these reactions can be created artificially through the use of fake accounts. 

We investigated the activity of bots on Facebook pages of Ukrainian media with the largest number of subscribers from Ukraine.

Bots were the most active on the pages of little-known media with a small number of subscribers (up to 200 thousand). They contributed to more than half of all comments there (Fig.1). Top three websites were NewsFacts.com (56% bot comments, 24,832 followers), Ukraine24 (53% bot comments and 169,659 followers), and Ukrainian News (52% bot comments, 6,970 followers).

Fig. 1. Top-20 media pages with most bots

The top 20 pages where the number of bot comments exceeded 45%, also included two pages of popular Ukrainian media: RBC-Ukraine (399 256 fowlers) and a page of one of the favorite media of the President Strana.UA (78 984 fowlers). Both pages have 44% comments from bots.

Top 20 in terms of fake comments also included 15 foreign pages with a large Ukrainian audience:

  • 5 international news agencies;
  • 9 media outlets from Russia (including 6 major well-known media outlets);
  • one media outlet from Israel.

The number of bots’ comments on them did not exceed 31%. On average, on such resources, bot comments reach 18%.

The highest ratio of bot comments to all comments was on CGTN pages in Russian (31%), Chastime (29%) and DW editions in Ukrainian (27%) and Russian (25%). The lowest percentage of bots in the comments of Esquire Russia (8%) and Knife magazine (5%).

In a third of the media in our sample (115 out of 332 pages), more than 30% of the comments are bots. These include top 10 media pages with the most coverage:

  1. RBC-Ukraine (44% comments on the page are from bots)
  2. STRANA.UA (43%)
  3. 112 channel (41%)
  4. Segodnya (40%)
  5. Gordon (40%)
  6. Gazeta.ua (38%)
  7. Censor.NET (35%)
  8. Obozrevatel (35%)
  9. Fakty ICTV (34%)
  10. UNIAN (31%)

Fig.2 Top media by coverage with more than 30% bots in the comments

Bots actively comment on posts on media pages with large audiences, but most of all they “twist” views on pages of little-known media. Bots can comment on posts to create the illusion of relevance and interest in a particular topic. The purpose of bots is ideological, they are trying to formulate a certain agenda and thus influence public opinion.

So it’s especially interesting to see how bots behave on the pages of trusted media.

Bots on the pages of trusted media

The five most popular media that people regularly read and trust are Segodnya (39.8% of comments are from bots), Obozrevatel (34.7%), UNIAN (30.6%), Ukrainska Pravda (28.8%) and Politeka (22.7%).

We decided to see how bots comment on news about the most famous politicians on the pages of these media.

We analyzed the comments on these pages, which mentioned Vladimir Zelensky, Petro Poroshenko, Yulia Tymoshenko, Volodymyr Groysman and Svyatoslav Vakarchuk. We have divided these comments by connotation (positive – neutral – negative). Then we defined bot comments in each group (Fig. 3).

Fig. 3. The tone of bot commentary on politicians on trusted media pages

Bots mostly criticize politicians on these media pages. For each of these politicians most of the comments have negative connotation (around 18% of all comments which connotation we were able to define).  

There are 2,5 times less bots with neutral connotation than with negative one: 6,8% against 18%. And there are even less of positive ones: 5,7%. 

The most attention from the bots on the pages of media we have studied went to the current and ex-presidents of Ukraine. Bots mostly criticize them. Zelenskiy got 1,5 times more negative and neutral comments than Poroshenko. However, he also got five times more positive comments from bots than the ex-president (21% of positive comments about Zelenskiy and 4% about Poroshenko). 

Bots basically ignore other politicians on these pages. They mention Vakarchuk on three pages only (Segodnya, Politeka and UNIAN) and mostly negative – 77% of all comments from bots about Vakarchuk were negative. 

Bots on these pages rarely mention Yulia Tymoshenko (only 4,4% comments of the bots are about her), but when they do, it’s mostly in a good light. 58% bots’ comments about Tymoshenko are positive. We found the biggest share of these comments on the Ukrainska Pravda page, and they are positive (1,7% of all the bot comments on the page are favorable comments about Tymoshenko). 

Bots on these media pages almost never write about Volodymyr Groysman. 

Bots in the comments under the popular news articles

Bots leave comments on the media Facebook pages in order to attract attention, change perception, cause an emotional reaction. Which news bots tried to attract attention to? How did they react to the most explosive news article which had the most comments from real users? 

The most discussed articles on the pages of media from May to July 2019 were sociopolitical: LGBT+ pride and Russia’s return to the PACE (pic. 4&5). 

The share of bots was the highest in the comments under political news articles: declarations Groysman made, Poroshenko’s rally, argument of Tymoshenko and Groysman on TV (we wrote about them in a previous article). 

We divided comments of the bots under the most popular news into messages. Message is a general idea bots are trying to push. Each message consists of posts with different wording but the same context. We described only the messages that rounded more than 50 comments of the bots. 

Fig. 4 Main bot messages under the news about LGBT+ pride in Kyiv (more than 50 comments in each message)

The article “LGBT+ pride in Kyiv” was the most popular. It had received 21 640 comments, 15 954 written by real people and 5 686 (26%) written by bots. We discovered 635 main messages in these comments. It is a small amount compared to other top news, since they are more uniform and centralized. All the comments were negative, with different level of aggression. Most of the comments stated “I don’t support LGBT+” (180 comments out of 635), 150 comments were against the march of the LGBT+ servicemen. Third place went to the comment “Sexual orientation is a personal issue, I have nothing against LGBT+, but a pride is too much”. There were 52 of such comments. 

There also were bots comments about sexual orientation of the government in general or separate politicians. Namely, Oleh Lyashko, Volodymyr Zelenskiy, Petro Poroshenko and Ihor Smeshko. 

Fig.5 Bot messages under the news about Russia returning to PACE (more than 50 comments in each message)

Message “Russia returns to PACE” got the second place based on the society’s reaction. It also got 26% of bot comments (2 781 bot comments against 8 078 real people comments). We defined the main message for 37,5% (1 079) of bot comments. They mostly criticized Europe (312 comments out of 1 079). Bots mostly left messages like “Europe is corrupt” (200 messages), “We need to change the PACE delegation!” (158 messages) and “We need to quit PACE and tell Europe off” (112 messages).

Under this article bots actively criticized the work of current (217 comments) and ex-president (90 comments). They also left quite unexpected “Theresa May is a political corpse” comments (28) and posts with hatred towards Iryna Heraschenko (10 comments).

Both news with the highest Facebook audience response got 26% bot comments. In both cases comments were negative and used hate speech. 

Conclusions:                  

  1. The largest share of comments from bots (more than half) was found on the pages of the lesser-known media with less than 200 000 Facebook subscribers: NewsFacts.com, Ukraine24, Ukrainian news.
  2. Among the Ukrainian media with the biggest coverage RBC-Ukraine has the largest share of bots in the comments (44% of comments – bots), STRANA.UA (43%) та 112 каналу (41%).
  3. Among the five media Ukrainians trust, we found the largest share of comments from fake accounts on the page of Segodnya (39,8%) and Obozrevatel (34,7%).
  4. On media pages bots mostly spread critical and hateful messages. 
  5. On media pages of the trusted media bots mostly criticize politicians. Volodymyr Zelenskiy and Petro Poroshenko get the most of the hate – 48,4% and 32,9% negative comments from bots respectively. There is nothing on Volodymyr Groysman and almost nothing on Svyatoslav Vakarchuk. Yulia Tymoshenko gets mostly positive comments from bots.  
  6. On the pages of the media we analyzed users the most active discussion started around the LGBT+ pride in Kyiv, and the message about Russia’s return to the PACE. Bots actively commented these same news articles – the amount of bot comments under them is the biggest. Bots left negative comments under each of them. The main type of bot comment under the news about pride was “I don’t support LGBT+”, and under the article about the PACE – negative comments about Europe, for example, “Europe is corrupt”.

How did we count:

We analyzed the Facebook posts of the popular Ukrainian media the bots comment on the most, from May 1st to July 8th, 2019. In total, this sample included 332 Facebook pages.  

How did we define bots

The use of bots is becoming more and more commonplace for us: chatbots announce when train tickets become available, help pay taxes and keep up with the news. However, next to “official” bots, there are bots that pose as real humans. Their goal is to change the discourse and influence public opinion. To do this, they distribute false information and propaganda, write emotionally charged posts and comments (see Figure 6 for an example). 

Fig. 6. An example of the bot comment

To separate the bots Artellence developed an algorithm which analyzed public information from the Ukrainian segment of the Facebook for nine months, from November 2018 to July 2019. The algorithm studied both comments and profile information.  We analyzed only users who left more than ten comments concerning politics. Users who got 95% chance of being a bot by an algorithm were defined as bots. 

In our study, bots are the fake accounts who do nothing but comment on various political topics. Those are mostly accounts created and managed by people (who can be called trolls in such a case). They can write their own comments or simply copy them from the list, pre-approved by their client. There are program-controlled bots, too, which allows to minimize human involvement. But in our case those are quite rare, most accounts are managed by people. 

However, the behavior of a fake account, even a human-managed one, differs from the behavior of a normal Facebook user quite starkly. We defined eight common features of the bots:

  1. On average, bots write their comments 15 times faster than a real user (average gap between two comments). 
  2. More than half of the bots have “comment buddies” – other fake accounts who leave comments under the same posts. Real people do that only one time out of eight (13%). 
  3. Bots comment on politics four times more often than real users do. 
  4. Bots have four times less friends than real users (on average). 
  5. Only 4% of bots have at least one check-in (in a certain location). For people average rate of check-ins is 44%. 
  6. Bots have faces on their avatars only in 50%, people – in 92% cases. 
  7. Only 20% of bots have comments under their own posts, in comparison to 80% with real people. 
  8. Only 3% of people have no reactions under their avatars, while among bots this indicator is 43%. 

How we chose the news

Disclaimer. We studied the text of the bot comments, not the principles of their creation and management. We cannot state that a bot actively commenting certain politician’s post belongs to said politician. It can be a black PR method. We were also unable to determine the bots’ country of origin. Some bots might have originated from other countries (including Russia). 

Artellence chose the news articles based on the natural language processing method machine learning, – vector analysis (NLP Word Vectors). The algorithm analyzed posts and unified the similar ones into one.  

For example, in the beginning of May 2019 there was a news article about Volodymyr Groysman commenting average salaries in Ukraine. It appeared on the different websites with different text, but the same meaning: “Groysman declared in the next two years minimum wage must be doubled”, “When should Ukrainians expect the salaries’ raise”, “Groysman says each Ukrainian will earn $620 in 2021” etc. All these and similar headings algorithm defined as one news article. 

Authors

Attention

The author doesn`t work for, consult to, own shares in or receive funding from any company or organization that would benefit from this article, and have no relevant affiliations