Jakub Kalenský, Atlantic Council: European influencers repeat typical Kremlin’s lies, giving them new legitimacy
Jakub Kalenský, Atlantic Council: European influencers repeat typical Kremlin’s lies, giving them new legitimacy
Читайте на русском здесь.
In July, the U.S. House Subcommittee on Europe, Eurasia, Energy, and the Environment conducted the hearing “Russian Disinformation Attacks on Elections: Lessons from Europe”. One of those who testified there was an expert from the Atlantic Council Jakub Kalenský. In particular, he said that in 2014–2019, researchers and journalists have identified pro-Kremlin disinformation campaigns in 16 elections and referenda throughout Europe, including Ukraine.
“I cannot guarantee that this list is exhaustive. But just from this brief overview, we can see that pro-Kremlin disinformation activity is definitely not becoming less aggressive,” Mr Kalenský said in his speech.
The Detector Media talked to him and found out the details.
– Jakub, what was the aim of the testimony? Who did invite you? Who else did take part?
– I was invited by the Chairman Keating’s office circa two weeks before the hearing. Members of the Subcommittee wanted to investigate how Kremlin’s disinformation works in Europe, especially with a look at influencing of the elections, and also to discuss how various European governments react to the threat.
Apart from me, Ambassador Dan Fried from the Atlantic Council, the Finnish investigative journalist Jessikka Aro, and Dr Frederick Kagan from the American Enterprise Institute testified in the hearing.
– You listed elections during which pro-Kremlin disinformation campaigns were identified. Ukraine was mentioned twice: in 2014 and in 2019. What was the difference in disinformation campaigning in our country in 2014 and 2019 and in their results? Was Ukrainian government, or civil society, or both more effective in countering these campaigns in 2019?
– As we mentioned in the recent report of the Ukrainian Election Task Force (it was launched in December 2018 for the period of presidential campaign in Ukraine by the Atlantic Council, the Transatlantic Commission on Election Integrity, and the Victor Pinchuk Foundation in partnership with the Detector Media, StopFake, and the Razumkov Center. – DM), for example in the cyber domain, in 2019, there were significant improvements in comparison to 2014. Whereas in 2014, there was a serious attack against the Central Election Commission, it seemed that in 2019, Ukraine was already much better prepared and we haven’t seen similar attacks.
As for disinformation, I do believe there were several improvements made in Ukraine. There are multiple projects focusing on various aspects of disinformation, information manipulation, influence operations. I think and I hope that Ukrainian civil society is now much better prepared than in 2014, when the post-revolutionary situation obviously caused some difficulties. The initiatives like Ukraine World, like Ukrainian Crisis Media Centre, like your own Detector Media, the well-known StopFake, or Texty.org.ua – and I am sure I omitted some other important examples, because there are so many – are all brilliant examples of the strength of Ukrainian civil society. The West has a lot to learn in Ukraine about investigating, researching, exposing and countering disinformation.
However, five years of the disinformation campaign also mean that some of the narratives have managed to root deeply in the information space, they found their audience, they get repeated and multiplied, and the original source of the disinformation gets laundered. I do not have proper data for this, but my feeling is that today more than e.g. in 2015, you could hear Ukrainian voices doubting whether the Maidan revolution was a step in the right direction, and more voices drawing a false parallel between what the West is doing, and what Russia is doing. I perfectly understand that we cannot blame solely the Kremlin for the increase of these voices – but I am also sure that the Kremlin is trying hard to make these voices stronger, and they are happy to see that.
On the other hand, I repeat that the feeling that these voices are becoming stronger is just my impression, and I will be more than happy if that impression is wrong.
As I highlighted in the Congress as well, I fear that you would see such a pattern of becoming used to disinformation, or even endorsing it by “domestic” sources, in many European countries.
– During how many elections from your list Russia was successful with disinformation campaigns (I mean that the result of elections was desirable for Russia)?
– Measuring the success of the disinformation campaign is always one of the hardest disciplines, and the Kremlin and its useful idiots are frequently abusing this fact: “Yeah, maybe there is some Russian propaganda, but show me that it actually has an impact!” This particular tactic is very similar to the old KGB approach, as described by defectors from the Soviet bloc.
It is hard to separate what effect had the Kremlin’s campaign from what effect was achieved by other actors and events that aimed in the same direction. If e.g. a particular politician gets attacked by the Kremlin’s disinformation machine on a daily basis, and he himself causes a scandal that will cost him some votes, it is truly very hard to measure which of the circumstances played a bigger role.
However, if we focus purely on the outcome, and have a look whether the elections ended the way that the Kremlin wanted, I think we can identify quite a few very clear wins: the Dutch referendum in 2016, the Brexit referendum, the US elections and the Italian constitutional referendum same year, Czech presidential and Italian parliamentary elections in 2018.
There will be a few disputable results. The elections in Germany in 2017 are not notorious for being a success of the Kremlin’s information aggression, however, I am afraid they were. Moscow aimed to weaken Angela Merkel and support the extremists from AfD. Merkel’s party scored the worst result since 1949. And the AfD received more votes than most of the opinion polls predicted.
Am I capable of distinguishing to what degree was the CDU/CSU’s and AfD’s result influenced by Kremlin’s influence operations, and to what degree was it their own campaign and mistakes? No, I am not. But was the Kremlin aiming its influence operations in this direction? Definitely yes. Would the result be the same without Kremlin’s information aggression? I am pretty sure it would not.
And then there are results that we usually consider a failure of Kremlin’s disinformation campaign, like the Austrian presidential elections in 2016, and French presidential elections a year later. In both cases, the candidate favoured by the Kremlin lost. However, do we know to what extend the Kremlin’s information aggression mattered and how it altered the result? Timothy Snyder in The Road to Unfreedom argues that Le Pen and Hofer might not have won, but they performed far better than they would have before Russia’s campaign against the West started – and I have a feeling he is right.
I would like to see more investigations similar to the one by Kathleen Hall Jamieson – breaking out what messages were planted and amplified by the Kremlin’s disinformation machine, and measuring what impact they had on the election campaign and polling of particular candidates. So far, it is the most thorough work I have seen about measuring the impact of Kremlin’s aggression on an election – and I am sad that in Europe, where we have far more experience with Kremlin’s hostilities, we do not see similar investigations.
– Do you see any new tactics, tools from Russia’s side at the moment?
– What I am most worried about is how Kremlin’s disinformation penetrates domestic, mainstream sources. We have seen in Ukraine and in Germany how Russian actors deliberately reach out to domestic actors who would spread Kremlin’s lies for them. We have seen in the Netherlands, in Serbia, and in many other countries how European influencers frequently repeat typical Kremlin’s lies, giving them new legitimacy. We have seen similar pattern in Belarus. Some of the most typical Kremlin’s lies about the language law in Ukraine, or about George Soros, are often endorsed even by Hungarian authorities.
In my opinion, this is one of the biggest successes of the Kremlin’s disinformation campaign – that increasingly more actors are parroting the Kremlin’s lies, giving them new legitimacy and spreading them among new audiences. And given how various actors focus more on Russian activities, I believe we will see more of these efforts that try to hide the Russian track and pose as a “purely domestic” problem.
– You have mentioned four directions of defense: documenting the threat, raising awareness, mitigating the weaknesses that the aggressor exploits, punishing the aggressor. What is the easiest and what is the most difficult one?
– The easiest is surely documenting of the threat, basically any media outlet or any NGO can begin with it right now, just monitoring the Russian TV or the local Russian and pro-Russian channels and identifying the lies that are spread there would already give some new information.
It is easy, but it is also the necessary first step – if we do not document the information aggression, we cannot properly investigate it. If we do not measure how many information attacks are there on a daily basis, we cannot talk about an increase or decrease of the disinformation campaign, we cannot talk about change of focus. Systematic gathering of the data is the necessary first step in order to be able to talk about hard data, not about impressions. This is precisely why we have started this project with my previous team in Brussels (East StratCom Task Force. – DM) – and we would need the EUvsDisinfo database to be even more comprehensive, for which, the team would need significantly more resources.
The most difficult one is punishing of the information aggressors. Whenever any state or non-state actor tried to do something about Kremlin’s disinformation campaign, the Kremlin and its useful idiots immediately start with attacks about freedom of speech etc. – just see the trolling reaction of the Kremlin to the British decision to ban Russian disinformation organisations from a UK event. You would see similar attacks everywhere in Europe.
It is necessary to keep repeating that this is not about creating new rules, but it is about enforcing those rules also with regards to these aggressors. We have laws against spreading hatred, against incitement to war and to violence, against defamation. In some countries, it is illegal to spread false alarms and cause panic. It is just about applying these rules also to the pseudomedia that keep violating them. This is about rule of law. Freedom of speech does not mean freedom to lie, deceive, manipulate and denigrate – which is exactly what the Kremlin’s disinformation machine is doing.
We should also sanction those individuals and organisations that are deliberately spreading lies, hatred, manipulations and fake news. Sanctioning the Russian TV channels that are involved exactly in these activities would also mean they would lose money from advertisements from the Western companies – and that would hurt them a lot.
– In Ukraine, there is a discussion on establishing new state TV channel: in Russian language, Ukrainian-centric, with entertainment content produced in Ukraine, aimed at Russian-speaking audience all over the world. What would you say about this idea?
– I always say that we need as many solutions as possible, as quickly as possible. We need to see more from the governments, from the intergovernmental organisations like EU and NATO, we need to see more from the media and from the civil society, and from the social media platforms. I welcome every new initiative that tries to do something against Kremlin’s information aggression – because inactivity is more dangerous than an activity that might not be perfect.
However, we cannot expect that this particular measure will solve all the problems on its own. There will always be some part of the audience that will prefer other Russian-language channels. The experience from Estonia, where they also launched a Russian-language channel for the Russian-speaking minority in Estonia, indicates that it is very hard to create a channel that will compete with the well-funded giants of Russian Federation. And most people will watch the channel that offers them the most attractive Champions League game or the most attractive boxing match, and the latest Hollywood film and the most discussed new series – this is what attracts audiences, and the news are usually just a filling.
With this in mind, it might be smart to have a look whether the money would not be better spend to support those projects that have already built some Russian-speaking viewership and readership, and that have the track record of reaching out to this particular audience. But this is for Ukrainians to decide, I do not have all the necessary details.
As it was mentioned before, I have outlined four different areas of possible solutions in the testimony, this particular one, a new channel for Russian-speaking audience, could be a part of the third bigger area, mitigating the weaknesses of the information space that get exploited by the information aggressor. It is necessary to perceive it as such and not forget that it does not solve 100 per cent of the problem, that there is much more that needs to be done.
– What is the best approach to prevent the dissemination of disinformation messages on social network, from your point of view?
– If we finally decided to punish the information aggressors, it would be also harder for them to spread their disinformation on social media. So I would say that the best approach is the one we are not doing yet – sanctioning and punishing Kremlin’s disinformation instruments and individuals helping the disinformation campaign. And ideally, the sanctions should include that the social media would be forbidden to cooperate with the identified individuals and companies – so that they would be prevented from conducting the malicious actions.
However, before we get to the point where the decision-makers finally realise how serious the situation is and what needs to be done, there is still a lot that can be done on the non-governmental level. The social media platforms should work with experienced NGOs and journalists working in this field to identify channels notorious for spreading disinformation. Such channels should not be promoted and recommended by the companies. In 2019, it is inexcusable that YouTube recommends videos by well-known liars and manipulators to their users – that’s either striking ignorance and incompetence, or deliberate collaboration with the West’s enemy.
Articles from the notorious disinformers can be pushed to lower positions in the search results. In the social media feed and in the search results, they could appear with a similar warning that we have on tobacco products: “Dear consumer, be aware that this article is coming from a site notorious for spreading disinformation. It can seriously harm your mental health and distort perception of reality”. Thus, we would also hopefully achieve that at least some of the companies would stop buying ads on these channels – who would want to financially support product identified as harmful for your health?
– Do you know why Russia’s Attorney General’s Office designated the Atlantic Council as an illegal “undesirable organization” in Russia? How did it influence the opportunities, conditions of your work?
– Let me refer to the statement of the Atlantic Council’s President Frederick Kempe. As far as I know, we have not been informed about the reasons for this decision. As for the implications, we are still figuring them out.
Photo: Jakub Kalenský