Since the full-scale invasion of Russia, most of our citizens, namely 76.6%, have been receiving news from social networks. Telegram, YouTube, Facebook came out in the TOP 3 in popularity. Such are the results of a sociological survey on media consumption of Ukrainians in a full-scale war presented by OPORA. In addition, experts discussed issues of self-regulation in social networks, combating misinformation, and how all these challenges of modern realities can be fixed in national legislation for the dissemination of systemic practices. The discussion took place as part of the weekly online stream OPORA.Live on June 1.
According to OPORA's data analyst Robert Lorian, in May 2022, the Kyiv International Institute of Sociology (KIIS) included a question from OPORA in its quarterly Omnibus KIIS survey to learn more about media consumption since the full-scale war. The poll method was phone interviews based on a random sample of mobile phone numbers. It took place on May 3-26. 2009 respondents representing the adult population of Ukraine took part (except for the territories occupied since 2014 and people who went abroad after 24.02.22). According to the data analyst, the total error of the study does not exceed 2.4%. "The aim of our study was to find out the media consumption of Ukrainians for news information in Ukraine in general over the past two months. Therefore, we directly asked Ukrainians, in particular, three questions: 1) what sources of information Ukrainians have used in the last two months to receive the news; 2) how much do they trust these sources of information, i.e., which sources of information do Ukrainians trust the most; 3) We asked those Ukrainians who chose social networks as one of the sources of information, which social networks Ukrainians use to receive the news," Robert Lorian said.
Since February 24, Ukrainians have mostly used Viber, Telegram, YoTube, TikTok, Facebook, Twitter, and Instagram to receive news - 76.6%. Television and the Internet (except social networks) are slightly less preferred - 66.7% and 61.2%. Radio and print media - 28.4% and 15.7%. Regarding the reliability of the information, citizens of Ukraine trust television the most (60.5%). Slightly less they trust social networks and the Internet (excluding social networks) - 53.9% and 48.8%. 34.3% trust radio, 23% trust print media, and 2.4% it was difficult to answer this question. Instead, 5.2% of the respondents do not trust any of these sources. Those citizens who answered the first question that they receive news information from social networks were also asked which social media they use. Telegram, YouTube, and Facebook were in the top 3 in terms of popularity - 65.7%, 61.2%, and 57.8%, respectively. Less than half of the respondents chose other social networks: Viber - 48%, Instagram - 29.1%, TikTok - 19.5%, Twitter - 8.9%, hard to say - 2.4%, other - 2%. More information about the research is on the OPORA website.
"We were most surprised that Telegram has become the most popular source of news for the last two months for Ukrainian users. There is a lot to say about the reasons for this because Telegram is quite fast and quite convenient, the messages there are displayed in chronological order, unlike in many other social networks, messages come quite quickly, there is a search, and there is no censorship, that is, any materials are published, even those that can not be published on other platforms. As a result, the popularity of Telegram is growing every year, especially during the war. Telegram became a popular social network in Hong Kong when there was a revolution. Telegram is gaining popularity in countries where censorship was introduced, such as Iran. And we see Telegram conquering our media market during a full-scale war because it is one of the main sources of information, including media information, i.e., watching videos and photos that users cannot watch on other platforms," Robert Lorian said.
According to the data analyst, since the source of news for more than 76% of Ukrainians is social networks, they play an important role in the information agenda in Ukraine, in particular, pose a high risk of disinformation. He notes that in the two months since russia's full-scale invasion of Ukraine, some social networks have also been frequently changing their self-regulatory policies, such as rules for sharing and moderating content.
According to Olha Snopok, OPORA's social media monitoring specialist, when we talk about self-regulation and deleting illegal content on Facebook, it should be divided into two categories: posts with paid advertising and regular publications.
When it comes to paid content, Facebook may either not approve the advertising post before its publication or delete it after some time. In the two months since the start of the full-scale war in Ukraine, there have been many deleted advertisements in the Ad Library. "We can conclude that Facebook learns to react to some illegal content and delete it faster. However, again, we recently discovered advertising posts that advertise some fraudulent schemes on Facebook, and it approves them. But, as we can see, the war in Ukraine shows that it is possible to counteract advertising on Facebook, and it is gradually learning to do so," Olha Snopok said.
According to her, when we talk about the regular posts that share illegal content, Facebook has a fairly small set of tools to regulate them. "Facebook has a certain feature - it closely monitors coordinated inauthentic behavior and quite effectively tries to fight it, delete it. In other social networks, this fight is not so big. And this is, of course, bad because we see, for example, when in Telegram, some coordinated groups or pages share illegal information," Olha Snopok said.
In addition, Facebook is actively cooperating with fact-checking organizations, which has become especially important since the COVID-19 pandemic. They also track and label misinformation, which helps to fight it, but doesn't solve the problem completely.
According to the social media monitoring specialist, the main tool that Facebook and other social networks rely on are users' complaints about illegal content. But this may not be very effective, as users may not know how to complain or not want to waste time. "But when the war started, Facebook has shown that it has several tools that it actively uses. For example, to combat misinformation in our country, Facebook lowers the ratings of pages or posts and limits access to certain categories of citizens to certain content. But in general, I would say there is a lot of misinformation on Facebook. And today, Facebook does not have any established policies to counter it effectively. And some unusual events, such as the storming of the Capitol in the United States a year ago or the war in Ukraine, show this especially well. When social networks should actually come up with some policies to fight this," Olha Snopok said.
Data analyst Robert Lorian also noted that since the 2016 US presidential election, when Donald Trump was elected, the public had been heavily criticizing Facebook for its regulatory policies. And now, Facebook is trying to regulate its content. In Ukraine, we can trace such attempts of regulation as hiding the hashtag about the Bucha genocide and its resumption afterward or allowing Ukrainians and Ukrainians to call for the deaths of putin and russians and banning such posts. "We see Facebook's local efforts to resolve this situation. They are not always successful. In general, Facebook's policy is not long-term. That is, we do not see a strategy to moderate content and counter disinformation," Robert Lorian said.
However, in his opinion, one must pay attention to Telegram, which doesn't regulate anything. And now, it is a platform for disseminating pro-russian narratives and disinformation. As an example, you can look at the study by Detector Media on this topic. "After Facebook and Instagram were blocked in russia. Vkontakte and Odnoklassniki have been blocked in Ukraine since 2015. Now Telegram is becoming the only field of common information space with russians for us. Where Ukrainians together with russians jointly consume and jointly generate some informational news. In particular, there is this intersection and spread of propaganda and misinformation on their opponents," Robert Lorian said. But this platform is not quite public, so it is unclear how to encourage it to self-regulate. In general, it is worth talking about the state regulation of the media space, in particular, social networks. Especially considering that from 2008-2009, the Ukrainian legislation on the media did not change, but the media space transformed significantly.
Lawyer at the Center for Democracy and the Rule of Law and the Independent Media Council, Tetiana Avdeyeva, described how deleting banned content on social media is currently working and why it is unlikely to change. According to her, there are two options for deleting content on social media: 1) response to complaints from users, 2) response to notifications from the state when government agencies send communications on changes in national legislation, and "geographical" blocking is applied. For example, this happened in Ukraine with regard to communist symbols. Relevant communication has been sent that communist symbols are banned in Ukraine, and the relevant content was blocked. Accordingly, such content is unavailable throughout the "country of Ukraine" region.
"If we talk about the proactive response of the networks themselves, then there is a problem. As soon as social networks begin moderation without a complaint procedure, they become responsible for the content published on the platform," Tetiana Avdieyeva said. And since social networks are, first and foremost, business, it is unlikely that businesses will want to be responsible for content. Especially in cases of armed conflicts, genocides, mass riots, etc. As an example, the expert recalls the genocide in Myanmar. An independent fact-finding mission said in its report that Facebook had done a lot to make the genocide happen. The same goes for the riots in the US Capitol. "We have a complex problem. On the one hand, social networks are very influential. On the other hand, they do not want to be responsible because they understand that they amplify what usually happens in society. And when it comes to state regulation, it seems to me that it is currently impossible to define misinformation as a category for regulation for social networks or to define specific regulation for social networks at the national level," Tetiana Avdieyeva said.
According to her, now we have a Digital Services Act, which offers specific regulation of social networks in the European region. But this proposal for regulations is constantly updated. And if the national governments actively adopt the relevant national legislation, then they are likely to face the problem that they will have to adapt to European regulations. This is one of the reasons why we are slowing down a bit in the regulation of social and shared access platforms. Ukrainian media legislation has not been updated since 2008. Currently, the bill is "in the drawer", which, according to the expert, unfortunately, is not considered relevant enough to appear on the agenda in parliament. Therefore social networks exist at the level of self-regulation or are regulated by the legislation of the states where they have offices. For example, the American Facebook office is located in California, so it complies with local law. And the European office of Facebook is located in Ireland, so, accordingly, it complies with the local and EU legislation.
OPORA's data analyst Robert Lorian notes that another way to regulate content on these platforms is artificial intelligence. Much content is moderated by automated programs that determine whether the content is acceptable for publication on these platforms. The question is to what extent the moderation algorithms are transparent. In particular, for content in different languages, since artificial intelligence can't respond to dialects, and the number of social network employees who moderate content is insufficient. In his opinion, opening offices will solve some problems. Offices will be able to work in our media legal field and comply with our legislation. Therefore, they will have a wider space for interaction and cooperation. And Ukrainians will be more involved in regulating content.
"The problem with artificial intelligence is not only the knowledge of dialects but the development of a common regulatory framework. The problem is that social networks initially tried to develop a common regulation for all with some exceptions. For example, we have a platform you can distribute anything except - and then the list goes on. For example: except for hate speech, calls for violence, child pornography, copyright infringement, etc.," Tetiana Avdieyeva said. In the past, they interpreted these exceptions, such as "calls for violence", equally for all regions. Then, particularly in Ukraine, they realized it was impossible to apply the same regulations and tried to be specific to the context of the armed conflict. But, in her opinion, the problem is that the rules remain general, and the context is specific. For example, people understand that the word "rusnia" for Ukrainians is a way, rather, to express feelings, but algorithms do not have time to update to this situation. That's why the question is not so much in the knowledge of the language for algorithms as in the inability to adapt to a specific context.
Tetiana Avdieyeva also spoke about the state restriction of social networks and messengers. The most popular example is Vkontakte. Along with blocking access, there was also a request to the AppStore and GooglePlay to block the ability to download the application from Ukraine. "This reduces the number of users very significantly. In fact, we really can't block any social network as such. By social network, I mean both YouTube or TikTok, where the video is distributed, and a messenger with a function of public channels. The question is to limit their impact, especially when this impact is negative. And in these cases, in general, everything depends on the audience - on the number of people who use these messengers," Tetiana Avdieyeva said.
According to the expert, another interesting issue is how the networks operate. When we talk about Meta, TikTok, and Twitter, it's clear that their business model is based on advertising. But with private messengers (Signal, Viber, Telegram) without advertising, it isn't clear. As for Telegram, there are some questions. How does this network work without advertising? Who pays salaries to moderators since there is a function of 5 complaints about content? To the people who are responsible for technical, legal, and other support? It's registered in the UAE, but where exactly are the servers located? Who has access to the servers? Could there be a leak of personal data of users from Ukraine, in particular to the aggressor state? And other questions. "How must we react to this if the social network is not ready to cooperate? Even democracies like Germany have already offered just blocking. The issue is not the constant blocking of the social network as such because we all realize that freedom of speech should prevail. The question is to encourage social networks to communicate in general, i.e., to encourage them to cooperate with the state. Especially when we have such a specific context," Tetiana Avdieyeva said. This was the case with Vkontakte - it was impossible to agree and establish communication with this social network. The situation with Telegram may be about the same if they do not respond to requests from the Government of Ukraine.
According to the expert, the common means of counteraction are media literacy, reports (complaints) of violations, more careful choice of applications for communication (digital literacy), and being careful with your personal data and information disseminated there.
Tetiana Avdieyeva says she is in favor of a model of both national legislation and the level of co-regulation. At the legislation level, it's possible to establish definitions and basic responsibilities. For example, personal data of children under 12 can't be used for ad targeting; hate speech must be removed within 24 hours, and during a state of emergency or martial law – 12 hours, etc. It's also possible to regulate the response to state requests. Is it possible to transfer personal data abroad, especially to countries where human rights are violated? Think about how artificial intelligence can work in terms of both deleting and prioritizing content. "In general, I like how the EU Digital Services Act looks now. Given that Ukraine will, in general, have to harmonize its national legislation with European legislation, it would be good to take it as a general framework to regulate social networks, shared access platforms, messengers, etc. But then, in fact, we could think of more contextual regulation. And contextual regulation is, again, in our case a state of war and a state of emergency - how it should be regulated. As for content restrictions and misinformation, it seems to me that this should not be regulated by the same law as social networks," Tetiana Avdieyeva said.
According to the expert, the issue is to regulate not political memes but false information, which leads to negative consequences (contributes to incitement to hatred). That is, to oppose content that is inherently a violation of freedom of expression. She believes that for such cases, it is necessary to develop an adequate response, which will be not reactive but proactive. That is, it will not be used when the information aggression is over but will allow reacting during the process. The expert also emphasizes this path will be very difficult for Ukraine. After all, most international organizations oppose regulating misinformation. Therefore, we'll have to advocate the idea that misinformation is dangerous and incorporate it carefully into our legislation.
OPORA's data analyst Robert Lorian agreed that the European approach to regulating social networks is the most correct, comprehensive, and relevant to our realities. But he sees that the implementation of this approach in our legislative field will be difficult and long. Regarding the proactive position on counteracting disinformation by the Government of Ukraine, he considers it a good position to block the profiles of russian state media or influencers with russian narratives.
Tetiana Avdieyeva told about our state structures, which are now, during a full-scale invasion, trying to resist russian disinformation. There is the Center for Strategic Communications under the Ministry of Culture and Information Policy. They deal with not security issues but rather with misinformation and narratives, the content of messages distributed regularly by the russian side. There is also a Center for Countering disinformation under the National Security and Defense Council, which deals with security issues. And there is the National Security and Defense Council, which deals with sanctions and specific distribution of resources.
Tetiana Avdieyeva believes this system works quite well. They operate both at the level of state measures and at the level of education. But we still need to actively cooperate with stakeholders and build communication. We should work to block propaganda resources not only in Ukraine. After all, the blocked profiles of russian influencers in Ukraine are positive, but they carry russian narratives about Ukraine around the world, in particular, to our partner countries. The expert says the main task of government agencies in our situation is to communicate adequately. With the social networks themselves, with foreign media, disseminating adequate information to them, with consumers of information. This system must be comprehensive. "It will not be possible to simply block the social network (by the way, the example of Vkontakte has clearly shown this) and say that we have won the information war. And no measure, in fact, will not help on its own because it must be a systematic approach. This is education, imposition of sanctions and blockades, where it is technically and legally possible, and communication in general," Tetiana Avdieyeva said.
According to the expert, we can't solve all nuances exclusively at the state level, as there are many of them. The public sector is involved. NGOs communicate with social networks and promptly report framework cases of violations or offer ready-made solutions. Also, fact-checking organizations cooperate with social networks. There are two of them in Ukraine: Stop Fake and Vox Check. They are certified by Facebook, reliable, and everyone listens to their reports. Tetiana Avdieyeva believes that all state structures in this direction work well, but coordination of efforts and more active cooperation with the public sector is needed.
OPORA's data analyst Robert Lorian supports the fact that Ukraine faces challenges in the context of protecting its information field and combating russian narratives internationally. But in the end, he urged everyone to start from themselves - to consume information consciously.