This chapter presents policies and practices for a multi-stakeholder approach to information integrity. It discusses efforts to help provide the public with the skills to navigate the evolving information environment with a discerning view and critical approach and help facilitate the search for consensus through media and information literacy and the necessary evolution of the role of public communication. The chapter also explores the importance of strengthening participatory measures to inform the policy-making process in this space.
Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity
3. Fostering societal resilience to disinformation
Abstract
3.1. Introduction
Countering disinformation and strengthening information integrity require concerted efforts to build societal resilience. Broadly, resilience is about addressing the root causes of crises while strengthening the capacities and resources of a system to cope with risks, stresses, and shocks (OECD, 2023[1]). Applied to tackling disinformation and strengthening information integrity, resilience refers to a society’s ability to understand, resist and recover from threats within the information space. Indeed, several countries have situated societal resilience to information threats as part of building a comprehensive or total defence system, in which every individual and organisation should play a role, including as checks and balances in the overall information ecosystem.
On the one hand, therefore, individuals need skills and knowledge to navigate the information space effectively and responsibly. Government investments in digital, media and information literacy – and efforts to help ensure private companies actively contribute to societal resilience efforts – are important means to prepare and inoculate people against false and misleading content. According to PISA (Programme for International Student Assessment) results, in 2018 only 47% of 15-year-old students across OECD countries reported that they were taught how to detect whether information is subjective or biased at school (OECD, 2021[2]). A person that can navigate the information space responsibly will likely be more able to assess critically the content they encounter, to find higher-quality sources, to identify biases, and make well-informed decisions.
Furthermore, developing a public communication function removed from politicised goals to serve as a source for accurate and relevant information, and that is responsive to citizens in the service of the common good, is an important tool to build societal resilience. More broadly, the value of access to information as a key safeguard for democracy has become more evident in the past years. Various crises, ranging from financial to health to defence, have increased the need and demand for accurate information from government itself (OECD, 2022[3]).
At the same time, fostering resilience to disinformation will require governments to strengthen public engagement mechanisms on topics related to information integrity as part of the larger undertaking to reinforce democracy and build trust. Engagement with the public and non-governmental stakeholders should ultimately be guided by efforts to protect and strengthen civic space to foster more open, transparent, and accountable governance (OECD, 2022[3]). Expanding research and understanding of the information space (namely, the convergence of the public, communication technologies, amplification algorithms, and content), and ensuring the findings inform the policymaking process, will also be essential contributions (Wanless and Shapiro, 2022[4]). Governments should therefore focus on expanding the competencies, resources, and reach of efforts in this space to facilitate participation and understanding across all segments of society.
Together, these efforts compose what is often referred to as a whole-of-society approach. That said, an effective whole-of-society approach also requires protecting the human rights of those targeted by disinformation. It also requires promoting civic education, as well as clarifying processes, expected outcomes, and mechanisms both to mitigate potential risks and to take full advantage of the opportunities to engage with the public and non-governmental stakeholders. For example, the Netherlands’ 2022 government-wide strategy for effectively tackling disinformation explicitly mentions the role of civil society and academics in fighting disinformation (Government of Netherlands, 2022[5]). Similarly, the 2023 Latvian counter-disinformation programme stresses the importance of government co-operation with stakeholders across society. The 2022 updated EU Code of Practice on Disinformation, furthermore, defines stronger and formalised roles for the fact-checking community, and the EU Digital Services Act creates obligations for online platforms and search engines to co-operate with fact checkers in the framework of Code of Practice (European Union, 2022[6]).
To reinforce societal resilience against the risks of mis- and disinformation and implement a whole-of-society approach, government efforts should focus on:
Strengthening media, information, and digital literacy skills
Helping ensure the public is well informed via proactive and public communication efforts removed from politicised goals, and
Strengthening public participation in information integrity efforts and building understanding of the information space.
3.2. Media, information, and digital literacy is essential to developing a systemic approach to building societal resilience
A long-term and systemic effort to building societal resilience to the challenges posed by disinformation involves building media, digital, and information literacy to help ensure the public can participate in the information environment with a discerning view and critical approach. There are several definitions of what media, digital, and information literacy includes. For example, the EU's Audiovisual Media Services Directive (AVMSD) stipulates that media literacy refers to skills, knowledge and understanding that allow citizens to use media effectively and safely. Beyond learning about specific tools, technologies, and threats, media literacy more broadly aims to equip individuals with the critical thinking skills required to exercise judgment, analyse complex realities, and recognise the difference between opinion and fact (European Union, 2018[7]). The UK’s independent communications regulator, Ofcom, defines media literacy as “the ability to use, understand and create media and communications in a variety of contexts” (Ofcom, 2023[8]). UNESCO defines media and information literacy (MIL) as an effort to: “Empower people to think critically about information and use of digital tools. It helps people make informed choices about how they participate in peace building, equality, freedom of expression, dialogue, access to information, and sustainable development (UNESCO, 2023[9]).” Digital literacy, furthermore, focuses on the competencies needed to live and work in a society where communication and access to information increasingly takes place through digital technologies (OECD, 2022[10]).
A comprehensive understanding of media, information, and digital literacy focuses on the public’s skills related to accessing, analysing, evaluating, and creating content in a variety of contexts (Hill, 2022[11]). This range of skills includes both understanding the creation and distribution process, as well as developing the ability to take a critical approach to evaluating information reliability. Governments largely recognise the importance of building media and information literacy skills. Within Europe, the EU's Audiovisual Media Services Directive (AVMSD) (European Union, 2018[7]), which governs EU-wide co-ordination of national legislation on all audio-visual media, includes specific provisions requiring Member States to promote media literacy skills and to report on these actions, and obliges media service providers and video-sharing platforms to promote the development of media literacy and raise awareness of available media and digital literacy tools (European Commission, 2023[12]). The European Regulators Group for Audiovisual Media Services, furthermore, is tasked with exchanging experience and best practices on the application of the regulatory framework for audiovisual media services, including on accessibility and media literacy. As of 2022, in the United States, 18 states have passed legislation requiring education agencies to develop and include media literacy curricula in schools (Media literacy now, 2022[13]).
Taken together, governments should prioritise the following elements when considering how media and information literacy initiatives best fit into broader efforts to build societal resilience:
Media and information literacy initiatives should be seen as part of a wider effort to reinforce information integrity, including by incorporating such efforts into official curricula and reaching individuals of all ages in relevant efforts
Pro-active public communication efforts, or “pre-bunking,” can be useful media and information literacy tools to help build societal resilience
Assessing and measuring impact of media and information literacy activities.
3.2.1. Media and information literacy initiatives should be seen as part of a wider effort to reinforce information integrity
The aim of the media, information, and digital literacy initiatives largely focuses on efforts to give people the tools to make conscious choices online, identify what is trustworthy, and understand platforms’ systems in order to use them for their own benefit (Forum on Information and Democracy, 2023[14]). Media and information literacy should be part of a larger approach to building digital literacy, for example by focusing on elements related to addressing how algorithm recommendation systems and generative AI work, as well as civic education, for example by teaching the importance of democratic principles and processes and focusing both on school-aged individuals, as well as adults and seniors.
Ultimately, media literacy initiatives are most relevant to the extent that they reinforce broader objectives related to strengthening information integrity. For example, Portugal’s National Plan for Media Literacy highlights that media literacy is a fundamental element for the defence of freedom of expression and information, and is essential to enabling democratic participation and the “realisation of economic, social, cultural and environmental rights (Government of Portugal, 2017[15]).” A notable component of Finland’s approach, furthermore, is that their focus on media literacy has long been perceived as part of a wider effort to build societal resilience to disinformation. Media education initiatives have been present in Finnish schools since the 1950s, and the country has focused its media education efforts on promoting people’s willingness and ability to consume, use and share information in a responsible way, and, ultimately, contribute to citizens’ active participation in society (see Box 3.1).
Box 3.1. Media literacy in Finland
Finland’s approach to media literacy is outlined in the National Media Education Policy, published by the Ministry of Education and Culture in 2019, in collaboration with the National Audiovisual Institute (KAVI). The promotion of media literacy is a cross-cutting activity for the Ministry of Education and Culture and has expanded to cover other areas of society and administration.
The 2019 National Media Education Policy in continues a decades-long effort to promote democratic participation and reduce polarisation in Finnish society. While the first media education curriculum was introduced in Finnish schools in 2004 through an action plan addressing violence in the media and media education, media education initiatives have been present in Finnish schools since the 1950s.
Today, the concepts of misinformation and disinformation are part of student coursework, including the study of famous propaganda campaigns, advertising, and tactics for using misleading statistics. As part of the curriculum, students create their own messages and multi-media products on different topics to share with their peers for comment and analysis.
Finnish media education involves a range of actors in developing media education plans, including civil society organisations, schools, libraries, NGOs and universities. Finland also promotes media literacy following European Union guidance, such as the Audiovisual Media Services Directive (EU 2018/1808) and the Communication from the Commission on Tackling Online Disinformation. The National Audiovisual Institute, in co-operation with the Ministry of Education and Culture, is responsible for evaluating the implementation of the action plan.
Source: Government of Finland (2019[16]), Media Literacy in Finland: National Media Education Policy, Ministry of Education and Culture, https://medialukutaitosuomessa.fi/mediaeducationpolicy.pdf.
In some OECD Member countries, media and information literacy is centrally co-ordinated, for example by the National Audio-visual Institute, KAVI, in Finland; the Centre de liaison de l'enseignement et des médias d'information, CLEMI, in France (see Box 3.2); or the National Media Regulatory Authority (ALIA) in Luxembourg, which co-ordinates media literacy activities with relevant national and European stakeholders. In Portugal, the Regulatory Authority for the Media has helped facilitate media literacy by mapping the range of existing interventions to promote and develop this space in the country (Portuguese Regulatory Authority for the Media, 2023[17]). In other countries, the responsibilities are spread across different institutions, such as ministries of education, other line ministries or national regulatory authorities.
The most common approach is for countries to provide media literacy within schools (see the example from Estonia in Box 3.3), either via a separate curriculum specifically devoted to media and information literacy or included within other topics (for example, language, mathematics, history, citizenship). In Portugal, for example, the curriculum integrates media literacy via citizenship and information and communication technology sections. The country’s Guidelines for Media Education (Referencial para a Educação para os Media), updated in December 2023, underline that media literacy is interdisciplinary and should be reinforced across learning areas, as well through projects with the National Network of School Libraries and with external organisations.
Box 3.2. The “CLEMI”: France’s centre to promote and co-ordinate media and information literacy activities
In France, the CLEMI (Le centre pour l'éducation aux médias et à l'information) is in charge of media and information literacy throughout the French education system. The CLEMI was created in 1983 its mission is to promote, both nationally and in France’s “académies”, the pluralistic use of information tools in education to foster a better understanding by students of the world around them while developing their critical thinking skills.
Its objectives are the following:
Training teachers and teaching pupils how to use media responsibly, whatever the information or communication medium (written press, audiovisual, Internet, social networks).
Producing or co-producing teaching resources and tools on all media to support teachers and pupils by offering MIL activities for the classroom.
Helping to create and develop school media (newspapers, websites, blogs, web radio, web TV).
Supporting families by producing and distributing media and information education tools for all.
Since the official text of January 24, 2022 (circulaire du 24-1-2022), regarding the mainstreaming of media and information literacy (MIL) in France, the CLEMI collaborates closely with the French Ministry of Education and Youth. Together, they oversee a network of 30 academic focal points, each tasked with leading cells that unite all inspection bodies and academic delegations. CLEMI's initiatives are supported by a national team of 22 individuals, a network of 200 local academic co-ordinators, and numerous media partners, all contributing to the development of projects for schools.
Source: CLEMI (n.d.[18]), CLEMI website, https://www.clemi.fr/fr/qui-sommes-nous.html.
Box 3.3. Estonia’s “Media and Manipulation” course in the high-school curriculum
Since 2010, Estonia has included a compulsory “Media and Manipulation” course in the high school curriculum. The goals of the 35 academic hour course are that, by the end of it, students can:
Understand the modern information environment and the processes that shape its development and explain the nature of communication and the conditions for its occurrence.
Identify arguments and basic persuasion techniques in media texts and explain the author's objectives and motives.
Distinguish between facts and opinions, assess the reliability of information, including changes in the meaning of translated information.
Critically analyse advertising and discuss advertising and branding topics.
Understand media channels, analyse their characteristics, and describe different media genres.
Analyse the differences between direct and mediated communication and the intentions of participants.
Critically evaluate media manipulation, recognise propaganda, fake news, and myth making.
Express their opinion on what they have read, heard, and seen and choose appropriate language tools for this purpose.
Critically analyse their media behaviour, including on social media, and adjust it accordingly to the situation.
Find references and clues to other texts, interpret text, and distinguish between private and public information.
Source: Data provided by the Estonian government.
OECD countries also produce manuals and guidebooks on understanding and counteracting the threat of mis- and disinformation. These are distributed on official websites and in print, to be shared with schools and public libraries. For example, in 2022, the Latvian State Chancellery published a digital book entitled “Handbook against disinformation: recognise and oppose” (Rokasgrāmata pret dezinformāciju: atpazīt un pretoties)1. The manual summarises practical recommendations for state administration and local government workers, as well as all Latvian residents, to address information manipulation. The manual is distributed to libraries throughout the country. The Ministry of Interior in the Netherlands, for its part, finances the creation and operations of the website “Is that really so?”,2 which teaches the population how to identify mis- and disinformation.
Media and information literacy activities are often developed and implemented in partnership with a wide range of civil society organisations. The tendency toward this whole-of-society approach is borne out by the amount of CSOs, media and other organisations working in this field. For example, the United Kingdom identified at least 175 organisations focused on media literacy and in Finland, KAVI has identified almost 100 organisations promoting media literacy. For its part, the Norwegian Media Authority has established a media literacy network to provide a forum for organisations representing researchers, businesses, civil society organisations and governmental bodies to share information and identify priority issues to address. In the Netherlands, furthermore, the Dutch Media Literacy Network connects close to 1 000 non-governmental organisations (see Box 3.4).
Box 3.4. Dutch Media Literacy Network
The Ministry of Education, Culture and Science established the Media Literacy Network in 2008, and it currently has over 1 000 organisations as members, including public libraries, cultural institutions, education publishers and welfare organisations.
The Ministry funds the network’s programme activity plan. The network’s core partners deliver up-to-date media literacy programmes and support members’ activities through a ‘coordinating core’ of five committees and groups. The partners provide independent advice on developments in media literacy; conduct research; oversee staffing and funding; manage relations with the network; and perform evaluation tasks through satisfaction surveys of network members. The network’s accomplishments are traced through its press page, which also publishes statements, briefs, and research related to media literacy.
In addition to media literacy programmes, the Network increases awareness of media literacy and shares knowledge, expertise, and resources through its online resources. For example, Netwerkmediawijsheid.nl is the main online platform for the Network’s partners and other professionals working in media literacy. Mediawijsheid.nl hosts resources for school leaders and boards to permanently integrate media literacy into school education. HoeZoMediawijs.nl is a resource aimed at children older than 10 focused on protecting oneself online, information and games about using social media, and judging the reliability of information, among others.
Source: The Dutch Media Literacy Network (n.d.[19]), “About Dutch Media Literacy Network”, https://netwerkmediawijsheid.nl/over-ons/about-dutch-media-literacy-network/.
Governments also often partner with non-government organisations to provide media literacy initiatives, where CSOs and governments work together to prepare campaigns, informational and study materials, gamified solutions, and training videos. In Norway, the campaign “Stopp.Tenk.Sjekk” (Stop, Think, Check) was developed before the 2021 elections and is a co-operation between the Norwegian Media Authority and the fact-checking service Faktisk.no, the National Association of Local Newspapers, the Norwegian Directorate for Civil Protection (DSB), and with support from Meta. The campaign recommends six questions for individuals to ask themselves when reading content online, with the aim of helping people think critically about whether an article, post, or piece of news is trustworthy. A new version of the campaign was created concerning Ukraine in 2022, as well as prior to the 2023 elections (Norwegian Media Authority, 2021[20]). Similarly, the Be Media Smart campaign in Ireland flags the importance of knowing how to verify information, provides tips and guidance on how to check the accuracy and reliability of information, and provides information on sources of support and training (see Box 3.5).
Box 3.5. Ireland’s “Be Media Smart” media literacy campaign
An initiative of Media Literacy Ireland (MLI), a largely voluntary informal network of individuals and organisations that promotes media literacy in Ireland, the “Be Media Smart” campaign encourages people to Stop, Think, and Check that the information they read, see, or hear is reliable and accurate.
First launched in 2019 as part of a European initiative to counter disinformation in advance of the 2019 European elections, the campaign evolved in 2020 to focus on accurate and reliable information about COVID-19. In 2021, the focus was on helping individuals make informed choices about the COVID-19 vaccination based on accurate and reliable information. The message was delivered in Irish and English across TV, radio, and news publications across community, commercial, public service, and social media platforms.
All TV and radio advertisements were produced, distributed, and broadcast free-of-charge by MLI members from the media sector with added visibility provided by editorials highlighting the initiative. A co-ordinated social media campaign with a diverse range of MLI members using freely available social media assets also boosted the campaign and the call to action. All Be Media Smart communication directed people to the Be Media Smart website (available in Irish and English) for advice and support, a FactCheck section, and an ‘Ask an Expert’ section, where members of the public can put media literacy related questions to a panel of experts.
In 2023, the “Be Media Smart” campaign incorporated a Be Media Smart Community Training Programme. The training programme, developed in conjunction with EDMO Ireland, trained over 100 community-based leaders, coaches, and librarians to use the Be Media Smart Workshop in a Box resource to deliver media literacy training in English and in Irish in their own communities.
Research carried out by Ipsos B&A in November 2023 noted that 23% of adults recalled the campaign, unprompted, compared to the 15% before the media campaign started (for context, recall rates of between 13%-17% is considered successful for similar campaigns). In addition, 45% of respondents to the survey in December 2023 said that they would take action if they came across information that was false or misleading, compared to 32% in April 2021.
Facilitated by the newly established media regulator in Ireland, Coimisiún na Meán, and supported by media, civil society organisations, libraries, educational, training and research institutions and search and social platforms, this project shows the power of collaboration and the benefits that can be achieved when organisations collaborate to contribute ideas and skills. The European Platform of Regulatory Authorities (EPRA) and EDMO have highlighted the campaign as a best practice example, and the concept and elements of the campaign have been adopted in at least four other European countries.
Source: Government of Ireland; Media Literacy Ireland (n.d.[21]), “What is Media Literacy Ireland?”, https://www.medialiteracyireland.ie/; Be media smart (2023[22]), Be media smart website, https://www.bemediasmart.ie/.
Another co-operation format is “media literacy weeks”, such as those organised by UNESCO, across the European Union, and in several countries. In Finland, for example, every year around 30 different materials or campaigns are created in co-operation with more than 50 partner organisations from all sectors of society, including public institutions, NGOs, and private companies (Media Literacy Week, 2023[23]).
Media and information literacy activities may also include efforts to better understand and reach groups susceptible to mis- and disinformation, but that are not reached by more traditional initiatives, such as older populations, diasporas and second-language communities, socioeconomically disadvantaged groups, people with disabilities, and migrants. For their part, older populations often have weaker digital skills and are more prone to sharing mis- and disinformation compared to younger cohorts of the population (Guess, Nagler and Tucker, 2019[24]). Efforts to reach these group include projects devoted to media literacy of retired people through seniors’ centres, public libraries, and other community settings. For example, the Norwegian Media Authority worked with the non-governmental organisation Seniornet to develop educational resources for seniors, including printed booklets, presentations, and in-person meetings that build media and digital literacy within that population.
Other vulnerable groups that media and information literacy activities target include diasporas and second-language communities. To that end, Baltic states have designed specific media literacy campaigns to reach Russian speakers, such as the Latvian government’s project it has carried out with the CSO Baltic Centre for Media Excellence. In addition to working through schools, therefore, governments should identify approaches to expand media and information literacy activities to particular groups of the population that traditional programmes might not otherwise reach (see Box 3.6 for examples from the United Kingdom).
Box 3.6. United Kingdom efforts to help vulnerable people to spot disinformation and boost online safety
The United Kingdom has funded projects with 17 organisations to pilot new ways of boosting media literacy skills for people at risk of experiencing online abuse and being deceived into believing false information, such as vaccine disinformation, deepfake videos or propaganda created by hostile states.
The Media Literacy Taskforce Fund is one of two funding schemes created to target ‘hard-to-reach’ and vulnerable groups by investing in community-led projects to ensure everyone can improve their media literacy skills and protect themselves from online disinformation through:
A social enterprise working with young people to develop their own podcasts exploring online mis- and disinformation to be aired on local radio
A project run by a charity to provide media literacy training focused on care workers
Access to digital media skills training online and in community centres for the elderly
A partnership with NewsGuard and charities that delivers workshops to older adults to support them in spotting mis- and disinformation online
The Economist Educational Foundation, to work with disadvantaged schools and boost teachers’ skills through news literacy training and support students to engage with the news and think critically about what they’re consuming online
The online safety charity Glitch, which will deliver workshops and training to vulnerable and marginalised women to support their media literacy skills including tackling online abuse.
Source: Government of the UK (2022[25]), “Help for vulnerable people to spot disinformation and boost online safety”, https://www.gov.uk/government/news/help-for-vulnerable-people-to-spot-disinformation-and-boost-online-safety.
3.2.2. Pro-active pre-bunking communication efforts can help build societal resilience to the spread of disinformation
Governments can also help prepare society to better understand disinformation flows and risks by “inoculating” the public to the potential harms. These “pre-bunking” efforts seek to “warn people of the possibility of being exposed to manipulative disinformation, combined with training them on how to counter-argue if they do encounter it,” with the idea that such activities will reduce their susceptibility to false and misleading content (Roozenbeek and van der Linden, 2021[26]) (Van der Linden, 2023[27]). Pre-bunking and other pro-active communication efforts can focus on flagging disinformation actors, sources of inauthentic information, and on assessments and insight into tactics used to create and share misleading content (OECD, 2023[28]).
To this end, governments have created and distributed materials and organised internet campaigns that inform the public about the dangers of mis- and disinformation, name-and-shame malign actors, and share examples of how information attacks and false narratives can spread. Lithuania, Latvia, Estonia, Finland, Czechia, and others, notably through their intelligence agencies have in recent years started to publicly disseminate analytical reports and threat assessments. These often devote considerable attention to the information environment, including malign actors, examples of relevant attacks and manipulations, and target audiences. Such reports provide the public with reliable information on the major threats (see Box 3.7).
Box 3.7. Security and Intelligence assessments – Case studies from Lithuania, Latvia, Finland and Sweden
Intelligence and security agencies in some OECD members have published public threat assessments or reports as a means of keeping policymakers and the public informed of relevant issues. Finland’s Security and Intelligence Service (SUPO) has produced reports since 2016, Latvia’s State Security Service since 2013, Lithuania’s Second Investigation Department under the Ministry of National Defence and State Security Department since 2014, and Sweden’ Security Police since 2001.
Finland, Latvia, Lithuania, and Sweden each produce annual reports that contain updates on malign information campaigns and strategies within the context of broader threats facing the country. Recent editions have highlighted the disinformation campaigns related to Russia’s war of aggression against Ukraine, which primarily seek to sway opinion in support of Russia’s invasion and justify its actions by taking advantage of perceived social tensions in the region.
Latvia’s latest report identifies long-term exposure to disinformation and propaganda, low levels of education, and the influence of “opinion leaders” as exacerbating the effect of malign information campaigns. Lithuania’s report also identifies personalities with links to Russia or Belarus as instigating disinformation. Similarly, it outlines how social issues, such as the 2020 migration crisis manufactured by Belarus, play a central role in Russian and Belarusian disinformation campaigns.
In a similar vein, Sweden’s report describes disinformation as a key factor in attempts to destabilise or undermine society and the democratic state. Narratives to this end portray Sweden as a country in “chaos and decay”, with COVID-19 described as a watershed moment for the spread of hate and distrust in society through malign information and conspiracy theories. Finland’s SUPO report also underscores outsider efforts to influence security policy decisions by preventing open discussions, such attempts to influence public debates around NATO membership as a direct threat to national security.
The reports show that the methods and vulnerabilities malign actors exploit are similar. Detailing the messages, narratives, and techniques for spreading malign information allows readers to more effectively identify and react to potential threats.
Source: Supo (2022[29]), “Supo Yearbook 2021: Finns must be prepared for influencing efforts from Russia during NATO debate”, https://supo.fi/en/-/supo-yearbook-2021-finns-must-be-prepared-for-influencing-efforts-from-russia-during-nato-debate; Latvian State Security Service (n.d.[30]), Annual reports, https://vdd.gov.lv/en/useful/annual-reports; Republic of Lithuania (2022[31]), National Threat Assessment 2022, https://www.vsd.lt/wp-content/uploads/2022/04/ANGL-el-_.pdfv; Swedish Security Service (n.d.[32]), Sweden Security Police Yearbooks, https://www.sakerhetspolisen.se/om-sakerhetspolisen/publikationer/sakerhetspolisens-arsberattelse.htm.
Based on these assessments, governments have also organised special courses for representatives of civil society, media, academics, business on national security and defence topics. The content of these courses includes information on threats, as well as opportunities to discuss the issues with government officials. Such efforts support societal resilience by raising participants’ awareness of threats and preparing them for co-operation in the case of a crisis. Beyond raising awareness, the benefits of such endeavours help participants serve as ambassadors to spread the understanding and skills to members of their respective organisations and the public.
Another practical example of a public and accessible pre-bunking tool is the development of the GoViral! Game, created by a collaboration between academic researchers, the UK Cabinet Office, the World Health Organisation and three private sector design agencies. The game exposes players to manipulation techniques and simulates real-world social media dynamics to share insights into of how mis- and disinformation are spread (see Box 3.8). A strength of these pre-bunking efforts is that while they inform the public of actual disinformation threats and techniques, they do not put governments in the position of discussing specific pieces of content or serving as an arbiter of truth.
Box 3.8. GoViral! Pre-bunking game
Funded by the UK Cabinet Office and supported by the UN World Health Organisation, Go Viral! was created by researchers from the University of Cambridge Social Decision-Making Laboratory and the Sciences Po Médialab. The game was built with the help of design agencies and builds on previous research showing that a similar game simulating the spread of disinformation, Bad News, can reduce susceptibility to false information for at least three months.
Launched in October 2020, the five-minute game exposes players to three common manipulation techniques used in spreading COVID-19 mis- and disinformation: emotional language, fake experts, and conspiracy theories. It aims to demystify and pre-bunk false information by simulating a real-world social media environment.
Within the game, players create a viral social media post using emotionally evocative language, share content using fake experts to gain credibility in a social media group, and create their own COVID-19 conspiracy theory, targeting an entity or organisation to spark protests. The game allows players to assess the popularity and perceived trustworthiness of their content, simulating the dynamics of real-world social media interactions.
At the start of gameplay, players are invited to take part in research questions about how they perceive certain pieces of content. They are then asked similar questions at the end. Analysis of the results found that the game increases perceived understanding of misinformation about COVID-19, improves confidence in the ability to spot false and misleading content, and reduces self-reported willingness to share such content with others.
The Go Viral! Game shows how collaboration between governments, international organisations, and academic institutions can inform cutting edge research into societal challenges. The ability to gather data throughout the game is also an effective way to measure the game’s effectiveness and gather user feedback.
Source: www.goviralgame.com; Maertens et al. (2021[33]), “Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments”, Journal of Experimental Psychology: Applied, Vol. 27/1, pp.1–16, https://doi.org/10.1037/xap0000315; Basol et al. (2021[34]), “Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation”, Big Data & Society, Vol. 8/1, https://doi.org/10.1177/20539517211013868.
3.2.3. Continued focus should be given to assessing and measuring impact of media and information literacy activities
Despite the general agreement on the necessity and value of providing media and information literacy skills, several challenges exist. First, the effectiveness of media literacy activities is heavily dependent on the capacity of teachers and trainers, as well as the quality of available materials. One way to help ensure consistent implementation of MIL activities, therefore, is for countries to establish a system of teacher training. Notably, the French centre “CLEMI” trains around 17 000 teachers on media and information literacy each year (CLEMI, 2023[35]). The consistency of training through the school system may also be hindered in countries with less centralised education system. Such systems may also enable greater innovation and experimentation, though it can lead to variable quality between approaches.
Attention should also be given to the quality of partners conducting MIL activities that are funded in whole or in part by the state. Given the range of potential actors, quality control, monitoring, and cost / benefit assessments are essential, despite adding administrative costs. Particularly where partners are providing media literacy campaigns, governments should put in place effective mechanisms to ensure the content, methods, and quality of products fit general requirements and that the activities align with strategic goals.
Another challenge that all MIL efforts face is related to difficulties in assessing and measuring impact of the activities. Formal measurement criteria usually involve obligations to report on outputs, such as a list of the events or other activities, the audience reached (for example, views on the website or social platform or the number of the participants in events), hours spent in trainings, and mentions of the project in the other media sources. Even if output measurements exist, such criteria do not often illustrate the actual impact of the project on its intended goals or broader changes over time in the capacity for critical and reflective information consumption. Without careful assessment, it is not clear how activities practically change participants’ attitudes or whether the effect is long-lasting. This challenge is magnified in less formal settings, where participation is not mandatory and may be less consistent.
Such issues point to the need for clear methodology for evaluating the effectiveness of MIL activities. The Council of Europe analysed 68 MIL projects in 2021 in the field of media literacy and found that one third of the projects did not include any measurement parameters (Council of Europe, 2016[36]). In the United Kingdom, the national online media literacy strategy explicitly stipulates the need for better measurement in this field. The analysis noted a “distinct lack of robust evaluation of media literacy provisions.” Where evaluation measures exist, they are often very limited, using metrics such as reach, number of events, quotes from participants, or participant self-assessments, making it challenging to assess whether provisions are effective at improving media literacy capabilities on a long-term basis (UK Department for Digital, Culture, Media & Sport, 2021[37]).
Media literacy providers, furthermore, often do not have sufficient funding to be able to monitor and evaluate their initiatives. Relatedly, interventions also often operate on a short-term basis and do not facilitate working with the same beneficiaries over a long enough time frame to determine the effectiveness of the activities. To that end, many aspects of media literacy that are cemented in behavioural change can be difficult or impossible to measure over the short-term; for example, assessing whether users are able to independently apply learnings to the ‘real’ online environment, rather than just under supervision (UK Department for Digital, Culture, Media & Sport, 2021[37]).
For its part, the Norwegian Media Authority conducts an assessment every two years on the state of the media literacy in the country. The latest report was published in 2021 and is based on the representative opinion poll of 2048 Norwegian residents. Among its findings are that the oldest (aged 60+) and youngest (aged 16–24) segments of the population find it most difficult to deal with disinformation, and that while 50% of the population reports that they check other sources they trust to verify information, 18% note that they do not verify information at all (Norwegian Media Authority, 2021[38]). (see Box 3.9 for additional examples of media literacy assessment tools).
Box 3.9. Media literacy assessment tools
UK Ofcom toolkit for evaluating media literacy interventions
The Making Sense of Media programme within the UK telecommunication regulator Ofcom published a toolkit in 2023 to help guide evaluations of media literacy interventions. The toolkit, which provides a series of how-to guides for planning and carrying out an evaluation of media literacy interventions, is an important element of Ofcom’s programme of work supporting the media literacy in the United Kingdom.
The toolkit gives practical and straightforward guidance and advocates for an evaluation process that is part of the project from the start. It explains that evaluation proves (in that an initiative has achieved its desired outcomes) and improves (in that the initiative provides insights and learnings for an organisation). The Toolkit also details the importance of demonstrating impact – notably, change at an individual or societal level that can be attributed to a project – and takes organisations through steps to help them show evidence.
The Toolkit is divided into three sections that represent stages in the evaluation process: Preparing; Doing; and Sharing. First, it discusses how to write a theory of change and how to create an evaluation framework. Second, it provides information about research methods and proposes model questions; the third section suggests how organisations can structure evaluation reports. There is a separate evaluation framework template, as well as searchable libraries that help map media literacy research and media literacy initiatives within the United Kingdom.
European Union DIGCOMP framework
The Digital Competence Framework for Citizens, also known as DigComp, provides a common language to identify and describe the key areas of digital competence. It is an EU-wide tool designed to improve individuals’ digital competence, help policymakers formulate policies and initiatives, and plan education and training initiatives to improve the digital competence of specific target groups. In this way, digital competence involves the "confident, critical and responsible use of, and engagement with, digital technologies for learning, at work, and for participation in society.”
The DigComp framework identifies the key components of digital competence in 5 areas:
Information and data literacy seeks to ensure the public can articulate information needs, locate and retrieve digital data, information and content, as well as judge the relevance of data sources and content.
Communication and collaboration seeks to ensure the public can interact, communicate and collaborate through digital technologies while being aware of cultural and generational diversity; participate in society through public and private digital services and participatory citizenship; and manage one’s digital presence, identity, and reputation.
Digital content creation focuses on skills related to creating and editing digital content.
Safety focuses on protecting devices, content, personal data and privacy in digital environments and protecting physical and psychological health.
Problem solving focuses on ensuring the public can identify needs and problems and use digital tools to innovate processes and products to keep up-to-date with the digital evolution.
Source: Ofcom (2023[39]), A toolkit for evaluating media literacy interventions, https://www.ofcom.org.uk/research-and-data/media-literacy-research/approach/evaluate/toolkit; Morris (2023[40]), Ofcom’s Toolkit for Evaluating Media Literacy Interventions, https://media-and-learning.eu/type/featured-articles/ofcoms-toolkit-for-evaluating-media-literacy-interventions/; European Commission (n.d.[41]), “DigComp”, https://joint-research-centre.ec.europa.eu/digcomp_en.
The challenges related to the costs, processes, and independence of assessing media, information, and digital literacy initiatives point to the opportunity provided by working with external partners and experts to provide independent perspectives. For example, the U.S. Department of State GEC supported the development of two web browser-based media and information literacy games. The University of Cambridge Social Decision-Making Lab independently assessed the efficacy of both games, which has enables the GEC to monitor the games’ efficacy and continue to make changes (Box 3.10).
Box 3.10. Harmony Square and Cat Park media and information literacy games
The U.S. Department of State’s Global Engagement Center (GEC) developed two measurably effective media and information literacy games to build resilience to foreign information manipulation and influence overseas: Harmony Square and Cat Park.
Harmony Square launched in November 2020 and is currently available in 18 languages. The game is intentionally apolitical (notably, the player attacks topics such as pineapple on pizza and a fictional election for bear patroller). Taking on the role of Chief Disinformation Officer, players learn how actors deploy trolling, artificial amplification on social media, emotional language, and escalation to violence to spread disinformation.
According to research by the University of Cambridge Social Decision-Making Lab, published in the Harvard Misinformation Review, players are statistically significantly better at discerning between reliable and unreliable information and are much less likely to share bad information on social media, after playing the game. Thanks to ongoing monitoring and evaluation of the game’s more than 400 000 plays, GEC determined that in some cases, players were coming away sceptical of all information, not just unreliable information. GEC and the studio behind the game developed a new feature in the game that corrects this issue.
Cat Park launched in October 2022 and is currently available in six languages. Players take the role of a person recruited into a social media pressure campaign. Players “train” with a group of activists with different media manipulation skillsets – creating sensational headlines, memes, and synthetic media – to stop a hypothetical development of a cat park.
The game has been played more than 100 000 times and there is a lesson plan available for most of the game’s languages. Drawing on lessons from Harmony Square and research from the U.S. Agency for International Development that questioned the efficacy of media and information literacy projects in developing countries, Cat Park offers a much greater level of localisation. When players in Sub-Saharan Africa play the game in Amharic or Swahili the plot and characters will look different. When players in the Middle East and North Africa play the game in Arabic, the plot and characters will look different. Similarly, when someone plays the game in Spanish in Latin America, the game will look different. Research from the University of Cambridge published in Nature found that after playing the game, players are more sceptical of unreliable information.
Note: Harmony Square game link: https://harmonysquare.game/; Cat Park game link: https://catpark.game/
Source: Roozenbeek and van der Linden (2020[42]) “Breaking Harmony Square: A game that “inoculates” against political misinformation”, Harvard Kennedy School Misinformation Review, https://doi.org/10.37016/mr-2020-47; Neylan, J. et al. (2023[43]), “How to “inoculate” against multimodal misinformation: A conceptual replication of Roozenbeek and van der Linden (2020)”, Scientific Reports, Vol. 13/1, https://doi.org/10.1038/s41598-023-43885-2.
A focus moving forward will be on developing methods for measuring impact of these initiatives as they relate to the public’s ability to take part constructively in the information space. This will require monitoring changes in broad indicators over time, such as susceptibility to mis- and disinformation narratives and trust in governmental communications and institutions. While direct causality is difficult (or impossible) to identify, these could be seen as possible pieces of evidence of success. Such analysis would be particularly relevant for large-scale projects that include considerable part of countries’ population. Indeed, greater emphasis on longitudinal impact evaluations would enable comparisons against baselines, highlighting changes over time in the capacity for individuals to critically and reflectively consume information.
Analysis could also be based on the monitoring of specific behaviour of the audiences targeted by a policy or project. For example, this could include analysis of online activity such as changes in patterns of sharing mis- and disinformation materials following MIL trainings. There are clear limitations for such activities, however, including the lack of transparency of social media platforms. Finally, measurement could include self-assessments of the target audience following interventions or activities, for example via questionnaires given to participants who took part in an MIL initiative.
3.3. Public communication plays an important role in providing information
A more immediate goal of whole-of-society efforts to strengthen societal resilience focuses on ensuring individuals are informed and aware of false and misleading content. In democratic settings where government information is open to scrutiny by free and independent media, the public communication function can play a crucial role in fostering societal resilience to disinformation. This is achieved by serving as a source of timely and relevant information. This function should aim to be distinct from political communication, which is linked to elections or political parties, political debates or the promotion of the government of the day. A modern public communication function should be understood as the government function to deliver information, listen, and respond to citizens in the service of the common good (OECD, 2021[44]). To that end, government efforts to build awareness and help ensure the public has access to information include the following avenues:
In democratic environments where government information can be challenged by free and independent press, timely information provided by governments can build awareness of relevant challenges and threats
Engagement with external partners, with appropriate governance models and within free and democratic contexts, can help build societal resilience to the spread of disinformation.
3.3.1. Accurate and timely information provided by public communication can build societal awareness of the risks of mis and disinformation
Information does not spread in a vacuum – traditional media and fact-checkers, online platforms, civil society, and individuals themselves are essential actors in generating and amplifying content. At the same time, governments, often via the public communication function of the centre of government or particular ministries, with other actors that constantly play a healthy checks and balance function, can help raise awareness of the spread of false and misleading content and serve as a source of accurate information. Even where facts are unclear or still being collected, as is often the case in crises, the public will demand updates; governments should consider how to anticipate and respond to individuals’ needs honestly, transparently, and with the best information possible, while pre-empting the spread of rumours and falsehoods (OECD, 2023[28]). The public communication function therefore requires advanced and sophisticated governance to safeguard its focus on delivering for the public good, promote disclosure of sources, ensure a level of separation from political communications, as well as to build its capacity and professionalism. The OECD has conducted a comparative analysis of good practices and drawn from these a set of Good Practice Principles for Public Communication Responses to Mis- and Disinformation (Box 3.11). In most OECD countries, this function remains undervalued and underutilised as a source of information, and is still transitioning away from a focus on political communication.
Box 3.11. OECD Good Practice Principles for Public Communication Responses to Mis- and Disinformation
The OECD has developed 9 Principles of Good Practice to provide policymakers with guidance to address the spread of mis- and disinformation, and in turn strengthen information ecosystems and support democracy. They relate most directly to public communication interventions. The Principles are based on the analysis and review of relevant emerging practices in the field of countering mis- and disinformation and the factors that make them effective. The 9 principles are:
Structure and governance
1. Institutionalisation: Governments should consolidate interventions into coherent approaches guided by official communication and data policies, standards and guidelines. Public communication offices will benefit from adequate human and financial resources, a well-co-ordinated cross-government approach at national and sub-national levels, and dedicated and professional staff.
2. Public-interest-driven: Public communication should strive to be independent from politicisation in implementing interventions to counteract mis- and disinformation. Public communication should be separate and distinct from partisan and electoral communication, with the introduction of measures to ensure clear authorship, impartiality, accountability, and objectivity.
3. Future-proofing and professionalisation: Public institutions should invest in innovative research and use strategic foresight to anticipate the evolution of technology and information ecosystems and prepare for likely threats. Counter misinformation interventions should be designed to be open, adaptable and matched with efforts to professionalise the function and build civil servants’ capacity to respond to evolving challenges.
Providing accurate and useful information
4. Transparency: Governments should strive to communicate in an honest and clear manner, with institutions comprehensively disclosing information, decisions, processes and data within the limitations of relevant legislation and regulations. Transparency, including about assumptions and uncertainty, can reduce the scope for rumours and falsehoods to take root, as well as enable public scrutiny of official information and open government data.
5. Timeliness: Public institutions should develop mechanisms to act in a timely manner by identifying and responding to emerging narratives, recognising the speed at which false information can travel. Communicators can work to build preparedness and rapid responses by establishing co-ordination and approval mechanisms to intervene quickly with accurate, relevant and compelling content.
6. Prevention: Government interventions should be designed to pre-empt rumours, falsehoods, and conspiracies to stop mis- and disinformation narratives from gaining traction. A focus on prevention requires governments to identify, monitor and track problematic content and its sources; recognise and proactively fill information and data gaps to reduce susceptibility to speculation and rumours; understand and anticipate common disinformation tactics, vulnerabilities and risks; and identify appropriate actions, such as “pre-bunking”.
Democratic engagement, stronger media and information ecosystem
7. Evidence-based: Government interventions should be designed and informed by trustworthy and reliable data, testing, and audience and behavioural insights. Research, analysis and new insights can be continuously gathered and should feed into improved approaches and practices. Governments should focus on recognising emerging narratives, behaviours, and characteristics to understand the context in which they are communicating and responding.
8. Inclusiveness: Interventions should be designed and diversified to reach all groups in society. Official information should strive to be relevant and easily understood, with messages tailored for diverse publics. Channels, messages, and messengers should be appropriate for intended audiences, and communication initiatives conducted with respect for cultural and linguistic differences and with attention paid to reaching disengaged, underrepresented or marginalised groups. Adequate resources and dedicated efforts can support responsive communication and facilitate two-way dialogue that counteracts false and misleading content.
9. Whole-of-Society: Government efforts to counteract information disorders should be integrated within a whole-of-society approach, in collaboration with relevant stakeholders, including the media, private sector, civil society, academia and individuals. Governments should promote the public’s resilience to mis- and disinformation, as well as an environment conducive to accessing, sharing and facilitating constructive engagement around information and data. Where relevant, public institutions should co-ordinate and engage with non-governmental partners with the aim of building trust across society and in all parts of the country.
Source: OECD (2023[28]) “Good practice principles for public communication responses to mis- and disinformation”, OECD Public Governance Policy Papers, No. 30, OECD Publishing, Paris, https://doi.org/10.1787/6d141b44-en.
Similarly, the European Centre of Excellence for Countering Hybrid Threats stressed the importance of rapidly refuting lies and debunking disinformation, the necessity of working with civil society, ensuring that the relevant teams within governments are in place, undermining foreign malign actors through humour and accessible messages, and learning from and supporting partners as best practices in countering disinformation threats. Many of the lessons drawn from government and civil society responses in Ukraine to Russian disinformation can provide important lessons for effective strategic communication efforts moving forward (Kalenský and Osadchuk, 2024[45]).
Building capacity, establishing clear frameworks and institutional mechanisms, and formalising definitions, policies and approaches can help shift from ad-hoc and fragmented public communication approaches to counteracting mis- and disinformation, to more structured and strategic approaches (OECD, 2021[44]). Along those lines, for example, the UK Government Communication Service Propriety Guidance specifies that government communication should be: relevant to government responsibilities; objective and explanatory; not represented as party political; conducted in an economic and appropriate way; and able to justify the costs as an expenditure of public funds (Government of the UK, 2022[46]).
Public communication campaigns and government websites can debunk existing disinformation narratives. Delivering clear and tailored messages can help ensure communications reach all segments of society, including groups that are less likely to be exposed to or trust official sources. To that end, preparing and implementing strategic communication campaigns and ensuring accurate content reaches target audiences are essential in counteracting the spread of mis- and disinformation (OECD, 2023[28]). For instance, in New Zealand, the “Unstoppable Summer” campaign, including television advertisements and a short musical video featuring the Director General of Health, and shown before broad audience events, is a good example of an effort to reach youth (Government of New Zealand, 2020[47]) (OECD, 2023[48]). Indeed, throughout the COVID-19 response, many countries developed processes that utilised credible messengers, such as members of a particular community, scientists and doctors, or influencers to present relevant information in a timely, authoritative, and non-politicised way to help ensure it reached as wide a segment of the population as possible.
Given their sensitive role in creating and sharing content, as well as monitoring and responding to disinformation, governments should take extra precautions to ensure their communication activities do not lead to allegations or instances of politicisation and abuse of power. In the first instance, therefore, ensuring public communication strengthens information integrity depends on free information spaces and a strong and free media environment.
A lack of transparency around the activities of the public communication function can also undermine trust. Specifically, there is a risk that public communication initiatives designed to respond to disinformation can play into the arguments of actors who may accuse the government as playing “arbiter of truth” or even adopting disinformation techniques themselves. As a reaction to changing information consumption patterns, for example, governments have collaborated with online influencers to conduct awareness raising and other campaigns to reach segments of the population that they may not otherwise be well-suited to reach. While government engagement influencers via both paid and earned support can help strengthen the inclusiveness and reach of messages, putting in place clear guidelines, transparent processes, and independent oversight of the public communication function will help provide the necessary governance mechanisms to build trust (OECD, forthcoming[49]). More broadly, promoting access to information and open government standards, including publicly accessible open data, can help lower barriers for journalists and citizens to access public information and officials.
3.3.2. Engagement with non-government actors should be transparent and guided by clear and democratic oversight
Beyond the public communication function, how governments engage with online platforms, civil society, media, and academics needs to be carefully considered. On the one hand, facilitating open lines of communication between actors can be a fast and efficient way to identify threats and promote better functioning information spaces (see Box 3.12). It can also be important for government institutions to receive direct updates from online platforms about the spread of mis- and disinformation, such as concerted amplification operations by hostile actors or those that threaten elections and the safety of the public. Furthermore, much of the work to counteract disinformation threats remains sensitive due to national security considerations; providing too much insight into what is known about foreign information threats or efforts to counteract them also risks compromising their efficacy (OECD, forthcoming[49]).
Box 3.12. Lithuanian government co-operation with Debunk.EU and Meta on moderation policies
In 2022, the Government Chancellery of Lithuania initiated discussions with Meta on its content moderation policies related to the Russian aggression against Ukraine and activity on Facebook that appeared to filter content and block authors expressing support for Ukraine. The Lithuanian government, working with the Lithuanian CSO Debunk.eu, collected examples of accounts that had been blocked or deleted because they had expressed pro-Ukraine opinions, though had not otherwise violated Meta’s content policies.
An outcome of the meeting was that it provided critical cultural and linguistic context to better inform Meta's content moderation policy and to ensure it considered the cultural and linguistic traditions of Lithuania. Indeed, Meta was often blocking accounts for words and expressions that it treated as offensive, despite their common and well-established use in the Lithuanian language. The engagement also facilitated consultation with Lithuanian language institutions, leading Meta to update its target keyword list and moderation policies.
The government and CSOs alike also noted that redress mechanisms were insufficient, and that blocking the posts and accounts of influential opinion makers without the possibility of correcting the content unreasonably limits free expression, restricts public debate, and can hinder civic initiatives, such as collection campaigns for victims in Ukraine. Meta representatives offered to hold training sessions with user groups to provide additional details of content management policies to help ensure their posts would not be blocked, as well as highlight the issues with senior management.
In 2023, 63% of Lithuanian citizens named social media as the primary place where they encounter disinformation, while the same percentage indicated that social media platforms’ actions to minimise spread of disinformation was insufficient.
Source: Data provided by the Lithuanian government.
On the other hand, government interactions with online platforms, media, and other non-governmental actors in fighting mis- and disinformation are particularly sensitive given the risk that engagement with these external partners may enable governments to encourage content moderation beyond the formal regulatory power they have and infringe on freedom of expression.
Similar considerations point to the challenges of working with external partners to identify and debunk specific pieces of content. Notably, fact-checkers can be accused of political bias, and there is a risk that if fact-checkers receive direct funding or other support from governments, they will be pressured or incentivised (or perceived as being pressured or incentivised) to protect the government or smear political opponents. Research has found correlations between fact-checkers’ political affiliations and their priorities and findings (Louis-Sidois, 2022[50]). The risk of perceived (or actual) politicisation by fact-checkers can also be seen by findings from the United States that demonstrated that Americans are split in their views of fact-checkers: Half said fact-checking efforts by news outlets and other organisations tend to deal fairly with all sides, while about the same portion (48%) say they tend to favour one side (Pew Research, 2019[51]).
In 2023, Faktograf, a Croatian fact-checking outlet, published the preliminary results from a survey of 41 leading European fact-checking organisations that illustrates the potency of the polarised environment in which they are working. Their research found that 90% of the outlets reported having experienced some type of harassment. More than three-quarters – 36 out of 41 – of the fact-checking organisations surveyed have experienced harassment online, often facing verbal attacks. Furthermore, 70% of the respondents that experienced online harassment were subjected to campaigns that include prolonged or co-ordinated threatening behaviour, such as stalking, smear campaigns, “doxing”, and technology-facilitated gender-based violence, including gendered disinformation. Furthermore, 78% of the organisations confirmed that elected officials had targeted them directly (Faktograf, 2023[52]). In politically polarised environments, government engagement with these actors may risk amplifying risks and fuelling accusations of censorship and partisanship, harming both government and non-government actors in the process.
Self-regulation mechanisms put in place by media, CSOs, and other non-governmental actors involved in fact-checking and other relevant activities can help mitigate these challenges. In this regard, the active participation of media professionals can help ensure that journalistic expertise and ethical standards inform other relevant actions to promote information integrity. For instance, the International Fact-Checking Network (IFCN) has developed a code of principles signed by more than 200 fact checking organisations from around the world (IFCN, 2023[53]). Notably, IFCN signatory status may not be granted to organisations whose editorial work is controlled by the state, a political party or politician. It may, however, be granted to organisations that receive funding from state or political sources if the IFCN assessor determines there is clear and unambiguous separation of editorial control from state or political influence. Signatories also promise to be neutral and unbiased and commit to funding and organisational transparency. More detailed commitments are included in the “European Code of Standards for Independent Fact-Checking Organisations”, approved by the European Fact-Checking Standards Network Project (supported by the European Commission) in August 2022. The emphasis in this Code is devoted to political impartiality and transparency of organisations’ activities (EFCSN, 2022[54]).
Opportunities also exist for governments to be more transparent in their work with online platforms. For example, while decisions to take down content or add warning labels rest with the platforms themselves, governments may flag false or misleading content to platforms. In these cases, transparency around such discussions is critical and relevant disclosure mechanisms should be put in place (Full Fact, 2022[55]). Transparency around how and under what circumstances governments share information with online platforms can be an important way to strengthen public confidence that freedom of expression is upheld, while at the same time enable external scrutiny that such actions are necessary. In addition, governments could consider establishing independent oversight mechanisms to evaluate their actions in this space and ensure they do not limit freedom of expression (OECD, forthcoming[49]).
3.4. Strengthening public participation and building understanding of the information space through research are key to informing policymaking and implementation
Building information integrity requires greater understanding of the specific problems that policy responses look to solve. As governments seek to strengthen their ability to counter threats posed by malign interference and disinformation, as well as reinforce the public’s ability to participate in well informed democratic debate more widely, they will need to build the understanding of what conditions within the information environment foster democracy and encourage active citizen participation (Wanless and Shapiro, 2022[4]). Working with the public and non-governmental partners to develop this understanding, build trust, and inform effective policymaking can ultimately serve as a catalyst for good governance and democracy.
Strengthening participation and engagement suggests the following entry points on which to build:
Participatory and deliberative democracy mechanisms can help establish policy priorities to strengthen information integrity.
Government-funded research on information integrity should be conducted with clear objectives and guardrails and inform the policymaking and implementation process.
3.4.1. Participatory and deliberative democracy mechanisms can help deliver policies on strengthening information integrity
Governments can also develop participation initiatives to facilitate engagement with the public, media professionals, platforms, academics and civil society organisations more widely on strengthening information integrity and countering mis- and disinformation. If structured well, such initiatives can help raise awareness and set a policy agenda that reflects public priorities while also building trust between individuals, media and decision makers. In a field such as information integrity, in which public scrutiny about government interference in the information space is, rightfully, important, and at a time of low trust in public institutions (OECD, 2022[56]), promoting civic education and involving citizens and various stakeholders in the design of these policies will be important.
Opportunities for citizens’ and stakeholders' participation and engagement are rooted in open and democratic governance and have multiplied significantly across OECD countries and beyond in the last decade. Indeed, the OECD Recommendation on Open Government notes that citizens should be provided “equal and fair opportunities to be informed and consulted and actively engaged in all phases of the policy-cycle,” and that “specific efforts should be dedicated to reaching out to the most relevant, vulnerable, underrepresented, or marginalised groups in society, while avoiding undue influence and policy capture (OECD, 2017[57]).” In this sense, the role of citizens refers to the public broadly, rather than the more restrictive sense of a legally recognised national of a state. Promoting the role of citizens and civil society means governments must create the conditions for the equitable, sustained, and substantive participation of civil society in policymaking (Forum on Information and Democracy, 2023[58]), and that countries should provide a level playing field by granting all stakeholders fair and equitable access to the development and implementation of public policies (OECD, 2010[59]).
Representative democracy, where citizen preferences are expressed through elected representatives, and direct democracy, where citizens vote on specific issues, are the most common avenues for participation. Beyond representation, promoting citizen participation should incorporate methods that provide the public with the time, information, and resources to discuss and deliberate, produce quality inputs, and develop individual or collective recommendations to support more open policy-making. For example, online calls for submissions, public consultations and roundtable discussions are all examples of participatory mechanisms. Furthermore, putting in place effective deliberative democracy mechanisms that bring together a representative group of people to discuss issues and feed a “representative” view into decision-making processes can lead to better policy outcomes, enable policy makers to make hard choices, and enhance trust between citizens and government (OECD, 2020[60]).3
To-date, engagement initiatives on topics of information integrity have been relatively limited, likely reflecting the need to continue to build understanding around the trends, processes, and clarity of potential policy responses. Nevertheless, while often characterised as a technical matter, identifying policy initiatives related to strengthening information integrity are largely understandable by, and of interest to, the public. Beyond academics and other stakeholders, such as media, CSOs, and the private sector, public consultations can help inform and support efforts to build information integrity.
In 2020, Ireland established the Future of Media Commission as an independent body to undertake a comprehensive and far-reaching examination of Ireland’s broadcast, print and online media. Notably, one of the recommendations of the report that was prepared by the Commission was for the government to create a National Counter-Disinformation Strategy (see Box 3.13), illustrating how public engagement can direct government actions and interventions. A similar example can be found in France with the organisation of the General Assembly on Information (les États généraux de l'information”), launched at the initiative of the President of the Republic in July 2023 with the aim of establishing a diagnosis of the key challenges related to the information space and proposing concrete actions that can be deployed at national, European, and international levels. The final output of this process, taking place between fall 2023 and summer 2024, will be to develop a set of proposals to anticipate future developments in the information space. Five working groups will develop these proposals, which will integrate feedback through citizens' assemblies and debates organised in-person in France as well as via an online consultation carried out by the French Economic, Social and Environmental Council (EESC).
Box 3.13. Ireland’s Future of Media Commission
Established by the Irish government in September 2020, the Future of Media Commission is an independent body that explored, among other topics, how Ireland’s media can remain sustainable and resilient in delivering on public service aims until 2030, including ensuring access to high-quality and impartial journalism.
Published in July 2022, the Future of Media Commission Report reflects the commission’s core mission to develop recommendations on sustainable public funding of Irish media and to ensure its viability, independence, and capacity. The Commission’s consultative efforts engaged the public, media organisations and industry stakeholders, regulators, and policymakers, and helped facilitate wide-ranging involvement in the drafting process.
The Commission’s public consultation process received more than 800 written submissions, while its series of six online Thematic Dialogues saw more than 1 000 members of the public and 50 expert panellists engage in detailed discussions and debate. In addition, the Commission undertook a comprehensive survey to examine what the public consumes and values in terms of media content and what can be anticipated about future trends.
The report contains 50 recommendations, 49 of which were adopted in principle by the government upon publication, showing the value and relevance of the process and outputs. Notably, the report recommended creating a National Counter-Disinformation Strategy to tackle mis- and disinformation and improve general trust in information and media. The report also notes that the wider context of changing funding models in Ireland threaten to centralise information distribution, making the media landscape less plural, as advertising revenues move from media organisations to technology companies.
Source: Government of Ireland (2022[61]), Report of the Future of Media Commission, https://www.gov.ie/pdf/?file=https://assets.gov.ie/229731/2f2be30d-d987-40cd-9cfe-aaa885104bc1.pdf#page=null.
In 2022, Spain created the "Forum against Disinformation Campaigns in the Field of National Security", a platform for public-private collaboration to promote debate and reflection on the risks posed by disinformation campaigns in the field of national security.
The complexity of policymaking around building information integrity and the need to respond to the challenges faced also point to the value of deliberative democracy initiatives as a promising tool. These refer to the “direct involvement of citizens in political decision making, beyond choosing representatives through elections”. Indeed, when conducted effectively, deliberative processes can lead to better policy outcomes, enable policymakers to make hard choices, and enhance trust between citizens and government (OECD, 2020[60]).
For example, the Canadian government worked with civil society organisations to organise three citizen assemblies on Democratic Expression, involving 90 Canadians who together contributed 6 000 volunteer hours to explore how the government should strengthen the information environment in which Canadians can freely express themselves. The Canadian Commission on Democratic Expression, in its report informed by the assemblies, recommended that the government should establish and independent Digital Services Regulator to set standards for the safe operation of digital services and to require platforms to conduct regular risk assessments. The Commission also recommended that the government appoint a special envoy to liaise at an international level on issues related to disinformation and foster dialogue with social media platforms, foreign governments, and multilateral bodies; promote interdisciplinary research on how content spreads; and to support media literacy efforts and invest in quality journalism at the national, regional and community levels (Citizens’ Assembly on Democratic Expression, 2022[62]). In addition to their use in informing policymaking, deliberative processes also help counteract polarisation and disinformation, as research suggests that deliberation can be an effective way to overcome ethnic, religious, or ideological divisions between groups (OECD, 2020[60]).
3.4.2. Government-funded research on information integrity should be conducted with clear objectives and guardrails and inform the policymaking and implementation process
The aim of research in this space should be to better understand the conditions within the information environment that can foster healthy democratic societies and encourage active citizen participation (Wanless and Shapiro, 2022[4]). OECD members have responded to information threats in part by funding research activities to analyse trends, including the susceptibility to mis- and disinformation by different sectors of the population, content consumption patterns, and the threats posed by foreign actors producing and intentionally spreading false and misleading information. Governments are also supporting research to develop methodologies to assess the efficiency of various policy measures such as awareness campaigns and regulatory interventions. For example, Luxembourg financially supports the University of Luxembourg in its activities regarding the conduct of surveys for the European Media Pluralism Monitor and the “Local Media Project for Democracy”, in full accordance with the principles of academic freedom and scientific independence.
Internal research conducted by or for the government can play an important role in supporting a better-informed policymaking process, particularly if it involves access to sensitive, private, or classified data. For example, the Government of Canada, in partnership with the OECD and the French Government, conducted an experiment to investigate Canadians’ intentions to share different types of content on social media to better understand vulnerable populations and to design innovative policy solutions to mitigate the spread of misinformation (see Box 3.14).
Box 3.14. An international collaboration to tackle misinformation with behavioural science
In partnership with the OECD and the French government, the Government of Canada implemented a Randomised Controlled Trial (RCT) embedded within the longitudinal COVID-19 Snapshot Monitoring Study (COSMO Canada) to test ways to reduce the spread of misinformation online. The study tested the effect of two behaviourally informed policy interventions. Both interventions were drawn from a rapidly growing research literature, and both aimed at improving the quality of news shared online (that is, the preference for sharing verifiably true over verifiably false news links) while prioritising individuals’ autonomy. The first intervention was a simple accuracy evaluation prompt, attuning respondents’ attention to accuracy by asking them to rate the accuracy of a single random headline prior to engaging with Facebook-style headlines online. The second intervention was a list of media literacy tips. This international collaboration found:
First, data indicate a disconnect between participants’ (N = 1872 participants) beliefs and sharing behaviours. People rate verifiably true headlines as significantly more accurate than verifiably false headlines (as determined by third-party fact-checkers), but are much less discerning in their sharing intentions – in other words, people share news headlines they believe to be false or questionable.
Second, experimental results show that prompting participants with digital media literacy tips reduces their intention to share fake news online by more than 20%. While exposure to both the simple attention-to-accuracy prompt and the digital media literacy tips significantly increased participants’ intentions to share true over false headlines, the effectiveness of the media literacy intervention far exceeded the effectiveness of the accuracy prompt. The digital media literacy tips had the greatest impact on reducing intentions to share false headlines online, reducing intentions to share by 21% compared to the control group (see figure below).
The findings from this RCT indicate that behavioural interventions can significantly reduce intentions to share false news headlines in online settings. The key insights from this report are the following:
1. A comprehensive policy response to mis- and disinformation should thus include an expanded understanding of human behaviour.
2. By empowering users, behavioural science offers effective and scalable policy tools that can complement system level policy to better respond to misinformation.
3. International experimentation across governments is vital for tackling global policy challenges and generating sustainable responses to the spread of mis- and disinformation.
These results provide compelling support for how simple and scalable online interventions presented to individuals before they engage with news stories may improve the quality of information circulating online. For some, it may be surprising to hear that individuals are (sometimes) willing to share news that they believe to be false or questionable. This study provides evidence that this does indeed happen, likely due to a failure to pay attention to the accuracy of news content confronted in the social media context. Although additional research and analysis is required to determine why individuals may choose to share false or misleading headlines online, studies like these remain vital for challenging assumptions about human behaviour, creating more effective and scalable solutions based on those they aim to serve, and indicating areas of future exploration that can enhance the robustness of knowledge on global behavioural challenges like mis- and disinformation.
Source: OECD (2022[63]), “Misinformation and disinformation: An international effort using behavioural science to tackle the spread of misinformation”, OECD Public Governance Policy Papers, No. 21, OECD Publishing, Paris, https://doi.org/10.1787/b7709d4f-en.
Though governments may not disseminate the results of such research publicly, they can serve an important role in building understanding of the information space. Co-operation with external researchers to provide public outputs, on the other hand, allows governments to receive diverse insights and advice. Continuing to develop partnerships that are transparent, well-resourced, and that serve clear objectives will be important moving forward.
For example, Canada’s Digital Citizen Initiative focuses on helping Canadians understand online disinformation and its impact on Canadian society, and building the evidence base to identify possible actions and future policymaking in this space (see Box 3.15 and (Government of Canada, 2023[64]). In the Netherlands, the Ministry of the Interior and Kingdom Relations is one of the partners collaborating in the AI, Media and Democracy Lab, an alliance between the University of Amsterdam, the Amsterdam University of Applied Sciences, and the Research Institute for Mathematics & Computer Science in the Netherlands to work with media companies and cultural institutions to increase knowledge related to the development and application of generative AI tools (in 2022, the project received EUR 2.1 million).
Box 3.15. Canada’s Digital Citizen Initiative
Initiated by the Federal Government, Canada’s Digital Citizen Initiative (DCI) funds civil society organisations, educational institutions, and research institutions to better understand and strengthen resilience against online disinformation and other online harms.
Since its inception in 2020, the DCI’s Digital Citizen Contribution Program has provided over CAD 21 million in support of 110 projects. These projects include developing awareness and learning materials for the public, students, and educators, and supporting research to investigate the creation and spread of disinformation across Canada.
Ten separate calls for proposals have prioritised specific issues related to online disinformation and online harms. In the immediate wake of the COVID-19 pandemic, two calls for proposals provided CAD 3.5 million to amplify the efforts of organisations supporting individuals’ abilities to identify and limit the spread of health-related mis- and disinformation. Following Russia’s war of aggression against Ukraine, a targeted call in 2022 funded initiatives to help individuals identify online mis- and disinformation related to this issue.
In the November 2022 Fall Economic Statement, the Government of Canada announced an extended investment of CAD 31 million over four years. In 2024-25, the programme will provide financial assistance for proposals that:
develop and publish tools to support digital media and civic literacy skills among people in Canada outside of educational institutions and/or among seniors in Canada;
develop and publish tools to help people in Canada identify content created and spread by bots and/or artificial intelligence;
develop and publish tools to prevent and address online violence against women, girls and 2SLGBTQI+ communities, and other forms of technology facilitated violence;
create resources to support children and parents in Canada to address and prevent cyberbullying;
build technical capacity and expertise among small and medium sized civil society organisations seeking to address mis- and disinformation, hate speech, and cyberbullying;
develop and publish tools to build resilience to mis- ad disinformation stemming from foreign governments targeting people in Canada, including diaspora communities, and;
conduct research, testing and evaluation of tools or interventions related to any of the above priorities.
Source: Government of Canada (2023[64]), “Digital Citizen Initiative, https://www.canada.ca/en/canadian-heritage/services/online-disinformation.html.
European Union institutions also illustrate whole-of-society models for long-term funding for research projects related to fighting disinformation, notably during the funding cycle of the Horizon 2020 programme (European Comission, 2023[65]). Indeed, the fight against mis- and disinformation is one of the main priorities of current (2021-2027) funding round of “Horizon Europe” programme. For example, the EUR 7 million vera.ai project (2022-2025) connects 14 partner organisations, including the European Broadcasting Union, Deutsche Welle, as well as research institutes, universities, private companies, and the news agency AFP. Together, the consortium aims to help develop AI solutions that can help to unmask and neutralise advanced disinformation techniques (VERA.AI, 2023[66]).
Another important, though less direct, approach to supporting research is illustrated by the EU’s funding to the European Digital Media Observatory (EDMO), which connects civil sector organisations and academics for joint efforts to strengthen information integrity. The second phase of the project has funded the creation of national and multinational digital media research hubs across Europe with EUR 11 million through the Connecting Europe Facility. There are currently 14 regional EDMO hubs that cover the 27 EU member states and Norway. One of the most important strands of EDMO work is research activities focused on project mapping, supporting, and co-ordinating research activities on disinformation at the European level, including the creation and regular update of a global repository of peer-reviewed scientific articles on disinformation. Similarly, Canada has made a USD 4 million (CAD 5.5 million) investment to create the Canadian Digital Media Research Network (CDMRN), bringing together a range of Canadian research institutions, to further strengthen Canadians’ information resilience by researching how quality of information, including disinformation narratives, impacts Canadians’ attitudes and behaviours and by supporting strategies for Canadians’ digital literacy.
Moving forward, the role and impact of closed groups and messages shared on encrypted services such as WhatsApp will need to be better understood. These platforms provide users with valuable privacy and safety functions but can also be important channels to spread mis- and disinformation, while their private and encrypted nature make understanding content spread on these channels impossible to analyse (OECD, 2022[67]). Another challenge faced in supporting research in this space is that research tools, such as specialised software or application programming interfaces (API) used to facilitate content and data sharing between applications are often prohibitively expensive, particularly for smaller research groups with limited budgets. Access to data from social media platforms is also increasingly difficult to get.
In response to these challenges, the European Union Digital Services Act (DSA) partially addresses the issue of data availability for the researchers (as discussed further in Chapter II). Specifically, Article 40 of the DSA stipulates that, “providers of very large online platforms or of very large online search engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the (specified) requirements, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union” (European Union, 2022[6]).
A fundamental issue regarding research in this space is that there is often a disconnect between the research being conducted and the ability for governments to use evidence collected in policymaking and implementation. Researchers and governments have identified a shortage of efficient information exchange and co-operation formats between relevant actors at both the national and international level. To that end, the French government has supported the International Observatory on Information and Democracy, which is modelled on the Intergovernmental Panel on Climate Change (IPCC) to aggregate and synthesise existing research to better understand the information and communication space (see Box 3.16).
Box 3.16. The International Observatory on Information and Democracy
Under the stewardship of Reporters Without Borders (RSF) and with the support of the French government, the Partnership on Information and Democracy was established in 2019. Today with 52 state signatories, this non-binding international governance process advances safeguards for the information and communication space to ensure the right to reliable information, the cornerstone of democratic discourse and critical to democratic institutions. The Forum on Information and Democracy is the civil society-led implementing entity of the Partnership working to advance policy change, enhanced civic voice and participation in agenda-setting and policy discourse and strengthening of the information ecosystem.
A core project of the Forum, the International Observatory on Information and Democracy was established to advance a common understanding of the structure of the information and communication space and its impact on democracy. By bringing together the research community, civil society, states, regulators, and representatives from private corporations, the Observatory is modelled on the IPCC, in this case regarding the information and communication space. In this way, the Observatory facilitates interaction between knowledge producers and policymakers.
Through this democratic lens, the Observatory aggregates and synthesises existing research and available data via regular reports, which provide civil society leaders, researchers, academics, and policymakers a periodic global assessment of the information and communication space and its impact on democracy. The Observatory’s work will inform the international community’s efforts to foster the adoption of effective and proportionate regulatory and non-regulatory measures for the protection of human rights – including the right to reliable information – and democracy in the digital space.
Reports of the Observatory explain existing state-of-the-art research. The aim is to help ensure stakeholders share a common understanding of critical impacts while also revealing research data gaps and important variance across different regions. The Observatory employs a robust methodology that ensures inclusion of perspectives and expertise from the Global Majority. With a governance structure led by international experts from the scientific community and civil society, yet conducting direct consultations with private and public officials, the Observatory contributes to creating shared knowledge benchmarks to help realign regulatory policy to ensure technology serves the public interest.
Observatory reports are addressed to governments, policymakers, regulatory bodies, NGOs, public information bodies, and technology corporations, to provide a shared understanding of how the current structure of the information and communication space is undermining democracies around the world. In turn, the Observatory’s ambition is to help stimulate meaningful dialogue, inform evidence-based policy decisions, and support innovative research in the field of the digital information space and democracy. The first work cycle will be completed in December 2024.
Source: Interview with Forum on Information and Democracy, February 2024.
Ultimately, for decision makers, it can be difficult to turn the results of academic studies into practical policies, suggesting that the feedback loop between researchers and governments can be improved to determine what conditions within the information environment are beneficial for democracy and help measure the success of policy interventions (Wanless and Shapiro, 2022[4]).
3.5. Considerations and path forward
Strengthening participation by and engagement with the public, civil society, and media workers will be essential as countries look to strengthen information integrity, reinforce democracy, and build trust. A whole-of-society approach, grounded in the protection and promotion of civic space, democracy, and human rights, will be necessary given the fundamental role that individuals and non-governmental partners have in promoting healthy information and democratic spaces.
Notably, citizens and stakeholders often have relevant and needed experience, human capital, and qualifications that can provide complementary perspective to governmental policymaking and to identify and respond to disinformation threats. Non-government actors may also have easier access to and greater experience working with groups that governments cannot reach as easily, for example, migrants, diasporas, and other minority, marginalised, or socially excluded groups who may be particularly affected by targeted disinformation. To the extent that non-governmental actors are seen as more reliable sources of trustworthy information than governmental institutions, the public may also be more receptive to projects and other initiatives managed by civil society organisations.
Governments are advancing steadily in this area, increasingly putting in place frameworks for successful engagement and partnership with the public and non-government partners, recognising that groups have different needs. As governments develop multi-stakeholder approaches, they should be guided by the following questions:
How can participatory initiatives that engage citizens and non-government stakeholders be best designed and carried out to build understanding of the information space and develop effective policy responses?
What are the benefits and potential drawbacks of partnerships and collaboration with non-government partners, including the private sector? How can any drawbacks or risks – to government and non-government partners – be mitigated?
How can governments best decide which initiatives to strengthen information integrity should be carried out in partnership with CSOs, media, academia, the private sector (not only online platforms) and where can – or should – governments act alone?
How can whole-of-society efforts designed to strengthen information integrity be measured to track their effectiveness and value?
To that end, governments should consider the following efforts to pursue a whole-of-society approach to strengthening societal resilience and citizen and stakeholder participation:
Enhance public understanding of – and skills to operate in – a free information space conducive to democratic engagement. Governments should ensure that civic, media, and digital information literacy, education and initiatives form part of a broader effort to build societal resilience and measure the effectiveness of initiatives. Promoting media and information literacy in school curricula from primary and secondary school to higher education, developing training programmes for teachers, conducting impact evaluations of media and information literacy programmes (including longitudinal studies), as well as supporting research to better understand the most vulnerable segments of the population to the risk of disinformation and to better target media and information programmes should form key pillars of governments’ toolbox.
Implement information access laws and open government standards, including publicly accessible open data, to lower barriers for journalists and citizens to access public information and officials.
Build capacity and work with partners from across society (notably academics, CSOs, media, and online platforms) to monitor and evaluate changes to and policy impacts on the information space. Beyond output measurements, methods for understanding the impact of disinformation and counter-disinformation efforts should also include monitoring changes in broad indicators over time, such as behavioural indicators and susceptibility to mis- and disinformation narratives.
Provide clear and transparent guidelines and oversight mechanisms for government engagement with other actors, to ensure that when governments are partnering with, funding, or otherwise co-ordinating with or supporting activities of non-government partners on issues related to information integrity governments cannot unduly influence the work of these actors or restrict freedom of expression. Unclear rules, exclusions, or decisions could create distrust in the process. Such guidelines and oversight mechanisms are particularly valuable in avoiding actual and perceived politicisation of governments’ engagement with non-government actors.
Build the capacity of the still largely underdeveloped public communication function to play a constructive role in supplying timely information and in raising awareness of threats, while developing a more solid governance for its own functioning, away from politicised information. In the short-term, the function can serve as an important source of information, including in times of crisis. Over the longer-term, building the capacity of the function to provide citizens with the skills necessary to better understand the information environment, for example through pre-bunking, can be an important tool for societal resilience.
Strengthen mechanisms to avoid real or suspected conflict of interest with respect to the public communication function. Transparent, accountable, and professional management of the public communication function can help ensure it plays an important role in providing timely information that can build awareness of relevant challenges and threats and provide proactive communication that helps build societal resilience to the spread of disinformation.
Expand understanding of the information space by supporting research activities to better understand trends in information and content consumption patterns, the threats posed and tactics used by foreign actors spreading false and misleading information, and methodologies for assessing the impact of risk mitigation measures. Strengthen opportunities and mechanisms for research to inform the policy-making process.
Design and put in place effective participatory mechanisms with citizens, journalists, social media platforms, academics, and civil society organisations to help establish policy priorities and clarify needs and opportunities related to strengthening information integrity. Building more meaningful democratic engagement, including through deliberative citizens assemblies, around policy design and implementation as related to information integrity will contribute to broader efforts to strengthen democracy resilience.
Identify government collaboration on information integrity with non-government partners, including journalists, academia, the private sector, and other relevant non-governmental organisations. Engagement activities and outputs, including those related to funding, the goals of the co-operation, and impact on content decisions, should be clearly identifiable by the public. Similarly, the public should be able to identify whether a communication campaign, media literacy activity, or research product is financed or guided by government institutions.
Take steps to clarify funding sources to mitigate the risks of malign interfering groups gaining access to data or being able to manipulate a country’s information space.
Mitigate the risk to governmental staff, academics, CSOs, private sector, and other actors engaged in information integrity initiatives when they become targets of disinformation campaigns, other threats, and harassment. When necessary, enable appropriate measures to protect the human rights of affected individuals.
References
[34] Basol, M. et al. (2021), “Towards psychological herd immunity: Cross-cultural evidence for two prebunking interventions against COVID-19 misinformation”, Big Data & Society, Vol. 8/1, https://doi.org/10.1177/20539517211013868.
[22] Be media smart (2023), “Be media smart website”, https://www.bemediasmart.ie/ (accessed on 15 February 2024).
[62] Citizens’ Assembly on Democratic Expression (2022), , https://static1.squarespace.com/static/5f8ee1ed6216f64197dc541b/t/632c7bdbe8994a793e6256d8/1663859740695/CitizensAssemblyOnDemocraticExpression-PPF-SEP2022-FINAL-REPORT-EN-1.pdf.
[35] CLEMI (2023), Bilan de formation 2021-2022, https://www.clemi.fr/fr/bilans-de-formation.html.
[18] CLEMI (n.d.), CLEMI website, Centre pour l’éducation aux médias et à l’information, https://www.clemi.fr/fr/qui-sommes-nous.html (accessed on 15 February 2024).
[36] Council of Europe (2016), Mapping of media literacy practices and actions in EU-28, https://rm.coe.int/media-literacy-mapping-report-en-final-pdf/1680783500.
[54] EFCSN (2022), “The European Fact-Checking Standards Network Project”, European Fact-Checking Standards Network, https://eufactcheckingproject.com/.
[65] European Comission (2023), “Funded projects in the fight against disinformation”, https://commission.europa.eu/strategy-and-policy/coronavirus-response/fighting-disinformation/funded-projects-fight-against-disinformation_en.
[12] European Commission (2023), Guidelines pursuant to Article 33a(3) of the Audiovisual Media Services Directive on the scope of Member States’ reports concerning measures for the promotion and development of media literacy skills, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52023XC0223%2801%29.
[41] European Commission (n.d.), “DigComp”, https://joint-research-centre.ec.europa.eu/digcomp_en.
[6] European Union (2022), Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), Publications Office of the European Union, https://eur-lex.europa.eu/legal-content/EN/TXT/?ur.
[7] European Union (2018), Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provis, Publications Office of the European Union, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32018L1808.
[52] Faktograf (2023), Harassment of Fact-checking Media Outlets in Europe, https://faktograf.hr/site/wp-content/uploads/2023/03/preliminary-survey-report-final.pdf.
[58] Forum on Information and Democracy (2023), OECD Tackling disinformation: Strengthening democracy through information integrity conference.
[14] Forum on Information and Democracy (2023), Pluralism of news and information in curation and indexing algorithms, https://informationdemocracy.org/wp-content/uploads/2023/08/Report-on-Pluralism-Forum-on-ID.pdf.
[55] Full Fact (2022), Full Fact Report 2022: Tackling online misinformation in an open society - what law and regulation should do, https://fullfact.org/media/uploads/full-fact-report-2022.pdf.
[64] Government of Canada (2023), Digital Citizen Initiative, https://www.canada.ca/en/canadian-heritage/services/online-disinformation.html.
[16] Government of Finland (2019), Media Literacy in Finland: National Media Education Policy, Ministry of Education and Culture, https://medialukutaitosuomessa.fi/mediaeducationpolicy.pdf.
[61] Government of Ireland (2022), Report of the Future of Media Commission, https://www.gov.ie/pdf/?file=https://assets.gov.ie/229731/2f2be30d-d987-40cd-9cfe-aaa885104bc1.pdf#page=null.
[5] Government of Netherlands (2022), Government-wide strategy for effectively tackling disinformation, https://www.government.nl/documents/parliamentary-documents/2022/12/23/government-wide-strategy-for-effectively-tackling-disinformation.
[47] Government of New Zealand (2020), Make summer unstoppable by hitting COVID-19 for six, https://www.beehive.govt.nz/release/make-summer-unstoppable-hitting-covid-19-six.
[15] Government of Portugal (2017), Resolução do Conselho de Ministros n.º 142/2023.
[46] Government of the UK (2022), Government Communcation Service Propriety Guidance, https://gcs.civilservice.gov.uk/publications/propriety-guidance/.
[25] Government of the UK (2022), “Help for vulnerable people to spot disinformation and boost online safety”, https://www.gov.uk/government/news/help-for-vulnerable-people-to-spot-disinformation-and-boost-online-safety.
[24] Guess, A., J. Nagler and J. Tucker (2019), “Less than you think: Prevalence and predictors of fake news dissemination on Facebook”, Science Advances, Vol. 5/1, https://doi.org/10.1126/sciadv.aau4586.
[11] Hill, J. (2022), “Policy responses to false and misleading digital content: A snapshot of children’s media literacy”, OECD Education Working Papers, No. 275, OECD Publishing, Paris, https://doi.org/10.1787/1104143e-en.
[53] IFCN (2023), “Commit to transparency — sign up for the International Fact-Checking Network’s code of principles”, International Fact-Checking Network, https://ifcncodeofprinciples.poynter.org/.
[45] Kalenský, J. and R. Osadchuk (2024), How Ukraine fights Russian disinformation: Beehive vs mammoth, https://www.hybridcoe.fi/wp-content/uploads/2024/01/20240124-Hybrid-CoE-Research-Report-11-How-UKR-fights-RUS-disinfo-WEB.pdf.
[30] Latvian State Security Service (n.d.), “Annual reports”, https://vdd.gov.lv/en/useful/annual-reports (accessed on 15 February 2024).
[50] Louis-Sidois, C. (2022), “Checking the French Fact-checkers”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.4030887.
[33] Maertens, R. et al. (2021), “Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments”, Journal of Experimental Psychology: Applied, Vol. 27/1, pp. 1-16, https://doi.org/10.1037/xap0000315.
[21] Media Literacy Ireland (n.d.), “What is Media Literacy Ireland?”, https://www.medialiteracyireland.ie/ (accessed on 15 February 2024).
[13] Media literacy now (2022), Media Literacy Policy Report 2022, https://medialiteracynow.org/policyreport/.
[23] Media Literacy Week (2023), “Media Literacy Week celebrates diversity in creating and developing a better media environment for all”, https://mediataitoviikko.fi/in-english/.
[40] Morris, K. (2023), Ofcom’s Toolkit for Evaluating Media Literacy Interventions, Media & Learning Association, https://media-and-learning.eu/type/featured-articles/ofcoms-toolkit-for-evaluating-media-literacy-interventions/.
[43] Neylan, J. et al. (2023), “How to “inoculate” against multimodal misinformation: A conceptual replication of Roozenbeek and van der Linden (2020)”, Scientific Reports, Vol. 13/1, https://doi.org/10.1038/s41598-023-43885-2.
[38] Norwegian Media Authority (2021), Critical Media Understanding in the Norwegian Population, https://www.medietilsynet.no/globalassets/publikasjoner/kritisk-medieforstaelse/211214-kmf_hovudrapport_med_engelsk_2021.pdf.
[20] Norwegian Media Authority (2021), Stop, think, check: How to expose fake news and misinformation, https://www.medietilsynet.no/english/stop-think-check-en/.
[48] OECD (2023), Drivers of Trust in Public Institutions in New Zealand, Building Trust in Public Institutions,, OECD Publishing, https://doi.org/10.1787/948accf8-en.
[28] OECD (2023), “Good practice principles for public communication responses to mis- and disinformation”, OECD Public Governance Policy Papers, No. 30, OECD Publishing, Paris, https://doi.org/10.1787/6d141b44-en.
[1] OECD (2023), “What is resilience and how to operationalise it?”, OECD, Paris, https://www.oecd.org/dac/conflict-fragility-resilience/risk-resilience.
[67] OECD (2022), Building Trust and Reinforcing Democracy: Preparing the Ground for Government Action, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/76972a4a-en.
[56] OECD (2022), Building Trust to Reinforce Democracy: Main Findings from the 2021 OECD Survey on Drivers of Trust in Public Institutions, Building Trust in Public Institutions, OECD Publishing, Paris, https://doi.org/10.1787/b407f99c-en.
[63] OECD (2022), “Misinformation and disinformation: An international effort using behavioural science to tackle the spread of misinformation”, OECD Public Governance Policy Papers, No. 21, OECD Publishing, Paris, https://doi.org/10.1787/b7709d4f-en.
[68] OECD (2022), OECD Guidelines for Citizen Participation Processes, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/f765caf6-en.
[3] OECD (2022), The Protection and Promotion of Civic Space: Strengthening Alignment with International Standards and Guidance, OECD Publishing, Paris, https://doi.org/10.1787/d234e975-en.
[10] OECD (2022), Trends Shaping Education 2022, OECD Publishing, Paris, https://doi.org/10.1787/6ae8771a-en.
[2] OECD (2021), 21st-Century Readers: Developing Literacy Skills in a Digital World, OECD Publishing, https://doi.org/10.1787/a83d84cb-en.
[44] OECD (2021), OECD Report on Public Communication: The Global Context and the Way Forward, OECD Publishing, Paris, https://doi.org/10.1787/22f8031c-en.
[60] OECD (2020), Innovative Citizen Participation and New Democratic Institutions: Catching the Deliberative Wave, OECD Publishing, Paris, https://doi.org/10.1787/339306da-en.
[57] OECD (2017), “Recommendation of the Council on Open Government”, OECD Legal Instruments, OECD/LEGAL/0438, OECD, Paris, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0438.
[59] OECD (2010), Recommendation of the Council on Principles for Transparency and Integrity in Lobbying, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0379.
[49] OECD (forthcoming), Unlocking public communication’s potential for stronger democracy and increased trust.
[39] Ofcom (2023), A toolkit for evaluating media literacy interventions, https://www.ofcom.org.uk/research-and-data/media-literacy-research/approach/evaluate/toolkit.
[8] Ofcom (2023), Making Sense of Media, https://www.ofcom.org.uk/research-and-data/media-literacy-research.
[51] Pew Research (2019), Republicans far more likely than Democrats to say fact-checkers tend to favor one side, https://www.pewresearch.org/short-reads/2019/06/27/republicans-far-more-likely-than-democrats-to-say-fact-checkers-tend-to-favor-one-side/.
[17] Portuguese Regulatory Authority for the Media (2023), Media Literacy in Portugal: 1st Report under No. 2 of Article 33.A of the Audiovisual Media Services Directive, https://www.erc.pt/en/reports/media-literacy/1st-report-under-n-2-of-article-33-a-of-the-audiovisual-media-services-directive-eu-/.
[31] Republic of Lithuania (2022), National Threat Assessment 2022, State Security Department (VSD)/Defence Intelligence and Security Service under the Ministry of National Defence (AOTD), https://www.vsd.lt/wp-content/uploads/2022/04/ANGL-el-_.pdf.
[26] Roozenbeek, J. and S. van der Linden (2021), Don’t Just Debunk, Prebunk: Inoculate Yourself Against Digital Misinformation, https://www.spsp.org/news-center/blog/roozenbeek-van-der-linden-resisting-digital-misinformation.
[42] Roozenbeek, J. and S. van der Linden (2020), “Breaking Harmony Square: A game that “inoculates” against political misinformation”, Harvard Kennedy School Misinformation Review, https://doi.org/10.37016/mr-2020-47.
[29] Supo (2022), “Supo Yearbook 2021: Finns must be prepared for influencing efforts from Russia during NATO debate”, SUPO Finnish Security and Intelligence Service, https://supo.fi/en/-/supo-yearbook-2021-finns-must-be-prepared-for-influencing-efforts-from-russia-during-nato-debate.
[32] Swedish Security Service (n.d.), “Sweden Security Police Yearbooks”, https://www.sakerhetspolisen.se/om-sakerhetspolisen/publikationer/sakerhetspolisens-arsberattelse.htm (accessed on 15 February 2024).
[19] The Dutch Media Literacy Network (n.d.), “About Dutch Media Literacy Network”, https://netwerkmediawijsheid.nl/over-ons/about-dutch-media-literacy-network/ (accessed on 15 February 2024).
[37] UK Department for Digital, Culture, Media & Sport (2021), Online media literacy strategy, https://www.gov.uk/government/publications/online-media-literacy-strategy.
[9] UNESCO (2023), Media and information literacy, United Nations Educational, Scientific and Cultural Organization, https://www.unesco.org/en/media-information-literacy#:~:text=Media%20and%20information%20literacy%20empowers,to%20information%2C%20and%20sustainable%20development.
[27] Van der Linden, S. (2023), Foolproof: Why we fall for Misinformation and How to Build Immunity, 4th Estate.
[66] VERA.AI (2023), Project Summary: Facts & Figures, https://www.veraai.eu/project-summary (accessed on 19 October 2023).
[4] Wanless, A. and J. Shapiro (2022), A CERN Model for Studying the Information Environment, https://carnegieendowment.org/2022/11/17/cern-model-for-studying-information-environment-pub-88408.
Notes
← 1. For more information, see: https://www.mk.gov.lv/lv/media/14255/download
← 2. For additional information, see: https://www.isdatechtzo.nl/
← 3. For additional information, see OECD (2022[68]), OECD Guidelines for Citizen Participation Processes.