Countering Disinformation Effectively: An Evidence-Based Policy Guide

Key takeaways:
There is significant evidence that media literacy training can help people identify false stories and unreliable news sources. However, variation in pedagogical approaches means the effectiveness of one program does not necessarily imply the effectiveness of another. The most successful variants empower motivated individuals to take control of their media consumption and seek out high-quality information—instilling confidence and a sense of responsibility alongside skills development. While media literacy training shows promise, it suffers challenges in speed, scale, and targeting. Reaching large numbers of people, including those most susceptible to disinformation, is expensive and takes many years.
Key sources:
- Monica Bulger and Patrick Davison, “The Promises, Challenges, and Futures of Media Literacy,” Data & Society, February 21, 2018, https://datasociety.net/library/the-promises-challenges-and-futures-of-media-literacy.
- Géraldine Wuyckens, Normand Landry, and Pierre Fastrez, “Untangling Media Literacy, Information Literacy, and Digital Literacy: A Systematic Meta-review of Core Concepts in Media Education,” Journal of Media Literacy Education 14 (2022), https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1531&context=jmle.
- Erin Murrock, Joy Amulya, Mehri Druckman, and Tetiana Liubyva, “Winning the War on State-Sponsored Propaganda: Gains in the Ability to Detect Disinformation a Year and a Half After Completing a Ukrainian News Media Literacy Program,” Journal of Media Literacy Education 10 (2018): https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1361&context=jmle.
Description and Use Cases
Increasing individuals’ media literacy through education and training is one of the most frequently recommended countermeasures against disinformation.1 Proponents argue that “media literacy and critical thinking are the first barrier to deception” and that teaching people these skills therefore enables them to better identify false claims.2 The National Association for Media Literacy Education defines media literacy as “the ability to access, analyze, evaluate, create, and act using all forms of communication.” However, scholars point to conceptual confusion around the term, and practitioners take many different approaches.3 Common goals include instilling knowledge of the media industry and journalistic practices, awareness of media manipulation and disinformation techniques, and familiarity with the internet and digital technologies.
Media literacy education initiatives target a range of different audiences, occur in multiple settings, and use a variety of methods—including intensive classroom-based coursework as well as short online videos and games. Many programs focus on children and adolescents,4 with research suggesting that young people are less familiar with the workings of the internet and digital media and more susceptible to online hoaxes and propaganda than commonly assumed.5 For example, a 2016 study of over 7,800 students found many failed to distinguish sponsored content and untrustworthy websites in search results.6 Public education is therefore one major vehicle to reach large numbers of people early in their lives, alongside other kinds of youth programs. Aspects of media literacy have long been embedded in general education and liberal arts curricula in advanced democracies, especially in subjects that emphasize critical reading and thinking, such as language arts, essay writing, civics, and rhetoric. Public libraries have also historically promoted media literacy.
Not all media literacy programs target young people. After all, people don’t necessarily age out of their susceptibility to disinformation; in fact, older individuals seem more likely to share false stories on Facebook.7 Media literacy training for adults may happen at libraries, senior citizen centers, recreational events, or professional settings. Civil society and government agencies have also run public awareness campaigns and released gamified education tools. For example, Sweden established a Psychological Defence Agency in 2022. Its responsibilities include leading “training, exercises and knowledge development” to help residents “identify and counter foreign malign information influence, disinformation and other dissemination of misleading information directed at Sweden.”8
One valuable case study is the International Research and Exchanges Board (IREX)’s Learn to Discern program, which has used a “train the trainers” approach in Ukraine and a number of other countries since 2015. This program equips volunteers to deliver a media literacy curriculum to members of their community.9 Reaching more vulnerable adults (for example, racial and ethnic minorities and those with fewer economic resources, less education, or less experience with the internet) is a policy priority for governments focused on media literacy.10
How Much Do We Know?
The body of scholarship on media literacy is large relative to most other disinformation countermeasures. For example, a 2022 literature review on digital literacy—one component of media literacy—found forty-three English-language studies since 2001, with thirty-three of these published since 2017, when interest in the topic swelled.11 The existence of dedicated journals and conferences is another indicator of growth in this subfield. For example, the National Association for Media Literacy Education published the first issue of the Journal of Media Literacy Education in 2009.12 Other major repositories of research on media literacy include a database maintained by the United Nations Alliance of Civilizations.13
Review of this literature shows that specific media literacy approaches have a strong theoretical basis and a large body of experimental evidence. However, variation in pedagogical approaches means the effectiveness of one program does not necessarily imply the effectiveness of another.14 Moreover, the lack of robust mechanisms for collecting data on classroom activities is a recognized gap. In 2018, the Media Literacy Programme Fund in the United Kingdom (considered a leader in media literacy education) cited grants to support evaluation as a priority.15 Since then, several studies have conducted real-time evaluation and sought to measure lasting improvements in student performance. Additional studies could expand the menu of possible approaches to evaluation; also useful would be to examine further the effectiveness of media literacy training for atypical individuals at the extremes, such as those who are especially motivated by partisanship, conspiracy theories, or radical ideologies.
How Effective Does It Seem?
There is significant evidence that media literacy training can help people identify false stories and unreliable news sources.16 Scholars sometimes refer to this as inoculation, because “preemptively exposing, warning, and familiarising people with the strategies used in the production of fake news helps confer cognitive immunity when exposed to real misinformation.”17 One experiment found that playing an online browser game designed to expose players to six different disinformation strategies reduced subjects’ susceptibility to false claims, especially among those users who were initially most vulnerable to being misled. Such laboratory findings are bolstered by studies of larger, real-world interventions. An evaluation of IREX’s Learn to Discern program found durable increases in good media consumption habits, such as checking multiple sources, lasting up to eighteen months after delivery of the training. 18 Other studies support teaching students to read “laterally”—using additional, trusted sources to corroborate suspect information.19
Because media literacy comes in many forms, it is important to assess which variants are most effective at reducing belief in false stories so trainers and educators can prioritize them. Research suggests that the most successful variants empower motivated individuals to take control of their media consumption and seek out high-quality information. This has been described as “actionable skepticism,” or sometimes simply as “information literacy.”20 For example, a 2019 review in American Behavioral Scientist examined various factors that might enable someone to recognize false news stories. They found that people’s “abilities to navigate and find information online that is verified and reliable”—for example, differentiating between an encyclopedia and a scientific journal—was an important predictor. In contrast, subjects’ understanding of the media industry and journalistic practices or their self-reported ability to “critically consume, question, and analyze information” were not predictive.21 Later research based on survey data also supported these findings.22
The most successful variants empower motivated individuals to take control of their media consumption and seek out high-quality information.
Importantly, multiple studies have shown that effective media literacy depends not only on people’s skills but also on their feelings and self-perceptions. Specifically, individuals who feel confident in their ability to find high-quality news sources, and who feel responsible for proactively doing so, are less likely to believe misleading claims. This factor is often called an individual’s “locus of control,” and it has been identified as important in studies of multiple nationally and demographically diverse populations.23 People who purposefully curate their information diet are less likely to be misled; passive consumers, on the other hand, are more vulnerable. However, this may be truer of typical news consumers than of outliers like extremists and very motivated partisans. The latter groups might self-report confidence in curating their media diet while nevertheless selecting for misleading, radical, or hyper-partisan sources.
A growing body of recent literature based on large-scale classroom studies shows how specific techniques can provide news consumers with greater agency and ability to seek out accurate information.24 Whereas past forms of online media literacy education often focused on identifying markers of suspicious websites—like typographical errors or other indicators of low quality—these signs are less useful in the modern information environment, where sources of misinformation can have the appearance of high production value for low cost.25 Recent studies have shown that lateral reading is more effective.26 In one study of students at a public college in the northeastern United States, only 12 percent of subjects used lateral reading before receiving training on how to do so; afterward, more than half did, and students showed an overall greater ability to discern true claims from fictional ones.27 A similar study on university students in California found these effects endured after five weeks.28 Another one-day exercise with American middle school students found that students had a difficult time overcoming impressions formed from “superficial features” on websites and should be trained to recognize different types of information sources, question the motivation behind them, and—crucially—compare those sources with known trustworthy sites.29
Teaching people to recognize unreliable news sources and common media manipulation tactics becomes even more effective when participants are also able to improve their locus of control, according to academic research and program evaluations. In a study of media literacy among 500 teenagers, researchers found that students with higher locus of control were more resilient against false stories. In another study based on survey data, researchers found that individuals who exhibited high locus of control and the ability to identify false stories were more likely to take corrective action on social media, such as reporting to the platform or educating the poster.30 (The participatory nature of social media increases the importance of educating users not only on how to recognize untrustworthy content but also on how to respond to and avoid sharing it.31)
Evaluations of IREX’s Learn to Discern program in Ukraine and a similar program run by PEN America in the United States shed further light on locus of control. These curricula’s focus on identifying untrustworthy content led subjects to become overly skeptical of all media. While trainees’ ability to identify disinformation and their knowledge of the news media increased, their locus of control changed only slightly. Ultimately, trainees’ ability to identify accurate news stories did not improve, and they remained distrustful of the media as a whole.32 A major challenge, then, is news consumers who feel under threat from the information environment rather than empowered to inform themselves. One potential intervention point could be social media platforms, which can provide tools and make other design choices to help users compare on-platform information with credible external sources (see case study 4). This could reinforce users’ locus of control while assisting them in exercising it.
Educators should be mindful of media literacy expert Paul Mihailidis’s warning that “critical thought can quickly become cynical thought.”33 In a 2018 essay, media scholar danah boyd argued that individuals who are both cynical about institutions and equipped to critique them can become believers in, and advocates for, conspiracy theories and disinformation. To avoid this trap, media literacy education must be designed carefully. This means empowering people to engage with media critically, constructively, and discerningly rather than through the lenses of undifferentiated paranoia and distrust.34
How Easily Does It Scale?
While media literacy training shows promise, it suffers challenges from speed, scale, and targeting. Many approaches will take years to reach large numbers of people, including many vulnerable and hard-to-reach populations. Attempts to reach scale through faster, leaner approaches, like gamified online modules or community-based efforts to train the trainers, are highly voluntary and most likely to impact already motivated individuals rather than large percentages of the public.
Many media literacy projects are not particularly expensive to deliver to small audiences. However, achieving wide impact requires high-scale delivery, such as integrating media literacy into major institutions like public education—a costly proposition. When a proposed 2010 bill in the U.S. Congress, the Healthy Media for Youth Act, called for $40 million for youth media literacy initiatives, leading scholars deemed the amount insufficient and advocated for larger financial commitments from the government, foundations, and the private sector.35
Once the resources and curricula are in place, it will still take time to develop necessary infrastructure to implement large-scale media literacy programs. For example, hiring skilled educators is a critical yet difficult task. Studies from the European Union (EU) and South Africa both identified major deficiencies in teachers’ own abilities to define core media literacy concepts or practice those concepts themselves.36
Notes
1 For examples, see Lucas and Pomeranzev, “Winning the Information War”; Katarína Klingová and Daniel Milo, “Countering Information War Lessons Learned from NATO and Partner Countries: Recommendations and Conclusions,” GLOBSEC, February 2017, https://www.globsec.org/what-we-do/publications/countering-information-war-lessons-learned-nato-and-partner-countries; Claire Wardle and Hossein Derakhshan, “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making,” Council of Europe, September 2017, https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c; Daniel Fried and Alina Polyakova, “Democratic Defense Against Disinformation,” Atlantic Council, February 2018, https://www.atlanticcouncil.org/wp-content/uploads/2018/03/Democratic_Defense_Against_Disinformation_FINAL.pdf; “A Multi-Dimensional Approach,” European Commission; Erik Brattberg and Tim Maurer, “Russian Election Interference: Europe’s Counter to Fake News and Cyber Attacks,” Carnegie Endowment for International Peace, May 2018, https://carnegieendowment.org/files/CP_333_BrattbergMaurer_Russia_Elections_Interference_FINAL.pdf; “Action Plan Against Disinformation,” European Commission, May 2018, https://www.eeas.europa.eu/node/54866_en; Fly, Rosenberger, and Salvo, “The ASD Policy Blueprint”; Jean-Baptiste Jeangène Vilmer, Alexandre Escorcia, Marine Guillaume, and Janaina Herrera, “Information Manipulation: A Challenge for Our Democracies,” French Ministry for Europe and Foreign Affairs and the Institute for Strategic Research, August 2018, https://www.diplomatie.gouv.fr/IMG/pdf/information_manipulation_rvb_cle838736.pdf; Todd C. Helmus et al., “Russian Social Media Influence: Understanding Russian Propaganda in Eastern Europe,” RAND Corporation, 2018, https://www.rand.org/pubs/research_reports/RR2237.html; and Paul Barrett, “Tackling Domestic Disinformation: What the Social Media Companies Need to Do,” New York University, March 2019, https://issuu.com/nyusterncenterforbusinessandhumanri/docs/nyu_domestic_disinformation_digital?e=31640827/68184927.
2 Klingová and Milo, “Countering Information War.”
3 “Media Literacy Defined,” National Association for Media Literacy Education, accessed February 13, 2023, https://namle.net/resources/media-literacy-defined/. See also Monica Bulger and Patrick Davison, “The Promises, Challenges, and Futures of Media Literacy,” Data & Society, February 21, 2018, https://datasociety.net/library/the-promises-challenges-and-futures-of-media-literacy/; and Géraldine Wuyckens, Normand Landry, and Pierre Fastrez, “Untangling Media Literacy, Information Literacy, and Digital Literacy: A Systematic Meta-review of Core Concepts in Media Education,” Journal of Media Literacy Education 14, no. 1 (2022): https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1531&context=jmle.
4 Renee Hobbs, “Digital and Media Literacy: A Plan of Action,” Aspen Institute, 2010, https://www.aspeninstitute.org/wp-content/uploads/2010/11/Digital_and_Media_Literacy.pdf; and “Online Media Literacy Strategy,” UK Department for Digital, Culture, Media, & Sport, July 2021, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1004233/DCMS_Media_Literacy_Report_Roll_Out_Accessible_PDF.pdf.
5 Tiffany Hsu, “When Teens Find Misinformation, These Teachers Are Ready,” New York Times, September 8, 2022, https://www.nytimes.com/2022/09/08/technology/misinformation-students-media-literacy.html; “A Global Study on Information Literacy: Understanding Generational Behaviors and Concerns Around False and Misleading Information Online,” Poynter Institute, August 2022, https://www.poynter.org/wp-content/uploads/2022/08/A-Global-Study-on-Information-Literacy-1.pdf; and Elena-Alexandra Dumitru, “Testing Children and Adolescents’ Ability to Identify Fake News: A Combined Design of Quasi-Experiment and Group Discussions,” Societies 10, no. 3 (September 2020): https://www.mdpi.com/2075-4698/10/3/71/htm.
6 Sam Wineburg, Sarah McGrew, Joel Breakstone, and Teresa Ortega, “Evaluating Information: The Cornerstone of Civic Online Reasoning,” Stanford Digital Repository, November 22, 2016, https://purl.stanford.edu/fv751yt5934.
7 For evidence that older users are more likely to share false stories on Facebook, see Andrew Guess, Jonathan Nagler, and Joshua Tucker, “Less Than You Think: Prevalence and Predictors of Fake News Dissemination on Facebook,” Science Advances 5, no. 1 (2019): https://www.science.org/doi/10.1126/sciadv.aau4586.
8 Elisabeth Braw, “Create a Psychological Defence Agency to ‘Prebunk’ Fake News,” Prospect, December 8, 2022, https://www.prospectmagazine.co.uk/politics/60291/create-a-psychological-defence-agency-to-prebunk-fake-news; and Adela Suliman, “Sweden Sets Up Psychological Defense Agency to Fight Fake News, Foreign Interference,” Washington Post, January 6, 2022, https://www.washingtonpost.com/world/2022/01/06/sweden-fake-news-psychological-defence-agency.
9 Erin Murrock, Joy Amulya, Mehri Druckman, and Tetiana Liubyva, “Winning the War on State-Sponsored Propaganda: Gains in the Ability to Detect Disinformation a Year and a Half After Completing a Ukrainian News Media Literacy Program,” Journal of Media Literacy Education 10, no. 2 (2018): https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1361&context=jmle.
10 “Online Media Literacy Strategy,” UK Department for Digital, Culture, Media, & Sport; and Kara Brisson-Boivin and Samantha McAleese, “From Access to Engagement: Building a Digital Media Literacy Strategy for Canada,” MediaSmarts, 2022, https://mediasmarts.ca/research-reports/access-engagement-building-digital-media-literacy-strategy-canada.
11 Hasan Tinmaz, Yoo-Taek Lee, Mina Fanea-Ivanovici, and Hasnan Baber, “A Systematic Review on Digital Literacy,” Smart Learning Environments 9 (2022), https://slejournal.springeropen.com/articles/10.1186/s40561-022-00204-y.
12 “History,” National Association for Media Literacy Education, accessed February 13, 2023, https://namle.net/about/history/.
13 “Media & Information Literacy,” UN Alliance of Civilizations, accessed March 26, 2023, https://milunesco.unaoc.org/mil-organizations/acma-digital-media-literacy-research-program.
14 “Media & Information Literacy,” UN Alliance of Civilizations.
15 “Online Media Literacy Strategy,” UK Department for Digital, Culture, Media, & Sport; “Media Literacy Programme Fund,” Government of the United Kingdom, accessed March 26, 2023, https://www.gov.uk/guidance/media-literacy-programme-fund; and Bulger and Davison, “Promises, Challenges, and Futures.”
16 Consider Bulger and Davison, “Promises, Challenges, and Futures,” as well as Theodora Dame Adjin-Tettey, “Combating Fake News, Disinformation, and Misinformation: Experimental Evidence for Media Literacy Education,” Cogent Arts & Humanities 9 (2022): https://www.tandfonline.com/doi/full/10.1080/23311983.2022.2037229.
17 Jon Roozenbeek and Sander van der Linden, “Fake News Game Confers Psychological Resistance Against Online Misinformation,” Palgrave Communications 5 (2019): https://www.nature.com/articles/s41599-019-0279-9.
18 Murrock, Amulya, Druckman, and Liubyva, “Winning the War.”
19 Carl-Anton Werner Axelsson, Mona Guath, and Thomas Nygren, “Learning How to Separate Fake From Real News: Scalable Digital Tutorials Promoting Students’ Civic Online Reasoning,” Future Internet 13, no. 3 (2021): https://www.mdpi.com/1999-5903/13/3/60.
20 Jennifer Fleming, “Media Literacy, News Literacy, or News Appreciation? A Case Study of the News Literacy Program at Stony Brook University,” Journalism & Mass Communication Educator 69, no. 2 (2013): https://journals.sagepub.com/doi/abs/10.1177/1077695813517885.
21 Because the measurement of media literacy was self-reported, the study posits this as an example of the “Dunning-Kruger effect”: an individual’s (over)confidence in their ability to critically consume media is related to their susceptibility to deception. See Mo Jones-Jang, Tara Mortensen, and Jingjing Liu, “Does Media Literacy Help Identification of Fake News? Information Literacy Helps, but Other Literacies Don’t,” American Behavioral Scientist (August 2019): https://www.researchgate.net/publication/335352499_Does_Media_Literacy_Help_Identification_of_Fake_News_Information_Literacy_Helps_but_Other_Literacies_Don’t.
22 Brigitte Huber, Porismita Borah, and Homero Gil de Zúñiga, “Taking Corrective Action When Exposed to Fake News: The Role of Fake News Literacy,” Journal of Media Literacy Education 14 (July 2022): https://www.researchgate.net/publication/362513295_Taking_corrective_action_when_exposed_to_fake_news_The_role_of_fake_news_literacy.
23 Murrock, Amulya, Druckman, and Liubyva, “Winning the War”; and “Impact Report: Evaluating PEN America’s Media Literacy Program,” PEN America & Stanford Social Media Lab, September 2022, https://pen.org/report/the-impact-of-community-based-digital-literacy-interventions-on-disinformation-resilience. See also Yan Su, Danielle Ka Lai Lee, and Xizhu Xiao, “‘I Enjoy Thinking Critically, and I’m in Control’: Examining the Influences of Media Literacy Factors on Misperceptions Amidst the COVID-19 Infodemic,” Computers in Human Behavior 128 (2022): https://www.sciencedirect.com/science/article/pii/S0747563221004349, a study based on subjects in China. The similar findings between the United States, Ukraine, and China—despite significant differences in the three countries’ media systems and histories—is noteworthy.
24 See generally: Folco Panizza et al., “Lateral Reading and Monetary Incentives to Spot Disinformation About Science,” Scientific Reports 12 (2022): https://www.nature.com/articles/s41598-022-09168-y; Sam Wineburg et al., “Lateral Reading on the Open Internet: A District-Wide Field Study in High School Government Classes,” Journal of Educational Psychology 114, no. 5 (2022): https://www.studocu.com/id/document/universitas-kristen-satya-wacana/social-psychology/lateral-reading-on-the-open-internet-a-district-wide-field-study-in-high-school-government-classes/45457099; and Joel Breakstone et al., “Lateral Reading: College Students Learn to Critically Evaluate Internet Sources in an Online Course,” Harvard Kennedy School Misinformation Review 2 (2021), https://misinforeview.hks.harvard.edu/article/lateral-reading-college-students-learn-to-critically-evaluate-internet-sources-in-an-online-course.
25 For more on this method, its success in classroom trials, and its departure from previous forms of media literacy education, see D. Pavlounis, J. Johnston, J. Brodsky, and P. Brooks, “The Digital Media Literacy Gap: How to Build Widespread Resilience to False and Misleading Information Using Evidence-Based Classroom Tools,” CIVIX Canada, November 2021, https://ctrl-f.ca/en/wp-content/uploads/2021/11/The-Digital-Media-Literacy-Gap-Nov-7.pdf.
26 Axelsson, Guath, and Nygren, “Learning How to Separate.”
27 Jessica E. Brodsky et al., “Associations Between Online Instruction in Lateral Reading Strategies and Fact-Checking COVID-19 News Among College Students,” AERA Open (2021): https://journals.sagepub.com/doi/full/10.1177/23328584211038937.
28 Sarah McGrew, Mark Smith, Joel Breakstone, Teresa Ortega, and Sam Wineburg, “Improving University Students’ Web Savvy: An Intervention Study,” British Journal of Educational Psychology 89, no. 3 (September 2019): https://bpspsychub.onlinelibrary.wiley.com/doi/10.1111/bjep.12279.
29 Angela Kohnen, Gillian Mertens, and Shelby Boehm, “Can Middle Schoolers Learn to Read the Web Like Experts? Possibilities and Limits of a Strategy-Based Intervention,” Journal of Media Literacy Education 12, no. 2 (2020): https://digitalcommons.uri.edu/cgi/viewcontent.cgi?article=1457&context=jmle.
30 Adam Maksl, Seth Ashley, and Stephanie Craft, “Measuring News Media Literacy,” Journal of Media Literacy Education 6 (2015), https://digitalcommons.uri.edu/jmle/vol6/iss3/3/; and Huber, Borah, and Zúñiga, “Taking Corrective Action.”
31 Bulger and Davison, “Promises, Challenges, and Futures.”
32 Like Jones-Jang, Mortensen, and Liu, the authors of the IREX evaluation suggest that the “false sense of control” already felt by individuals who did not receive media literacy training may also partially explain the relatively small improvements in these subjects’ locus of control.
33 Paul Mihailidis, “Beyond Cynicism: Media Education and Civic Learning Outcomes in the University,” International Journal of Learning and Media 1, no. 3 (August 2009): https://www.researchgate.net/publication/250958225_Beyond_Cynicism_Media_Education_and_Civic_Learning_Outcomes_in_the_University.
34 “You Think You Want Media Literacy… Do You?” danah boyd, apophenia, March 9, 2018, https://www.zephoria.org/thoughts/archives/2018/03/09/you-think-you-want-media-literacy-do-you.html.
35 Hobbs, “Digital and Media Literacy.”
36 Sandy Zinn, Christine Stilwell, and Ruth Hoskins, “Information Literacy Education in the South African Classroom: Reflections from Teachers’ Journals in the Western Cape Province,” Libri 66 (April 2016): https://www.degruyter.com/document/doi/10.1515/libri-2015-0102/html; and Maria Ranieri, Isabella Bruni, and Anne-Claire Orban de Xivry, “Teachers’ Professional Development on Digital and Media Literacy. Findings and Recommendations From a European Project,” Research on Education and Media 9, no. 2 (2017): https://sciendo.com/article/10.1515/rem-2017-0009.
Source link