CPHA Canvax

Beware the public opinion survey’s contribution to misinformation and disinformation in the COVID-19 Pandemic

Noni E. MacDonald, Eve Dubé, Devon Greyson D, Janice E. Graham


MacDonald NE MD MSc1, Dubé E PhD2, Greyson D PhD3, Graham JE PhD1,4.
1.    Department of Pediatrics, Dalhousie University, IWK Health Centre, Halifax, Nova Scotia, Canada
2.    Institut national de santé publique du Québec and Université Laval, Québec, Québec, Canada
3.    Department of Communication, College of Social and Behavioral Sciences, University of Massachusetts, Amherst, Massachusetts, USA
4.    Technoscience and Regulation Research Unit, Faculty of Medicine, Dalhousie University, Halifax, Nova Scotia, Canada

Corresponding Author: Noni MacDonald
Funding: None for this commentary
Declaration of Interests: All the authors are paid by their respective universities or government department.

Keywords: public surveys, misinformation, disinformation, COVID-19, science denialism


The COVID-19 pandemic has been accompanied by an “infodemic” of misinformation and disinformation (1). Given the large degree of uncertainty, the complexity of the science, and rapidly evolving knowledge, well-intentioned misinformation is not surprising. As scientists race to understand a new disease, partial information and guesswork fill the gap until reliable research evidence is established. Unfortunately, disinformation, defined as deliberately false or misleading information (2), can be expected when crises are used as opportunities to make money (3) or to undermine existing institutions, including education and health care systems. Regardless of intention, misleading information can spread rapidly in the era of social media and 24/7 news coverage, aided by the influence of fear, anxiety, and stress on learning, beliefs, and health decisions. It is therefore incumbent on those conducting COVID-19 rapid research, especially research associated with public awareness and knowledge translation, to avoid contributing to the spread of misinformation and disinformation through their work.

Numerous surveys aiming to assess how COVID-19 health messages are received by the public are being conducted by governments, academics, and opinion polling companies around the globe. One needs merely to land on a news media website to be met with the results of yet another survey or to be invited to vote in a poll. Surveillance of public opinions and awareness is an important part of public health countermeasures against COVID-19 (4). It is critical, however, that such surveys be carefully designed to avoid contributing to “information overload” and the inadvertent propagation of inaccurate messaging. While many surveys are rigorously designed, based on the social and behavioural science literature, it is hardly universal, and we fear that the rush to collect time-sensitive data risks overlooking survey-design and educational messaging principles. For example, a Leger and the Association for Canadian Studies poll was reported on April 28, 2020, to show that 60% believed the vaccine should be mandatory (Canadians divided over making COVID-19 vaccine mandatory: Poll). This mandatory query leaves the erroneous impression that mandatory vaccination is being contemplated when vaccine supplies will be inadequate to cover Canada’s population when these vaccines initially become available – thus making mandatory vaccination impossible. This mandatory issue repeatedly popped up in news stories and discussions for several weeks.  

Assessing public knowledge about SARS-CoV-2 and COVID-19 when the science is rapidly evolving poses unique difficulties. One of these is disambiguating “good” and “bad” uncertainty. Survey designers must be cautious of conflating a lack of confidence in public health messaging (an undesirable outcome) with science-based uncertainty (a reasonable stance, given the current state of knowledge). Questions about lack of confidence in public health information should be carefully worded to target trust in institutions such as government, medicine, science, and corporations to avoid misclassifying accurate assessments of the emerging state of the science as a lack of trust in public health. It is important because effective responses to these two types of uncertainty are different: one (science-based uncertainty) calls for clear communication about the science as it becomes less uncertain, while the other (lack of confidence in public health) requires community outreach to build trust in science.

A second potential pitfall comes with knowledge-testing questions aiming to identify public knowledge gaps and false beliefs. While this information is extremely valuable to inform the tailoring of public health messages, it is critical to avoid reinforcing incorrect answers to knowledge questions. Multiple-choice knowledge-testing questions, however, are prone to invalid responses, as test-takers are biased toward familiar answers, even if incorrect (5). Further, such questions risk encouraging respondents subsequently to misremember the false response options as true (6, 7). In particular, we caution about asking the public to respond to questions for which the scientific data are still being collected (e.g., on the length of incubation period, modes and risks of transmission, symptoms of COVID-19). Not only do such questions risk creating an inaccurate picture of public knowledge, but they may also reinforce incorrect answers.

Science involves hypothesis-testing and study replication; a degree of uncertainty is expected within the scientific community. To the public, however, newly discovered evidence that changes previous public health recommendations can be perceived as ignorance, confusion, or even deliberate obfuscation – particularly against a background media environment that whirls in a cacophony of misinformation and disinformation. We must, therefore, ask ourselves with each survey: Are we being helpful or harmful when we ask knowledge questions early in a pandemic with numerous changing indications?  

Given the challenges inherent in successfully correcting misinformation (8), and particularly in correcting entrenched health myths (9), survey design should be undertaken carefully in consultation with experts in the psychology of survey design and in controlling misinformation. There are other means to gather this information rather than through true/false or multiple-choice survey questionnaires that risk planting seeds of confusion in those unaware of the misleading information – especially if the survey comes from trusted authorities. Observational qualitative research or social media listening (10) can identify trends in rumours and false information circulating among the public without contributing to misinformation. Returning to the mandatory survey question, a more helpful query might have been "Who should have first access to the COVID-19 vaccine when available and why?" as an open-ended questions.

As COVID-19 science grows, public health, health care and policy guidance must be linked to the newly found evidence (11). Misinformation and disinformation must be vigorously countered. Public health should be forthright and not apologize for these changes in policy as the evidence becomes available. Building a research base for public health emergency knowledge-testing on COVID-19, in this time of discordant messages, is essential. Current studies should themselves be studied, to refine our public health information research methods. For those undertaking these COVID-19 surveys, please be careful not to contribute to misinformation and disinformation. In keeping with the core principle of the Hippocratic Oath, to abstain from whatever is deleterious and mischievous, “first do no harm” needs to apply to what is included in health surveys as well.


References

  1. World Health Organization. Novel Coronavirus (2019-nCoV) Situation Report - 13 [Internet]. 2020 [cited 2020 Apr 29]. Available from: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf
  2. Jack C. Lexicon of Lies: Terms for Problematic Information [Internet]. New York, NY: Data & Society Research Institute; 2017 [updated 2017 Aug 9; cited 2020 Apr 13]. Available from: https://datasociety.net/library/lexicon-of-lies/
  3. Europol. Pandemic profiteering: How criminals exploit the COVID-19 crisis [Internet]. Europol Report; 2020 [updated 2020 Mar 27, cited 2020 Mar 27]. Available from: https://www.europol.europa.eu/publications-documents/pandemic-profiteering-how-criminals-exploit-covid-19-crisis
  4. Betsch C, Wieler LH, Habaersaat K, and the COSMO group. Monitoring behavioural insights related to COVID-19. The Lancet [Internet]. 2020 April 2;  395(10232):1255-6. Available from: https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(20)30729-7/fulltext
  5. Xu X, Kauer S, Tupy S. Multiple-choice questions: Tips for optimizing assessment in-seat and online. Scholarship of Teaching and Learning in Psychology. 2016; 2(2):147–158.
  6. Roediger HL III, Marsh EJ. The positive and negative consequences of multiple choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition. 2005;31, 1155– 1159.
  7. Marsh EJ, Roediger HL III, Bjork RA, Bjork EL. The memorial consequences of multiple-choice testing. Psychonomic Bulletin & Review. 2007; 14: 194 –199.
  8. Lewandowsky S, Ecker UK, Seifert CM, Schwarz N, Cook J. Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest. 2012; 13(3): 106-131.
  9. Nyhan B, Reiffler J. Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information. Vaccine. 2015; 33(3):459-464.
  10. Taylor J, Pagliari C. Comprehensive scoping review of health research using social media data. BMJ Open. 2018;8(12): e022931.
  11. Rochwerg B, Parke R, Murthy S, Fernando SM, Leigh JP, Marshall J, Adhikari NKJ, Fiest K, Fowler R, Lamontagne F, Sevransky JE. Misinformation during the coronavirus disease 2019 outbreak: how knowledge emerges from noise. Critical Care Explorations. 2020 Apr;2(4).