Article published on the ASSET website
Some things just do not want to die. In public health, anti-vaccination movements keep sizzling debates, just as they did in the XIX century. At the same time, the “deficit model” of science communication – the myth that the “public” is just ignorant and that it would support science, if spoon-fed information from the ivory tower – still haunts the relationship between health, science and the community, despite having been repeatedly debunked. The two zombies are more related than one could believe. Vaccine hesitancy and anti-vaccination movements grow in the cracks between trust and knowledge, and these are the fault lines that communication should heal – or rip apart, if it fails.
An interesting case in point is Italy, where vaccination is the talk of the town. On Friday, May 19, following falling immunization rates, lawmakers just decreed to make compliance with numerous vaccinations a requirement for kids to be allowed in school, raising a significant debate. Many feel that the decree is too weak, probably yielding easily when probed in courts by parents that will demand a right to education regardless of their personal health choices. Others instead worry that such a heavy-handed approach, without a focused vaccine awareness campaign, could further increase distrust in the health and political institutions.
In the last couple of years, a prominent media figure emerged in the heated vaccination debate in Italy. Professor Roberto Burioni, an expert academic microbiologist, amassed a surprising media following for a scientist (and even more so for an Italian one), by sternly but rigorously and clearly rebuking antivaccination myths on social media. He is now regularly invited in TV program to discuss vaccinations, and has a strong core of supporters. Burioni also employs a distinctly top-down approach – an approach that, on vaccine communication, is known not to work or is even counterproductive. His recent popular science book on vaccines features the not exactly soothing title Vaccines are not an opinion: The vaccinations, explained to those who just refuse to understand them. After a rush of comments in January, blaming immigrants for introducing meningitis, Burioni snapped on his Facebook page with the following comment, that became quickly viral and debated on Internet and press:
“I clarify that this page is not a place where people who know nothing can have a 'civilized discussion' on the same ground as myself. […] Anyone can check personally the truth of what I report. However, they cannot discuss with me. I hope to settle the issue: here, only who studied has a right to speak, not the common citizen. Science is not democratic.”
So much for all painstaking attempts at a dialogue-based approach to health communication, amento those believing that health decisions should not be shoved down the throat of the citizens without their understanding. When the post went viral, the science/health communication community in Italy panicked, trying to remind that a top-down attitude can further polarize the public opinion, entrenching anti-vaccination attitudes and pushing fence-sitters away from vaccination. On the other hand, many scientists and science fans consider Burioni to be a charismatic figure who speaks his mind clearly and “puts antivaccinists in their place”, accusing critics of being just envious of his success. The debate smouldered for months on social media, and is currently being revived by the recent decree on compulsory vaccination. It split a camp that fights for the same aim: a society taking sensible health choices. What went wrong?
It is true that vaccine communication is still in trouble. While we know what does not work, there is still little consensus on what actually works. A review of reviews on strategies intended to address vaccine hesitancy found “no strong evidence to recommend any specific intervention”. It is also true that many health communication interventions happen away from the spotlight of media, giving an impression of absence, while scientist-media-stars seem to do all the hard work. But perhaps science communicators must acknowledge failure on their own turf -educating the scientific community on how to communicate. No matter how much evidence shows it fails at counteracting anti-scientific sentiments in the public(s), the deficit model is still standing. So much that the journal Public Understanding of Science devoted half of its May 2016 issue to essays trying to explain its persistence. The attitude of the Italian microbiologist, indeed, is hardly unique. Almost identical declarations of “science is not a democracy” have been used by scientists and science writersrepeatedly. Most remarkably, scientists worldwide gathered in a March for Science April 22, 2017, apparently a huge public outreach success. Yet critics pointed that the March aimed more at instructing than engaging, bringing as evidence, for example, the AAAS statement endorsing it.
Ironically, papers lamenting this stubborn survival tend to assume a deficit model themselves, calling for more education of scientists and medical professionals on the issue (e.g. Besley and Tanner, 2011). This is not necessarily wrong – there is evidence that communication training helps. Yet at its core the problem is not a knowledge deficit (see the pattern here?). It is cultural. As pointed by Simis et al., scientists support a deficit model because of their cultural assumptions. Being trained to be rational agents, they assume their public will act as a rational agent as well. They often assume a “us vs. them” distinction between themselves and the rest of the public. They lack formal training on communication. Finally, researchers in STEM tend to have a dim view of “soft” disciplines such as social sciences. Such a dim view is correlated with endorsement of the deficit model. In short, these scientists do not trust experts. Scientists choose what science to trust and what not like any other, depending on their own experience, knowledge and beliefs.
There is a remarkable parallel with the dynamics of antivaccination and other pseudoscientific movements. Scientists and physicians often are frustrated at the how the public distrust experts – and, consequently, they do not trust the public. When Burioni declares that “science is not democratic”, he meant that only what experts say matters. However it is not true people do not believe experts at all: more often, they choose the experts to trust. In this case, of course, “expert” means “whoever the public feels to have trustworthy knowledge on the subject”. Nonetheless, more often than not, these are people that would be considered experts by many definitions of the word. Yet Andrew Wakefield was a physician who published his bogus vaccine-autism correlation study on Lancet. Vaccine skepticism is often endorsed by healthcare professionals. Worse, there is plenty of Nobel Prize winners that publicly hold pseudoscientific beliefs, from Luc Montaignier belief in homeopathy to Kary Mullis AIDS/HIV denialism – so much that the expression “Nobel disease” has been coined. Why should not a layperson believe them? Are not they as qualified as humanly possible?
The deficit model thus seeds its own demise. If we communicate medicine and science as a one-way interaction between omniscient self-declared experts and a passive public, the public will internalize this hierarchy. In the moment the omniscient expert fails (being, in fact, hardly omniscient), the feeling of betrayal will readily remove trust in that expert. But the overarching frame of blind belief in a figure of trust will lead to the search of another authority – more or less real – in tune with the personal narratives or experiences. Communication becomes a competition between magic pipers, with the public left with little or no tools to understand which one to follow. This is further exasperated by the contemporary social media dynamics, which promotes polarization and isolated echo chambers or bubbles, incapable of communicating with each other.
Indeed, there are concerns that such a polarization is transforming science support, in popular culture, in an ideological faction instead of a basis on which to build consensus. Consequently, the current vaccine debate in Italy is increasingly fractured among political lines. Some comments on social media even begin to conflate support for vaccination with other unrelated issues such as support for abortion, euthanasia and even same-sex marriage.
Is there a way out? An alternative would be to reposition trust from experts or qualifications to a trust in the process of knowledge-generation of science and medicine. The attitude should therefore shift from “believe us, because we are experts” to “believe this, because it is the result of a robust process”. It is hard to trust knowledge that appears from behind curtains; it is easier, perhaps, if we open to the public(s) how it is made and we engage it.
This means more than many current attempts at “citizen science” do, where citizens are “engaged” by helping researchers with menial, highly-parallelizable tasks. This requires to engage the public in a conversation about the practical process of building scientific knowledge. It would show that for example science is self-correcting on the long run, while pseudosciences tend to be static and deaf to evidence against them. Concerns about political, societal and economical bias in medicine and science should be explicitly addressed and acknowledged, not hidden or dismissed. Scientists should learn to trust the public, to understand that they are a public too: that even if the public has bizarre or false beliefs, the concerns behind them are all too real. Of course, this means we must act at a much deeper level than healthcare providers communicating about vaccines: in school, in media, in the general science-society relationship.
The same should happen in the internal debate between scientists and communicators. Science communication should engage with scientists, taking into account the cultural differences between social sciences and STEM disciplines. The rift between “hard” and “soft” sciences is perhaps the worst outcome of the infamous “Two Cultures” split that C.P. Snow described more than half a century ago. Gaining the trust of scientists, often frustrated by sloppy descriptions of their work on media and by the rising of pseudoscientific attitudes, should be therefore a top priority for science communication research. This could also help public discussion, giving tools to science fans to channel their enthusiasm and frustration into useful forms of engagement, possibly breaking the curse of social media polarization. And then, science and health care might really become democratic.
Massimo Sandal
Science journalist