Recently, I contributed to a project that turned healthy human tissues into an earlier stage of pancreatic cancer—a disease that carries a dismal 5-year survival rate of 5 percent.
When I described our project to a friend, she asked, “why in the world would you want to grow cancer in a lab?” I explained that by the time a patient learns that he has pancreatic cancer, the tumor has spread throughout the body. At that point, the patient typically has less than a year to live and his tumor cells have racked up a number of mutations, making clinical trials and molecular studies of pancreatic cancer evolution downright difficult. For this reason, our laboratory model of pancreatic cancer was available to scientists who wanted to use it to find the biological buttons that turn healthy cells into deadly cancer. By sharing our discovery, we wanted to enable others in developing drugs to treat cancer and screening tests to diagnose patients early. The complexity of this process demonstrates that science is a team effort that involves lots of time, money, and the brainpower of highly-trained individuals working together toward a single goal.
Many of the challenges we face today—from lifestyle diseases, to the growing strains of antibiotic-resistant superbugs in hospitals, to the looming energy crisis—require scientific facts and solutions. And although there’s never a guarantee of success, scientists persist in hopes that our collective discoveries will reverberate into the future. However, as a corollary, hindering scientific progress means a loss of possibilities.
Unfortunately, the deceleration of scientific progress seems likely possibility. In March, the White House released a document called “America First: A Budget Blueprint to Make America Great Again,” which describes deep cuts to some of the country’s most important funding agencies for science.
As it stands, the National Institutes of Health is set to lose nearly a fifth of its budget; the Department of Energy’s Office of Science, $900 million; and the Environmental Protection Agency, a 31.5 percent budget cut worth $2.6 billion. Imagine the discoveries that could have saved our lives or created jobs, which will instead languish solely as unsupported hypotheses in the minds of underfunded scientists.
Scientists cannot remain idle on the sidelines; we must be active in making the importance of scientific research known. Last weekend’s March on Science drew tens of thousands of people around more than 600 rallies across the world, but the challenge now lies in harnessing the present momentum and energy to make sustained efforts to maintain government funding for a wide range of scientific projects.
The next step is to get involved in shaping public opinion and policy. As it stands, Americans on both sides of the political spectrum have expressed ambivalence about the validity of science on matters ranging from climate change to childhood vaccinations. Academics can start tempering the public’s unease toward scientific authority and increase public support for the sciences by stepping off the ivory tower. Many researchers are already engaging with the masses by posting on social media, penning opinion articles, and appearing on platforms aimed at public consumption (Youtube channels, TED, etc). A researcher is her own best spokesperson in explaining the importance of her work and the scientific process; unfortunately, a scientist’s role as an educator in the classroom and community is often shoved out by the all-encompassing imperative to publish or perish. As a profession, we must become more willing to step out of our laboratories to engage with the public and educate the next generation of science-savvy citizens.
In addition, many scientists have expressed interest in running for office, including UC Berkeley’s Michael Eisen (who also a co-founder of PLOS). When asked by Science why he was considering a run for senate, Eisen responded:
“My motivation was simple. I’m worried that the basic and critical role of science in policymaking is under a bigger threat than at any point in my lifetime. We have a new administration and portions of Congress that don’t just reject science in a narrow sense, but they reject the fundamental idea that undergirds science: That we need to make observations about the world and make our decisions based on reality, not on what we want it to be. For years science has been under political threat, but this is the first time that the whole notion that science is important for our politics and our country has been under such an obvious threat.”
If scientists can enter into the house and senate in greater numbers, they will be able to inject scientific sense into the discussions held by members of legislature whose primary backgrounds are in business and law.
Science is a bipartisan issue that should not be bogged down by the whims of political machinations. We depend on research to address some of the most pressing problems of our time, and America’s greatness lies in part on its leadership utilizing science as an exploration of physical truths and a means of overcoming our present limitations and challenges.
Marching is much in the air of late. After the “Women’s March”, that engaged many millions and was motivated in part by misogynistic statements and proposed policies from various politicians, we find ourselves faced with a range of anti-science behaviors, remarks, and proposed policy changes that have encouraged a similar March for Science. The March for Science has garnered the support of a wide range of scientific organizations, including the American Association for the Advancement of Science (AAAS) and a range of more pecialized professional science organizations, including the Public Library of Science (PLoS). There have been a number of arguments for and against marching for science, summarized in this PLoS On Science blog post, so I will not repeat them here. What is clear is that science does not exist independently of humanity, and this implies a complex interaction between scientific observations and ideas, the scientific enterprise, politics, economics, and personal belief systems: it seems evident that not nearly enough effort is spent in our educational systems to help people understand these interactions (see PLoS SciEd post: From the Science March to the Classroom: Recognizing science in politics and politics in science).
What I want to do here is to present some reflections on the relationship between science and politics, by which I include various belief systems (ideologies).
The mystic Giordano Bruno, burnt at the stake by the Roman Catholic Church as a heretic in 1600, is sometimes put forward as a patron saint of science, mistakenly in my view. Bruno was a mystic, whose ideas were at best loosely grounded in the observable and in no way scientific as we understand the term. His type of magical thinking is similar to that of modern anti-vaccination-ists who claim vaccination can cause autism (it does not)(1) or that GMOs are somehow innately “unhealthy” and more dangerous than “natural” organisms (see: The GMO safety debate is over). A better model, particularly in the context of current political controversies, would be the many Soviet geneticists who suffered exile and often death (the famed geneticist N.I. Vavilov starved to death in a Soviet gulag in 1943) as a result of the state/party-driven politicization of science, specifically genetics, carried out by Joseph Stalin (1878-1953) and the Communist party/state of the Soviet Union (see: The tragic story of Soviet genetics shows the folly of political meddling in science). In response to the implications of genetic and evolutionary mechanisms, Stalin favored Lamarckism (inheritance of acquired traits) posited by Ivan Michurin (1855–1935) and Trofim Lysenko (1898–1976)[see link]. Communist ideology required (or rather demanded) that traits, including human traits, be seen as malleable, that the “nature” of plants and people could be altered permanently with appropriate manipulations (vernalization for plants, political re-education for people)[see: The consequences of political dictatorship for Russian science). No need to wait for the messy, multi-generational processes associated with conventional plant breeding (and Darwinian evolution). In both cases, the unforgiving realities of the natural world intervened, but not without intense human suffering and starvation associated with both efforts.
It is worth noting explicitly that there are, and likely always will be, pressures to politicize science, due in large measure to science’s success in explaining the natural world and providing the basis for its technology-based manipulation. Giordano Bruno was an early martyr in the evolution of a highly ideological world view (illustrated by the house arrest of Galileo and the suppression of heliocentric models of the solar system)(2). Eventually such forms of natural theology were replaced by the apolitical and empirical ideals implicit in Enlightenment science. Aspects of ideological (racist) influences can be seen in 19th century science, most dramatically illustrated by Gould (Morton’s ranking of races by cranial capacity. Unconscious manipulation of data may be a scientific norm)(see link). How racist policies were initially embraced, and then rejected by American geneticists during the course of the 20th century is described by Provine (Geneticists and the Biology of Race Crossing).
More recent events remind us of the pressures to politicize science. A number of states (Kentucky in 1976, Mississippi in 2006, Louisiana in 2008, and Tennessee in 2012) have passed bills that allow teachers to present non-scientific ideas to students (think intelligent design creationism and climate change denial). Such bills continue to come up with depressing frequency. Most recently an admitted creationist has been appointed to lead a federal higher education reform task force in the United States [see link]. Is creationism simply alt-science? a position explicitly or tacitly supported by both the religiously orthodox and those of a post-modernist persuasion, such as left-leaning college instructors, who claim that science is a social construct [see: Is Science ‘Forever Tentative’ and ‘Socially Constructed’?].
While such recent anti-science/alt-science attitudes have not had quite the draconian effects found in the Soviet Union, Nazi Germany or eugenist America), I would argue that they have a role in eroding the public’s faith in the scientific understanding of complex processes, a faith that is largely justified even in the face of the so-called “reproducibility crises”, which in a sense is no crises at all, but an expected outcome from the size, complexity, and competing forces acting on scientists and the scientific enterprise. That said, laws and various forms of coercion dictating right-wing/religious or left-wing/political correctness in science threaten to impact the education of a generation of students. Predictions of climate changed based on human-driven (anthropogenic) increases in atmospheric CO2 levels or the effects of lead in public water systems on human health [link] cannot simply be discarded or discounted based on ideological positions on the role of government in protecting the public interest, a role that neither unfettered capitalism or fundamentalist communism seems particularly good at addressing. Similarly the lack of any demonstrable connection between autism and vaccination (see above), the physicochemical impossibility of homeopathic treatments (or various versions of “Christian Science”), and the lack of evidence for the therapeutic claims made for the rather startling array of nutritional supplements serve to inject a political, ideological, and economic dimension into scientific discourse. In fact science is constantly under pressure to distort its message. Consider the European response to GMOs in favor of the “organic” (non-GMO); most GMOs have been banned from the EU for what appears to be ideological (non-scientific) reasons, even though the same organisms have been found safe and are grown in the US and most of Asia (see this Economist essay).
It is clear that the rejection of scientific observations is wide-spread on both the left and the right, basically whenever scientific observations, ideas, or models lead to disturbing or discomforting conclusions or implications (link). Consider the violent response when Charles Murray was invited to speak at Middlebury College (see Andrew Sullivan’s Is intersectionality a religion?). That human populations might (and in fact can be expected to) display genetic differences, the result of their migration history and subsequent evolutionary processes, both adaptive and non-adaptive (see Henn et al., The great human expansion), is labelled racist and by implication beyond the pale of scientific discourse, even though it is tacitly recognized by the scientific community to be well established (no one, I think, gets particularly upset at the suggestion that noses are shaped by evolutionary processes and reflect genetic differences between populations (see Climate shaped the human nose) or that nose shape might play a role in human sexual selection (see Facial Attractiveness and Sexual Selection; and sexual dimorphism). One might even speculate that studies of the role of nose shape in mate selection could form the basis of an interesting research project (see Beauty and the beast: mechanisms of sexual selection in humans.
What often goes undiscussed is whether differences in specific traits (different alleles and allele frequencies) between populations have any meaningful significance in the context f public policy – I would argue that they do not). What is clear is that in a pre-genomic era recognizing such differences can be of practical value, for example in the treatment of diseases (see Ethnic Differences in Cardiovascular Drug Response). That said, the era of genomics-based personalized diagnosis and treatment is rapidly making such population-based considerations obsolete (see: Genetic tests for disease risks and ethical debate on personal genome testing), while at the same time raising serious issues of privacy and discrimination based on the presence of the “wrong” alleles (see: genome sequencing–ethical issues). In a world of facile genomic engineering the dangers of unfettered technological manipulations move more and more rapidly from science fiction to the boutique (intelligent?) design of people (see: CRISPR gene-editing and human evolution).
So back (about time, you may be thinking) to the original question – if we “march for science”, what exactly are we marching for [link]? Are we marching to defend the apolitical nature of science and the need to maintain economic support (increased public funding levels) for the scientific enterprise, or are we conflating support for science with a range of social and political positions? Are we affirming our commitment to a politically independent (skeptical) community of practitioners who serve to produce, reproduce, critically examine, and extend empirical observations and explanatory (predictive) models?
This is not to ignore the various pressures acting on scientists as they carry out their work. These pressures act to tempt (and sometimes reward) practitioners to exaggerate (if not fabricate) the significance of their observations and ideas in order to capture the resources (funds and people) needed to carry out modern science, as well as the public’s attention. Since resources are limited, extra-scientific forces have an increasing impact on the scientific enterprise – enticing scientists to make exaggerated claims and to put forth extra-scientific arguments and various semi-hysterical scenarios based on their observations and models. In the context of an inherently political event (a march) the apolitical ideals of science can seem too bland to command attention and stir action, not to mention the damage that politicizing science does to the integrity of science.
At the end of the day my decision is not to march, because I believe that science must be protected from the political and the partisan(see: The pernicious effects of disrespecting the constraints of science); that the ultimate working nature (as opposed to delivered truth) of scientific observations and conclusions must be respected, something rarely seen in any political movement and certainly not on display in the Lysenkoist, climate change, anti-vaccination, or eugenics movements (see this provocative essay: The Disgraceful Episode Of Lysenkoism Brings Us Global Warming Theory.)
(1) While there is not doubt that vaccinations can, like all drugs and medical interventions, lead to side effects in certain individuals, there is unambiguous evidence against any link between autism and vaccination.
(2) It is worth noting that as originally proposed the Copernican (Sun-centered) model of the solar system was more complex than the Ptolemaic (Earth-centered) system it was meant to replace. It was Kepler’s elliptical, rather than circular, orbits that made the heliocentric model dramatically simpler, more accurate, and more aesthetically compelling.
Purely scientific discussions are hallmarked by objective, open, logical, and skeptical thought; they can describe and explain natural phenomena or provide insights into a broader questions. At the same time, scientific discussions are generally incomplete and tentative (sometimes for well understood reasons). True advocates of the scientific method appreciate the value of its skeptical and tentative approach, and are willing to revise even long-held positions in response to new, empirically-derived evidence or logical contradictions. Over time, science’s scope and conclusions have expanded and evolved dramatically; they provide an increasingly accurate working model of a wide range of processes, from the formation of the universe to the functioning of the human mind. The result is that the ubiquity of science’s impacts on society are clear and growing. However, discussing and debating the details of how science works, and the current consensus view on various phenomena, such as global warming or the causes of cancer or autism, is very different from discussing and debating how a scientific recommendation fits into a societal framework. As described in a recent National Academies Press report on Communicating Science Effectively [link], “the decision to communicate science [outside of academia] always involves an ethical component. Choices about what scientific evidence to communicate and when, how, and to whom, are a reflection of values.”
Over the last ~150 years, the accelerating pace of advances in science and technology have enabled future sustainable development, but they have also disrupted traditional social and economic patterns. Closing coal mines in response to climate predictions (and government regulations) may be sensible when viewed broadly, but are disruptive to those who have, for generations, made a living mining coal. Similarly, a number of prognosticators have speculated on the impact of robotics and artificial intelligence on traditional socioeconomic roles and rules. Whether such impacts are worth the human costs is rarely explicitly considered and discussed in the public forum, or the classroom. As members of the scientific community, our educational and outreach efforts must go beyond simply promoting an appreciation of, and public support for science. They must also consider its limitations, as well as the potential ethical and disruptive effects on individuals, communities, and/or societies. Making policy decisions with large socioeconomic impacts based on often tentative models raises risks of alienating the public upon which modern science largely depends.
Citizens, experts or not, are often invited to contribute to debates and discussions surrounding science and technology at the local and national levels. Yet, many people are not provided with the tools to fully and effectively engage in these discussions, which involves critically analyzing the scope, resolution, and stability of scientific conclusions. As such, the acceptance or rejection of scientific pronouncements is often framed as an instrument of political power, casting a shadow on core scientific principles and processes, framing scientists as partisan players in a political game. The watering down of the role of science and science-based policies in the public sphere, and the broad public complacency associated with (often government-based, regulatory) efforts, is currently being challenged by the international March For Science effort. The core principles and goals of this initiative [link] are well articulated, and, to my mind, representative of a democratic society. However, a single march on a single day is not sufficient to promote a deep social transformation, and promote widespread dispassionate argumentation and critical thinking. Perspectives on how scientific knowledge can help shape current and future events, as well as the importance of recognizing both the implications and limits of science, are perspectives that must be taught early, often, and explicitly. Social or moral decisions are not mutually exclusive from scientific evidence or ideas, but overlap is constrained by the gates set by values that are held.
In this light, I strongly believe the sociopolitical nature of science in practice must be taught alongside traditional science content. Understanding the human, social, economic and broader (ecological) costs of action AND inaction can be used to highlight the importance of framing science in a human context. If the expectation is for members of our society to be able to evaluate and weigh in on scientific debates at all levels, I believe we are morally obligated to supply future generations with the tools required for full participation. This posits that scientists and science educators, together with historian, philosophers, and economists, etc., need to go beyond the teaching of simple facts and theories by considering how these facts and theories developed over time, their impact on people’s thinking, as well as the socioeconomic forces that shape societies. Highlighting the sociopolitical implications of science-based ideas in classrooms can also motivate students to take a greater interest in scientific learning in particular, and related social and political topics in general. It can help close the gap between what is learned in school and what is required for the critical evaluation of scientific applications in society, and how scientific ideas can and should be evaluated when it comes to social policy or person beliefs.
A “science in a social context” approach to science teaching may also address the common student question, “When will I ever use this?” All too often, scientific content in schools is presented in ways that are abstract, decontextualized, and can feel irrelevant to students. Such an approach can leave a student unable or unwilling to engage in meaningful and substantive discussions on the applications and limitations of science in society. The entire concept of including cost-benefit analyses when considering the role of science in shaping decisions is often over-looked, as if scientific conclusions are black and white. Furthermore, the current culture of science in classrooms leaves little room for students to assess how scientific information does and does not align with their cultural identities, often framing science as inherently conflicting or alien, forcing a choice between one way of seeing the world over the other, when a creative synthesis seems more reasonable. Shifting science education paradigms toward a strategy that promotes “education through science” (as opposed to “science through education”) recognizes student needs and motivations as critical to learning, and opens up channels for introducing science as something that is relevant and enriching to their lives. Centered on the German philosophy of Allgemeinbildung [link] that describes “the competence for participation in critical dialogue on currently important matters,” this approach has been found to be effective in motivating students to develop the necessary skills to implement empirical evidence when forming arguments and making decisions.
In extending the idea of the perceived value of science in sociopolitical debates, students can build important frameworks for effectively engaging with society in the future. A relevant example is the increasing accessibility of genome editing technology, which represents an area of science poised to deeply impact the future of society. In a recent report [link] on the ethics of genome editing, assembled by an panel of clinicians and scientists (experts), it is recommended that the United States should proceed — cautiously — with genome editing studies on human embryos. However, as pointed out [link], this panel failed to include ANY public participation in this decision. This effort, fundamentally ignores “a more conscious evaluation of how this impacts social standing, stigma and identity, ethics that scientists often tend to cite pro forma and then swiftly scuttle.” As this discussion increasingly shifts into the mainstream, it will be essential to engage with the public in ways that promote a more careful and thoughtful analysis of scientific issues [link], as opposed to hyperbolic fear mongering (as seen in regard to most GMO discussions)[link] or reserving genetic engineering to the hyper-affluent. Another, more timely example, involves the the level at which an individual’s genome be used to predict a future outcome or set of outcomes, and whether this information can be used by employers in any capacity [link]. By incorporating a clear description of how science is practiced (including the factors that influence what is studied, and what is done with the knowledge generated), alongside the transfer of traditional scientific knowledge, we can help provide future citizens with tools for critical evaluation as they navigate these uncharted waters.
It is also worth noting that the presentation of science in a sociopolitical contexts can emphasize learning of more than just science. Current approaches to education tend to compartmentalize academic subjects, framing them as standalone lessons and philosophies. Students go through the school day motions, attending English class, then biology, then social studies, then trigonometry, etc., and the natural connections among subject areas are often lost. When framing scientific topics in the context of sociopolitical discussions and debates, stu
dents have more opportunities to explore aspects of society that are, at face value, unrelated to science.
Drawing from lessons commonly taught in American History class, the Manhattan Project [link] offers an excellent opportunity to discuss the fundamentals of nuclear chemistry as well as sociopolitical implications of a scientific discovery. At face value, harnessing nuclear fission marked a dramatic milestone for science. However, when this technology was pursued by the United States government during World War II — at the urging of the famed physicist Albert Einstein and others — it opened up the possibility of an entirely new category of warfare, impacting individuals and communities at all levels. The reactions set off by the Manhattan Project, and the consequent 1945 bombing of Hiroshima and Nagasaki, are ones that are still felt in international power politics, agriculture, medicine, ecology, economics, research ethics, transparency in government, and, of course, the Presidency of the United States. The Manhattan Project represents an excellent case study on the relationship between science, technology, and society, as well as the project’s ongoing influence on these relationships. The double-edged nature often associated with scientific discoveries are important considerations of the scientific enterprise, and should be taught to students accordingly.
A more meaningful approach to science education requires including the social aspects of the scientific enterprise. When considering a heliocentric view of the solar system, it is worthwhile recognizing its social impacts as well as its scientific foundations (particularly before Kepler). If we want people to see science as a human enterprise that can inspire rather than dictate decisions and behaviors, it will require resifting how science — and scientists — are viewed in the public eye. As written here [link]. we need to restore the relationship between scientific knowledge and social goals by specifically recognizing how
‘So… cutting my funding, eh? Well, I’ve got a pair of mutant fists that say otherwise!’
science can be used, inappropriately, to drive public opinion. As an example, in the context of CO2-driven global warming, one could (with equal scientific validity) seek to reduce CO2 generation or increase CO2 sequestration. Science does not tell us which is better from a human perspective (although it could tell us which is likely to be easier, technically). While science should inform relevant policy, we must also acknowledge the limits of science and how it fits into many human contexts. There is clearly a need for scientists to increase participation in public discourse, and explicitly consider the uncertainties and risks (social, economic, political) associated with scientific observations. Additionally, scientists need to recognize the limits of their own expertise.
A pertinent example was the call by Paul Ehrlich to limit, in various draconian ways, human reproduction – a political call well beyond his expertise. In fact, recognizing when someone has gone beyond what science can legitimately tell us [link] could help rebuild respect for the value of science-based evidence. Scientists and science educators need to be cognizant of these limits, and genuinely listen to the valid concerns and hesitations held by many in society, rather than dismiss them. The application of science has been, and will always be, a sociopolitical issue, and the more we can do to prepare future decision makers, the better society will be.
Jeanne Garbarino, PhD, Director of Science Outreach, The Rockefeller University, NY, NY
Jeanne earned her Ph.D. in metabolic biology from Columbia University, followed by a postdoc in the Laboratory of Biochemical Genetics and Metabolism at The Rockefeller University, where she now serves as Director of Science Outreach. In this role, she works to provide K-12 communities with equitable access to authentic biomedical research opportunities and resources. You can find Jeanne on social media under the handle @JeanneGarb.
Developing a coherent understanding of a scientific idea is neither trivial nor easy and it is counter-productive to pretend that it is.
For some time now the idea of “active learning” (as if there is any other kind) has become a mantra in the science education community (see Active Learning Day in America:link). Yet the situation is demonstrably more complex, and depends upon what exactly is to be learned, something rarely stated explicitly in many published papers on active learning (an exception can be found here with respect to understanding evolutionary mechanisms :link). The best of such work generally relies on results from multiple-choice “concept tests” that provide, at best, a limited (low resolution) characterization of what students know. Moreover it is clear that, much like in other areas, research into the impact of active learning strategies is rarely reproduced (see: link, link & link).
As is clear from the level of aberrant and non-sensical talk about the implications of “science” currently on display in both public and private spheres (link : link), the task of effective science education and rigorous scientific (data-based) decision making is not a simple one. As noted by many there is little about modern science that is intuitively obvious and most is deeply counterintuitive or actively disconcerting (see link). In the absence of a firm religious or philosophical perspective, scientific conclusions about the size and age of the Universe, the various processes driving evolution, and the often grotesque outcomes they can produce can be deeply troubling; one can easily embrace a solipsistic, ego-centric and/or fatalistic belief/behavioral system.
There are two videos of Richard Feynman that capture much of what is involved in, and required for understanding a scientific idea and its implications. The first involves the basis scientific process, where the path to a scientific understanding of a phenomena begins with a guess, but these are a special kind of guess, namely a guess that implies unambiguous (and often quantitative) predictions of what future (or retrospective) observations will reveal (video: link). This scientific discipline (link) implies the willingness to accept that scientifically-meaningful ideas need to have explicit, definable, and observable implications, while those that do not are non-scientific and need to be discarded. As witness the stubborn adherence to demonstrably untrue ideas (such as where past Presidents were born or how many people attended an event or voted legally), which mark superstitious and non-scientific worldviews. Embracing a scientific perspective is not easy, nor is letting go of a favorite idea (or prejudice). The difficulty of thinking and acting scientifically needs to be kept in the mind of instructors; it is one of the reasons that peer review continues to be important – it reminds us that we are part of a community committed to the rules of scientific inquiry and its empirical foundations and that we are accountable to that community.
The second Feynman video (video : link) captures his description of what it means to understand a particular phenomenon scientifically, in this particular case, why magnets attract one another. The take home message is that many (perhaps most) scientific ideas require a substantial amount of well-understood background information before one can even begin a scientifically meaningful consideration of the topic. Yet all too often such background information is not considered by those who develop (and deliver) courses and curricula. To use an example from my own work (in collaboration with Melanie Cooper @MSU), it is very rare to find course and curricular materials (textbooks and such) that explicitly recognize (or illustrate) the underlying assumptions involved in a scientific explanation. Often the “central dogma” of molecular biology is taught as if it were simply a description of molecular processes, rather than explicitly recognizing that information flows from DNA outward (link)(and into DNA through mutation and selection). Similarly it is rare to see stated explicitly that random collisions with other molecules supply the energy needed for chemical reactions to proceed or to break intermolecular interactions, or that the energy released upon complex formation is transferred to other molecules in the system (see : link), even though these events control essentially all aspects of the systems active in organisms, from gene expression to consciousness.
The basic conclusion is that achieving a working understanding of a scientific ideas is hard, and that, while it requires an engaging and challenging teacher and a supportive and interactive community, it is also critical that students be presented with conceptually coherent content that acknowledges and presents all of the ideas needed to actually understand the concepts and observations upon which a scientific understanding is based (see “now for the hard part” : link). Bottom line, there is no simple or painless path to understanding science – it involves a serious commitment on the part of the course designer as well as the student, the instructor, and the institution (see : link).
This brings us back to the popularity of the “active learning” movement, which all too often ignores course content and the establishment of meaningful learning outcomes. Why then has it attracted such attention? My own guess it that is provides a simple solution that circumvents the need for instructors (and course designers) to significantly modify the materials that they present to students. The current system rarely rewards or provides incentives for faculty to carefully consider the content that they are presenting to students, asking whether it is relevant or sufficient for students’ to achieve a working understanding of the subject presented, an understanding that enables the student to accurately interpret and then generate reasoned and evidence-based (plausible) responses.
Such a reflective reconsideration of a topic will often result in dramatic changes in course (and curricular) emphasis; traditional materials may be omitted or relegated to more specialized courses. Such changes can provoke a negative response from other faculty, based of often inherited (an uncritically accepted) ideas about course “coverage”, as opposed to desired and realistic student learning outcomes. Given the resistance of science faculty (particularly at institutions devoted to scientific research) to investing time in educational projects (often a reasonable strategy, given institutional reward systems), there is a seductive lure to easy fixes. One such fix is to leave the content unaltered and to “adopt a pose” in the classroom.
All of which brings me to the main problem – the frequency with which superficial (low cost, but often ineffectual) strategies can act to inhibit and distract from significant, but difficult reforms. One cannot help but be reminded of other quick fixes for complex problems. The most recent being the idea, promulgated by Amy Cuddy (Harvard: link) and others, that adopting a “power pose” can overcome various forms of experienced- and socioeconomic-based prejudices and injustices, as if over-coming a person’sexperiences and situation is simply a matter of will. The message is that those who do not succeed have only themselves to blame, because the way to succeed is (basically) so damn simple. So imagine one’s surprise (or not) when one discovers that the underlying biological claims associated with “power posing” are not true (or at least cannot be replicated, even by the co-authors of the original work (see Power Poser: When big ideas go bad: link). Seems as if the lesson that needs to be learned, both in science education and more generally, is that claims that seem too easy or universal are unlikely to be true. It is worth remembering that even the most effective modern (and traditional) medicines, all have potentially dangerous side effects. Why, because they lead to significant changes to the system and such modifications can discomfort the comfortable. This stands in stark contrast to non-scientific approaches; homeopathic “remedies” come to mind, which rely on placebo effects (which is not to say that taking ineffective remedies does not itself involve risks.)
As in the case of effective medical treatments, the development and delivery of engaging and meaningful science education reform often requires challenging current assumptions and strategies that are often based in outdated traditions, and are influenced more by the constraints of class size and the logistics of testing than they are by the importance of achieving demonstrable enhancements of students’ working understanding of complex ideas.
The ubiquitous impact of science-based information and technologies in everyday life suggests that misunderstanding how science works can have serious consequences. Yet people’s decisions and strongly held beliefs are often uncoupled from, or at odds with, the conclusions and recommendations of empirical studies and scientific consensus.
In some cases, the implications of misunderstanding or rejecting science are more or less harmless – does it really matter if someone believes the Earth is the center of the Universe? In other cases, they can be critical. Perhaps nowhere is understanding science more important than in the anti-vaccination (anti-VAX) movement. While infant/child vaccination rates have increased for the major vaccine-preventable diseases (2000-2014), there are growing pockets of vaccine refusal in multiple locations across the United States. The real world and serious implication of vaccine refusal is the disruption of local herd immunity, resulting in vaccine-preventable disease outbreaks. For example, regions of California, including Marin, Napa, and Sonoma counties, have seen significant increases in the number of pertussis and measles cases in recent years, largely related to increased intentional vaccine refusal among parents. Similarly, we have seen a revival of measles and mumps cases in and around Brooklyn, NY.
According to survey data, vaccine refusal usually relates to concerns about vaccine safety and perceived efficacy. Specifically, parents either misunderstand or flat out reject the data-based debunking of a causal relationship between vaccination and autism, based on a seriously flawed, and now retracted study (see PLoS Blog: Public Health Takes on Anti-Vaccine Propaganda). Additionally, parents may not see the point of vaccination since many vaccine-preventable diseases, and their serious health implications, have become much less common in the developed world. We now live in a world free of smallpox and almost free of polio, making it difficult for people to connect to the painful consequences of their reappearance.
Parents who choose to forgo or hesitate in vaccinating their children tend to be white, well-educated, and living in households making $75,000 or more annually. Presumably, such parents have ready access to relevant information, spend a lot of time researching the topic of vaccines, and, according to Seth Mnookin in The Panic Virus “take pride in being intellectually curious, thoughtful, and rational.” These are individuals and communities that are generally assumed to support science and are scientifically literate by various measures (for example, they know that the Earth is not flat and that it travels around the Sun, they might even know that antibiotics do not “kill” viruses, although they may not know exactly why this is the case). But, for some reason, these parents become fully invested in exploring the “debate” relating to vaccine safety and efficacy, and are not able to differentiate scientifically supported conclusions from those that are not.
Maybe this mindset is influenced by the “healthcare consumerism” movement, which encourages patients to be more involved in their healthcare decisions, marking a shift away from the authoritarian, doctor-knows-best default. Furthermore, given the growing tax on the already faulty infrastructure of our healthcare system, this “look it up dear” trend has become an accepted norm. The ability to conduct “research” on an aspect of healthcare, and to have an appreciation on what “research” truly entails, is a direct application of science literacy. Assuming “research” is done correctly, the internet can be an excellent source of information for patients, with the potential of filling in any healthcare knowledge gaps.
One immediate driver of the anti-VAX movement, however, appears to be that internet search results on vaccines yield a very mixed bag, potentially compromising the attempts at “research.” For instance, when I plugged “are vaccines safe” into Google (December 2016), sites like CDC’s Vaccine Safety and the US Health and Human Service’s Vaccine Safety appeared in the feed. These sites, however, were outnumbered by anti-VAX (aka fake science) sites that can have the look and feel of a legitimate resource. In fact, the internet is the main platform for anti-VAX messaging; parents who exempt children from vaccination are likely to have made their decision based on information they read online. Interestingly, anti-VAX information is often couched in non-science messages demonizing “lying big pharma,” while simultaneously promoting the value of alternative (that is, non-science-based) medicine – as of 2015, an unregulated (and not completely honest) $30B per year industry. One might suspect that anti-VAX sentiments are reinforced by alternative medicine and nutraceutical lobbying, further demonstrating an inability to correctly identify conflicts of interest and a misunderstanding or flat out rejection of evidence-based medicine.
While I value the opportunity for access to scientific and medical information, and believe that patients owe it to themselves to research areas specifically related to their health/illnesses, such information is often presented in a context that assumes a general understanding of underlying processes that are fundamental to the given explanation. For instance, when delivering information about vaccine science, is it to be assumed that readers are familiar with how the immune system works? Are parents aware of the impact of withholding vaccination on their, and their neighbor’s child’s well being, or the potential immediate and long term effects of contracting the disease that the vaccine protects against? Do readers have a handle on autism spectrum disorder, and what we know – and don’t know – about related genetic and environmental influences? When explaining a “why,” such as why someone should vaccinate their child, “you have to be in some framework that you allow something to be true, otherwise you are perpetually asking why” (Feynman on BBC, 1983). In presenting materials on vaccines, have we clearly established what needs to be understood and accepted? Does our audience have the foundational knowledge to understand the arguments for and mechanisms behind vaccination? These are questions that extend beyond vaccination, and generally go well beyond what is meant by scientific literacy.
Confounding the difficulties associated with being a critical consumer of healthcare information is the tendency for humans to connect with stories – particularly stories with negative outcomes – regardless of factual content. According to a McKinsey research summary on healthcare consumerism, “there is often a disconnect between what consumers believe matters most and what influences their opinions most strongly.” This can also relate to the concept of biased assimilation, where people unknowingly engage in cherry picking to find the arguments that best support the values held by those in their community. In the context of the vaccine discussion, parents can be influenced by the (abundant) stories linking vaccines to the onset of a spectrum of health issues, even if they doubt the validity of the story, even more so if they are a part of a community that supports anti-VAX sentiments. This natural human behavior is reinforced by celebrity-backed, widely-publicized campaigns that falsely link vaccines to autism — despite retraction of the original study AND numerous subsequent studies disproving these claims – likely sustains anti-VAX momentum. Given the incoming presidential administration’s suggested openness to anti-VAX arguments, this momentum could be amplified.
Vaccine hesitancy and rejection is not restricted to the United States. In fact, nearly every country experiences pockets of vaccine-preventable disease outbreaks, resulting from vaccine hesitancy or refusal, as described in a recent WHO working group report on the topic. Similar to trends seen in the United Sates, the choice to align with anti-VAX sentiments has complex underpinnings, often relating, in part, to personal/community belief systems and distrust of healthcare providers and institutions. As an extreme example, the Taliban [link] are vehemently opposed to vaccinations as they view vaccination programs as antithetical to their traditional beliefs and culture; such programs can be seen as a nefarious plots to “inject” Western culture into traditional societies. One result has been terrorist attacks targeting polio workers in Pakistan, killing 60 since 2012.
At the risk of oversimplifying the issues related to vaccine hesitancy and rejection, people’s decision’s for themselves and their children might have less to do with the message, and more about how — and in what context — the message is delivered. In fact, a meta-analysis on persuasive techniques common to anti-VAX websites suggest that these sites go far beyond simply providing (inaccurate) information on vaccine safety. These sites are making genuine connections to parents’ values (i.e. freedom of choice) and lifestyles (i.e. healthy eating), adeptly contextualizing anti-VAX sentiments as being a part of holistic well-being. Such sites skillfully cultivate feelings of trust and credibility by aiming their message to hit the more human side of things. These sites get human behavior, while pro-objective evidence sites often do not. When looking at the anti-VAX movement, we see the power of personal stories and of presenting anti-VAX “science” alongside related messages that promote the values and ideals of the target population.
So how do we apply these lessons to improving the public’s understanding of particular science-based decisions? While it may feel counterintuitive, perhaps we should stop trying to win arguments using the traditional academic approach, with data, error bars, and p-values, as these risk strengthening the emotional appeal of anti-evidence, anti-scientific viewpoints. Instead, we can present data-based conclusions in compelling and effective ways, keeping in mind the connections and disconnections between human emotion and rationality. As the world’s population continues to soar, the importance of humanizing our messages, arguments, and conclusions is paramount.
There is no universal equation governing scientifically literate decision-making, it is unlikely that one will ever be identified simply because human behavior is difficult to predict. From a practical perspective, this may mean that “scientific literacy” as an over-arching concept is less useful than fostering a deeper understanding of relevant issues in science and medicine. In recognizing the need to provide people with specific knowledge to make informed, data-based decisions, it is implied that facts (empirical observations) are clearly differentiated from non-facts. However, this idea is much more complicated, and the interpretation of “facts” is impacted by the context of individual experiences (the Saigon, 1965 episode of the Revisionist History podcast provides an excellent example of such an analysis). The inability to predict how someone will interpret empirical evidence is the giant wrench stuck in the gears of the science literacy machine. With regard to the anti-VAX movement, we are dealing with an emotionally-charged phenomenon that is deeply intertwined with human nature (the need to protect ourselves and our children), and the process goes beyond answering general true-false questions. While difficult, the effort to better understand and meaningfully correct points of failure in communicating any controversial science issue is likely to be beneficial, particularly when we consider the consequences of scientific illiteracy.
Jeanne Garbarino, PhD, Director of Science Outreach, The Rockefeller University, NY, NY
Jeanne earned her Ph.D. in metabolic biology from Columbia University, followed by a postdoc in the Laboratory of Biochemical Genetics and Metabolism at The Rockefeller University, where she now serves as Director of Science Outreach. In this role, she works to provide K-12 communities with equitable access to authentic biomedical research opportunities and resources. You can find Jeanne on social media under the handle @JeanneGarb.