The pernicious effects of disrespecting the constraints of science

By Mike Klymkowsky

Recent political events and the proliferation of “fake news” and the apparent futility of fact checking in the public domain have led me to obsess about the role played by the public presentation of science. “Truth” can often trump reality, or perhaps better put, passionately held beliefs can overwhelm a circumspect worldview based on a critical and dispassionate analysis of empirically established facts and theories. Those driven by various apocalyptic visions of the world, whether religious or political, can easily overlook or trivialize evidence that contradicts their assumptions and conclusions. While historically there have been periods during which non-empirical presumptions are called into question, more often than not such periods have been short-lived. Some may claim that the search for absolute truth, truths significant enough to sacrifice the lives of others for, is restricted to the religious, they are sadly mistaken – political (often explicitly anti- religious) movements are also susceptible, often with horrific consequences, think Nazism and communist-inspired apocalyptic purges. The history of eugenics and forced sterilization based on flawed genetic premises have similar roots.

Copyright Sidney Harris
Copyright Sidney Harris; http://sciencecartoonspplus.com/ Please note: this is not a CCBY image; must contact copyright holder above.

Given the seductive nature of belief-based Truth, many turned to science as a bulwark against wishful and arational thinking. The evolving social and empirical (data-based) nature of the scientific enterprise, beginning with guesses as to how the world (or rather some small part of the world) works, then following the guess’s logical implications together with the process of testing those implications through experiment or observation, leading to the revision (or abandonment) of the original guess, moving it toward hypothesis and then, as it becomes more explanatory and accurately predictive, and as those predictions are confirmed, into a theory.  So science is a dance between speculation and observation. In contrast to a free form dance, the dance of science is controlled by a number of rigid, and oppressive to some, constraints [see Feynman].

Perhaps surprisingly, this scientific enterprise has converged onto a small set of over- arching theories and universal laws that appear to explain much of what is observable, these include the theory of general relativity, quantum and atomic theory, the laws of thermodynamics, and the theory of evolution. With the noticeable exception of relativity and quantum mechanics, these conceptual frameworks appear to be compatible with one another. As an example, organisms, and behaviors such as consciousness, obey and are constrained by, well established and (apparently) universal physical and chemical rules.

 

https://en.wikipedia.org/wiki/Last_universal_common_ancestor
https://en.wikipedia.org/wiki/Last_universal_common_ancestor

A central constraint on scientific thinking is that what cannot in theory be known is not a suitable topic for scientific discussion. This leaves outside of the scope of science a number of interesting topics, ranging from what came before the “Big Bang” to the exact steps in the origin of life. In the latter case, the apparently inescapable conclusion that all terrestrial organisms share a complex “Last Universal Common Ancestor” (LUCA) makes theoretically unconfirmable speculations about pre-LUCA living systems outside of science.  While we can generate evidence that the various building blocks of life can be produced abiogenically (a process begun with Wohler’s synthesis of urea) we can only speculate as to the systems that preceded LUCA.

 

Various pressures have led many who claim to speak scientifically (or to speak for science) to ignore the rules of the scientific enterprise – they often act as if their are no constraints, no boundaries to scientific speculation. Consider the implications of establishing “astrobiology” programs based on speculation (rather than observations) presented with various levels of certainty as to the ubiquity of life outside of Earth [the speculations of Francis Crick and Leslie Orgel on “directed panspermia”: and the Drake equation come to mind, see Michael Crichton’s famous essay on Aliens and global warming]. Yet such public science pronouncements appear to ignore (or dismiss) the fact that we know (and can study) only one type of life, the descendants of LUCA. They appear untroubled when breaking the rules and abandoning the discipline that has made science a powerful, but strictly constrained human activity.

 

Whether life is unique to Earth or not requires future explorations and discoveries that may (or given the technological hurdles involved, may not) occur. Similarly postulating theoretically unobservable alternative universes or the presence of some form of consciousness in inanimate objects [such unscientific speculation as illustrated here] crosses a dividing line between belief for belief’s sake, and the scientific – it distorts and obscures the rules of the game, the rules that make the game worth playing [again, the Crichton article cited above makes this point]. A recent rather dramatic proposal from some in the physical-philosophical complex has been the claim that the rules of prediction and empirical confirmation (or rejection) are no longer valid – that we can abandon requiring scientific ideas to make observable predictions [see Ellis & Silk]. It is as if objective reality is no longer the benchmark against which scientific claims are made; that perhaps mathematical elegance or spiritual comfort are more important – and well they might be (more important) but they are also outside of the limited domain of science. At the 2015 “Why Trust a Theory” meeting, the physicist Carlo Rovelli concluded “by pointing out that claiming that a theory is valid even though no experiment has confirmed it destroys the confidence that society has in science, and it also misleads young scientists into embracing sterile research programs.” [quote from Massimo’s Pigliucci’s Footnotes to Plato blog].

 

While the examples above are relatively egregious, it is worth noting that various pressures for glory, fame, and funding can tend to impact science more frequently – leading to claims that are less obviously non-scientific, but that bend (and often break) the scientific charter. Take, for example, claims about animal models of human diseases. Often the expediencies associated with research make the use of such animal models necessary and productive, but they remain a scientific compromise. While mice, rats, chimpanzees, and humans are related evolutionarily, they also carry distinct traits associated with each lineage’s evolutionary history, and the associated adaptive and non-adaptive processes and events associated with that history. A story from a few years back illustrates how the differences between the immune systems of mice and humans help explain why the search, in mice, for drugs to treat sepsis in humans was so relatively unsuccessful [Mice Fall Short as Test Subjects for Some of Humans’ Deadly Ills]. A similar type of situation occurs when studies in the mouse fail to explicitly acknowledge how genetic background influences experimental phenotypes [Effect of the genetic background on the phenotype of mouse mutations], as well as how details of experimental scenarios influence human relevance [Can Animal Models of Disease Reliably Inform Human Studies?].

 

Speculations that go beyond science (while hiding under the mantel of science – see any of a number of articles on quantum consciousness) – may seem just plain silly, but by abandoning the rules of science they erode the status of the scientific process.  How, exactly, would one distinguish a conscious from an unconscious electron?

In science (again as pointed out by Crichton) we do not agree through consensus but through data (and respect for critical analyzed empirical observations). The Laws of Thermodynamics, General Relativity, the standard model of particle physics, and Evolution theory are conceptual frameworks that we are forced (if we are scientifically honest) to accept. Moreover the implications of these scientific frameworks can be annoying to some; there is no free lunch (perpetual motion machine), no efficient, intelligently-designed evolutionary process (just blind variation and differential reproduction), and no zipping around the galaxy. The apparent limitation of motion to the speed of light means that a “Star Wars” universe is impossible – happily, I would argue, given the number of genocidal events that appear to be associated with that fictional vision.

 

Whether our models for the behavior of Earth’s climate or the human brain can be completely accurate (deterministic), given the roles of chaotic and stochastic events in these systems, remains to be demonstrated; until they are, there is plenty of room for conflicting interpretations and prescriptions. That atmospheric levels of greenhouse gases are increasing due to human activities is unarguable, what it implies for future climate is less clear, and what to do about it (a social, political, and economic discussion informed but not determined by scientific observations) is another.

Courtesy NASA.As we discuss science, we must teach (and remind ourselves, even if we are working scientific practitioners) about the limits of the scientific enterprise. As science educators, one of our goals is to help students develop an appreciation of the importance of an honest and critical attitude to observations and conclusions, a recognition of the limits of scientific pronouncements. We need to explicitly identify, acknowledge, and respect the constraints under which effective science works and be honest in labeling when we have left scientific statements, lest we begin to walk down the path of little lies that morph into larger ones.  In contrast to politicians and other forms of religious and secular mystics, we should know better than to be seduced into abandoning scientific discipline, and all that that entails.

Picture1

 

 

 

 

M.W. Klymkowsky  web site:  http://klymkowskylab.colorado.edu  email: klym@colorado.edu

 

 

 

 

Book Review: An Astronaut’s Guide to Life on Earth

Commander Chris Hadfield captured the world’s imagination last year, when, from 13 March to 13 May 2013, he was the first Canadian Commander of the International Space Station. While aboard the ISS, Commander Hadfield did a series of “experiments,” both for scientists, but, perhaps most importantly, for youth. This included genuinely interesting questions like “How do you cry in space? (video above)” and “How do you cut your nails?” and the always important “How do you go to the bathroom?” His amicable nature and genuinely infectious enthusiasm brought science to the masses, and helped inspire thousands of youth.

Recently, Chris Hadfield released his book – “An Astronaut’s Guide to Life on Earth.” My sister waited in line for 3 hours at our local Costco to get me a signed copy for my birthday, and I finally got around to reading it for this review. The book follows the life of Chris Hadfield as he becomes the commander of Expedition 35, detailing his attitude and the path he took to become the first Canadian Commander of the ISS. The book is split into three broad sections leading up to Expedition 35 titled “Pre-Launch,” “Liftoff” and “Coming Down to Earth,” with several chapters within each section.

The book was fascinating to me – Hadfield is a hybrid pilot-engineer-scientist-lab rat. His expertise is in engineering and as a test pilot, but throughout the book he references how his work is interdisciplinary, and he has to have a broad understanding of several domains in order to be effective. In addition to his role as an astronaut and Commander, he is also a fully fledged lab rat, and people on the ground will ask him questions about how he’s feeling, take samples while he’s in space and after he returns, as well as measure how quickly he recovers to life back on Earth in order to further our understanding about how life in space impacts the human body. Since, at some point, we hope to explore the stars, any data we can get on how astronauts respond to life in space is valuable.

One of my favourite parts of the book was how it didn’t just focus on the mundane, it relished them. He spends pages describing the drills he went through, and how important have a strong grasp of the fundamentals was for his success. I found this refreshing – too often in science we glorify the achievements but ignore all the hard work that got them there. A breakthrough in the lab might take months or even years of work before things go right, and having some acknowledge that, not only do things not work (often), them not working is not the end of the world. This was a refreshing take on the scientific method, and really highlighted the value in “the grind” of slowly perfecting your skills.

Click the book cover for purchasing options!
Click the book cover for purchasing options!

He also has a certain brand of “folksy wisdom” that is inspiring in it’s own way. It’s not inspirational in the nauseating sense that these things are often written in, but more practical. He states the importance of reading the team dynamic before getting involved for example, or how important it is to really understand the nuts and bolts of what you’re doing, but at no point does that feel patronizing or “hey, look at me, I’m an astronaut!” For many budding scientists, the idea of trudging through another page of equations, or washing beakers, or just doing the mundane, less exciting parts of science makes you apathetic and bored. Hadfield takes this moments and stresses just how important it is to learn from them, as well as ensure that you know exactly why they are important. I highly recommend the book to anyone interested in STEM careers, and especially those early in their careers.

To purchase, check out Chris Hadfield’s official website.


Featured image: Commander Hadfield performed at the 2013 Canada Day celebrations in Ottawa, ON | Picture courtesy David Johnson, click for more info

Do lemurs like to move it-move it? (video)

Lemurs had their 15 minutes of fame, back when DreamWork’s Madagascar came out in 2005. This year it’s time for IMAX Island of Lemurs: Madagascar to shine a spotlight on this primates.

We discussed before how nature documentaries influence the public’s understanding of science, and mostly increase the general public’s science literacy. Which is why I was curious to test the effect of the Madagascar movie: what did it teach the general public? Did it result in the public’s new understanding of lemurs? During my visit to the Duke Lemur Center, I had the perfect opportunity to find out. During  the 40 minute car ride, I asked acting driver and education specialist Chris Smith. And here’s what he told me:

This is the second installment of our participation on Lemur Week. For Part I, click here.

Sci-Ed joins Lemur Week (video)

Their ghostly eyes are lovely windows to their souls.

Lemurs are primates – they have long tails, tree-climbing hands, and incredible curiosity. At least that’s what I encountered on my visit to the Duke Lemur Center (sponsored by Owen Software). Education specialist Chris Smith led me on an amazing tour. See below:

The Duke Lemur Center offers tours, similar to the one above. Their goal is to raise funds for research (Smith estimated that 10% of the center’s funds come from tours). Most of all, the center aims to educate the public and raise awareness about lemur conservation. And it seems to pay off: in 2013, they received 18,000 visitors (5,000 more than a previous record-breaking year). In addition to tours, the educational department is expanding to bring in even younger visitors, so conservation education can start earlier. The Duke Lemur Center now has a “primates for pre-schoolers program” for kids ages 3-5, and a “leaping lemurs summer science camp” for 6th and 8th graders from all over the country. For the grown-ups, there’s an “evening with the experts” with such curious topics as “are you smarter than a lemur?”.

Come back Wednesday for another video on Duke Lemur Center, when we’ll explore some of Chris Smith’s strategies when talking lemur science to the public.

Childhood obesity drops 40% in the last decade. Or not really, but who’s checking?

“A lie that is half-truth is the darkest of all lies.”
― Alfred Tennyson

Last week, a new study published in the Journal of the American Medical Association received a lot of media attention. The study, performed by Cynthia Ogden and colleagues at the CDC, aimed to describe the prevalence of obesity in the US and look at changes between 2003 and 2012. The study itself had several interesting findings, not least among them that the prevalence of obesity seems to have stabilized in many segments of the US population. However, they made one observation that caught the media’s attention:

“There was a significant decrease in obesity among 2- to 5-year-old children (from 13.9% to 8.4%; P = .03)”

This is where things get interesting, as the focus was not on the 5.5 percentage points difference. Instead of reporting the absolute difference, i.e. how much something changed, news outlets focused on the relative difference, i.e. how much they changed compared to each other. In that case, it would be (5.5/13.9 =) 40%. Which is much more impressive than the 5.5% change reported in the study. So you can guess what the headlines loudly proclaimed:

Headlines from Bloomberg, the LA Times and the WSJ
Headlines from Bloomberg, the LA Times and the WSJ | Click to enlarge, click links to read the articles

The media latched onto this “40%” statistic and ran with it, despite the researchers clearly stating that this was not their intention. In fact, from the paper itself, they said (in their conclusions):

Overall, there have been no significant changes in obesity prevalence in youth or adults between 2003-2004 and 2011-2012. Obesity prevalence remains high and thus it is important to continue surveillance. (emphasis mine)

This makes me wonder how many journalists read the article, how many got to the end, and how many just saw what other people had reported and ran with the same headline and narrative.

Here’s the thing – they’re technically correct (the best kind of correct). Yes, childhood obesity dropped 40% based on that report, and if that is true, that is a dramatic decrease. However, that is one group, and even the researchers themselves conclude this may be meaningless. It begs the question why, and if this is an actual association or just an artifact of something else like the type and number of statistical tests used. But since the narrative had already been written, everyone followed suit, and next thing you know we’re all slapping hi fives and proclaiming that there has been a drop off in childhood obesity that may not actually be something worth celebrating.

Now, had the results been portrayed fairly, two things would have happened. For one, the findings would not have been as positive as they are now. In fact, the headlines would have read “Business as usual: Obesity the same for the last decade” or “Obesity up 20% among elderly women!” (The latter refers to the finding that the prevalence of obesity went up among women aged 60 years and older from 31.5% to 38.1%). Secondly, a much more detailed discussion of the study findings would have happened – why has the prevalence stabilized? Have we finally reached saturation? Are all the people who could be obese now obese? Or is something else going on? But these weren’t the findings that were focused on.

The worst outcome of this media exposure won’t be felt right now. It will be felt in the next study. You see, this study in JAMA was reported all over the media, and millions would have heard about how we’ve finally “turned a corner in the childhood obesity epidemic” (to quote the New York Times). Unfortunately, this may not be the case, and if a new study comes out saying the opposite, this further undermines the public’s confidence in science, even though the researchers in question never made any such claim.

And that, dear readers, is the darkest of all lies.

References
Ogden, C. L., Carroll, M. D., Kit, B. K., & Flegal, K. M. (2014). Prevalence of Childhood and Adult Obesity in the United States, 2011-2012. JAMA, 311(8), 806-814.

Creation vs Evolution: Why science communication is doomed

Last Tuesday night, Bill Nye the Science Guy had a debate with Ken Ham over creationism vs evolution. I watched part of the debate, and have conflicted feelings on it. I’m going to start by saying I think it was a brilliant marketing move. For one, it suddenly brought the Creation Museum into the forefront of society for next to nothing. While before only a handful had heard of it, now it has risen to national prominence, and I’m sure the number of visits they have will reflect that in the near future.

As for the substance itself, I don’t think this is a very good topic for a debate. Any time you bring religion into a discussion, it turns into an “us vs them” argument where neither party is willing to change their view. Even the advertising and marketing billed it as a debate of “creationism vs evolution” – effectively presupposing the view that one can believe in both (which I’ll come back to). At best, it’s snarky and offhanded, and at worst, antagonistic and ad hominem. I should point out though that this is on both sides – neither side is willing to reconcile.

And why should they? Both view their side as being right, and weigh the information they have differently. So all that this accomplishes is that both sides become further polarized and further entrenched, and any chance of meaningful dialogue between both sides becomes less and less likely with every angry jab back and forth. It turns into a 21st century war of angry op-eds, vindictive tweets and increasingly hostile and belligerent Facebook posts shared back and forth. This isn’t just limited to religion though – many discussions end this way with people being forced to take sides in an issue that is more complicated than simply being black/white. Rather than discuss the details and come to an understanding of what we agree and disagree on, we’re immediately placed into teams that are at loggerheads with each other.

What is most interesting is what happens to extreme viewpoints when they are criticized. Rather than taking in new information and evaluating it based on its merits, criticism actually results in the consolidation of those perspectives. In lay language, if you have an extreme viewpoint, you dig in your heels, build a trench and get ready to defend yourself against all attackers. This isn’t entirely surprising – when someone attacks you, and in particular attacks you *personally*, why wouldn’t you get defensive. Studies of this have look at this from a political perspective, comparing extreme conservatives to extreme liberals. To quote Psychology Today:

Extreme conservatives believed that their views about three topics were more superior: (1) the need to require voters to show identification when voting; (2) taxes, and (3) and affirmative action. Extreme liberals, on the other hand, believed that their views were superior on (1) government aid for the needy; (2) the use of torture on terrorists, and (3) not basing laws on religion.

But wait! Aren’t these just fringe opinions being heard in the media? The good news is yes. The bad news is that the extremes are what people hear. If you imagine everyone existing on a normal distribution – with extreme opinions on the edges – then the vast majority of the people exist in the gulf between those people. However, those extremes are what people hear. In fact, this is what led to Popular Science shutting down their comments, based on findings by Brossard and Scheufele. What they did was ask people to read a study, and while the article remained the same, one group was exposed to civil comments, and the other to uncivil comments. What they found was striking:

In the civil group, those who initially did or did not support the technology — whom we identified with preliminary survey questions — continued to feel the same way after reading the comments. Those exposed to rude comments, however, ended up with a much more polarized understanding of the risks connected with the technology.

So seeing negative comments not only made people more skeptical of the article, it made them more skeptical of the science itself! That’s a huge concern for us, and how science is written about and discussed. Seeing negative comments, no matter how poorly written or ill-informed they are, makes people fundamentally view the science as being of lower quality. And that resulted in Popular Science closing their commenting section.

So to bring it all full circle, the “debate” was a microcosm of science and the public. Scientists sit back, do their work, and then turn around and say “Hey! You should do this” and then wonder why no one listens to them and why people fight them. We saw this with the New York soda ban, we’re seeing this in other spheres as well, and unless we change how we approach these hot button issues, we’ll lose the support of the fringe opinions (which we have already lost), but also the support of the moderates (which we can still get). I was having this discussion with my friend Steve Mann, who is one of the smartest men I know, and he sums it up best:

“It’s easier to poke fun at people with whom you disagree, particularly if you can imply that they are childish, old-fashioned, religious, or uneducated, than to honestly examine whether there is any merit to what they’re saying, and I think that’s a shame.”

I’m not taking sides – that wasn’t the aim of this piece. The aim of this piece is to tell you to listen with a open mind, discuss issues with others, and at all costs avoid ad hominem and personal attacks. If we want to bring people together, we have to avoid using language that drives us apart. If we want to promote science, we have to discourage hate. And if we want to educate others, we first have to start by understanding others.

Reference:
K. Toner, M. R. Leary, M. W. Asher, K. P. Jongman-Sereno. Feeling Superior Is a Bipartisan Issue: Extremity (Not Direction) of Political Views Predicts Perceived Belief Superiority. Psychological Science, 2013; DOI: 10.1177/0956797613494848

On overcoming writer’s block

The setting is your office. You’re bathed in the dull glow of your computer screen, staring at a blank page in Word, trying to write a paper.

Blink.

Blink.

The cursor is watching you, mocking you, laughing at your inability to get words out.

Blink.

Blink.

Your mind locks up as you wonder “what do I have to say?” The more you try to force out words, the harder it becomes, and eventually the frustration leads to you sitting there, at your desk with your head in your hands, wondering how you’ll ever finish.

Blink.

You then Google “how to overcome writers block” and end up on this post.

The official name for this is the "? block"
The official name for this is the “? block”

Writer’s block is a tough thing to deal with, but one we’ll all have to tackle at some point – either at the start of our training while we’re writing outlines and proposals, at the end when we’re writing up manuscripts and theses, or afterwards, as we’re working on papers and other documents. As science communicators, the toughest part is often figuring out exactly how to begin, and how to frame the core message that we want to get across – a process that can be incredibly frustrating. So the question becomes, how do you deal with it?

Now, I’m going to state the obvious here, but it’s a necessary point: The hardest part of writing is starting to write. Once you start though, it becomes infinitely easier to get content out onto the page. To help you kick start your writing process, I’m going to give you a few tips, and as always, I’d love to hear what you do to overcome writers block when it hits in the comments.

1) Isolate yourself. Remove all distractions – phone, coworkers, cats, get rid of it all. You want to be able to focus exclusively on writing. The fact is that if you have an easy out, you’re more likely to take it, i.e. “I’m stuck, I wonder if anything has changed on Facebook in the past 3 minutes? And this Buzzfeed article seems great, and look at what this cat is doing…” It’s tough to start writing, and removing distractions means you’ll struggle through those tough parts rather than put it off and do something else. You need to power through this part.

2) Talk it out. This one sounds strange, but is one of my favourites and has been hugely effective for me. Occasionally, I’ll close my office door, stand up, and pretend I’m giving a talk about whatever it is I’m writing about. Now only does this get you thinking about the topic at hand, but without the intimidation of the cursor and blank word document staring at you, it is easier to just get your ideas out. Be organic: stand up, pace back and forth, talk like you normally would, and don’t focus on the minutia of your project. Talk about the broad strokes and the flow of your arguments, and see if they helps you over the initial hurdle.

Alternative: Grab a coworker, go for coffee, and outline your paper/idea to them. Tell them their job is not to have a conversation with you – their job is to ask questions and prod you when you get stuck, and help you jump start your writing. Obviously, you owe them coffee/donut(s) for listening to you 🙂

TUPAC SHAKUR
Tupac Shakur released a song called “My Block” (click to listen)

3) Write an outline. For those who don’t like talking things out, this is an effective alternative. Sketch down the key points you want to make in each paragraph, and write as much information about each paragraph as you can without losing momentum. Even if you do talk it out, this is a good way to conceptualize your work. By the end, you should have something like this:

Paragraph 1: Open with a scene about writers block
Paragraph 2: Describe writers block, transition into list
Paragraph 3: Start outlining key points
etc

This is an engine block.
This is an engine block.

4) Start writing. Don’t think about grammar, phrasing, punctuation or language rules. Just get words out. Ignore word choices, ignore making things sound “professional.” Just get those ideas out and onto the page. At this point you want to have something out there to look at and critique, and hopefully, if you followed steps 1 through 3, you’ve got a few ideas up your sleeve now. Remember: the ideas don’t have to flow. You can write two distinct paragraphs, making two very different points, and that’s fine. You can go back later and fine tune things. Again, all you’re trying to do here is get something out onto the page that you can work with.

5) Do something else. Up until this point, I’ve talked about isolating yourself and focusing on writing. Here, I’m going to suggest leaving it, but with one caveat. Go and do something else that gets you moving, but not something that engages you entirely – something like cooking, cleaning, going for a run, lifting weights etc. Something that allows you to get yourself up, but without taking your full attention. There’s a reason why we have our best ideas in the shower, and turns out it’s because of the combination of 1) the release of dopamine, 2) being relaxed, and 3) being distracted enough that your subconscious can engage and work on a problem, results in you being more creative (science here)

mutombofingerwag
Dikembe Mutumbo was famous for his ability to block

Before you know it, you’ve got an outline, some body text and a fleshed out idea of what you want to say, and that’s half the battle right there. After you’ve got a skeleton to work with, it becomes a lot easier to start writing, and begin building your arguments.

How do you deal with writer’s block?

This post originally appeared on MrEpidemiology.com