M. Jackson Group Update – June 2018 – Pseudoscience

This month’s post is again from Ken Pope’s listserv, where he kindly provides daily summaries of current articles in the field.  His post is as follows:
Scott Lilienfeld has written an excellent forward (“Navigating a Post-Truth World: Ten Enduring Lessons from the Study of Pseudoscience”) to an excellent book: Pseudoscience: The Conspiracy Against Scienceedited by Allison B. Kaufman & James C. Kaufman, published by MIT Press (2018).
Here’s how the forward opens: “We find ourselves living increasingly in a ‘post-truth’ world, one in which emotions and opinions count for more than well-established findings when it comes to evaluating assertions.  In much of contemporary Western culture, such catchphrases as ‘Don’t confuse me with the facts,’ ‘Everyone is entitled to my opinion,’ and ‘Trust your gut’ capture a troubling reality, namely, that many citizens do not—and in some cases, apparently cannot—adequately distinguish what is true from what they wish to be true.  This overreliance on the ‘affect heuristic,’ the tendency to gauge the truth value of a proposition based on our emotional reactions to it (Slovic, Finucane, Peters, and MacGregor, 2007), frequently leads us to accept dubious assertions that warm the cockles of our hearts, and to reject well-supported assertions that rub us the wrong way.  We are all prone to this error, but one hallmark of an educated person is the capacity to recognize and compensate for it, at least to some degree.”
Here are some excerpts from the 10 enduring lessons:
[begin excerpt]
(1) We are all biased. Yes, that includes you and me.
(2) We are largely unaware of our biases. Research on bias blind spot (Pronin, Lin, and Ross, 2002) demonstrates that most of us can readily identify cognitive biases in just about everyone except for one person—ourselves. As a consequence of this metabias, we often believe ourselves largely immune to serious errors in thinking that afflict others.
(3) Science is a systematic set of safeguards against biases. Despite what most of us learned in high school, there is probably no single “scientific method”—that is, a unitary recipe for conducting science that cuts across all research domains (McComas, 1996). Instead, what we term “science” is almost certainly an exceedingly diverse, but systematic and finely honed, set of tools that humans have developed over the centuries to compensate for our species’ biases (Lilienfeld, 2010). Perhaps foremost among these biases is confirmation bias, the propensity to selectively seek out, selectively interpret, and recall evidence that supports our hypotheses, and to deny, dismiss, and distort evidence that does not (Nickerson, 1998). As social psychologists Carol Tavris and Elliott Aronson (2007) have observed, science is a method of arrogance control; it helps to keep us honest. 
(4) Scientific thinking does not come naturally to the human species. As many authors have noted, scientific thinking is unnatural (McCauley, 2011). It needs to be acquired and practiced assiduously.
(5) Scientific thinking is exasperatingly domain-specific. Findings in educational psychology suggest that scientific thinking skills generalize slowly, if at all, across different domains. This point probably helps to explain why it is so difficult to teach scientific thinking as a broad skill that can be applied to most or all fields (Willingham, 2007). This sobering truth probably also helps to explain why even many Nobel Prize winners and otherwise brilliant thinkers can easily fall prey to the seductive sway of pseudoscience.
(6) Pseudoscience and science lie on a spectrum. As I noted earlier, there is almost surely no bright line distinguishing pseudoscience from science. Like many pairs of interrelated concepts, such as hill versus mountain and pond versus lake, pseudoscience and science bleed into each other imperceptibly.
Still, as I have pointed out, the fact that there is no categorical distinction between pseudoscience and science does not mean that we cannot differentiate clear-cut exemplars of each concept.
(7) Pseudoscience is characterized by a set of fallible, but useful, warning signs.
Such warning signs differ somewhat across authors, but often comprise an absence of self-correction, overuse of ad hoc maneuvers to immunize claims from refutation, use of scientific-sounding but vacuous language, extraordinary claims in the absence of compelling evidence, overreliance on anecdotal and testimonial assertions, avoidance of peer review, and the like (Lilienfeld, Lynn, and Lohr, 2014).  Despite their superficial differences, these warning signs all reflect a failure to compensate for confirmation bias, an overarching characteristic that sets them apart from mature sciences.
(8) Pseudoscientific claims differ from erroneous claims.  Intuitively, we all understand that there is a fundamental difference between fake new and false news. The latter is merely incorrect, and typically results from the media getting things wrong.  In contrast, the former is deceptive, often intentionally so. Similarly, many and arguably most assertions in science are surely erroneous, but that does not render them pseudoscientific.
(9) Scientific and pseudoscientific thinking are cut from the same basic psychological cloth. In many respects, this is one of the most profound insights imparted by contemporary psychology.  Heuristics—mental shortcuts or rules of thumb—are immensely valuable in everyday life; without them, we would be psychologically paralyzed.  Furthermore, in most cases, heuristics lead us to approximately correct answers. 
Still, when misapplied, heuristics can lead to mistaken conclusions.  For example, many unsubstantiated complementary and alternative medical remedies draw on the representativeness heuristic as a rationale for their effectiveness (Nisbett, 2015).  Many companies market raw brain concentrate in pill form to enhance memory and mood (Gilovich, 1991).  The reasoning, apparently, is that because psychological difficulties stem from an inadequately functioning brain, “more brain matter” will somehow help the brain to work better.
(10) Skepticism differs from cynicism. Skepticism has gotten a bad rap in many quarters, largely because it is commonly confused with cynicism. The term “skeptic” derives from the Greek word “skeptikos,” meaning “to consider carefully” (Shermer, 2002). Skepticism requires us to keep an open mind to new claims but to insist on compelling evidence before granting them provisional acceptance. In this respect, skepticism differs from cynicism, which implies a knee-jerk dismissal of implausible claims before we have had the opportunity to investigate them carefully (Beyerstein, 1995). In fairness, some individuals in the “skeptical movement” have at times blurred this crucial distinction by rejecting assertions out of hand. Skeptics need to be on guard against their propensities toward disconfirmation bias, a variant of confirmation bias in which we reflexively reject assertions that challenge our preconceptions (Edwards and Smith, 1996).
[end excerpts]
Here’s the publisher’s page for the book:
Here’s the Amazon page for the book:
Here’s the Barnes & Noble page for the book:
Ken Pope
Print—Kindle—Nook—eBook—Apple iBook—Google Book
Hardbound—Kindle—Nook—eBook—Google Book
Canadian Psychology/psychologie Canadienne article free online at:
“Science is more than a body of knowledge; it is a way of thinking.  I have a foreboding of an America in my children’s or grandchildren’s time–when the United States is a service and information economy; when nearly all the key manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.”
—Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark (Random House, 1996, p. 25)