This month’s post is again from Ken Pope’s listserv, where he kindly provides daily summaries of current articles in the field. His post is as follows:
I just finished a fascinating new book in the MIT Press Essential Knowledge series: Post-Truth by Lee McIntyre.
It provides a carefully documented account of the how more and more people are moving away from respecting and learning from science and evidence, and are moving toward trusting feelings, dogma, and in-groups. It explores the move toward a world of “alternative facts” and the implications of that move for different aspects of our individual, social, cultural, and political lives.
Here’s an excerpt:
[begin excerpt]
In his work on the psychology of emotion and moral judgment, David DeSteno, a psychologist at Northeastern University, has studied the effect of such “team affiliation” on moral reasoning. In one experiment, subjects who had just met were randomly divided into teams by giving them colored wristbands. Then they were separated. The first group was told that they would be given the option of performing either a fun ten-minute task or a difficult forty-five-minute one. Each subject was then placed alone in a room and told that he or she should choose which to do—or decide by a coin flip—but that in either case the person who entered the room afterward would be left with the remaining task. What subjects didn’t know is that they were being videotaped. Upon exiting the room 90 percent said that they had been fair, even though most had chosen the easier task for themselves and never bothered to flip the coin. But what is absolutely fascinating is what happened next. When the other half of the subjects were asked to watch a videotape of the liars and cheaters, they condemned them—unless they were wearing the same color wristband.7 If we are willing to excuse immoral behavior based on something as trivial as a wristband, imagine how our reasoning might be affected if we were really emotionally committed.
Motivated reasoning has also been studied by neuroscientists, who have found that when our reasoning is colored by affective content a different part of our brain is engaged. When thirty committed political partisans were given a reasoning task that threatened their own candidate—or hurt the opposing candidate—a different part of their brain lit up (as measured by a functional-MRI scan) than when they were asked to reason about neutral content. It is perhaps unsurprising that our cognitive biases would be instantiated at the neural level, but this study provided the first experimental evidence of such differential function for motivated reasoning.8 With this as background we are now ready to consider two of the most fascinating cognitive biases that have been used to explain how our post-truth political beliefs can affect our willingness to accept facts and evidence.
[end excerpt]
Another excerpt: “The ‘backfire effect’ is based on experimental work by Brendan Nyhan and Jason Reifler, in which they found that when partisans were presented with evidence that one of their politically expedient beliefs was wrong, they would reject the evidence and ‘double down’ on their mistaken belief. Worse, in some cases the presentation of refutatory evidence caused some subjects to increase the strength of their mistaken beliefs.”
Another excerpt:
[begi excerpt]
Post-truth was foreshadowed by what has happened to science over the last several decades. Once respected for the authority of its method, scientific results are now openly questioned by legions of nonexperts who happen to disagree with them. It is important to point out that scientific results are routinely scrutinized by scientists themselves, but that is not what we are talking about here. When a scientist puts forth a theory, it is expected that it will be put through the paces of peer review, attempts at replication, and the highest order of empirical fact checking that can be performed by one’s scientific peers. The rules for this are fairly transparent, since they are in service of the scientific value that empirical evidence is paramount in evaluating the worth of a scientific theory. But mistakes can occur even with the most scrupulous safeguards in place. The process can be quite brutal, but it is necessary to make sure that, insofar as is possible, only good work gets through. Thus, failures to disclose any potential sources of bias—conflicts of interest, the source of one’s funding—are taken especially seriously.
Given this high level of scientific self-scrutiny, why would nonscientists feel it necessary to question the results of science? Do they really think that scientists are lax? In most cases, no; yet this is exactly the sort of claim that is routinely spread by those who find their ideological beliefs in conflict with the conclusions of science.1 In some instances laypersons feel it is in their interest to question both the motives and the competence of scientists. And this is where “science denialism” is born.
[end excerpt]
Here’s a shortened version of the About the Author section: “Lee McIntyre is a Research Fellow at the Center for Philosophy and History of Science at Boston University and an Instructor in Ethics at Harvard Extension School. He is the author of Dark Ages: The Case for a Science of Human Behavior (MIT Press).”
The Amazon webpage for the paperback & Kindle versions is at:
The Basrnes & Noble page for the book is at:
Ken Pope
POPE & VASQUEZ: ETHICS IN PSYCHOTHERAPY AND COUNSELING: A PRACTICAL GUIDE (5th EDITION)—John Wiley & Sons
Print—Kindle—Nook—eBook—Apple iBook—Google Book
POPE: FIVE STEPS TO STRENGTHEN ETHICS IN ORGANIZATIONS AND INDIVIDUALS:
EFFECTIVE STRATEGIES INFORMED BY RESEARCH AND HISTORY—Routledge (imprint of Taylor & Francis)
Hardbound—Kindle—Nook—eBook—Google Book
POPE: “AWARD ADDRESS: THE CODE NOT TAKEN: THE PATH FROM GUILD ETHICS TO TORTURE AND OUR CONTINUING CHOICES”—
Canadian Psychology/psychologie Canadienne article free online at:
In times of universal deceit, telling the truth will be a revolutionary act.
—George Orwell