Flip through the cartoons: on a Starbuck's-like paper coffee container these check-offs, checked off: Anguish; Dread; Angst; Extra Foam.
Then I flip to "The Mail," commenting on a review about the psychology of human reasoning. A Letter: Kolbert discusses studies which "demonstrate that reasonable-seeming people are often totally irrational." [Yes!] This work identifies that people have a tendency "to embrace information that supports their beliefs and reject information that contradicts them." [Yes!] Psychologists call this "confirmation bias." Many people refuse to enter the possibility that the scientists who create and oversee these studies may suffer from confirmation biases of their own, believing that the duplication process in the scientific method will uncovers any incorrect theses. But the fallibility of this assumption comes to light when Kolbert write that the authors...."probe the gap between what science tells us and what we tell ourselves." Of course, "science" doesn't tell us anything. Scientists do. [YES!] And, presumably, they are no less human than the rest of us. [Presumably!]
(signed) Bernard P. Dauenhauer [Thank you for your letter!]
UPDATE: Bernard sent the original of his letter to the New Yorker. As an editor, I can see why they cut what they did. But teachers of rhetoric, philosophers, theologians, fourth grade teachers, and epistomologists, what say you? And thank you Bernard.
My Letter to the New Yorker: In her “That’s What You Think: Why Reason and Evidence Won’t Change Our Minds” (2/27) Elizabeth Kolbert discusses the plethora of studies that “demonstrate that reasonable-seeming people are often irrational.” Prominent among the forms of faulty thinking these studies identify is “the tendency people have to embrace information that supports their belief and reject information that contradicts them.” Psychologists call this tendency “confirmation bias.”
One regularly overlooked feature of many of these studies is their failure to entertain the possibility that the scientists who make them may suffer from some sort of “confirmation bias” of their own. A clue to this possibility appears in Kolbert’s remark that the scientists Jack and Sara Gorman “probe the gap between what science tells us and what we tell each other.”
Of course, “science” doesn’t tell us anything. Scientists do. And presumably they are neither more nor less human than the rest of us. For brevity’s sake, let me offer one “bias” that seems to affect any number of social scientists. For them, according to what I would call the “quantification bias,” unless some putative feature of human existence (e.g. the capacity to exercise free choice) can be empirically detected and measured, it cannot actually exist. Kolbert tells us that the scientists Hugo Mercier and Dan Sperba propose to explain why so many of us suffer from what they call ”myside bias” by claiming that it must have some adaptive function, and that function they maintain, is related to our “hypersociability”. Notice that “MUST HAVE” and the “hypersociabililty.” Do Mercier and Sperba themselves suffer from this “necessity” and this condition? Or just the rest of us?