Saturday, March 11, 2017

Flipping Through the New Yorker UPDATE

Opened the New Yorker (March 13). Must read Anthony Lane on Jane Austen's last, unfinished novel, Sanditon (unclear that was her title). Read it. Conclusion (mine) No, I do not have to read the unfinished novel.

Flip through the cartoons:  on a Starbuck's-like paper coffee container these check-offs, checked off:  Anguish; Dread; Angst; Extra Foam.

Then I flip to "The Mail," commenting on a review about the psychology of human reasoning. A Letter: Kolbert discusses studies which "demonstrate that reasonable-seeming people are often totally irrational." [Yes!]   This work identifies that people have a tendency "to embrace information that supports their beliefs and reject information that contradicts them." [Yes!]  Psychologists call this "confirmation bias." Many people refuse to enter the possibility that the scientists who create and oversee these studies may suffer from confirmation biases of their own, believing that the duplication process in the scientific method will uncovers any incorrect theses.  But the fallibility of this assumption comes to light when Kolbert write that the authors...."probe the gap between what science tells us and what we tell ourselves."  Of course, "science"  doesn't tell us anything. Scientists do. [YES!]  And, presumably, they are no less human than the rest of us.  [Presumably!]
(signed) Bernard P. Dauenhauer   [Thank you for your letter!]

UPDATE: Bernard sent the original of his letter to the New Yorker. As an editor, I can see why they cut what they did. But teachers of rhetoric, philosophers, theologians, fourth grade teachers, and epistomologists, what say you?  And thank you Bernard.
My Letter to the New Yorker:    In her “That’s What You Think:  Why Reason and Evidence Won’t Change Our Minds” (2/27)  Elizabeth Kolbert discusses the plethora of studies that “demonstrate that reasonable-seeming people are often irrational.”  Prominent among the forms of faulty thinking these studies identify is “the tendency people have to embrace information that supports their belief and reject information that contradicts them.” Psychologists call this tendency “confirmation bias.”
    One regularly overlooked feature of many of these studies is their failure to entertain the possibility that the scientists who make them may suffer from some sort of “confirmation bias” of their own.  A clue to this possibility appears in Kolbert’s remark that the scientists Jack and Sara Gorman “probe the gap between what science tells us and what we tell each other.”
    Of course, “science” doesn’t tell us anything.  Scientists do.  And presumably they are neither more nor less human than the rest of us.  For brevity’s sake, let me offer one “bias” that seems to affect any number of social scientists.  For them, according to what I would call the “quantification bias,” unless some putative feature of human existence (e.g. the capacity to exercise free choice) can be empirically detected and measured, it cannot actually exist.  Kolbert tells us that the scientists Hugo Mercier and Dan Sperba propose to explain why so many of us suffer from what they call ”myside bias” by claiming that it must have some adaptive function, and that function they maintain, is related to our “hypersociability”.  Notice that “MUST HAVE” and the “hypersociabililty.”  Do Mercier and Sperba themselves suffer from this “necessity” and this condition?  Or just the rest of us?


14 comments:

  1. One thing that must be kept in mind is that scientists are very conservative when it comes to asserting something proven. It wasn't until around 1999 that they claimed the human warming signal was proven, where proof is 95% statistical probability. Of course, if we thought a plane had a 5% let alone 95% chance of crashing, we'd make a ubie and stay off the plane. Noemi Oreskes has written books on the psychology of denialism, and claims that scientists conforming to 95% conservatism contributes to political inaction on climate change and the disasters that will befall civilization.

    ReplyDelete
  2. Is it possible that "scientists" are a breed apart from "science writers"? The article cited was about psychology. There is the eternal debate about whether psychology is "science"

    ReplyDelete
  3. For two decades I did applied research in the mental health system, mostly using data collected by clinicians on clients. At the beginning I wondered if I could tell them anything that they didn’t already know. I discovered that most clinicians cannot see the forest for the trees. It would be good if they were focusing on each individual client but in fact they are focusing on the particular problem in front of them without considering everything else that is going on about the client (which computer systems are better at doing)
    In our system at both intake and closing of a case, the clinician would check if a sexual abuse, or a physical abuse problem was involved. No other information, not if the person was a victim or perpetrator or family member, or whether anything was done. I could aggregate the check marks for the same person over all episodes at all agencies. People who had a check mark were very much more expensive, even if I made the unlikely assumption that the same person would have had depression (or whatever) and cost whatever the average depressive cost. So sexual abuse is a very costly mental health problem (and probably health problem and social services problem) but really is not much on the radar screen except for the Catholic Church.
    On the individual clinician level, an agency had one therapist who was extremely good at group therapy. People loved her groups; one even discussed the possibility of buying a house together! In order to be in group therapy you first have to be in individual therapy. This therapist lost 90% of her clients because they dropped out in the first three sessions, the rest went on to group therapy and stayed forever! She wasn’t aware of this; her supervisor wasn’t aware of this. Fortunately she made a lot of money for the agency.
    We all walk around with blinders on, selectively seeing things. I spent a lot of time processing data with people starting with clinicians, then going up to supervisors and senior administrators, and clients and family members, and people in other systems. I always tried to organize and simply data as much as possible with making conclusions. I liked to bring it just to the point where someone would say “is this what you say is going on” I would respond “so is that how you interpret it? let’s hear from someone else>” I found data very useful in bringing people together and getting them to think beyond their blinders.

    ReplyDelete
  4. Should have been "without making conclusions" I am a terrible proof reader.

    ReplyDelete
  5. Margaret, that's a good question. I have found that there is certainly a difference between statistical research findings and how doctors (who I guess are at least scientific thinkers) apply the research and present it to patients: "This drug is safe for most people to take" vs. "This drug may have a 10 percent chance worsening your disease in eight years, but without it, you have a 50 percent chance of being dead in under five years, whaddya wanna do?" I prefer knowing all the odds, but some people don't and become frightened when they get the second approach.

    ReplyDelete
    Replies
    1. Doctors are more technicians who use scientific tests and statistical information than people who think like scientists.

      I was diagnosed with severe sleep apnea after an overnight sleep test at a motel run by Hospital A. I tried using the CPAP machine but it never worked well.

      A couple of years later after losing weight I was retested again in a hospital bed run by Hospital B. Sleep apnea gone; the doctor attributed it to the weight loss (which often happens)

      But I am a scientist. The bed used by Hospital A was flat; the bed at Hospital B was inclined. I got myself an inclined bed and used the app SNORELAB to measure my snoring. When I incline the bed, no snoring; when it is flat, terrible snoring like during the test at Hospital A. Conclusion the angle of the bed is far more important than the weight loss.

      What don’t they have all the diagnostic tests in an adjustable bed?. First half of the night flat, second half adjusted to minimize snoring. Too much technology with scientific thinking.

      I like Ellen Langer’s Counter Clockwise: Mindful Health and the Power of Possibility. A fellow social psychologist, she has a scientific approach. She advocates observing yourself, and manipulating the circumstances and what you do to see if you can get more positive results. May work for some of us scientists types but there is a great possibility of encouraging superstitious behavior, e.g. all the things baseball players do to get ready to bat some of which may help and some of which may be completely unrelated.

      Delete
  6. I guess I become annoyed when findings in psychology or theories in evolutionary psychology are used to excuse bad behavior. I still think we can stand outside ourselves and decide a change is in order. Eco and climate change websites have referenced the studies referenced in Peggy's post, explaining why some people love to be wrong. I think it is more respectful of a climate denier to tell him he's being a jerk (or something more Abe-ish) rather than psychoanalyzing him like a specimen. Interestingly, these same eco sites have referenced a book that uses Jesus among others as examples of great communicators, able to obviate the prejudices of their hearers. Wish I could do that. All I can do is argue the science, and you've seen how well that works.

    ReplyDelete
  7. Hi guys. I've been having trouble posting comments but maybe now I've fixed that.

    ReplyDelete
  8. https://www.nytimes.com/2017/03/03/opinion/sunday/why-we-believe-obvious-untruths.html

    ReplyDelete
  9. I think this is the New Yorker story Bernard refers to: http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds

    Interesting bit here: "If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our vitews. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration."

    But isn't the opposite true? If the OMB and other analysts say that GOPcare will be more affordable than Obamacare while helping roughly the same number of people, I will still have to hate GOPcare. That the GOP could do anything humane does not fit the general world view of my "tribe." I would be compelled to point out that GOPcare is not much of an improvement, that it poses vast inconvenience and expense in the form of legislative time and bureaucratic switchover, and that it must be making somebody undeservedly rich or the GOP wouldn't be pushing it.

    ReplyDelete
  10. Jean,

    I don't think what you describe is an opposite. It is more like a converse. I don't deny a clear bias in the article in favor of Obamacare, Democrats, or whatever. But the key point is about baseless opinions. You can rewrite the passage and substitute Trumpcare or Ryancare for the Affordable Care Act, and substitute something like "Pelosi Democrats" for "the Trump Administration," and you will still have a valid example of the phenomenon being described.

    That the GOP could do anything humane does not fit the general world view of my "tribe."

    Just because we're biased does not mean we aren't right!

    ReplyDelete
  11. "Just because we're biased does not mean we aren't right!"

    I dunno. I've never had a hard time making judgments and decisions, but I do accept that my vision is limited and that being"right" is a hazy concept in human life. Kolbert's thesis, that we will sacrifice rationality to tribal affiliation might be overstated, but I think it's there.

    ReplyDelete
    Replies
    1. Kolbert's thesis, that we will sacrifice rationality to tribal affiliation might be overstated, but I think it's there.

      I didn't mean to imply confirmation bias and all of those other distortions of rationality were not real phenomena. I am currently reading Michael Lewis's The Undoing Projects, which illustrates countless ways human beings can be irrational. There is a point in the book in which someone takes a course in behavioral economics. The first thing the teacher has them do is write down the last two digits of their cell phone number. Then she asks them to estimate the percentage of African countries in the UN. Those who wrote down higher two-digit numbers based on their cell phone gave higher estimates of the percentage of African countries in the UN. Asked to make a quick estimate of the value of 8X7X6X5X4X3X2X1, people give a higher estimate than if asked to estimate 1X2X3X4X5X6X7X8, even though they are exactly the same. People are more likely to opt for surgery if told the operation gives them a 94% chance of survival than a 6% chance of dying.

      As for my "tribe," I am a liberal Democrat, and no doubt the affiliation brings with it a certain bias. But regarding my assessment of the Trump administration, I feel on solid ground. I have always made a certain effort to pay attention to the "other side," mainly by regularly reading a number of conservative columnists (George Will, Charles Krauthammer, Jennifer Rubin, David Brooks) and also by trying to be fair minded in assessing what the conservatives say on the roundtables on the Sunday shows. When it comes to Trump, I find the people on the "other side" saying things I am largely in agreement with.

      Delete
    2. It's The Undoing Project, not Projects, as I wrote above. As I pressed "Publish" for the above message an alert popped up on my computer: 24 million would lose insurance under the G.O.P. health bill within a decade, the nonpartisan Congressional Budget Office found.

      Delete