Saturday, December 30, 2017

Errata and a Conundrum

After I pushed 'Publish' on my last blog post I did something I should have done first... namely, I read up a bit more on the real history of the things I wrote about and now shamefacedly acknowledge that I got wrong. On Wikipedia of course! Who else? So please accept my apologies for the blue.

The pre-WWI Austro-Hungarian Empire

This is what I got wrong. For some reason I didn't pay attention to the map. Derr! The light blue at the top showing Bohemia and Moravia is essentially the new Czech Republic and the muddy brown bit to its bottom right is Slovakia. I had it around the other way. Sorry.

Anyway do read some fascinating more history from Wikipedia about the dissolution here and follow the links to the Prague Spring of 1968 and the war time story of Slovakia including its fantastically brave but ultimately disastrous uprising against the Nazis in summer 1944. It broadly coincided with the Warsaw Uprising and the failure of both was ultimately due to the Soviets stopping in their tracks to wait for the mainly middle class fighters (and therefore anti-Soviet) to be crushed by the Nazis. Those guys have a lot to answer for.

But that isn't the reason why I chose to write this new post, it is due to an article I read from the New Yorker Magazine today called "Why Facts Don't Change Our Minds" -- you can find the full article here but do note that it is quite long and for me anyway quite difficult to follow in parts as I am of course precisely the kind of person they are talking about. This post is to try to show that I am not!

The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight -- New Yorker Magazine, Feb 2017

This is the bit that got me:

A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.

In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.

Even when presented with incontrovertible proof that we are wrong about something, we simply do not believe it and continue believing what we want to believe. How on earth does education work then?

The article ends with the following conundrum:

Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.

I don't know the answer either, I'm afraid. I'm still trying to digest the conundrum.

No comments:

Post a Comment