Wednesday, May 5, 2010

BS

Here's the video of Jon Stewart interviewing Harry Frankfurt about his book On Bullshit (which you can read online for free here).



What do you think? Is not caring about whether you're telling the truth worse than deliberately lying?

Sunday, May 2, 2010

Homework #3

Homework #3 is due at the beginning of class on Monday, May 10th. Your assignment is to choose an ad (on TV or from a magazine or wherever) and evaluate it from a logic & reasoning perspective.
  • First, very briefly explain the argument that the ad offers to sell its product.
  • Then, list and explain the mistakes in reasoning that the ad commits.
  • Then, list and explain the psychological ploys the ad uses (what psychological impediments does the ad try to exploit?).
  • Attach (if it's from a newspaper or magazine) or briefly explain the ad.

Friday, April 30, 2010

Intellectual Humility

I think there’s an important connection between intellectual honesty and humility. A simple goal of this class is to get us all to recognize what counts as good evidence and what counts as bad evidence for a claim. I think we've gotten pretty good at this so far. But this doesn’t guarantee that we’ll care about the difference once we figure it out.

Getting us to care is the real goal. We should care about good evidence. We should care about evidence and arguments because they get us closer to the truth. When we judge an argument to be overall good, THE POWER OF LOGIC COMPELS US to believe the conclusion. If we are presented with decent evidence for some claim, but still stubbornly disagree with this claim for no strong reason, we are just being irrational. Worse, we’re effectively saying that the truth doesn’t matter to us.

Instead of resisting, we should be open-minded. We should be willing to challenge ourselves--seriously challenge ourselves--and allow new evidence change our current beliefs if it warrants it. We should be open to the possibility that we’ve currently gotten something wrong. This is how comedian Todd Glass puts it:



Here are the first two paragraphs of an interesting article on this:

Last week, I jokingly asked a health club acquaintance whether he would change his mind about his choice for president if presented with sufficient facts that contradicted his present beliefs. He responded with utter confidence. “Absolutely not,” he said. “No new facts will change my mind because I know that these facts are correct.”

I was floored. In his brief rebuttal, he blindly demonstrated overconfidence in his own ideas and the inability to consider how new facts might alter a presently cherished opinion. Worse, he seemed unaware of how irrational his response might appear to others. It’s clear, I thought, that carefully constructed arguments and presentation of irrefutable evidence will not change this man’s mind.

Ironically, having extreme confidence in oneself is often a sign of ignorance. Remember, in many cases, such stubborn certainty is unwarranted.

Certainty Is a Sign of Ignorance

Wednesday, April 28, 2010

Metacognition

Next We Can Think About the Way We Think About ThinkingThere's a name for all the studying of our natural thinking styles we've been doing in class lately: metacognition. When we think about the ways we think, we can vastly improve our learning abilities. This is what the Owning Our Ignorance club is about.

I think this is the most valuable concept we're learning all semester. So if you read any links, I hope it's these two:

Tuesday, April 27, 2010

Practical Advice

How can we counteract these cognitive biases we're learning about? One big point is to own our fallibility. Awareness of our limits and biases should lead us to lower our degree of confidence in our beliefs. Simply put, we should admit (and sincerely believe) that there's a real chance that we're wrong.

Here are two other big, simple points I think make for some great practical advice:
  1. AKirk & His Straw Bananactively seek out sources that you disagree with. We tend to surround ourselves with like-minded people and consume like-minded media. This hurts our chances of discovering that we've made a mistake. In effect, it puts up a wall of rationalization around our preexisting beliefs to protect them from any countervailing evidence.
  2. When we do check out our opponents, it tends to be the obviously fallacious straw men rather than sophisticated sources that could legitimately challenge our beliefs. But this is bad! We should focus on the best points in the arguments against what you believe. Our opponents' good points are worth more attention than their obviously bad points. Yet we often focus on their mistakes rather than the reasons that hurt our case the most.

Monday, April 26, 2010

Let's All Nonconform Together

If you like these links, I'll let you in my exclusive club:

Friday, April 23, 2010

Status Quo Bias

Lazy, inert humans:
  • If it already exists, we assume it's good.
  • Our mind works like a computer that depends on cached responses to thoughtlessly complete common patterns.
  • NYU psychologist John Jost does a lot of work on system justification theory: our tendency to unconsciously rationalize the status quo, especially unjust social institutions. Scarily, those of us oppressed by such institutions have a stronger tendency to justify their existence.
  • Jost has a new book on this stuff. Here's a video dialogue about his research:

Wednesday, April 21, 2010

Wished Pots Never Boil

Here is a hodgepodge of links on some psychological impediments we're discussing recently:
Does Wishful Thinking Work Yet?

Monday, April 19, 2010

Second-Hand News

Angelo heard it through the grapevine:

Sunday, April 18, 2010

The Smart Bias

Oddly, the I'M-SPECIAL-ism bias seems to increase the more intelligent you are. Studies suggest that the smarter and more experienced you are, the more overconfident you're likely to become. In particular, we seem to believe that our intelligence makes us immune to biases. But that's just not true! The philosopher Nigel Warburton puts it nicely:
“Many of us would like to believe that intellect banishes prejudice. Sadly, this is itself a prejudice.”
Like You All, I'm Better Than You All

Saturday, April 17, 2010

No, You're Not

One of my favorite topics is I'M-SPECIAL-ism. Psychological research has repeatedly shown that most Americans overestimate their own abilities. This is one of the biggest hurdles to proper reasoning: the natural tendency to think that we're more unique--smarter, or more powerful, or prettier, or whatever--than we really are.

You've probably noticed that one of my favorite blogs is Overcoming Bias. Their mission statement is sublimely anti-I'M-SPECIAL-ist:

"How can we better believe what is true? While it is of course useful to seek and study relevant information, our minds are full of natural tendencies to bias our beliefs via overconfidence, wishful thinking, and so on. Worse, our minds seem to have a natural tendency to convince us that we are aware of and have adequately corrected for such biases, when we have done no such thing."

This may sound insulting, but one of the goals of this class is getting us to recognize that we're not as smart as we think we are. All of us. You. Me! That one. You again. Me again!

So I hope you'll join the campaign to end I'M-SPECIAL-ism.

Anti-I'M-SPECIAL-ism: No, You're Not

Friday, April 16, 2010

The Importance of Being Stochastic

Statistical reasoning is incredibly important. The vast majority of advancements in human knowledge (all sciences, social sciences, medicine, engineering...) is the result of using some kind of math. If I had to recommend one other course that could improve your ability to learn in general, it'd be Statistics.

Anyway, a few links:
y = mx + SCREW YOU

Thursday, April 15, 2010

Quiz #2

Quiz #2 will be held at the beginning of class on Wednesday, April 21st. It will last 25 minutes, and is worth 7.5% of your overall grade. The quiz is on everything we've discussed since the midterm:
  • Fallacies (starting with appeal to ignorance to the end of chapter 5)
  • Psychological Impediments (chapter 4)
The quiz will contain a mix of short answer questions and arguments that contain fallacies.

Show Your Work

Wednesday, April 14, 2010

The Conspiracy Bug

Here's an article on a 9/11 conspiracy physicist that brings up a number of issues we're discussing in class (specifically appealing to authority and confirmation bias). I've quoted an excerpt of the relevant section on the lone-wolf semi-expert (physicist) versus the overwhelming consensus of more relevant experts (structural engineers):
While there are a handful of Web sites that seek to debunk the claims of Mr. Jones and others in the movement, most mainstream scientists, in fact, have not seen fit to engage them.

"There's nothing to debunk," says Zdenek P. Bazant, a professor of civil and environmental engineering at Northwestern University and the author of the first peer-reviewed paper on the World Trade Center collapses.

"It's a non-issue," says Sivaraj Shyam-Sunder, a lead investigator for the National Institute of Standards and Technology's study of the collapses.

Ross B. Corotis, a professor of civil engineering at the University of Colorado at Boulder and a member of the editorial board at the journal Structural Safety, says that most engineers are pretty settled on what happened at the World Trade Center. "There's not really disagreement as to what happened for 99 percent of the details," he says.
And one more excerpt on reasons to be skeptical of conspiracy theories in general:
One of the most common intuitive problems people have with conspiracy theories is that they require positing such complicated webs of secret actions. If the twin towers fell in a carefully orchestrated demolition shortly after being hit by planes, who set the charges? Who did the planning? And how could hundreds, if not thousands of people complicit in the murder of their own countrymen keep quiet? Usually, Occam's razor intervenes.

Another common problem with conspiracy theories is that they tend to impute cartoonish motives to "them" — the elites who operate in the shadows. The end result often feels like a heavily plotted movie whose characters do not ring true.

Then there are other cognitive Do Not Enter signs: When history ceases to resemble a train of conflicts and ambiguities and becomes instead a series of disinformation campaigns, you sense that a basic self-correcting mechanism of thought has been disabled. A bridge is out, and paranoia yawns below.
There are  a lot of graduate-educated young earth creationists.

Tuesday, April 13, 2010

Rationalizing Away from the Truth

A big worry that the confirmation and disconfirmation biases raise is the difficulty of figuring out what counts as successful, open-minded reasoning, versus what amounts to after-the-fact rationalization of preexisting beliefs. Here are some links on our tendency to rationalize rather than reason: