I get a lot of puzzled looks from students in my classes. Sometimes its because I'm just weird. Others times is because I present them something that is perfectly rational, but wildly non-intuitive. So because its non-intuitive, the reasoning must be wrong. They'll throw up objections and try to refute the argument, and I respond successfully, but they'll still reject the argument.

Sometimes they'll just throw out reason. The arguments are dumb! Apparently dumb arguments are any argument whose conclusion are non-intuitive. Why is it so difficult to accept that our intuitions can be wrong? Its not a new experience that our intuitions are wrong. I would venture to argue that all of our intuitions have been wrong several times in our lives. Unfortunately, in those times when we've been wrong, often an experience refutes the intuition, instead of reason/argumentation. In philosophy, we often can't just appeal to experience, in part because sometimes the experience itself is what is in question.

But I am comforted that philosophy isn't the only discipline to suffer from this problem. Mathematics has its share of non-intuitive conclusions, that are firmly backed by reason. For example: .999999999999...(an infinite string of 9s) is equal to 1.

Now intuition might tell us that it's not equal to 1, but only really close to one. But in fact, 1 and .9999999999... are the same number, just two different ways of writing it. 2/2 is 1, 4/4 is 1 and .999999999... is 1.

A simple proof can be done of this:

1/3 = .33333333333333333...

2/3= .66666666666666666...

1/3 + 2/3= 3/3 =1

.33333333333...+ .66666666666...=.99999999999...

By the rule of transitivity of identity, 1 =.999999999999...

Now if you're acquainted with who I am, you'll know I'm terrible at math, so maybe I'm making a mistake.... But I'm not mistaken. I refer you to this blog post which has two follow up posts linked on the original post that explicates the proof in great detail. And the blogger has to deal with the same intuitive vs rational argument that I deal with on a regular basis in my courses. So why is there such a resistance to give up our intuitions, and is there a better way to get people to give up their intuitions when dealing with something that can't be experienced?

## Friday, March 5, 2010

Subscribe to:
Post Comments (Atom)

While I agree that often our intutions are wrong when faced with either reason or experience, I don't think your math is right.

ReplyDeleteI don't agree that 1/3 = .33333333333 (etc).

They are very close to each other, and in math, we typically use them interchangeably as a shortcut, but they are not the exact same number any more than .999999999999999999 is the exact same number as 1. It is mathmatically and by definition true that 1/3+2/3=1 and .333+.666=.999, but 1/3 is not equal to .333333333333333333333, its just a shortcut.

Nope its not a shorthand illusion. They really are equal. Its not even all that controversial in the math world.

ReplyDeletehttp://en.wikipedia.org/wiki/0.999...

http://mathforum.org/dr.math/faq/faq.0.9999.html

I've even seen it in textbooks.

Hmmm... maybe I failed to point out that the elipsis at the end of .333333... refers to an infinite string of 3s.

ReplyDeleteI hope I've never given the impression that I'm rejecting your points because they sound silly. I do notice it though, and I find that lack of real critical thinking annoying.

ReplyDeleteAlthough I would like to say that, after given a lot of thought, I could maybe point out a few arguments against two of the thought experiments you've presented us with, as well as the achilles/tortoise thing. (Not really arguments against them, but just things that point to how cheap they are). I just have a hard time articulating my thoughts on the spot, which is why I don't talk all the time in class.

Oh well. I have to concede the point. Reading further into the definition, .33_ is 1/3, just as .5 is 1/2. Fair enough. .99_ is 1. I'm wondering if one could extend the logic to an absurdly large number of zeros followed by a 1, ex: 1.000000000000000000000000000000000001 = 1. For all intents, they're practically the same, right? If you round from the Tera, Yotta or "Hella" scale, it is 1. Or even more interestingly, 0 = 0.00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000001. Same number? Almost the same logic, but somehow not exactly the same. A = ~exactly A. The math is saying that though we're using different symbols to represent them, the numbers are in fact identical, by definition. But that my follow on, which would seem somewhat logical, doesn't actually fly, because there is a difference between a decimal which actually repeats to infinity and one that just comes really close.

ReplyDeleteno it wouldn't extend to the arbitrarily large. It only works with infinite sequences. Practically they may be identical, but mathematically they're not.

ReplyDelete