Friday, June 10, 2005

Ramblings of Eric: Part 4 - What if I don't have free will?

In response to my second Ramblings post, Paul asked,

"...I think something like this is jazzed up with something more personal. Why do you care so much at an everyday level about free will? How does free will (or lack thereof) affect Eric today? tomorrow?"

Paul is on his way to becoming a psychologist, and it shows in his question:) But it's a very reasonable question to ask. After all, no point in debating something like this if there is not a reason to at the end of the argument rainbow.

This is not why I believe in free will. These are the results in my mind of a world that truly accepted there is no free will.

Lets assume for a second that determinism really is correct. All of my decisions, all my life choices, were decided a picoseconds after the Big Bang (lets ignore quantum mechanics here. Yeah, I know that quantum mechanics PROVES that true determinism does not exist. But something more complicated takes it place that only clouds what we are talking about here). To me personally, that means a few important things:

  1. Our accomplishments in life mean swat. Why feel good about the fact that you got good grades in school, or had that threesome with the twin hotties at camp? You did not make any personal decisions that lead to those accomplishments. They are just a product of the environment. It would be like the Sun being proud of the fact that it shined today, or the Earth being proud that it held 6+ billion people on the ground.
  2. There is no such thing as moral superiority. Moral superiority rests firmly on the ability of some individuals to resist temptations that others do not.
  3. Our mistakes in life should not be punished. What's the point of punishing someone for choices they really had no control over? What I mean to say is, if I was hypnotized into shooting up a liquor store and could prove it, would you still punish me? In general, the law explicitly denies that people under duress should be held accountable for their actions (within some important limits. You can't go on a shooting spree because someone threatens to make you listen to 80's pop music.).
  4. Its very hard to reconcile the lack of free will and a God (especially The God). Why bother creating us if we are just robots? If you really setup the whole universe with perfect knowledge of all our actions, why is there so much evil in the world? (This is most certainly going to be expanded in some future post).

Those are the biggies for me. #1 bothers me a lot more than #2 or #3, because I think I could argue there are other reasons to punish people even if determinism is true. You just can't claim morally superiority for doing it anymore, which does not bother me because I am a relative moralist anyways. #4 is only important if you believe a God (I do, but that is tied into my belief in free will to a large degree. I can't accept one without the other. I'll save that for another argument on another post).

Note that where I see life being utterly pointless in these results, others might see something positive. For one thing, you might feel better that your poor choices were not your fault (although really, that's a small comfort since feeling better in that case is also completely determined by the universe's giant state machine). Also, without free will, you don't need a/the God to explain much of anything (proving this is an exercise left to the reader).

Once again, its after midnight and I need some sleep...

2 comments:

R said...

It is interesting that you illustrate determinism with the idea that the universe is a gigantic state machine.

By "determinism," I'm assuming you're refering to "causal determinism," which would translate into exactly the "state machine" allegory.

Using this as a frame of reference, do you believe that any state machine can become infinitely complex enough to simulate free will?

Eric said...

I'm not sure I know of another form of determinism, so yes, I'm talking about causal determinism.

Infinitely is a large space. But no, I not sure an infiniate state machine can become consious. That would imply something as simple as a piece of tape, a pencal and a set of rules (i.e., a universal Turing machine) could become aware. That's something I just have trouble believing.