Saturday, June 27, 2015

FTA part 6: Defying intuition, the multiverse, or necessity (part of the Aron series)

Greg says,
Yes, the normalization problem does seem to come up, doesn't it?  But the more I think about it, the more I think it's a cover-up.  Here's what I mean.  Just like the initial objection you raised about degree of fine tuning not translating into a rigorous probability, in this case this is just another layer of subtlety, but the conclusion is the same.  In going deeper with this, we are essentially just pushing it back another layer.

Another way to think about it is, the first level of fine-tuning is very intuitive, and speaks easily to the common person.  "Wow!  Look how finely-tuned these constants are!  This argues for intention in the make-up of the universe."  This is the intuitive conclusion, and sometimes intuition is right.

On the other hand, sometimes intuition is wrong.  For someone who wants to contest this conclusion (and please don't consider that turn of the phrase to mean I think the challenger of the FTA is disingenuous...we need to think deeply about it), there is always a way to get out of it.  There's always a door to exit for the skeptic.  But every time you exit the door, you end up in another room that is smaller and more difficult to exit.  A smaller door is there to exit the next room, and still smaller.  Pretty soon you'll need one of Alice's mushrooms to get out of the door, it's so small.  How deep does the rabbit hole go?

Did that sound pompous?  Sorry, I thought of that word picture last night and I really liked it.    In any case, my point is that the more one plays the skeptic to deny what intuition is telling us, the harder you have to work and the more you have to deny precious bits of reality.

OK, now that I've played it up so much, do I actually have an argument?  (Hee-hee, I hope so.  We'll see if you like it or not.)

So let's start with the point I made last time.  Either the small life-permissible interval of G is improbable, or the probability distribution from which we are drawing G must be itself finely-tuned (aka, atypical).  That makes intuitive sense.  It's a bit harder to understand than the basic "G must be within one part in 10^60, therefore God did it," but it's still pretty intuitive.  The rebuttal to that is we have no reason to consider any particular sort of probability measure.  Indeed, the normalizability problem destroys fine-tuning: either everything's equally impossible, or our current value is necessary (P = 1).  What method do we have to restrict the probability distribution to some intermediate shape?

So, we drop-kick intuition and need to go one level deeper.  (Remember how I asked, "How deep does the rabbit hole go?"  I have a feeling to get to the bottom of this conversation, we'll eventually have to discuss properly basic beliefs and brains-in-a-vat.  It's a steep price to pay to be constantly skeptical of the intuitive power of the FTA.)  If we really want to have a probability distribution to draw from, we need a mechanism. Here our discussion will bifurcate into two plausible solutions.

(Before I do that, can I mention an aside here?  Initially, I presented the FTA as a rigorous Bayesian-type proof.  Recall you challenged my ability to say P(FT | ~G) is super-low.  Now I just want to recall the point that these probabilities in Bayesian arguments are *epistemic*.  Meaning, they're "what are the odds of that happening?"-type probabilities.  This is the reason why the Bayesian argument goes through, because most will understand the fine-tuning of the constants and conditions of the universe and of earth and intuitively agree that P(FT | ~G) is low, even if it can't be proven rigorously.)

OK, back to the bifurcation.  There are now two naturalistic options (to avoid God): (1) either the universe is alone (and necessary), or (2) it is one of an ensemble of universes, commonly called the multiverse (which then itself is necessary).

Option 1: if the universe is necessary and alone, then all the constants and conditions could not have been other than what they are.  In that sense, all of these probabilities would be unity.  How could it have been any other way if the universe itself is necessary?  But if that is the case, we again are stuck with asking why it had to have been this way.  What is it about the universe and necessity that made it so that life could possibly exist?  Especially when it seems like there are so many other ways it could have been that would have precluded life.  Again, we are now not only stuck with asking "Why is there something rather than nothing?" (since the universe has no explanation for its existence, it would seem rather odd that it would be the necessary entity), but also with "Why is the universe the way that it is" (since it being just the way it is permitted intelligent life to develop within it to ask these questions).

Now, one way you could answer these questions is flippantly.  Dr. Krauss is a famous example of this, with his, "'Why' questions are silly."  But I don't regard you as thinking that.  So then why do you think there is something rather than nothing?  Why do you think the universe is the way it is?  Remember, without God and thus without intention, there is no explanation for these facts.

Option 2: if the universe is one of many universes in the multiverse, then plausibly this could explain how the perceived fine-tuning arose.  Returning to the normalizability problem, the main issue I have with it is, if there really is a natural mechanism that "chooses" values of constants for the universe, then it cannot have the normalizability problem.  This is because it must have a real probability distribution, not this hypothetical/philosophical/no-logical-restriction type distribution.  So the existence of the multiverse then solves the normalizability problem: either the probability distribution is typical, and thus our universe is rare, or the universe-generating mechanism itself has a finely-tuned probability distribution to produce a bunch of universes like ours.  In the second case, our universe is not rare (they're all like ours), but the fine-tuning is in the multiverse itself.

Skeptics rather like the first case: our universe is rare, but the number of universes (probabilistic resources) is so high that one such as ours is bound to have been generated randomly.

I could go on and on about the multiverse, so let me leave it with this so you can respond before I go off the deep end: I don't see Option 1 as viable.  For the skeptic, the universe just can't be the necessary entity. It raises too many questions about sufficient explanation.  So, Option 2, the multiverse, must be the fall back if you want to escape through the ever-shrinking skeptical door.  Further more, the second case of Option 2 (the multiverse itself is finely-tuned) cannot be the case for the skeptic, as this would be identical in ontology to Option 1.  Therefore, as I see it, the skeptic must choose the first case of Option 2: the multiverse generates widely-varying random universes, one of which is the lucky one (ours).

Is that where you think you'd go with this?

[See summary page of this discussion, with links to all the posts, here.]

No comments:

Post a Comment