Monday, July 07, 2008

RANDOM CHANCE BOGUS

Insofar as (otherwise) scientific accounts of nature invoke chance and randomness, they admit their weakness as scientific accounts.

Chance and randomness are fudge factors. They play the same role in scientific thinking as is played by miracles in religious thinking. And they reduce scientific thought to quasi-religious thought.

The question to anyone who explains anything in terms of chance and randomness is, "What are you talking about?" Take quantum physics as an example of a discipline that leans heavily on randomness. "Random" in quantum mechanics seems only to mean that physicists can’t predict the outcome of quantum processes. But that just begs the question, Why can’t the outcome be predicted? There are only three possibilities:

1) The outcome is not predictable, because, though deterministic and in principle predictable, the process involves a causal chain too complicated to unravel.

2) The outcome is not predictable, because it "just happens," that is, it is an event that occurs without having been caused. This would seem to pose a problem for science. At least miracles have causes, albeit supernatural ones. But for an event to occur with no cause whatsoever? (In quantum physics the outcome of a quantum collapse is taken to be random, even though triggered by an environmental factor, or, in the case of Roger Penrose’s Objective Reduction, the collapse is caused by gravity. But in any case, the collapse can in principle produce any of a number of outcomes, only one of which actually occurs. Which one occurs from among the possibilities is not predictable, and so the spectre random is invoked as a gloss that lets the scientists elude the responsibility of causal elucidation.)

3) The process is not predictable, because a subjective agent decides, or at least influences, the outcome.

Scientific thinking leans on chance and randomness in some cases when scientists observe seemingly teleological effects, as in the seeming directedness of evolution from simple to complex organisms, but scientists are professionally prohibited, and probably temperamentally inhibited, from invoking teleological explanations. Hence, the usefulness of the gap-fillers, chance and random.

To see in more detail why chance and randomness are conceptual fuzz and need to be expunged from scientific thinking, exposed as spectres, consider the classic example of a deck of cards, shuffled in the normal way or arranged according to a rule.

Shuffling is taken to be a randomizing of the deck, so that if you were to pick a card from position thirty in a thoroughly shuffled deck, the identity of that particular card would be a matter of chance. You can’t predict it. But the card at position thirty in a nonrandom deck, one that is set up deliberately in a contrived pattern according to a rule, should be predictable. This is how these things typically are understood.

But the opposite seems actually to be the case.

The position of any card in a shuffled deck is usually figured to be given by chance, but what’s occurred during the shuffling of the deck is that the indeterminism has been removed, and the process has become deterministic, with a determined outcome. The positions of the cards in the deck are determined by deterministic physical determinants: the stiffness of the cards, how much they’re bent back, how quickly and with how much force you roll your thumbs over the edges and so forth, starting with the original ordering of the cards in the deck. So if you knew all the physical variables, you could, in theory, predict which card would end up at position thirty. Because there’s no indeterminism involved, an algorithm will solve the problem. It’s just a matter of the physical characteristics of the cards, and how much force is imparted to them,how much they bend, and so forth.

So actually you could predict the so-called chance or random result. The “chance” or “random” aspect of the shuffling—the absence of deliberate ordering—is precisely what allows card positions to be predicted. The exclusion of indeterminate causes makes the outcome predictable.

Now, if you’re given a deck of cards, and you’re told that someone has arranged them in a particular pattern and then suddenly died or been abducted by aliens, then you have no way to determine—to predict—which card is at position thirty. Given no additional information, there’s no calculation you can do based on the physics of the cards and the initial arrangement that will tell you where any particular card is or which card is at any particular position. Additional information is needed to figure it out: the rule by which the cards were arranged.

The "random" arrangement is the predictable arrangement. The nonrandom arrangement is the unpredictable one.

The dead or abducted person’s choice in arranging the cards, the rule they select, is presumably not deterministic. That person had a range of choices and decided on one. They might have chosen a convoluted rule or a simple one. To someone who doesn’t know the rule, if it’s a convoluted one, say like find the first ace and double the value of its ordinal position in the deck and square that number and divide by five and take the second digit of the remainder for the next card . . . something like that, the result might well look like a random arrangement to someone who doesn’t know the code. Once the code and the initial arrangement are known, one can use an algorithm to determine card thirty. This is cryptography.

This example underscores a problem with the naïve view of information and information theory. There’s no way to determine whether a seemingly random set of signals actually is random—"does not encode information"—or whether the set of signals is highly ordered and contains a great deal of information, unless one knows that a code or rule was used to arrive at the order. And one knows what the rule was. Or if one knows the initial physical conditions and all of the physical variables involved. In other words, it is impossible to determine a priori the signal-to-noise ratio of any set of pulses.

Take, as another illustration, solar radiation, or sunshine, a seemingly "random" mixture of wavelengths of electromagnetic radiation. What is the information content of sunshine? It tells us about the nuclear reactions occurring inside the sun, so it has some information content. But it also tells the plants it strikes in which direction to face their leaves. Does that imply additional content? In the case of conscious minds deliberating over wavelengths so as to discern something about solar physics, sunshine is taken to contain information, and solar researchers are involved in information decoding. But the response of the plants is "automatic," not deliberate, and few people would interpret it as an instance of information processing, only energy processing. But what is the meaningful difference?

The notion that information is somehow inherent in physical nature makes sense only if we assume that there is a natural code. I suppose we WANT to believe that information inheres in nature. And so we’re led necessarily to some kind of crypto-theology. In other words, if there’s information in nature, then there’s a code (design), and then there’s a coder (designer). Any philosophy that takes information to be an inherent aspect of physical nature is necessarily a theological philosophy. One way around this, for congenital theophobes, is to deny that information inheres in nature and to admit that it inheres only in conscious minds. Same thing with computer memory. The strings of 0's and 1's are geometrical configurations of electrons in spacetime in the physical circuitry, but only a mind can determine whether any information is present.

I’m not sure many scientifically minded people would sign onto that necessity.

2 comments:

  1. these issues are both interesting and exciting. the information field needed to encompass the data necessary to predict the fixed nature of events referred to casually as being random is huge, fluctuating, and generally unwieldy for formal calculations necessary to express it as other than random chance. however, just because something is extremely difficult is not an acceptable reason to ignore its importance. likewise, just because similar processes always have been carried out by formal methods in the past is not an acceptable reason to continue to do so in the future.
    the human being has amazing capacities for mental process, physical sensory stimulus, and kinetic memory; when the-no less amazing- human capacity for intuition is fully utilized in conjunction with these other factors, the possibilities are phenomenal.

    ReplyDelete