In the golden age of classical physics following the discovery of the laws of motion by Sir Isaac Newton, the old postulate that “behind every effect there is a cause responsible for producing such effect” was finally placed in firm mathematical grounds, and the Universe all of the sudden became quite a predictable place. The motions of the planets around the Sun could be predicted, their masses could be estimated, and even the possibility of artificial satellites put into orbit by Man came into the picture. And the laws were proved to hold valid even as they applied to small objects such as cannonballs being fired or to a person jumping across a ravine from one ledge to the other. Once the current state of affairs of a physical system is known or fixed, its unfolding as time goes on is already determined by its obedience to the laws of Nature. Not surprisingly, this point of view came to be known as determinism. Indeed, the Universe became fully mechanistic, and it was thought at the time that if we knew all of the initial conditions of a situation down to their most minute details, then a long-range prediction into the future was possible in principle, the only limitation being the accuracy with which we can carry out our calculations. And since a mathematical procedure when carried out correctly cannot give two different answers to the same problem (otherwise, mathematics itself would be inconsistent), for a given set of initial conditions occurring in Nature there can be only one possible outcome, regardless of the amount of time which has elapsed. This is the mechanistic, deterministic, view of Nature that necessarily follows from Newton’s laws of motion.
Such was the belief until the start of the twentieth century. The arrival of a new science, quantum mechanics, essentially threw the old concepts into disarray, by showing us that the state of affairs of the basic constituents of matter itself cannot be known with an unlimited degree of precision (this idea came into being with Heisenberg’s Uncertainty Principle), and there is natural limit to the amount of information we can extract from the building blocks of nature such as atoms and molecules. The stunning success of quantum mechanics in the prediction of many new phenomena and its daily verification in the most advanced research centers of the world leaves very little doubts as to its validity. If there is a level of uncertainty already built into the basic building blocks of nature that prevents us from determining what is going on at the microscopic realm with an unlimited degree of precision, it follows also that we cannot predict the precise outcome of something which is going on at the atomic and subatomic levels, be it here on Earth, the Moon, or even the entire Universe for that matter. In effect, it is as if Nature was playing dice all over the place. The inherent uncertainty that lies at the very core of matter itself is a concept so repulsive to many philosophers that even Albert Einstein himself once replied, “God does not play dice with the Universe”.
Still, since quantum mechanics applies to the realm of microscopic phenomena, which is the realm in which the postulates of quantum mechanics hold valid, many supporters of the mechanistic view of Nature held steadfastly to the belief that, at least on a macroscopic scale, Newton’s laws of motion could still be applied with some disregard to things such as quantum mechanics.
It was back in 1960 when an MIT meteorologist, Edward Lorenz, carried out a computer simulation of the earth’s atmosphere. The computer itself was required to solve a number of “nonlinear equations” used to model the atmosphere, and the initial conditions he used for his data included such things as the wind speed, the wind direction, the air pressure and the temperature. After the simulation was run once, Lorenz repeated his simulation a second time rounding off the decimal figures in the equations to three decimal places instead of the six employed in the first run. Much to his surprise, the result he got was not even an approximation of the first forecast. It was a completely different forecast. Upon closer examination, it was confirmed by him that the three decimal place differences between both computer simulations was eventually magnified by the repetitive procedures used to solve numerically his starting set of nonlinear equations. To quote his words published on Discover magazine:
“I knew right then that if the real atmosphere behaved like this (in reference to the mathematical model he used), long-range forecasting was impossible.”
If we assume that the set of equations used by Lorenz to model weather phenomena is the right one (the “correct” model perhaps is much more complicated than we can even imagine), then it stands to reason that our forecast will be more precise if we increase the accuracy of our computer by using ten decimal places instead of six. But if we had used one hundred decimal places instead of ten, we must accept the fact that the difference between using ten decimal places instead of one hundred (something which many would consider a “small insignificant error”) will eventually be amplified through the repetitive numerical procedures used by the computer in the solution of the problem, and again we will have two different outcomes. We can only guess that the long-range outcome using one hundred decimal places will be closer to the truth than the one using ten decimal places. But as time goes on, even our solution using one hundred decimal places will quickly become obsolete. The only way to compensate for this is to use an even more powerful computer capable of handling figures with something like one hundred million decimal places. But even then, as time goes on, the difference between using one hundred million decimal places and two hundred million decimal places will eventually be magnified by the computer, and this difference will creep up and throw our long-range forecast astray. The only way in which we can come up every time with a unique and exact solution is to use a computer with an unlimited degree of accuracy, capable of handling an infinite amount of decimal figures and capable of giving us a final result within our lifetimes. But such a computer does not exist, nor can we build such a machine either today or in the foreseeable future. Even the “quantum computer”, the most powerful computer conceived by Man, a computer which is yet to be built some time in the future provided some major technological obstacles can be overcome, would not even come close to the performance we could expect from an “infinite computer”. In principle, we cannot even conceive of such a machine, with an unlimited degree of accuracy, capable of handling numbers all the way up or down to infinity. To complicate matters even further, if we take for granted that, under a completely mechanistic framework, for a given set of exact initial conditions there can be only one possible outcome for those conditions, regardless of the amount of time that has elapsed, even if we had a computer capable of doing math with infinite accuracy, in order to be capable of carrying out long-range forecasting we would need to know the initial conditions with unlimited accuracy. Few instruments around the world can measure anything precisely with more than ten significant figures [see, for example, the August 1980 article of Scientific American entitled "The isolated electron", which describes how a property of the electron called the g factor was measured to be 2.0023193044, correct to eleven significant figures]. Any instrument capable of doing measurements with one hundred or more significant figures is well beyond our current technological capabilities and perhaps even beyond our understanding. And even if the accuracy of our computer and the accuracy of our measurements were allowed to extend with unlimited precision, it would not be long before quantum mechanics itself enters into the picture and spoils the party.
The results obtained by Edward Lorenz had actually been anticipated for quite some time by mathematician Henri Jules Poincaré, who wrote in 1903:
“A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of the same universe at a succeeding moment. But even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation approximately. If that enabled us to predict the succeeding situation with the same approximation, that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But this is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon.”
It is possible that if a mathematician as talented as Poincaré could have had access in his time to a modern computer, he could have verified for himself his suspicions using one of the many mathematical models he used to play with. The implications of the observations made by Poincaré and their confirmation by Edward Lorenz are more profound than even both of them could have realized at the time. Many natural phenomena besides the weather can be modeled by sets of nonlinear equations, similar to the ones used by Lorenz, from economic growth models all the way to models describing the evolution of ecosystems and life itself. And just like the weather, long-range prediction for all these phenomena becomes impossible, at least for us. This puts an absolute limit upon the knowledge that Man can expect to posses, even if he could somehow manage to live for an eternity.
There is another important conclusion we can draw from all of this. If something like the Universe itself is about to be created, and if such a creation is to fulfill a certain promise, a long-range plan, with its evolution path inscribed from the very outset upon the initial conditions with which it will be created, then in order to ensure that the plan will be carried out as expected the initial conditions of the act of creation itself must be set with an infinite degree of precision, and even the uncertainties introduced by the quantum mechanical nature of matter itself must be carefully taken into consideration. Anything less is very likely to produce in the long run an outcome completely different from the one expected, as Edward Lorenz himself found out early in 1960.
The availability of an “infinite computer”, capable of doing math with an infinite level of accuracy by handling an infinite amount of significant figures and coming up with exact (and not just approximate) answers in a finite length of time, would enable us at this very moment to obtain real definitive answers to some of the most vexing problems being faced by mathematicians nowadays, such as proving or disproving the validity of Goldbach’s conjecture (which states that every even number is the sum of two primes) or the twin primes conjecture (which states that there are infinitely many twin primes, with twin primes being defined as consecutive odd primes such as 11 and 13.) And in cases such as these, an infinite computer might be the only way to prove or disprove the conjectures, since it has already been shown by noted logician Kurt Gödel in what we know today as Gödel’s incompleteness theorem that there are mathematical assertions and statements whose validity cannot be proven nor disproven within the framework of mathematics itself [A more technical way of enunciating Gödel's incompleteness theorem is the following: "No algorithm (procedure or cookbook recipe for solving a problem) exists that can determine the truth or falsity of any logical proposition in a system of logic that is powerful enough to represent the natural numbers." For many mathematical theorems and propositions, there will be an algorithm to prove their truth or falsity, provided that the mathematicians attempting to prove them are clever enough to find them. But this cannot be generalized to all mathematical theorems and propositions, for there will always be some theorems and propositions for which no algorithm to prove their truth or falsity will ever be found since that algorithm does not exist, not even in principle]. We already know beforehand that any mathematical statement must be either true or false; it cannot be both at the same time. But there is a very uneasy feeling with the knowledge that rigorous mathematical logic alone will never be able to provide answers to problems that fall into this category. And even if we had an infinite computer at our disposal, we would have no other choice than to accept its conclusions at face value, since we ourselves have no means of verifying those conclusions. There can be no doubt whatsoever in our minds that any being who could have access to an infinite computer or who could be able to grasp and comprehend infinity would have a lot of knowledge (besides the solution to Goldbach’s conjecture and the twin primes conjecture) that we ourselves will never be able to derive with logic alone or perhaps even to comprehend. And this includes not just peeking into the distant future within a purely deterministic framework, but even more, arranging things from the very outset in such a manner that some major events will inevitably take place as scheduled even after the passage of billions of years.
We close this chapter with a discussion of an interesting irrational number (an irrational number is one that cannot be represented as the ratio of two whole numbers) which we will call Ω (this symbol is the Greek letter omega), which we know beforehand will have a value somewhere between zero and one (since it represents a probability). This number, discovered by Gregory J. Chaitin at the IBM Thomas J. Watson research center, thus known as Chaitin's constant, is supposed to be so random that in the long run no gambler would do better than break even if he were to place bets based on the successive digits of this number. In other words, no matter how many decimal figures we may have written down of such number, there is no way of predicting what the next missing digit will be, and the digits follow no discernible pattern we can uncover through any of the known statistical analysis tests. One of the most interesting properties of Ω is that it can be defined precisely but it cannot be computed. In his article “Mathematical Games” published in the November 1979 issue of Scientific American, Martin Gardner quotes the following:
“Throughout history mystics and philosophers have sought a compact key to universal wisdom, a finite formula or text that would provide the answer to every question. The use of the Bible, the Koran and the I Ching for divination and the tradition of the secret books of Hermes Trismegistus and the medieval Jewish Cabala exemplify this belief or hope. Such sources of universal wisdom are traditionally protected from casual use by being difficult to find as well as difficult to understand and dangerous to use, tending to answer more questions and deeper ones than the searcher wishes to ask. The esoteric book is, like God, simple but undescribable. It is omniscient, and it transforms all who know it. The use of classical texts to foretell mundane events is considered superstition nowadays, yet in another sense science is in quest of its own Cabala, a concise set of natural laws that would explain all phenomena. In mathematics, where no set of axioms can hope to prove all true statements, the goal might be a concise axiomatization of all ‘interesting’ true statements … W is in many senses a Cabalistic number. It can be known of through human reason, but not known. To know it in detail one must accept its uncomputable sequence of digits on faith, like words of a sacred text. The number embodies an enormous amount of wisdom in a very small space inasmuch as its first thousand digits, which could be written on a small piece of paper, contain the answers to more mathematical questions than could be written down in the entire universe –among them all interesting finitely refutable conjectures. The wisdom of Ω is useless precisely because it is universal: the only known way of extracting the solution to one halting problem, say the Fermat conjecture, from Ω is by embarking on a vast computation that would at the same time yield solutions to all other simply stated halting problems, a computation far too large to be actually carried out. Ironically, however, although Ω cannot be computed, it might be generated accidentally by a random process, such as a series of coin tosses or an avalanche that left its digits spelled out in a pattern of boulders on a mountainside. The first few digits of Ω are probably already recorded somewhere in the universe. No mortal discoverer of this treasure, however, could verify its authenticity or make practical use of it.”