Why Does the World Exist: An Existentia - By Jim Holt Page 0,16
up empty-handed. For, as the old saying goes: Nothing seek, nothing find.
Interlude
The Arithmetic of Nothingness
Mathematics has a name for nothing, and that is “zero.” It is notable that the root of zero is a Hindu word: sunya, meaning “void” or “emptiness.” For it was among Hindu mathematicians that our notion of zero arose.
To the Greeks and Romans, the very idea of zero was inconceivable—how could a nothing be a something? Lacking a symbol for it in their number systems, they could not take advantage of convenient “positional” notation (in which, for example, 307 stands for 3 hundreds, no tens, and 7 ones). That’s one reason why multiplying with roman numerals is hell.
The idea of emptiness was familiar to Indian mathematicians from Buddhist philosophy. They had no difficulty with an abstract symbol that signified nothing. Their notation was transmitted westward to Europe during the Middle Ages by Arab scholars—hence our “arabic numerals.” The Hindu sunya became the Arabic sifr, which shows up in English in both the words “zero” and “cipher.”
Although European mathematicians welcomed zero as a notational device, they were at first chary of the concept behind it. Zero was initially regarded more as a punctuation mark than as a number in its own right. But it soon began to take on greater reality. Oddly enough, the rise of commerce had something to do with this. When double-entry bookkeeping was invented in Italy around 1340, zero came to be viewed as a natural dividing point between credits and debits.
Whether discovered or invented, zero was clearly a number to be reckoned with. Philosophical doubts about its nature receded before the virtuoso calculations of mathematicians such as Fibonacci and Fermat. Zero was a gift to algebraists when it came to solving equations: if the equation could be put in the form ab = 0, then one could deduce that either a = 0 or b = 0.
As for the origin of the numeral “0,” that has eluded historians of antiquity. On one theory, now discredited by scholars, the numeral comes from the first letter of the Greek word for “nothing,” ouden. On another theory, admittedly fanciful, its form derives from the circular impression left by a counting chip in the sand—the presence of an absence.
Suppose we let 0 stand for Nothing and 1 stand for Something. Then we get a sort of toy version of the mystery of existence: How can you get from 0 to 1?
In higher mathematics, there is a simple sense in which the transition from 0 to 1 is impossible. Mathematicians say that a number is “regular” if it can’t be reached via the numerical resources lying below it. More precisely, the number n is regular if it cannot be reached by adding up fewer than n numbers that are themselves smaller than n.
It is easy to see that 1 is a regular number. It cannot be reached from below, where all there is to work with is 0. The sum of zero 0’s is 0, and that’s that. So you can’t get from Nothing to Something.
Curiously, 1 is not the only number that is unreachable in this way. The number 2 also turns out to be regular, since it can’t be reached by adding up fewer than two numbers that are less than 2. (Try it and see.) So you can’t get from Unity to Plurality.
The rest of the finite numbers lack this interesting property of regularity. They can be reached from below. (The number 3, for example, can be reached by adding up two numbers, 1 and 2, each of which is itself less than 3.) But the first infinite number, denoted by the Greek letter omega, does turn out to be regular. It can’t be reached by summing up any finite collection of finite numbers. So you can’t get from Finite to Infinite.
But back to 0 and 1. Is there some other way of bridging the gap between them—the arithmetical gap between Nothing and Something?
As it happens, no less a genius than Leibniz thought he had found a bridge. Besides being a towering figure in the history of philosophy, Leibniz was also a great mathematician. He invented the calculus, more or less simultaneously with Newton. (The two men feuded bitterly over who was the true originator, but one thing is certain: Leibniz’s notation was a hell of a lot better than Newton’s.)
Among much else, the calculus deals with infinite series. One such infinite series that Leibniz derived is: