This post originally came about as a result of the first time I participated in a DonorsChoose fundraiser. I offered to write articles on requested topics for anyone who donated above a certain amount. I only had one taker, who asked for an article about zero. I was initially a bit taken aback by the request - what could I write about *zero*? This article which resulted from it ended up turning out to be one of the all-time reader-favorites for this blog!

### History

We'll start with a bit of history. Yes, there's an actual history to zero!

Early number systems had no concept of zero. Numbers really started out as very practical tools, primarily for measuring quantity. They were used to ask questions like "How much grain do we have stored away? If we eat this much now, will we have enough to plant crops next season?" In that context, a "measurement"" of zero doesn't really mean much; even when math is applied to measurements in modern math, leading zeros in a number - even if they're *measured* - don't count as significant digits in the measurement. (So if I'm measuring some rocks, and one weighs 99 grams, then that measurement has only two significant digits. If I use the same scale to weigh a very slightly larger rock, and it weighs 101 grams, then my measurement of the second rock has *three* significant digits. The leading zeros don't count!)

Aristotle is pretty typical of the reasoning behind why zero wasn't part of most early number systems - he connected the ideas of zero and infinity, and considered them nothing more than pure *ideas* related to numbers, but not actual numbers themselves. After all, you can't really *have* 0 of anything; zero of something isn't *anything*: you *don't have* any quantity of stuff. And by Aristotle's reasoning, zero had another property in common with infinity - you can't ever really *get to* zero as he understood it. If numbers are quantity, then you can start with one of something. Then you can cut it in half, and you'll have half. Cut that in half, you'll have a quarter. Keep going - and you can cut stuff in half forever. You'll get closer and closer to the concept represented by zero, but you'll never actually get there.

The first number system that we know of to have any notion of zero is the babylonians; but they still didn't really quite treat it as a genuine number. They had a base-60 number system, and for digit-places that didn't have a number, they left a space: the space was the zero. (They later adopted a placeholder that looked something like "//".) It was never used *by itself*; it just kept the space open to show that there was nothing there. And if the last digit was zero, there was no indication. So, for example, 2 and 120 looked exactly the same - you needed to look at the context to see which it was.

The first real zero came from an Indian mathematician named Brahmagupta in the 7th century. He was quite a fascinating guy: he didn't just invent zero, but arguably he also invented the idea of negative numbers and algebra! He was the first to use zero as a real number, and work out a set of algebraic rules about how zero, positive, and negative numbers worked. The formulation he worked out is very interesting; he allowed zero as a numerator or a denominator in a fraction.

From Brahmagupta, zero spread both east (to the Arabs) and west (to the Chinese and Vietnamese.) Europeans were just about the last to get it; they were so attached to their wonderful roman numerals that it took quite a while to penetrate: zero didn't make the grade in Europe until about the 13th century, when Fibonacci (he of the series) translated the works of a Persian mathematican named al-Khwarizmi (from whose name sprung the word "algorithm" for a mathematical procedure). As a result, Europeans called the new number system "arabic", and credited it to the arabs; but as I said above, the arabs didn't create it; it originally came from India. (But the Arabic scholars, including the famous poet Omar Khayyam, are the ones who adopted Brahmagupta's notions *and extended them* to include complex numbers.

### Why is zero strange?

Even now, when we recognize zero as a number, it's an annoyingly difficult one. It's neither positive nor negative; it's neither prime nor compound. If you include it in the set of real numbers, then they're not a group - even though the concept of group is built on multiplication! It's not a unit; and it breaks the closure of real numbers in algebra. It's a real obnoxious bugger in a lot of ways. One thing Aristotle was right about: zero is a kind of counterpart to infinity: a concept, not a quantity. But infinity, we can generally ignore in our daily lives. Zero, we're stuck with.

Still, it's there, and it's a real, inescapable part of our entire concept of numbers. It's just an oddball - the dividing line that breaks a lot of rules. But without it, a lot of rules fall apart. Addition isn't a group without 0. Addition and subtraction aren't closed without zero.

Our notation for numbers is also totally dependent on zero; and it's hugely important to making a polynomial number system work. To get an idea of how valuable it is, just wait 'till later this week, when I'll be re-posting an article about multiplication in roman numerals - multiplication is vastly easier in the decimal system with zero!

Because of the strangeness of zero, people make a lot of mistakes involving it.

For example, there's one of my big pet peeves: based on that idea of zero and infinities are relatives, a lot of people believe that 1/0=infinity. It doesn't. 1/0 doesn't equal *anything*; it's meaningless. You *can't* divide by 0. The intuition behind this fact comes from the Aristotelean idea about zero: concept, not quantity. Division is a concept based on quantity: Asking "What is x divided by y" is asking "What quantity of stuff is the right size so that if I take Y of it, I'll get X?"

So: what quantity of apples can I take 0 of to get 1 apple? The question makes no sense; and that's exactly right: it *shouldn't* make sense, because dividing by zero makes no sense: *it's meaningless*

.

Zero is also at the root of a lot of silly mathematical puzzles and tricks. For example, there's a cute little algebraic pun that can show that 1 = 2, which is based on hiding a division by zero.

- Start with "x = y".
- Multiply both sides by x: "x
^{2}= xy". - Subtract "y
^{2}" from both sides: ""x^{2}- y^{2}= xy - y^{2}". - Factor: "(x+y)(x-y) = y(x-y)".
- Divide both sides by the common factor "x-y": "x + y = y".
- Since x=y, we can substitute y for x: "y + y = y".
- Simplify: "2y=y".
- Divide both sides by y: "2 = 1".

The problem, of course, is step 5: x-y = 0, so step five is dividing by zero. Since that's a meaningless thing to do, everything based on getting a meaningful result from that step is wrong - and so we get to "prove" false facts.

Anyway, if you're interested in reading more, the best source of information that I've found is an online article called "The Zero Saga". It covers not just a bit of history and random chit-chat like this article, but a detailed presentation of everything you could ever want to know, from the linguistics of words meaning zero or nothing to cultural impacts of the concept, to detailed mathematical explanation of how zero fits into algebras and topologies.

Closing the italics tag.

"zero spread both east (to the Arabs) and west (to the Chinese and Vietnamese.)"

I think you mean west to the Arabs, and east to the Chinese ðŸ™‚

Mark obviously forgot to carry the negative sign.

I don't think you're quite right about this. The Arabs are, according to many scholars, the inventors of the number zero.

Also, I think zero has much more credence than the concept of infinity. For one thing, you

can"have zero" of something - zero expresses a total lack, and a lack is a quantitative, not qualitative notion. On the other hand, one cannot have infinity of something.Otherwise, a very good post!

NS

Yes, one of your best posts of all times. Also won its rightful place in the very first Science Blogging Anthology...

So if I'm measuring some rocks, and one weighs 99 grams, then that measurement has only two significant digits. If I use the same scale to weigh a very slightly larger rock, and it weighs 101 grams, then my measurement of the second rock has *three* significant digits. The leading zeros don't count!All this shows is how silly and stupidly-defined "significant figures" are.

Maybe you could do (or have done) one on the concept of "significant digits"? I know several people who use the term, but none of them seem to mean the same thing as any of the others.

Cite on the Arabs inventing complex numbers thing? I thought it was Renaissance Italy.

I would say zero and infinity are related both conceptually and because

lim(x->0): 1/x = infinity

#4, I think the point was that you can't "have" zero of something. That is you can't be in possession of nothingness. You can be in a state of not possessing anything, but that isn't a quantity that can be possessed. (Put another way, you can trade 4 quarters for 1 dollar, but you can't trade nothing and get something; That is called being given something).

#9, Actually, the limit you have written is

stillnot defined, since 1/x exhibits different behavior depending on how one takes the limit. For that limit to exist you must specify the so called "right handed" limit, where you demand that the limit be taken such that x > 0 is always true.I usually explained the Natural Numbers to my students as those numbers appropriate to describing how many grains of sand are in a pot. You can't have a half a grain of sand, but you may have zero grains in the pot. This seems very natural to me, and I'm disappointed in Aristotle that he wasn't willing to think of the counting numbers that way.

Divsion by zero to demonstrate the inconsistency of arithmetic is unnecessary. Do it without any division at all,

4 - 10 = 9 - 15

Add 25/4 to both sides,

4 - 10 + 25/4 = 9 - 15 + 25/4

Write sides as complete squares,

(2 - 5/2)^2 = (3 - 5/2)^2

Take the square root of both sides

2 - 5/2 = 3 - 5/2,

add 5/2 to both sides

2=3

Of course Uncle Al's proof fails because his third equation is equivalent to (-1/2)^2 = (1/2)^2. Taking the square root to equate -1/2 and 1/2 is a fallacy. In general x^2=y^2 does not mean that x=y. Rather it means x=y or x=-y. (The or is the inclusive or.)

So to be correct:

2-5/2=3-5/2 or 2-5/2=-3+5/2

2=3 or 2=-3+10/2

2=3 or 2=2

Still it is nice to have a "proof" that 1=2 that does not have a division by zero as its fallacy.

you have in the paragraph about bablyonians and zero that "2 and 120 looked exactly the same." Don't you mean "2 and 20"?

The concept of zero does not come naturally to some people, even nowadays. Eric Hehner gives (pdf) the following example of it being reinvented the hard way:

We will never know who the real Brahmagupta was, or even if there were several of them in those days. Although popular accounts helpfully inform us that Brahmagupta was born in present day Rajasthan and lived and worked there all his life, that's not saying much really. Many of the scholars of India in those days used pseudonyms and even the names of popular scholars of earlier times. All in an attempt to keep their personal lives a secret.

Avi Steiner: The Babylonian system was base 60, so it really should be 2 and 120.

Hi Mark,

Charles Seife, 'Zero: The Biography of a Dangerous Idea', is an excellent historical read.

The author agrees with you about the Babylonians.

There is evidence that arabic numerals were taken after the conquest of India.

The author also suggests that the Greeks had both a system similar to the Romans and were aware of the existence of zero. This number may have been a state secret for the Romans, who were excellent engineers, which would seem to be difficult without knowing about zero.

#4, If you say that we can have zero of something,,,why are you uncomfortable with saying you have infinite amount of something? When it comes to atomic size, say a rock, I CAN say it has infinite number of particles.

My understanding of the difficulty is that zero is not a member of the group of rational numbers - that group which you get from multiplication and division using the set of prime numbers as "generators" of the group.

There's an asymmetry.

The additive group over the integers is the set of numbers you get by adding and subtracting, using 0 and 1 as generators. When you combine the two groups by identifying the positive integers with the whole numbers, it turns out that all rational numbers have well-defined operations under addition/subtraction, but not all integers have well-defined operation under multiplication/division.

I wonder if it's related to the other asymmetry, that

a(b+c) = ab + ac

but not

a + bc = (a+b)(a+c)

Oh, and note that "infinity" is not a member of either group. While it is true that there are an infinite number of integers, there is no integer that is "infinity".

I just heard this on Sportscenter of all places:

"By the way, zero times any number is zero and division by zero is ... well .. really hard and sorta meaningless."

Given the intended audience, I'm more than willing to forgive a little vacillation. I just thought it was humorously ironic to have read this article yesterday and then to hear such an accurate mathematical analysis on a sports show (albeit one known for its witty and intelligent commentary).

P.S. In case you're wondering, it was w/r to the Detroit Lions' amazing, winless season.

Brahmagupta didn't invent zero. The decimal place value number system with zero involved within Indian mathematics between Aryabhata (born 476) and Brahmagupta (7th century) who gave the earliest set of rule for addition, multiplication, subtraction and division with positive and negative numbers and zero although he didn't realise that division by zero is not definable.

As far as I know this statement is rubbish! Do you have a source for your claim?

The Greeks had two different number systems. For every day purposes they used an alphanumerical system, i.e. with letters for numbers, that is in many ways similar to the Romans but for astronomical texts they used the sexagesimal place value system of the Babylonians with a placeholder zero, although unlike the Babylonians they also placed their zero at the end of numbers. The original Greek symbol for their placeholder zero was a dot that later became a small circle, which is probably the origin of our symbol for zero.

You only need a zero in a place value number system to indicate an empty place, as the Roman number system was not place value they had no need for a zero.

From Krishna (#19):

This statement makes my mind split in half, between the practical mind and the precise mind. My practical mind says that yes, for darn near any application, one can treat the number of particles in a rock as infinity -- no matter how you wear it down into pieces, you'll wear out before making all the particles you can.

By my precise mind says "no". Without having a definite limit in mind of the number of particles in the rock, there is no way you could say with authority that the homeopaths have diluted their medicines beyond any particle remaining.

So, to be fair, it does make a difference -- just not very darn often.