Last time, I talked about students’ difficulties carrying out divisions that involve a zero, which reminded me of another issue that sounds almost the same, but is quite different: division of a number *by* zero. Students often either forget the rule they were taught, or they don’t believe the rule, or they just wonder about it. This is common enough that the Ask Dr. Math site has a FAQ on it. Here are a selection of questions about this, with answers on many levels.

## Arithmetic: A model, and a definition

Dividing by Zero I cannot comprehend that a human being is not able to divide a number by zero, because by definition 0 is nothing, and if you can multiply by nothing, and add and subtract by nothing, why can't you divide by nothing? Let's say you have 10 apples and you divide them by 0 - don't you still have 10 apples? I cannot see why this cannot be done!

It is common for people (of all ages) to try to understand multiplication or division by zero in terms of a physical model, and get hung up. What does it mean to divide by zero? I started by clarifying Iain’s model, starting with a non-zero example:

You don't seem to be thinking closely about what it MEANS to divide by zero. Let's watch what happens when we try to divide those apples. First, let's divide the 10 apples into piles of 2, so we can give 2 to each of our friends. (When we run out, we'll be out of friends!) We do this: oo oo oo oo oo -- -- -- -- -- We've managed to make 5 piles; 10 divided by 2 is 5. Now let's try dividing the apples into piles of ZERO to give to our enemies, and see when we run out: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- ... This is getting hard! No matter how many empty piles I make, I haven't used up any of my apples. I guess I can have infinitely many enemies to give them to! This is why we can't divide by zero: we can never finish the job. And your mention of human beings is interesting; it's precisely because we are human, and therefore finite, that we can't do this.

Hold on to that thought about infinity! For now, let’s move from physical models (which are often misunderstood) to abstract definitions:

To put all of this into mathematical terms, dividing by 2 means finding a number (5) by which we can multiply 2 to get 10: 10 / 2 = 5 because 10 = 2 * 5 If we could divide 10 by 0 (I'll call the answer X), we would be saying that: 10 / 0 = X because 10 = 0 * X But zero times anything is 0, so I will never find an X for which this is true. That's what happened when I tried dividing the apples.

## Algebra: What it means to be undefined

That last bit of thinking was almost algebra. At that level, we are doing math more formally, starting with definitions and properties. Here, the reason division by zero is “undefined” is that we can’t define it without losing the consistency that algebra depends on:

Divide by 0 Undefined? When something is divided by 0, why is the answer undefined?

In 1996, Doctor Robert gave an answer much like mine above, but Doctor Tom took a slightly higher level:

It's because there's just no sensible way to define it. For example, we could say that 1/0 = 5. But there's a rule in arithmetic that a(b/a) = b, and if 1/0 = 5, 0(1/0) = 0*5 = 0 doesn't work, so you could never use the rule. If you changed every rule to specifically say that it doesn't work for zero in the denominator, what's the point of making 1/0 = 5 in the first place? You can't use any rules on it.

That is, any definition we chose for 1/0 would lead to a contradiction with the very rules we use to solve algebraic equations; so we are forced to leave it undefined.

Now we can go back to that earlier mention of “infinitely many”. He continues:

But maybe you're thinking of saying that 1/0 = infinity. Well then, what's "infinity"? How does it work in all the other equations? Does infinity - infinity = 0? Does 1 + infinity = infinity? If so, the associative rule doesn't work, since (a+b)+c = a+(b+c) will not always work: 1 + (infinity - infinity) = 1 + 0 = 1, but (1 + infinity) - infinity = infinity - infinity = 0. You can try to make up a good set of rules, but it always leads to nonsense, so to avoid all the trouble we just say that it doesn't make sense to divide by zero.

The point here is that if we were to say that the result of 1/0 is “infinity”, we would be treating infinity as a number. And it turns out that it doesn’t behave like a number; it, too, breaks rules. So although you might have a name for it, it still isn’t a number, which doesn’t help. Division by zero is undefined — there is no *number* that works.

## Calculus: Limits and infinity

But we have a field of math that *does* handle infinity (or at least has sort of tamed it, so we can do some things with it): calculus. The way calculus handles infinity is by treating it not as an actual number, but as a “limit”. We talk about a variable or a function “approaching infinity”, and we give a precise definition of that. So, might we be able to say that 1/0 is infinite in that sense?

Here is an answer from 2001, where Doctor TWE first gave an explanation much like my first one above, but then came close to calculus:

Error: Division by Zero I've been trying to help my third grader understand that a number divided by zero is undefined but am not getting much help. The calculator they use in school gives the answer 0/E (the teacher told them to write that on their homework paper, but seems not to understand what it means - I'm guessing it stands for "error"). But worse, the calculator on the Windows Accessory program gives the answer "positive infinity" when you divide a number by 0. How are we supposed to swim upstream against the teacher, the school calculators and the computer?

You can read the first part. Right now, I’m interested in that bit about infinity.

The argument above may seem like a case in favor of the Windows Accessory calculator program's answer of "positive infinity" ... To explain why it is not simply positive infinity requires a little more abstraction. Suppose we divide 1 by successively smaller values (I'll use 1, 0.1, 0.01, etc.) Our results are: 1 / 1 = 1 1 / 0.1 = 10 1 / 0.01 = 100 1 / 0.001 = 1,000 1 / 0.0001 = 10,000 1 / 0.00001 = 100,000 We can see that as the denominator gets closer to zero, the quotient increases without bound. (You can introduce the concept of limits and Lim[x->0+, 1/x] = +oo.) But suppose we divide -1 by successively smaller values. What happens then? Our results are: -1 / 1 = -1 -1 / 0.1 = -10 -1 / 0.01 = -100 -1 / 0.001 = -1,000 -1 / 0.0001 = -10,000 -1 / 0.00001 = -100,000 Now our quotient is approaching negative infinity. So we have to make two rules: one if the numerator is positive and one if it is negative. ... If I have the problem 1/0, how am I supposed to know whether to use a sequence of positive or negative numbers to approach zero? In fact, I can't know. That's why we say it is undefined instead of positive infinity. As we APPROACH zero from the right or from the left, our quotient approaches positive or negative infinity, but when the denominator IS zero, the quotient is neither - it's undefined. This concept may be too abstract for a third grader. You may have to tell him to take it "on faith" for now, and that later on, when he gets older, he'll understand the reason why. When my son was 3, I didn't tell him why it was a good idea to buckle up in the car - he just had to do it. (I didn't want to scare him with the idea of having an accident.) When he was a little older (about 5, I think), I explained to him WHY it was a good idea. Now he always checks to make sure I'm buckled up, too!

So, although there is some truth to the idea that 1/0 “equals” infinity, in the sense of a limit rather than a “number”, on close examination even that doesn’t hold up.

But let’s buckle up, because there’s one more direction we might go:

## But, couldn’t it be like imaginary numbers?

Is It Possible That x/0 is Not Really Undefined? As we are all told, n = 1/0 is undefined, since no number n exists that, when multiplied by 0, gives the result 1 (i.e., n*0 = 1). However, drawing an example from the theory of complex numbers, there also exists no obvious number i that, when squared, gives the result -1. The square root of -1 would, in past centuries, have been described as nonsense or undefined. Mathematicians nonetheless define just such a number, enlarging the known numbers from the real to the complex, and use the number i successfully in many real-world calculations. Can we be certain that, for any nonzero x, x/0 is actually undefined (i.e. unable to be ascribed any actual value or meaning), or is it possible that an important number and new number class has so far escaped discovery?

Read that page, and the following, to see answers to this common question!

Imaginary Numbers, Division By Zero Zero as Denominator

Here is the conclusion to the latter:

I think division by zero has always been confusing because there is a lot to it. The books simply weasel out of it by saying, "Division by zero is not defined," and the instant knee-jerk response is, "Well, just define it." It's in trying to find a "reasonable" definition that all the ugly problems come up.

Pingback: Zero Divided By Zero: Undefined and Indeterminate – The Math Doctors

Geoff SmithDear Sirs / Madams,

I disagree with your explanation above.

My answer: a number that is divided by zero has not been divided at all.

I put to you that the true meaning of ‘any number divided by 0’ is: that you are not dividing at all.

That is, I suggest that “a number that is divided by zero is not divided at all”, therefore will be the same number.

Anything that is divided by nothing has not been divided at all, so it remains it’s same original ‘self’.

For example: 10 oranges divided by 5 = 2 in each pile (10 / 5 = 2)

10 oranges divided by 10 = 1 in each pile (10 /10 = 1)

10 oranges divided by 0 = 10 oranges NOT divided at all, therefore, no action taken (0 = no action = nothing done). (10 / 0 = 10).

10 divided by 0 means I am not dividing it, there is 0 division, the number is not being divided. 10 divided by 0 = 10.

If you add 0 to any number you have the same original number. If you subtract 0 from any number you have the same original number. If you divide a number by zero – it means you aren’t dividing it at all, since 0 means no quantifiable amount, so if you are not dividing something by a quantifiable amount (0) then you are not dividing it at all. Something that has not been divided remains as it is. Therefore any number divided by zero is the same original number. No change.

Therefore, any number divided by 0 leaves the original number unchanged = same number. X / 0 = X

Sure, that is the same result as if you divide a number by 1 (that is X / 1 = X) but since Zero is a nothing quantity it could still have a different characteristic to the number ‘1’ but still derive the same answer in the case of division.

From Geoff Smith, Sydney, Australia

Dave PetersonHi, Geoff.

You’re saying essentially what the first questioner above said: “Let’s say you have 10 apples and you divide them by 0 – don’t you still have 10 apples?” The answer we gave Iain applies to you; but let’s say a little more this time.

First, if you

haven’t divided at all, then … you are really saying you can’t divide by zero, and we agree. But you aren’t really saying that.Then you say that division by zero should be just like

adding or subtracting zero, which have no effect. But you don’t mentionmultiplying by zero, whichdoeshave an effect: It always results in zero. Division by zero is actually like that: It always has the same result, namely “infinity” (in the sense we’ve explained).What you are missing is

how we check our answer. To check the result of asubtraction, for example, you canaddthe same number back: To check that 10 – 3 = 7, you can observe that 7 + 3 = 10, and we get back the number we started with. And to check that 10 – 0 = 10, you add 10 + 0 = 10, and see that it works.The same thing works for

division, which is the opposite of multiplying: To check that 10 ÷ 5 = 2, youmultiply2 × 5 = 10.Now, if you claim that 10 ÷ 0 = 10, you can check that, too: Multiply 10 × 0, and you get … not 10, but 0. It didn’t work.

Now, the one thing we want most in math is consistency. Even if something seems to make sense, if it is not consistent with the rest of math, then we can’t accept it. And that is, essentially, what the rest of this article is saying. We can’t define division by zero, because no answer you give can be consistent with the rest.

Pingback: Is Zero Really a Number? – The Math Doctors

Geoff SmithHello Dave,

Thank you for your answer. You provide a valid explanation, and yes, the mutiplication ‘cross-check’ that you explained is also valid.

I was merely trying to challenge the status quo.

However, if you consider that multiplication and division are an ‘action’, then that would mean that both either multiplying or dividing by 0 can be construed to mean that in fact that we are not in any way changing the number that is being ‘acted’ upon. Is that right ? (Or wrong ?).

Therefore if you use my meaning above then:

10 x 0 = 10 (because you haven’t actually multiplied it by anything so ’10’ remains the same),

….. and

10 divided by 0 = 10 (because you haven’t actually divided it by anything, so “10” remains the same ie “0” means nothing). In this suggestion the multiplication ‘proof’ holds true for the division too.

However, 0 divided by 10 = 0

Nevertheless Dave, I am going to accept your explanation above, which is what I also learned at high school (like we all did) and I thank you for your reply.

However, I’m keeping at open mind to the possibility of some truth to what I wrote above and hopefully some Philosophers might be able to shed some light on it and you too Dave ?

(A) If someone asked you what you were going to do about any particular problem and you answered “zero”, that would mean that you were going to effect ‘no change’ that is, do nothing. So why can’t 0 (zero) mean that same thing in relation to multiplying and dividing ? That is why can’t it simply mean that multiplying or dividing by zero means ‘effecting no change’ therefore the answer remains the same as the original number ??

I make a valid point.

Dave, I’m going to accept your answer above for now ….. but I want to throw my question above (A) out there to you and everyone else anyway. I’m happy to be proven wrong and I’m happy to be proven right – whatever is the truth about things.

Thank you and best regards,

Geoff S.

Dave PetersonHi, Geoff.

All of this is based on a false premise: “if you consider that multiplication and division are an ‘action’, then that would mean that either multiplying or dividing by 0 can be construed to mean that in fact that we are not in any way changing the number that is being ‘acted’ upon.”. That is wrong.

Adding or subtracting 0

does nothing; 0 is called theadditive identity.Multiplying or dividing by 1

does nothing; 1 is called themultiplicative identity.But multiplying by zero does

notdo nothing. For an explanation, just see my latest post: Is Zero Really a Number? There we explain what zero is, and why multiplying by zero does what it does.You ask, “why can’t it simply mean that

multiplying or dividing by zero means ‘effecting no change’therefore the answer remains the same as the original number??” One quick answer is thatwe already have a number that effects no changewhen you multiply by it, and that number is 1. Your claim is essentially that zero should do exactly what 1 does. That’s silly, in addition to not meeting the definition of multiplication. But do read the new post.Geoff SmithThank you for your informative explanation Dave.

One thought ….. and it’s pretty obvious:

What you are saying is that for any positive number(s) and fraction(s), let’s call it X :

X – X = X x 0

Best regards,

Geoff S.

Sydney, Australia.

Dave PetersonYes, that is one way to express what we say in the other post (and not only for “positive numbers and fractions”, but for any real number, and even any complex number:

\(x-x=x(1-1)=x\cdot0\)

That proves that, in fact, \(x\cdot0=0\) for any number x.

Geoff SmithThank you very much Dave.

Just one last question if I may – I heard one explanation as to why 1 / 0 is undefinable is because, as in this example:

That is a proof that I saw that explains why dividing a number by zero is undefinable.

Is this a viable proof ?

I don’t think this proof takes into account the nature of infinity. Since infinity is not a number, it is a concept.

Best regards,

Geoff S.

Sydney, Australia.

Dave PetersonWhat you quote is essentially what we say in this very post:

We do “take into account the nature of infinity”; our whole point is that it is not a number, because it doesn’t follow the rules of numbers; and your quotation is just another example of that.

In order the 1/0 to be

defined, its result must be anumber; an operation must take two numbers and produce anothernumber. What’s needed is more than mere existenceas a concept.Geoff SmithYes. Thanks Dave.

The word “infinity” comes from the word “infinite”.

“Infinite” means: “limitless or endless in space, extent, or size; impossible to measure or calculate”.

“Infinite” is also another term for “non-finite”.

Best regards,

Geoff S.

Dave PetersonYes; often I prefer to say “infinitely many” rather than “infinity”, to avoid giving the sense that it is something in itself.

For some interesting perspectives on what infinity is and what it isn’t, see our Hilbert’s Cookie Jar: Little Kids, Big Ideas. As I say there,

leonard paul tunstillWhen I think of infinity I always consider the example of approaching a number in smaller and smaller increments, without ever getting to the number. Or you want a pint of beer so you order half a pint, then a quarter pint, then an eighth of a pint, etc., but you never actually get your pint. interesting concept?

Dave PetersonThe idea you refer to is the

limit of a sequence, or thesum of an infinite (geometric) series, namely \(\frac{1}{2}+\frac{1}{4}+\frac{1}{8}+\dots=1\). It’s also related to a (horizontal) asymptote of a function, which is a value it approaches as the input approaches infinity.What we’re dealing with in this post is sort of the opposite: instead of

addinginfinitely many (smaller and smaller) numbers to get afinitenumber, we aredividingby smaller and smaller numbers to get results that approachinfinity.leonard tunstillInfinity + 0 = finite. If you keep adding 1 to your number it keeps increasing ( indefinitely ? ) but if you add 0 you have stopped the number increasing and now have a finite number.

Dave PetersonNo, this is not true at all. Adding zero to anything leaves it unchanged, so that (to the extent we can treat infinity as a number at all), infinity plus zero has to be infinity.

If this is related to your previous comment, then you appear to be confusing

repeatedlyadding non-zero numbers with adding zeroonce. If you add 1 repeatedly, you get \(1+1+1+\dots=\infty\); to make this finite, you have to not add 0 to the result, but rather replace all but a finite number of the 1’s with 0: \(1+1+1+0+0+\dots=3\)In any case, infinity is very slippery to think about. You might benefit from reading the page I referred to above, Hilbert’s Cookie Jar: Little Kids, Big Ideas.