Last time, I talked about students’ difficulties carrying out divisions that involve a zero, which reminded me of another issue that sounds almost the same, but is quite different: division of a number by zero. Students often either forget the rule they were taught, or they don’t believe the rule, or they just wonder about it. This is common enough that the Ask Dr. Math site has a FAQ on it. Here are a selection of questions about this, with answers on many levels.
Arithmetic: A model, and a definition
Dividing by Zero I cannot comprehend that a human being is not able to divide a number by zero, because by definition 0 is nothing, and if you can multiply by nothing, and add and subtract by nothing, why can't you divide by nothing? Let's say you have 10 apples and you divide them by 0 - don't you still have 10 apples? I cannot see why this cannot be done!
It is common for people (of all ages) to try to understand multiplication or division by zero in terms of a physical model, and get hung up. What does it mean to divide by zero? I started by clarifying Iain’s model, starting with a non-zero example:
You don't seem to be thinking closely about what it MEANS to divide by zero. Let's watch what happens when we try to divide those apples. First, let's divide the 10 apples into piles of 2, so we can give 2 to each of our friends. (When we run out, we'll be out of friends!) We do this: oo oo oo oo oo -- -- -- -- -- We've managed to make 5 piles; 10 divided by 2 is 5. Now let's try dividing the apples into piles of ZERO to give to our enemies, and see when we run out: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- ... This is getting hard! No matter how many empty piles I make, I haven't used up any of my apples. I guess I can have infinitely many enemies to give them to! This is why we can't divide by zero: we can never finish the job. And your mention of human beings is interesting; it's precisely because we are human, and therefore finite, that we can't do this.
Hold on to that thought about infinity! For now, let’s move from physical models (which are often misunderstood) to abstract definitions:
To put all of this into mathematical terms, dividing by 2 means finding a number (5) by which we can multiply 2 to get 10: 10 / 2 = 5 because 10 = 2 * 5 If we could divide 10 by 0 (I'll call the answer X), we would be saying that: 10 / 0 = X because 10 = 0 * X But zero times anything is 0, so I will never find an X for which this is true. That's what happened when I tried dividing the apples.
Algebra: What it means to be undefined
That last bit of thinking was almost algebra. At that level, we are doing math more formally, starting with definitions and properties. Here, the reason division by zero is “undefined” is that we can’t define it without losing the consistency that algebra depends on:
Divide by 0 Undefined? When something is divided by 0, why is the answer undefined?
In 1996, Doctor Robert gave an answer much like mine above, but Doctor Tom took a slightly higher level:
It's because there's just no sensible way to define it. For example, we could say that 1/0 = 5. But there's a rule in arithmetic that a(b/a) = b, and if 1/0 = 5, 0(1/0) = 0*5 = 0 doesn't work, so you could never use the rule. If you changed every rule to specifically say that it doesn't work for zero in the denominator, what's the point of making 1/0 = 5 in the first place? You can't use any rules on it.
That is, any definition we chose for 1/0 would lead to a contradiction with the very rules we use to solve algebraic equations; so we are forced to leave it undefined.
Now we can go back to that earlier mention of “infinitely many”. He continues:
But maybe you're thinking of saying that 1/0 = infinity. Well then, what's "infinity"? How does it work in all the other equations? Does infinity - infinity = 0? Does 1 + infinity = infinity? If so, the associative rule doesn't work, since (a+b)+c = a+(b+c) will not always work: 1 + (infinity - infinity) = 1 + 0 = 1, but (1 + infinity) - infinity = infinity - infinity = 0. You can try to make up a good set of rules, but it always leads to nonsense, so to avoid all the trouble we just say that it doesn't make sense to divide by zero.
The point here is that if we were to say that the result of 1/0 is “infinity”, we would be treating infinity as a number. And it turns out that it doesn’t behave like a number; it, too, breaks rules. So although you might have a name for it, it still isn’t a number, which doesn’t help. Division by zero is undefined — there is no number that works.
Calculus: Limits and infinity
But we have a field of math that does handle infinity (or at least has sort of tamed it, so we can do some things with it): calculus. The way calculus handles infinity is by treating it not as an actual number, but as a “limit”. We talk about a variable or a function “approaching infinity”, and we give a precise definition of that. So, might we be able to say that 1/0 is infinite in that sense?
Here is an answer from 2001, where Doctor TWE first gave an explanation much like my first one above, but then came close to calculus:
Error: Division by Zero I've been trying to help my third grader understand that a number divided by zero is undefined but am not getting much help. The calculator they use in school gives the answer 0/E (the teacher told them to write that on their homework paper, but seems not to understand what it means - I'm guessing it stands for "error"). But worse, the calculator on the Windows Accessory program gives the answer "positive infinity" when you divide a number by 0. How are we supposed to swim upstream against the teacher, the school calculators and the computer?
You can read the first part. Right now, I’m interested in that bit about infinity.
The argument above may seem like a case in favor of the Windows Accessory calculator program's answer of "positive infinity" ... To explain why it is not simply positive infinity requires a little more abstraction. Suppose we divide 1 by successively smaller values (I'll use 1, 0.1, 0.01, etc.) Our results are: 1 / 1 = 1 1 / 0.1 = 10 1 / 0.01 = 100 1 / 0.001 = 1,000 1 / 0.0001 = 10,000 1 / 0.00001 = 100,000 We can see that as the denominator gets closer to zero, the quotient increases without bound. (You can introduce the concept of limits and Lim[x->0+, 1/x] = +oo.) But suppose we divide -1 by successively smaller values. What happens then? Our results are: -1 / 1 = -1 -1 / 0.1 = -10 -1 / 0.01 = -100 -1 / 0.001 = -1,000 -1 / 0.0001 = -10,000 -1 / 0.00001 = -100,000 Now our quotient is approaching negative infinity. So we have to make two rules: one if the numerator is positive and one if it is negative. ... If I have the problem 1/0, how am I supposed to know whether to use a sequence of positive or negative numbers to approach zero? In fact, I can't know. That's why we say it is undefined instead of positive infinity. As we APPROACH zero from the right or from the left, our quotient approaches positive or negative infinity, but when the denominator IS zero, the quotient is neither - it's undefined. This concept may be too abstract for a third grader. You may have to tell him to take it "on faith" for now, and that later on, when he gets older, he'll understand the reason why. When my son was 3, I didn't tell him why it was a good idea to buckle up in the car - he just had to do it. (I didn't want to scare him with the idea of having an accident.) When he was a little older (about 5, I think), I explained to him WHY it was a good idea. Now he always checks to make sure I'm buckled up, too!
So, although there is some truth to the idea that 1/0 “equals” infinity, in the sense of a limit rather than a “number”, on close examination even that doesn’t hold up.
But let’s buckle up, because there’s one more direction we might go:
But, couldn’t it be like imaginary numbers?
Is It Possible That x/0 is Not Really Undefined? As we are all told, n = 1/0 is undefined, since no number n exists that, when multiplied by 0, gives the result 1 (i.e., n*0 = 1). However, drawing an example from the theory of complex numbers, there also exists no obvious number i that, when squared, gives the result -1. The square root of -1 would, in past centuries, have been described as nonsense or undefined. Mathematicians nonetheless define just such a number, enlarging the known numbers from the real to the complex, and use the number i successfully in many real-world calculations. Can we be certain that, for any nonzero x, x/0 is actually undefined (i.e. unable to be ascribed any actual value or meaning), or is it possible that an important number and new number class has so far escaped discovery?
Read that page, and the following, to see answers to this common question!
Imaginary Numbers, Division By Zero Zero as Denominator
Here is the conclusion to the latter:
I think division by zero has always been confusing because there is a lot to it. The books simply weasel out of it by saying, "Division by zero is not defined," and the instant knee-jerk response is, "Well, just define it." It's in trying to find a "reasonable" definition that all the ugly problems come up.
Pingback: Zero Divided By Zero: Undefined and Indeterminate – The Math Doctors
Dear Sirs / Madams,
I disagree with your explanation above.
My answer: a number that is divided by zero has not been divided at all.
I put to you that the true meaning of ‘any number divided by 0’ is: that you are not dividing at all.
That is, I suggest that “a number that is divided by zero is not divided at all”, therefore will be the same number.
Anything that is divided by nothing has not been divided at all, so it remains it’s same original ‘self’.
For example: 10 oranges divided by 5 = 2 in each pile (10 / 5 = 2)
10 oranges divided by 10 = 1 in each pile (10 /10 = 1)
10 oranges divided by 0 = 10 oranges NOT divided at all, therefore, no action taken (0 = no action = nothing done). (10 / 0 = 10).
10 divided by 0 means I am not dividing it, there is 0 division, the number is not being divided. 10 divided by 0 = 10.
If you add 0 to any number you have the same original number. If you subtract 0 from any number you have the same original number. If you divide a number by zero – it means you aren’t dividing it at all, since 0 means no quantifiable amount, so if you are not dividing something by a quantifiable amount (0) then you are not dividing it at all. Something that has not been divided remains as it is. Therefore any number divided by zero is the same original number. No change.
Therefore, any number divided by 0 leaves the original number unchanged = same number. X / 0 = X
Sure, that is the same result as if you divide a number by 1 (that is X / 1 = X) but since Zero is a nothing quantity it could still have a different characteristic to the number ‘1’ but still derive the same answer in the case of division.
From Geoff Smith, Sydney, Australia
Hi, Geoff.
You’re saying essentially what the first questioner above said: “Let’s say you have 10 apples and you divide them by 0 – don’t you still have 10 apples?” The answer we gave Iain applies to you; but let’s say a little more this time.
First, if you haven’t divided at all, then … you are really saying you can’t divide by zero, and we agree. But you aren’t really saying that.
Then you say that division by zero should be just like adding or subtracting zero, which have no effect. But you don’t mention multiplying by zero, which does have an effect: It always results in zero. Division by zero is actually like that: It always has the same result, namely “infinity” (in the sense we’ve explained).
What you are missing is how we check our answer. To check the result of a subtraction, for example, you can add the same number back: To check that 10 – 3 = 7, you can observe that 7 + 3 = 10, and we get back the number we started with. And to check that 10 – 0 = 10, you add 10 + 0 = 10, and see that it works.
The same thing works for division, which is the opposite of multiplying: To check that 10 ÷ 5 = 2, you multiply 2 × 5 = 10.
Now, if you claim that 10 ÷ 0 = 10, you can check that, too: Multiply 10 × 0, and you get … not 10, but 0. It didn’t work.
Now, the one thing we want most in math is consistency. Even if something seems to make sense, if it is not consistent with the rest of math, then we can’t accept it. And that is, essentially, what the rest of this article is saying. We can’t define division by zero, because no answer you give can be consistent with the rest.