- #1
mirelo
- 19
- 0
Dividing any number by zero truly consists in two different mathematical operations:
1) Dividing zero by zero, which has any quotient, since the product of any number by zero is again zero -- so mathematicians call that quotient indeterminate.
2) Dividing any other (than zero) number by zero, which has no quotient, since, as the product of any number by zero is again zero, the product of no number by zero is different from zero -- so mathematicians call that quotient undefined.
The problem is that we define division as the inverse of multiplication. For instance, 2 * 3 = 6 only because 6 / 3 = 2 and 6 / 2 = 3. But if we concentrate on the division of zero by zero -- while leaving aside for a while the division of any other number by zero -- then we find something a bit shocking, namely that, according to the very definition of division, if 1 * 0 = 0, then it must be the case that 0 / 0 = 1. In fact, we may be fine with it until we discover that, for the same reason, it is also the case that 0 / 0 = 2, since 2 * 0 = 0.
And here is the problem: the very definition of division tells us to consider both 0 / 0 = 1 and 0 / 0 = 2 as valid in themselves, but the fact that 0 / 0 has any quotient, being thus indeterminate, tells us to repute it as invalid, since this indeterminacy would undermine mathematics as a whole -- although this is no mathematical reasoning. But no matter how much we love mathematics, our disregarding 0 / 0 to prevent mathematics form self-destructing will never turn into a mathematical argument itself. Historically, we all know what was mathematicians choice: even good mathematicians today tend to treat 0 / 0 as if it were just the same as 1 / 0, hence confusing being indeterminate with being undefined -- a mathematical error.
It is way too much easy for us to forget that mathematics itself is telling us that 0 / 0 = X is valid for each and every X, by the very definition of division, which demands of it only that X * 0 = 0. There is no mathematical reason for treating 0 / 0 as invalid, which does not mean there are no reasons to avoid dividing zero or any other number by zero in practice (for example, in computer programs). But no matter how disastrous that division may be -- as indeed it is -- it is mathematically valid. Even worse, it makes numbers false, because a true number 1 must never be identical to the number 2, which it is once it becomes the quotient of 0 / 0.
One thing is to forbid such a mathematical operation, like the Catholic Church has forbidden so many things in medieval times. Another thing is to have a rational motivation for that, not to mention a mathematical reason.
1) Dividing zero by zero, which has any quotient, since the product of any number by zero is again zero -- so mathematicians call that quotient indeterminate.
2) Dividing any other (than zero) number by zero, which has no quotient, since, as the product of any number by zero is again zero, the product of no number by zero is different from zero -- so mathematicians call that quotient undefined.
The problem is that we define division as the inverse of multiplication. For instance, 2 * 3 = 6 only because 6 / 3 = 2 and 6 / 2 = 3. But if we concentrate on the division of zero by zero -- while leaving aside for a while the division of any other number by zero -- then we find something a bit shocking, namely that, according to the very definition of division, if 1 * 0 = 0, then it must be the case that 0 / 0 = 1. In fact, we may be fine with it until we discover that, for the same reason, it is also the case that 0 / 0 = 2, since 2 * 0 = 0.
And here is the problem: the very definition of division tells us to consider both 0 / 0 = 1 and 0 / 0 = 2 as valid in themselves, but the fact that 0 / 0 has any quotient, being thus indeterminate, tells us to repute it as invalid, since this indeterminacy would undermine mathematics as a whole -- although this is no mathematical reasoning. But no matter how much we love mathematics, our disregarding 0 / 0 to prevent mathematics form self-destructing will never turn into a mathematical argument itself. Historically, we all know what was mathematicians choice: even good mathematicians today tend to treat 0 / 0 as if it were just the same as 1 / 0, hence confusing being indeterminate with being undefined -- a mathematical error.
It is way too much easy for us to forget that mathematics itself is telling us that 0 / 0 = X is valid for each and every X, by the very definition of division, which demands of it only that X * 0 = 0. There is no mathematical reason for treating 0 / 0 as invalid, which does not mean there are no reasons to avoid dividing zero or any other number by zero in practice (for example, in computer programs). But no matter how disastrous that division may be -- as indeed it is -- it is mathematically valid. Even worse, it makes numbers false, because a true number 1 must never be identical to the number 2, which it is once it becomes the quotient of 0 / 0.
One thing is to forbid such a mathematical operation, like the Catholic Church has forbidden so many things in medieval times. Another thing is to have a rational motivation for that, not to mention a mathematical reason.