Probs skim a wiki page before trying to correct someone :) conventions are just conventions. All maths is arbitrary (e.g. n^0 being defined as 1 is so nice patterns around power addition and such hold at 0 but "n times n zero times" is sort of a nonsense question if the operation is taken at it's most naive and direct interpretation)
Also not a number isn't a mathematical construct, it's a value computer software returns to handle edge cases in a predictable fashion.
I now vote for defederation due to this pedantic response. Division by zero is undefined under the rational number set which is what pretty much anyone on earth will think of. It does not reach the same value as you take the limit from either side. It's not a convention it is an axiom. Math is not arbitrary, it must all be proven n^0=1 is not a convention, it is proven in many different ways.
I guess that's fair, I'm being a bit sloppy. We prove things inside of axioms we accept, and can develop systems consistent with different sets of axioms but there's not necessarily any reason to choose one set over another. Doesn't 0^0 come from Euler being like "shut up it works nicer this way" though? or was it Russel?
We can't prove our axioms, and the rational number set isn't more true than anything else, it just tends to be more useful in normal arse problems.
And anyone who's cared for a baby would tell you that lullabies are the most useful sort of music but they're hardly what I want to talk about when music comes up :P
Does everyone have to provide a disclaimer on every comment they ever make regarding math? (Note: This comment refers only to the system of mathematics every single person reading this comment is familiar with. If you make up different rules then those rules will apply instead of the ones I'm talking about.)
I didn't intend any hostility, the world is just nuanced and really fun. I often see assertions of rules of thumb presented as factual statements without any hint of further complexity existing and it makes me sad as people read that and think the world is simple and makes sense.
It's much more true to say something like "usually we can't divide by zero" and that leaves room for someone curious to go "huh!" and scurry off on their own and learn something fascinating.
division is not defined for 0. it can yield multiple values, any value at all, explode to infinity, etc.. but even that statement depends on taking a limit because you can't actually divide by zero. you break basic algebraic laws if you try to include it. it's such a essential fact of algebra that you only name an element of a ring 0 if it's the additive identity and always multiplies to zero. when you extend such a set to a field, you define a division operation as multiplication by the multiplicative inverse except for the additive identity because such an operation is never well-defined.
your Wikipedia link is discussing limits. limits are only well-defined when you can prove that every step of the limiting process is well-defined and the overall sequence converges absolutely. if I just write 5/0, there's no sequence - you can't say the limit diverges to infinity or resolves to a specific number because there is no limiting sequence to begin with. you need a function like sin(x)/x to produce a limit such that you know for certain that 0/0 in this very particular case is 0 (ie the discontinuity at 0 is removable).
if you're interested in this, you're looking for ring theory. a lot of textbooks will give you the basis to prove that division by zero produces inconsistent results for any field - this is why it's one of the field axioms.
I wasn't talking about limits. Read the stuff about extended real lines etc. There are some (2 or 3? idk not many) systems where we do define 1/0 as infinity.
I do think your right in the framework of ring theory though, but I haven't done much of that. It's a framework for analysing a lot of algebra and the maths we usually do but I don't think it's the universal truth of all mathematics which is possible. Am I wrong there?
sure, you can extend the real line but you're basically defining a new value that behaves a lot like NaN in software. you have to be very careful about the operations because of the strange properties of the new terms introduced. what you get out loses a lot of the basic properties you're used to with arithmetic - ie afaik it's neither a field nor a ring. it's also a really misleading statement to say it's dividing by zero - you're changing what division by zero even means. and the infinity you get back isn't infinity in the usual sense - the supremum of the natural numbers. rather it's a symbolic infinity.
basically, it's as much division by zero as -1/12 is the sum of the natural numbers. they're both true in a very particular sense but only after you change the meaning of all the words in the statement.
said another way, you can make anything you like true by introducing new axioms but those axioms have deep impacts on what's true in the new system those axioms generate and it's misleading to say that the ability to introduce those axioms makes an undefined operation sensible in a system that lacks those axioms.
Ok I suppose that's fair, most people probably are taking division as meaning in the sense it applies in what is it even called, default maths? real analysis or whatever, and while people call the operation division in other systems it's sort of a homophone for a rather different thing that shares characteristics.
People should still learn about the Riemann sphere though and it's a sensible operation under that. We do it all the time in quantum shit mwahahahaha.
it's a version of field theory where the rules aren't all properly explained? I wish we just taught groups, rings, and fields as soon as modular arithmetic gets introduced. it's not really that complicated and it makes sense if you have matrices, integer rings, Z, R, and Q available as examples. we just leave things poorly explained by not teaching the axioms.
Riemann spheres are awesome, I just want to be careful with my language in a space where people don't even know what a field is, generally. but god I love math. I really want to go back to grad school and finish a phd - I've been settling for teaching myself with books and free online lectures.
I know that, I was using it in the literal sense. x / 0 = not a number. It isn't a number. It is not something defined mathematically. I could just as easily have said undefined, N/A, or anything else.
conventions are just conventions. All maths is arbitrary
you cannot take an object and divide it into 0 pieces. The expression is nonsensical to begin with.
Maybe learn some algebra before you start being pedantic. You can't divide by zero and any notation that does is shorthand for something else. There is no way to define 6/0 in the same way as 6/3 and the fact that you need a whole-ass metric space so you can have convergent sequences shows that.
And n0 is not just arbitrary but the definition of an empty sequence of an operation is that it yields the neutral element. The empty sum yields 0, the empty product yields 1 and n0, in the most naive and direct interpretation, Is n multiplied by itself no times, hence an empty product and therefore yields 1 (there is a caveat for 00 but that's like a whole lecture). Sure that is arbitrary in the sense that any axiomatic system is arbitrary but not in the /0 sense where it keeps changing based on context and no self-consistent axiomatic system is possible where /0 is assigned a value.
It looks like we won't be
(•_•)
( •_•)>⌐■-■
(⌐■_■)
divided by a zero.
That would make you infinite tho.
No. You can't divide by zero and get infinity. x / 0 = NaN, it can't be defined, dividing by zero is impossible.
https://en.m.wikipedia.org/wiki/Division_by_zero
Probs skim a wiki page before trying to correct someone :) conventions are just conventions. All maths is arbitrary (e.g. n^0 being defined as 1 is so nice patterns around power addition and such hold at 0 but "n times n zero times" is sort of a nonsense question if the operation is taken at it's most naive and direct interpretation)
Also not a number isn't a mathematical construct, it's a value computer software returns to handle edge cases in a predictable fashion.
I now vote for defederation due to this pedantic response. Division by zero is undefined under the rational number set which is what pretty much anyone on earth will think of. It does not reach the same value as you take the limit from either side. It's not a convention it is an axiom. Math is not arbitrary, it must all be proven n^0=1 is not a convention, it is proven in many different ways.
I guess that's fair, I'm being a bit sloppy. We prove things inside of axioms we accept, and can develop systems consistent with different sets of axioms but there's not necessarily any reason to choose one set over another. Doesn't 0^0 come from Euler being like "shut up it works nicer this way" though? or was it Russel?
We can't prove our axioms, and the rational number set isn't more true than anything else, it just tends to be more useful in normal arse problems.
I can think of one reason to choose the set of axioms we all learned in grade school:
And anyone who's cared for a baby would tell you that lullabies are the most useful sort of music but they're hardly what I want to talk about when music comes up :P
Does everyone have to provide a disclaimer on every comment they ever make regarding math? (Note: This comment refers only to the system of mathematics every single person reading this comment is familiar with. If you make up different rules then those rules will apply instead of the ones I'm talking about.)
I didn't intend any hostility, the world is just nuanced and really fun. I often see assertions of rules of thumb presented as factual statements without any hint of further complexity existing and it makes me sad as people read that and think the world is simple and makes sense.
It's much more true to say something like "usually we can't divide by zero" and that leaves room for someone curious to go "huh!" and scurry off on their own and learn something fascinating.
I love that this somehow spawned a struggle session about math which is oddly enough totally on brand for both of our instances.
division is not defined for 0. it can yield multiple values, any value at all, explode to infinity, etc.. but even that statement depends on taking a limit because you can't actually divide by zero. you break basic algebraic laws if you try to include it. it's such a essential fact of algebra that you only name an element of a ring 0 if it's the additive identity and always multiplies to zero. when you extend such a set to a field, you define a division operation as multiplication by the multiplicative inverse except for the additive identity because such an operation is never well-defined.
your Wikipedia link is discussing limits. limits are only well-defined when you can prove that every step of the limiting process is well-defined and the overall sequence converges absolutely. if I just write 5/0, there's no sequence - you can't say the limit diverges to infinity or resolves to a specific number because there is no limiting sequence to begin with. you need a function like sin(x)/x to produce a limit such that you know for certain that 0/0 in this very particular case is 0 (ie the discontinuity at 0 is removable).
if you're interested in this, you're looking for ring theory. a lot of textbooks will give you the basis to prove that division by zero produces inconsistent results for any field - this is why it's one of the field axioms.
I wasn't talking about limits. Read the stuff about extended real lines etc. There are some (2 or 3? idk not many) systems where we do define 1/0 as infinity.
I do think your right in the framework of ring theory though, but I haven't done much of that. It's a framework for analysing a lot of algebra and the maths we usually do but I don't think it's the universal truth of all mathematics which is possible. Am I wrong there?
sure, you can extend the real line but you're basically defining a new value that behaves a lot like NaN in software. you have to be very careful about the operations because of the strange properties of the new terms introduced. what you get out loses a lot of the basic properties you're used to with arithmetic - ie afaik it's neither a field nor a ring. it's also a really misleading statement to say it's dividing by zero - you're changing what division by zero even means. and the infinity you get back isn't infinity in the usual sense - the supremum of the natural numbers. rather it's a symbolic infinity.
basically, it's as much division by zero as -1/12 is the sum of the natural numbers. they're both true in a very particular sense but only after you change the meaning of all the words in the statement.
said another way, you can make anything you like true by introducing new axioms but those axioms have deep impacts on what's true in the new system those axioms generate and it's misleading to say that the ability to introduce those axioms makes an undefined operation sensible in a system that lacks those axioms.
Ok I suppose that's fair, most people probably are taking division as meaning in the sense it applies in what is it even called, default maths? real analysis or whatever, and while people call the operation division in other systems it's sort of a homophone for a rather different thing that shares characteristics.
People should still learn about the Riemann sphere though and it's a sensible operation under that. We do it all the time in quantum shit mwahahahaha.
it's a version of field theory where the rules aren't all properly explained? I wish we just taught groups, rings, and fields as soon as modular arithmetic gets introduced. it's not really that complicated and it makes sense if you have matrices, integer rings, Z, R, and Q available as examples. we just leave things poorly explained by not teaching the axioms.
Riemann spheres are awesome, I just want to be careful with my language in a space where people don't even know what a field is, generally. but god I love math. I really want to go back to grad school and finish a phd - I've been settling for teaching myself with books and free online lectures.
I know that, I was using it in the literal sense. x / 0 = not a number. It isn't a number. It is not something defined mathematically. I could just as easily have said undefined, N/A, or anything else.
you cannot take an object and divide it into 0 pieces. The expression is nonsensical to begin with.
Have you considered Wikipedia tho
Maybe learn some algebra before you start being pedantic. You can't divide by zero and any notation that does is shorthand for something else. There is no way to define 6/0 in the same way as 6/3 and the fact that you need a whole-ass metric space so you can have convergent sequences shows that.
And n0 is not just arbitrary but the definition of an empty sequence of an operation is that it yields the neutral element. The empty sum yields 0, the empty product yields 1 and n0, in the most naive and direct interpretation, Is n multiplied by itself no times, hence an empty product and therefore yields 1 (there is a caveat for 00 but that's like a whole lecture). Sure that is arbitrary in the sense that any axiomatic system is arbitrary but not in the /0 sense where it keeps changing based on context and no self-consistent axiomatic system is possible where /0 is assigned a value.
Not entirely correct
https://en.wikipedia.org/wiki/Projectively_extended_real_line
https://en.wikipedia.org/wiki/Riemann_sphere
a zero is gen X slang for someone totally untubular.
Which for you gen zers out there means one who is totally rizzless