• 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: 12 June 2023

help-circle














  • I guarantee you're misunderstanding or accidentally misrepresenting something here. The fact that there are only countably many computable numbers is a simple consequence of the fact that there are only countably many programs, which is bounded above by the number of finite sequences of letters from a finite alphabet, which is countably infinite.

    There may be more finitistic/computable models for the real numbers or something, but "the computable real numbers" are countable.



  • Cantor's theorem says the power set of X has a strictly larger cardinality than X.

    When |X| is a natural number, the power set of X has cardinality 2^(|X|), since you can think of an element of the power set as a choice, for each element of X, of "in the subset" vs "not in the subset." Hence the notation 2^X for the power set of X.

    Cantor's theorem applies to all sets, not just finite ones. You can show this with a simple argument. Let X be a set and suppose there is a bijection f : X -> 2^(X). Let D be the set { x in X : x is not in f(x) }. (The fact that this is well defined is given by the comprehension axiom of ZFC, so we aren't running into a Russell's paradox issue.) Since f is a bijection, there is an element y of X so that f(y) = D. Now either:

    • y is in D. But then by definition y is not in f(y) = D, a contradiction.

    • y is not in D. But then by definition, since y is not in f(y), y is in D.

    Thus, there cannot exist such a bijection f, and |2^(X)| != |X|. It's easy enough to show that the inequality only goes one way, i.e. |2^(X)| > |X|.


  • No, that's definitely not true. As I said, infinite cardinals (like the cardinality of the naturals ℵ₀) are defined to be equivalence classes of sets that can be placed in bijection with one another. Whenever you have infinite sets that can't be placed in bijection, they represent different cardinals. The set of functions f : X --> X has cardinality 2^X too, so e.g. there are more real-valued functions of real numbers than there are real numbers. You can use this technique to get an infinite sequence of distinct cardinals (via Cantor's theorem, which has a simple constructive proof). And once you have all of those, you can take their (infinite) union to get yet another greater cardinal, and continue that way. There are in fact more cardinalities that can be obtained in this way than we could fit into a set-- the (infinite) number of infinite cardinals is too big to be an infinite cardinal.

    You might be thinking of the generalized continuum hypothesis that says that there are no more cardinal numbers in between the cardinalities of power sets, i.e. that ℵ₁ = 2^(ℵ₀), ℵ₂ = 2^(ℵ₁), and so on.


  • You wouldn’t even notice if some proof is wrong because it relies on an inconsistency that’s the issue.

    You wouldn't notice because there's no realistic chance that any meaningful result in the vast majority of math depends strictly on the particular way in which ZFC is hypothetically inconsistent.

    And that’s before you didn’t notice because noone builds anything on axioms but instead uses fragile foundations made of intuition, hand-waving, and mass psychology.

    This is a ridiculous attitude. Nobody uses the axioms of ZFC directly because that would be stupid. It's obviously sufficient to know how to do so. There is literally no difference to the vast majority of all math which particular axiomatic formalism you decide to use, because all of those results are trivially translatable between them.


  • Math went on because it doesn't matter. Nobody cares about incompleteness. If you can prove ZFC is inconsistent, do it and we'll all move to a new system and most of us wouldn't even notice (since nobody references the axioms outside of set theorists and logicians anyway). If you can prove it's incomplete, do it and nobody will care since the culprit will be an arcane theorem far outside the realm of non-logic fields of math.