I am absolutely astonished that anybody with the most basic understanding of relativity would ever take Yud as some kind of brain genius after he shows his entire mucky arse all over just those two opening paragraphs
Btw…
I am not going to discuss the actual experiments that have been done on calibration—you can find them in my book chapter on cognitive biases and global catastrophic risk1—because I’ve seen that when I blurt this out to people without proper preparation, they thereafter use it as a Fully General Counterargument, which somehow leaps to mind whenever they have to discount the confidence of someone whose opinion they dislike, and fails to be available when they consider their own opinions.
lol
I didn't expect that the repetition of a banal yet occasionally useful saying like "the map is not the territory" could make a person deserve being shoved into a locker, but life will surprise us all.
Mixed in with the rank, fetid ego are amusing indications that Yud gave very little thought to what Bayesian probability actually means. I find that entertaining.
"The Map is Not the Territory" (I shall henceforth refer to this as "Teenage Mutant Ninja Turtles") is especially rich given how Yudkowski and other lesswrong types tend to be massive metaphor fetishists.
Suppose you say that you’re 99.99% confident that 2 + 2 = 4.
Then you're a dillbrain.
Then you have just asserted that you could make 10,000 independent statements, in which you repose equal confidence, and be wrong, on average, around once. Maybe for 2 + 2 = 4 this extraordinary degree of confidence would be possible
Yes, how extraordinary that I can say every day that the guy in front of me at the bodega won't win the Powerball. Or that
[
makes a list that is ]False
in all but one spot.P(x|y) is defined as P(x,y)/P(y). P(A|A) is defined as P(A,A)/P(A) = P(A)/P(A) = 1. The ratio of these two probabilities may be 1, but I deny that there's any actual probability that's equal to 1. P(|) is a mere notational convenience, nothing more.
No, you kneebiter.
I roll a fair 100 sided dice.
Eliezer asks me to state my confidence that I won't roll a 1.
I say I am 99% confident I won't roll a 1, using basic math.
Eliezer says "AHA, you idiot, I checked all of your past predictions and when you predicted something with confidence 99%, it only happened 90% of the time! So you can't say you're 99% confident that you won't roll a 1"
I am impressed by the ability of my past predictions to affect the roll of a dice, and promptly run off to become a wizard.
Ah but the machine gods could be tinkering with your neural wiring to make you think you're rolling a die when in reality the universe is nothing but the color pink. That's right, reality is nothing but a shade of Fuchsia and dice don't actually exist. You should take this possibility into account when adjusting your priors for some reason.
Epistemic Status: Barbie.
I had 100% certainty that 53 was prime when I was 12. Does that mean I was smarter than Yud back then? (I may have become more stupid since learning about LW)