2017-12-03

Spontaneous Trumpeting

There are some things which after a period of contemplation acquire a definite "I want to trumpet this from the rooftops" quality, and which I end up not saying because the world doesn't seem to be listening anyway. I also don't have enough clarity and/or writing skill to force the world to listen.

Some of these thoughts come across in comments (for example on LessWrong 2.0), where I somehow feel less like I need to have an explicit defensible justification for everything I might want to say. I'm quoting two such comments below, in the vain hope that someone (anyone?) will get the message. YMMV.

Here's one on Acknowledging Rationalist Angst:

[Content warning: unpopular opinion.]

[I know that this is not the whole point of this post. I'm just responding to the part that is my personal pet peeve.]

I think it's in general harmful to make excuses for why rationalists are supposed to be weak. Because, you know, they really do wipe the floor with the competition, if they go far enough in the art. Why would you discourage yourself, and others, by saying that for this or other reason it's reasonable to expect rationalists to be unsuited for real-world combat? No it's not! It is painful to hold oneself to high standards, because then failure feels like failure. And yet if you want to walk the path, you won't go anywhere far by going around spending effort on making excuses for why these high standards don't apply to you. Even if the excuse is everyone else around is just the same, and even when this is in fact mostly what happens. Bah! You'll still do better if you can think I will not invent convenient excuses for failure, no matter how reasonable they sound.

And another one on Sunset at Noon (which is a very cool post, and you should totally go read it):

For you it will be a minor piece of evidence, but I hope it pushes you in the right direction on this delicate issue. This is verbatim from my personal notes, some time ago, formulated directly from my own experience and understanding of the world, without hearing anything about "Bayesian wizardry" beyond the general background of Eliezer's sequences:

Bayesian superpower

-> the ability to intuitively get Bayesian updates on evidence very precisely right

(huge returns across the board)

(learnable though elusive)

I am personally convinced that this is a real, specific, learnable, rare, and extremely valuable mental skill, and that I have greatly improved at it over the past 2 years. I have no way of proving to anyone that this is real, and I am super vague on how one would go about teaching it, but I nevertheless want to bless anyone who declares any level of interest in acquiring it.

Upon further reflection, you learning this would be extraordinarily cool because you might get a better shot at teaching it, even while being worse at the object level skill than some people.

No comments:

Post a Comment