Thank you for writing this much-needed piece. EA can be quick to self-flagellation under the best of circumstances. And this is not the best of circumstances.
I am not sure why,but this comment made me suddenly realize how close we used to be to doomsday(and still are)!
Edit suggestion: 'statutory' seems to be misspelled as 'statuatory'.
Great read, btw.
Upvoted both examples because I would like to see more content of this broad kind on lesswrong, even if they don't become the most popular posts.
The images(which make up most of the post) are missing now(20210622).
https://web.archive.org/web/20200809205218/https://www.lesswrong.com/posts/A2TmYuhKJ5MbdDiwa/when-gears-go-wrong has the post with images.
I found the post well-worth reading. In fact, a critical addition to learning about Gears-level, in that it helps with when and how to optimize it's use.
This post was easy to read. Far easier than a post like this usually manages to be. And It was definitely useful. I realized i have been messing up in some ways described in the post.
The achievement of easiness is due to the use of specific examples everywhere.
I feel like this post adds a new dimension to the kind of content that LW can be a home to. I , personally, would love to see more posts like this on LW.
I expect this will be directly useful in my discussions with relatives/friends who are not the LW type.
I do have trouble with getting them to 'accept' Bayesian thinking first and then moving on to the topic at hand. Bypassing it this way might give better results.
Thank you for this post!
Thank you! I needed to read this.
Also, A lot of stuff written about here are worse in communities where prosperity is recent or yet to come.