unreasonable insurance companies killing the occasional kid
If you watch 50 shorts from a YouTuber, YouTube still won’t recommend their longer videos to you (and vice versa, I think). This level of segregation between shorts and regular videos seems terrible.
He also gave an update in comments:
Next day I was fleeing the country. The journey took me more than 50 hours. I was awake for the whole trip. But now I’m safe.
High status people often aren’t great people and often don’t have enviable lives. Read their texts and see for yourself.
In 2016, The Center for Applied Rationality (CFAR, a rationality research organization connected with Less Wrong) announced:
- They want to focus on “AI Safety” more than they care about rationality.
- They care more about influencing high status people and politics than about helping regular people be more rational.
Here’s a short explanation of our new mission:
- We care a lot about AI Safety efforts in particular, and about otherwise increasing the odds that humanity reaches the stars.
Would you rather: (a) have bad rationality skills yourself; or (b) be killed by a scientist or policy-maker who also had bad rationality skills?
I originally thought that the quote meant something like:
Would you rather: have bad rationality and be alive because important people have good rationality, OR: good rationality and be dead cuz the important people have bad rationality?
But re-reading the sentence, I noticed the “also”. So instead of choosing between either you or important people having good rationality, your actually only choosing whether the important people have rationality or not, and you in both scenarios don’t have rationality.
(i think title is a typo and they meant 2h 53m)