Discussion about this post

User's avatar
Xavier Moss's avatar

I think with a lot of the EA movement specifically, a lot of people have trouble squaring the source of funds and their use – you can buy someone like Elon Musk or Bill Gates as a philanthropist, they made money by making things, but many people (including myself) perceive crypto billionaires as having made money off false promises to everyday people. If you see crypto as a ponzi scheme, it becomes harder to believe the people in it are genuine altruists.

Obviously the classic argument exists that you can do some evil to make money to do good of greater utility, but we know human beings are self-serving, and this is just a retread of the politicians who compromise everything to win power because 'without power we can't help anyone.' Well those people get power and usually don't help anyone anyway. You can also argue that they think crypto is good but, well, I'm not sure that's an argument for the effectiveness of their altruism.

I'm a believer in the principles of effective altruism – in fact, I've structured my career to do meaningful work on an important public health problem, and taken a large pay cut to do so – and I'm funded by one of the large foundations, so I see the role rich funders play. But as often happens in the modern age, being effectively altruistic and being 'in the EA community' are not the same thing, and these subcultures form, become insular, and lose sight of their goals. Long-term utility is precisely one of those areas that allows human beings to subconsciously skew their thinking around their own selfish interests, and in fact the 'rationalist' community seems to largely be about internal status-seeking around issues like AI risk than thinking through real problems, as well as making the fundamental 'the world is just an engineering problem' error common to both crypto and online rationalism.

Frankly, this is the telling sentence 'He briefly [worked] directly for the Centre for Effective Altruism...but while there hit upon a crypto trading arbitrage opportunity.' Kind of says it all.

Expand full comment
Doctor Memory's avatar

Maybe it’s “not about crypto” but if the biggest name in applied consequentialism right now is, essentially, in the business of marketing Ponzi schemes to the gullible as his day job, that seems kinda relevant to me when approaching the question of how much weight I should assign his opinions about existential risks?

Expand full comment
361 more comments...

No posts