Discussion about this post

User's avatar
Ethics Gradient's avatar

Candidly, I find the conflation of the FTX bankruptcy and EA’s general credibility to be a big stretch - kind of a square peg / round hole situation, where people keep trying to put a scandal somewhere where it doesn’t belong. Were SBF pulling the strings at any of these charities (which seems plausible only at most as to never-got-off-the-ground GAP) that would be one thing, but in general we don’t hold the credibility of charitable institutions hostage to their donors’ moral virtue — how many dyed-in-the-wool liberals see productions at Lincoln Center in buildings named for the Koch brothers?

The fact that someone who favored a lot of the same good causes that a lot of rationalist types do turned out to be engaging in suspect transactions in a space whose only demonstrable value is in facilitating illicit transactions (crypto) just isn’t that intrinsically interesting (Hitler was a vegetarian and loved dogs etc. etc.) because there’s no evidence that this was the result of the dispassionate application of utilitarian values as opposed to just, like, one guy being kind of scummy.

The stuff about risk aversion is a good point but the EA framing is masking it: while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event, what he’s reciting is just a bog-standard argument that e.g. Daniel Kahneman makes that people don’t employ proper EV analysis enough and so the appropriate corrective is to be more rational about it. The very real problems Matt points to seem less a function of diminishing marginal utility (there’s probably a lot more than $15 billion of need in the world) and more about how reliance interests actually break nominal parity of outcomes because telling people you’ll give them money creates downstream effects that are now at risk beyond just the upstream sum of money possessed by the donor (see Scott Alexander’s discussion thereof).

Ultimately this isn’t that much of a story for EA. The take away for normies is “libertarian SF tech bro nerd type makes bad.” which is not only not that interesting on its own, but even less so as a minor variation on “libertarian New York finance type makes bad,” which is dog-bites-man banal.

Sorry for the long post but I just can’t help but view the tie-in between SBF and EA credibility as this bizarre attempt at making fetch happen.

Expand full comment
Nick Y's avatar

I wonder if Matt thinks time spent actually serving others might have an effect on a person. I know spending the morning at a soup kitchen ‘doesn’t even try to maximize blah blah’ (whatever that is supposed to mean...). But maybe setting up an annual recurring payment to against malaria fails to nurture something inside the donator?

Expand full comment
516 more comments...

No posts