Some thoughts on the FTX collapse
Implications for EA, for me, and for linear utility theory
New episode of Bad Takes is out, about the temptation to engage in way-too-early 2024 prognostication.
Several months ago, I found myself having a few mocktails and splitting vegan snacks with Sam Bankman-Fried at a restaurant near my house.1 We touched on, among other things, his proposal to create a new publication featuring writers he liked, including me.
I declined, which obviously in retrospect was the right choice. I told him that I like my Substack just fine and make plenty of money, though he was happy to offer more. But I also told him that given the extent to which we agree on a lot of important issues, I thought it was a lot more valuable to these causes for me to maintain credibility by not accepting any of his money.
Most of the other things we talked about turned out to be duplicative of things he’s said in other on-the-record interviews that I just hadn’t read or listened to.
Until very recently, for example, I thought I had an unpublishable, off-the-record scoop about his weird idea that someone with his level of wealth should be indifferent between the status quo and a double-or-nothing bet with 50:50 odds. That I had this information made me nervous on behalf of people making plans based on his grants and his promises of money — I didn’t realize this is actually something he’s repeatedly said publicly and on the record. When FTX blew up, my initial thought was, “there goes Mr. Risk Neutral who for some reason doesn’t believe in diminishing returns, bringing to ruin a bunch of important projects.”
Bankman-Fried’s publicly stated plan was reckless. But the truth appears to be much worse than reckless, even as it’s still not fully clear exactly how much worse (did he and his circle lose the money? did they pocket it?), and for those of us who defended him against some of his critics, a reckoning is due.
SBF funded some good causes
SBF is notable in politics because he gave a lot of money. He’s notable to me in particular because he gave to causes that I think are good.
And I’d hasten to add that he gave a lot of money to causes that I believe many of the people I’ve seen sharply criticize Effective Altruism on Twitter also believe are good. He gave tens of millions of dollars to get Donald Trump out of the White House (and also quite a bit to Democrats in the 2022 cycle). In this case, the cause he favored prevailed. But it prevailed narrowly. Any political success is a highly collective endeavor and no one person is responsible for any result. But money does matter in politics, and some of these races — including very notably the 2020 electoral college — were very close. It’s plausible that without SBF’s money, Trump would still be in the White House.
That kind of spending earns you a lot of access in politics. But in some respects, it earns you less influence than you’d think.
SBF’s brother Gabe Bankman-Fried ran Guarding Against Pandemics, a group that was not very successful at getting Congress to appropriate a few tens of billions of dollars to pandemic preparedness. This was a good proposal on the merits as almost all thoughtful people agree, no matter how scornful they are about Effective Altruism or the idea of “longtermism” (an idea that I also have some serious doubts about). Eventually, GAP started focusing on the smaller idea of trying to secure a significant boost in spending for BARDA. It’s possible that Sam and Gabe were on track to have more success in the lame duck or in the next Congress, but now I fear the whole idea is going to become radioactive.
And that’s the larger disaster here.
By betraying his clients — I don’t know whether he “defrauded” or whatever else in a legal sense,2 but he certainly betrayed them — SBF is leaving many of his causes worse off than they would have been if he’d never invested in them. He’s also brought negative brand energy to the whole concept of Effective Altruism, including the people working in areas he didn't spend as much of his money on, like farm animal welfare and public health in poor countries.
One of the core tenets of EA is that results count, not intentions. Perhaps Bankman-Fried had convinced himself that whatever he was up to was for the greater good, but it just wasn’t. We are now more likely than ever to keep funding dangerous dual-use biological research and to continue under-investing in promising virus-killing technologies.
The harms done are staggeringly large, even beyond the theft and destruction of huge sums of money.
What was I thinking?
Since FTX went bankrupt, a lot of people have been passing around the Matt Levine interview in which SBF appeared to concede that crypto is mostly used for scams and Ponzi schemes. Levine recounts that this paradoxically made him think better of Bankman-Fried:
In fact, I came away from that conversation bullish on FTX and Bankman-Fried. My view was, and is, that if you talk to a crypto exchange operator and he is like “crypto is changing the world, your old-fashioned economics are just FUD, HODL,” then that’s bad. A wild-eyed crypto true believer is not the person to operate an exchange. The person you want operating an exchange is a clear-eyed trader. You want someone whose basic attitude to financial assets is, like, “if someone wants to buy and someone wants to sell, I will put them together and collect a fee.” You want someone whose perspective is driven by markets, not ideology, who cares about risk, not futurism. A certain cynicism about the products he is trading is probably healthy.
A lot of tech and VC people and a lot of crypto lobbyists in D.C. try to convince you that crypto is really good, which is very unconvincing to me. But the Levine interview made SBF sound sensible. When the two of us chatted, he didn’t try to give me a hard sell on the merits of crypto — he said it’s good business to make a good crypto exchange.
There’s an old saw to the effect that the best way to make money in a gold rush is to sell shovels. And I took that to be SBF’s strategy. It didn’t really matter whether it was a real gold rush or a fool’s gold rush; he had an opportunity to sell shovels. The fact that SBF was not a particularly strident true believer in crypto seemed not like a true confession that the whole thing was a scam, but like an appealing piece of self-awareness. His stated goal for FTX was to become a software platform for trading all kinds of securities and financial products so as to become less of a “crypto” company. That would have made perfect sense as a narrative arc — start at Jane Street doing arbitrage trades, then strike out on his own doing crypto arbitrage, then build tools for trading crypto, and then build general trading tools. The crypto itself is just a ladder that’s cast aside.
In retrospect, this sounds more like Michael Corleone promising that the family will be totally legitimate in five years. And at root, it’s a duplicitous game — you’re telling one audience that you’re not some crypto-fanatic yokel, but on the other hand you have all this marketing that fundamentally is premised on the idea that there’s gold out there in the crypto mines.
What’s EA got to do with it?
Effective Altruism existed before anyone had ever heard of Sam Bankman-Fried, and there have always been people who were deeply annoyed by it. Personally, I still find the following EA claims to be very compelling:
The typical middle-class or richer resident in the developed world is a pretty privileged person, all things considered, who could be doing tremendous good in the world by being somewhat more charitable.
An extremely large share of philanthropy goes to causes that cannot be even remotely described as maximizing benefit for humanity. If society were inclined to demand just a little bit of rigor in terms of “why are you supporting this?” we could unlock a lot of good.
Direct cash transfers to the poorest people in the world seem like a good cause with scaling potential; there also appear to be several interventions related to health and nutrition that are even more valuable at the margin than cash.
There is a lot of animal suffering in the world that could be remediated with regulations that are not too costly. There are also some promising research lines that might be able to massively reduce animal suffering if we invest in them.
Improving biotechnology is greatly increasing the risk of engineered pandemics, and we ought to be doing more to block those risks.
More speculatively, AI is progressing very rapidly, with the research mostly done by for-profit companies whose main interest is in ad sales and by Chinese firms who are trying to entrench the power of a tyrannical regime. Neither group is particularly attentive to the risks involved in this research, and we should try to promote both workable regulatory controls and more responsible research programs.
There are a number of important U.S. policy areas that are relatively neglected by mainstream U.S. political advocacy spending, including housing supply production, international labor mobility, and macroeconomic stabilization.
SBF supported some but not all of those goals, which on net I thought was good. If you are tediously familiar with the details of EA institutions, I think you’ll see my list is closer to the priorities of Open Philanthropy (the Dustin Moskovitz / Cari Tuna EA funding vehicle) than to those of the FTX Future Fund. In part, that’s because as you can see in the name, SBF was very publicly affiliated with promoting the “longtermism” idea, which I find to be a little bit confused. What’s more, a good chunk of that giving went into the somewhat nebulous category of EA community building, where even separate from any allegations of scandal it’s just hard to know what’s really being accomplished.
But the fact that so much EA money was coming from a crypto guy naturally aroused suspicion that the whole thing was a front for crypto lobbying. I argued (before having met SBF) that this was a misread of the situation, given his literally lifelong involvement in utilitarianism. I did warn that SBF supporting good causes didn’t mean that we should assume his crypto agenda is benign — sincere belief that your wealth benefits humanity can be dangerous:
Of course if you’re very skeptical of cryptocurrency you might not find this reassuring — a person running a cryptocurrency exchange for the larger purpose of saving all of humanity might actually be more ruthless in his lobbying practices than someone doing it for something as banal as money.
I did not, of course, seriously consider the possibility that he would just steal his clients’ money. And given what we now know, you have to be suspicious about the downstream spending as well. All the official and unofficial EA material emphasizes the importance of integrity and does not encourage people to run scams or break faith with others. But I do think the situation poses some questions that the community as a whole will need to reckon with.
A weird risk analysis error
To end where we began, over and above any subterfuge or crimes, I’m struck that Bankman-Fried’s stated plan with FTX, Alameda, and altruism was very bad.
Here’s how it’s described by 80,000 Hours introducing their interview with him:
If you were offered a 100% chance of $1 million to keep yourself, or a 10% chance of $15 million — it makes total sense to play it safe. You’d be devastated if you lost, and barely happier if you won.
But if you were offered a 100% chance of donating $1 billion, or a 10% chance of donating $15 billion, you should just go with whatever has the highest expected value — that is, probability multiplied by the goodness of the outcome — and so swing for the fences.
This is the totally rational but rarely seen high-risk approach to philanthropy championed by today’s guest, Sam Bankman-Fried. Sam founded the cryptocurrency trading platform FTX, which has grown his wealth from around $1 million to $20,000 million.
I think if you stick with the 100 percent chance of $1 million vs. 10 percent chance of $15 million example and assume we’re talking about global health and welfare donations, the stated logic makes sense. Basically, if you had 100 EAs who all got offered this choice, you’d want all 100 to take Option B, which generates $150 million in contributions rather than Option A which generates only $100 million.
But when you push this up into the range of billions of dollars and are talking about grant-making and political influence, it doesn’t make sense. The whole calculus is based on the idea that the volume of need is so large that when it comes to helping the global poor, you don’t face diminishing marginal returns in the relevant financial range. But there clearly are diminishing returns in grant-making, community-building, and political advocacy. And the instability itself becomes costly. People don’t want to join projects that are likely to vanish without warning, and the vanishing itself leaves resources stranded and wasted. I was confused when SBF explained this to me and kind of thought it was an end-of-the-bar bluster despite the lack of alcohol, but it was the actual official doctrine.
I don’t know what role excessive risk appetite played in the ultimate unraveling of his enterprise, but the fact that he articulated this line so forcefully speaks to poor judgment, and the fact that it was echoed by others as sound EA doctrine seems worse. Suppose there was no fraud and no misuse of funds but a Probability Fairy just offered him 50.00001% odds on a double or nothing bet for all his money and he lost, forcing the rapid unplanned shutdown of everything he was funding. That would be bad — much worse than the upside of doubling his money. Validating this kind of thinking, more than anything that anyone did or didn’t know about FTX, really raises doubts in my mind about the judgment of EA community leaders.
The bottom line for me
Nothing specific that has come out about SBF or FTX so far has shed any light on the details of the operations and activities of GAP here in D.C. But in light of what we’ve learned about misconduct at FTX, it seems really foolish to assume there’s nothing nefarious there.
I hope it will turn out that the money that flowed into Guarding Against Pandemics was all spent in reasonable ways in pursuit of a good cause, but all I can really say for sure right now is that at least some of the money was spent on good things. But the information that’s come out about FTX’s balance sheet and business practices is frankly shocking, and at this point I wouldn’t be surprised to learn new terrible things about how projects SBF funded were run.
Personally, it’s also hard not to just generally feel worse about the “EA community” as a set of social institutions distinct from the specific ideas. I always had sort of mixed feelings about this, and I gave money to GiveWell’s Top Charities Fund for years before I ever attended my first EA conference. And while I thought the conference was fine, afterward I felt more confident that I would keep donating to GiveWell than that I would ever go to another EA conference.
Knowing now what I know, I feel even more strongly about that.
If two weeks ago you found the whole scene to be obnoxious and weird and suffused with an odd mix of arrogance and credulity, recent events have tended to vindicate that. And yet it’s still true that there is a lot of suffering in the world that can be ameliorated or avoided at a relatively low cost, and it’s worth taking some time to consider what you personally can do to contribute to that. There are still lots of people working on lots of important causes, there are lots of causes that could use money, and there are even job opportunities available to help identify and evaluate scalable, cost-effective ways to help. I can only hope that one company’s catastrophic misdeeds don’t completely derail this work.
For the record, we split the check.
I’m not a lawyer and certainly not an expert in the relevant Bahamian rules or what protections a person is legally entitled to when they start dealing with offshore financial entities or any of the rest of this saga.
Candidly, I find the conflation of the FTX bankruptcy and EA’s general credibility to be a big stretch - kind of a square peg / round hole situation, where people keep trying to put a scandal somewhere where it doesn’t belong. Were SBF pulling the strings at any of these charities (which seems plausible only at most as to never-got-off-the-ground GAP) that would be one thing, but in general we don’t hold the credibility of charitable institutions hostage to their donors’ moral virtue — how many dyed-in-the-wool liberals see productions at Lincoln Center in buildings named for the Koch brothers?
The fact that someone who favored a lot of the same good causes that a lot of rationalist types do turned out to be engaging in suspect transactions in a space whose only demonstrable value is in facilitating illicit transactions (crypto) just isn’t that intrinsically interesting (Hitler was a vegetarian and loved dogs etc. etc.) because there’s no evidence that this was the result of the dispassionate application of utilitarian values as opposed to just, like, one guy being kind of scummy.
The stuff about risk aversion is a good point but the EA framing is masking it: while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event, what he’s reciting is just a bog-standard argument that e.g. Daniel Kahneman makes that people don’t employ proper EV analysis enough and so the appropriate corrective is to be more rational about it. The very real problems Matt points to seem less a function of diminishing marginal utility (there’s probably a lot more than $15 billion of need in the world) and more about how reliance interests actually break nominal parity of outcomes because telling people you’ll give them money creates downstream effects that are now at risk beyond just the upstream sum of money possessed by the donor (see Scott Alexander’s discussion thereof).
Ultimately this isn’t that much of a story for EA. The take away for normies is “libertarian SF tech bro nerd type makes bad.” which is not only not that interesting on its own, but even less so as a minor variation on “libertarian New York finance type makes bad,” which is dog-bites-man banal.
Sorry for the long post but I just can’t help but view the tie-in between SBF and EA credibility as this bizarre attempt at making fetch happen.
I wonder if Matt thinks time spent actually serving others might have an effect on a person. I know spending the morning at a soup kitchen ‘doesn’t even try to maximize blah blah’ (whatever that is supposed to mean...). But maybe setting up an annual recurring payment to against malaria fails to nurture something inside the donator?