519 Comments

Candidly, I find the conflation of the FTX bankruptcy and EA’s general credibility to be a big stretch - kind of a square peg / round hole situation, where people keep trying to put a scandal somewhere where it doesn’t belong. Were SBF pulling the strings at any of these charities (which seems plausible only at most as to never-got-off-the-ground GAP) that would be one thing, but in general we don’t hold the credibility of charitable institutions hostage to their donors’ moral virtue — how many dyed-in-the-wool liberals see productions at Lincoln Center in buildings named for the Koch brothers?

The fact that someone who favored a lot of the same good causes that a lot of rationalist types do turned out to be engaging in suspect transactions in a space whose only demonstrable value is in facilitating illicit transactions (crypto) just isn’t that intrinsically interesting (Hitler was a vegetarian and loved dogs etc. etc.) because there’s no evidence that this was the result of the dispassionate application of utilitarian values as opposed to just, like, one guy being kind of scummy.

The stuff about risk aversion is a good point but the EA framing is masking it: while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event, what he’s reciting is just a bog-standard argument that e.g. Daniel Kahneman makes that people don’t employ proper EV analysis enough and so the appropriate corrective is to be more rational about it. The very real problems Matt points to seem less a function of diminishing marginal utility (there’s probably a lot more than $15 billion of need in the world) and more about how reliance interests actually break nominal parity of outcomes because telling people you’ll give them money creates downstream effects that are now at risk beyond just the upstream sum of money possessed by the donor (see Scott Alexander’s discussion thereof).

Ultimately this isn’t that much of a story for EA. The take away for normies is “libertarian SF tech bro nerd type makes bad.” which is not only not that interesting on its own, but even less so as a minor variation on “libertarian New York finance type makes bad,” which is dog-bites-man banal.

Sorry for the long post but I just can’t help but view the tie-in between SBF and EA credibility as this bizarre attempt at making fetch happen.

Expand full comment

The rationalist community replicates a lot of these behaviours, which is why many of us are sceptical of the whole endeavour. And 'fetch' is already happening. I'm more worried about the opposite – that because SBF was basically a criminal and a fraud, people will say 'oh, the rest of the rationalist/EA community isn't like this,' when for the most part the problems caused by weird ingroup subcultures and massive overestimation of their own rationality are still there. EA is mostly an excuse for people to rationalize doing whatever they want and still feel (or look) like they're doing good, whether that is 'stealing billions' or 'obsessing over AI'.

Expand full comment

The behavior best associated with effective altruism is giving a large fraction of your income to charity. I don't think that can be fairly characterized as rationalizing doing whatever you want.

(It is true that in the last year it's become fashionable for a certain subset of bay area person to self id as an effective altruist without actually doing anything remotely altruistic. As far as I can tell, this was just about accessing Sam's money. I'm hopeful that will self correct now that the money isn't there)

Expand full comment

Isn't there also an element of justifying working in extremely high-paying professions as a way to maximize your income and therefore maximize your donating ability? That's where things get dicey, to me.

Expand full comment

What seems dicey about that? The justification requires you to then give away the money!

Expand full comment

Because every layer abstraction is an opportunity for your human flaws to get through the rationality. 'Ok I'll spend the next ten years screwing poor people for their money to become a billionaire. Ok now I have to give it away...hmm, AI seems cool. AI risk must be the most important thing! And I mean I don't have to give it _all_ away...what a coincidence that my happiness and the world's wellbeing are maximised the same way, how lucky am I. And really, anything I give makes the world better! So maybe I'll just give 10%...'

What's missing is humility and a recognition of one's humanity. EA is just the prosperity gospel for tech libertarians.

Expand full comment

Because it does not force you to fully account for the potential harm caused by your business/professional choices. Like, I dunno, losing $16B of investor cash.

Edit to add: I am aware that some in EA do make efforts to address this: https://80000hours.org/articles/harmful-career/#case-study-earning-to-give-in-finance I just think when push comes to shove, people's tendency to justify doing something that feels good (becoming rich and influential with the donations you can make) can fuel a biased assessment of the harm you might be causing in your job.

"This negative impact is large enough to make working in finance in a way that increases the chance of a financial crisis arguably ethically wrong if you don’t donate to effective charities. But if you donate 25% of your income to Against Malaria Foundation, the benefits are over 50 times larger than the harms." -> This doesn't age well.

Expand full comment

“…it does not force you to fully account for the potential harm caused by your business/professional choices”

How could it? Such questions are impossibly complicated.

Expand full comment

Wasn’t anyone who invested in crypto basically asking to be defrauded?

Expand full comment

What is dicey about that? Working in a high paying, private, profession is simply better than working in a low-paying, or a public, profession — it does more good that way!

Expand full comment

I can think of lots of high-paying, private professions that are ethically dicier than some low-paying, public professions. Not all, but surely lots.

Expand full comment
Comment removed
Nov 18, 2022
Comment removed
Expand full comment

Correct, there is nothing inherently wrong with working in high-paying professions. However some (not all!) high-paying professions involve ethically dubious behavior that is tempting to justify because of all the $. Some low-paying professions are ethically dubious too to be sure.

Expand full comment

What are these high-paying professions involving ethically dubious behavior?

I could see an argument that many high-paying professions have government interventions which bar competitors and increase their returns; lawyers, for example, and people in the medical profession, are beneficiaries of unjust laws, and of course, there are government positions themselves. Admittedly, the largest class of parasites, the teachers, are apparently not paid well. I would suggest that ethical dubiousness is not particularly correlated with income.

Expand full comment

Replicates which behaviors? The behaviors of scamming people? Or something else?

Expand full comment

The articles making this connection do seem a bit in the vein of "I already don't like these rationalist clowns or their puffed-up EA sanctimony so when someone adjacent to a cause they support goes down, I readily associate that person's failures with the people I already don't like."

Expand full comment

A lot of people who, for self-interested reasons, don’t like the idea of charities needing to actually be useful above replacement are very happy about this news, and that’s really unfortunate.

Expand full comment
Comment removed
Nov 17, 2022Edited
Comment removed
Expand full comment

You have seen those people, you just listed three groups of people full of them.

Expand full comment

Yeah, my least favorite thing from the whole SBF saga has been, "SBF shows that I was right all along to hate something that is tenuously connected to SBF and not at all connected to his wrongdoing."

Expand full comment

It appears that EA was the motive for one of the largest financial frauds in history. "Not at all connected to wrongdoing" is a stretch.

Expand full comment

A lot of people steal money to benefit their families--does that mean “you should care for your family” is connected to wrongdoing?

Expand full comment

If there was a whole social/political movement based on the "earn to care (for your family)" philosophy where a prominent advocate was suddenly discovered to be stealing in the name of earning, well, then, claiming to "care about your family" as justification for aggressive earning would indeed start to look suspicious.

Expand full comment

I guess what I was saying is that "earn to care for your family" *is* the dominant philosophy; it's the default. We're willing to do a lot of shady stuff to help out our families (see also the college cheating scandal), and yes, we should all keep in mind—individually and as a society—that we may feel the impulse to cheat and steal on behalf of our families and that we should resist (and societally disincentivize) that impulse. But that doesn't mean that the phrase "family comes first" or the act of setting up a college fund should be viewed with suspicion. Similarly, the idea "we should all give more money to charity, and to the causes where it can do the most good" shouldn't now be tainted by someone who committed crimes on behalf (let's stipulate) of that principle.

Expand full comment

I don't think that we have any evidence at all to support the idea that if two years ago, SBF got disenchanted with EA, and decided to just keep his money for himself, the outcome for FTX would have been one iota different.

Expand full comment

It's not that FTX would have been better run (in fact it's not clear SBF ever actually believed in EA), but many of the EA's gave SBF cover for his actions. It's a similar situation with the Sacklers and museums. If EA had come out and said spending money on the Superbowl ad to encourage people to use FTX was bad, I might give them a pass but very few questioned it. I also think that the crash has exposed that a lot of EA organizations were poorly run with bad risk controls.

Expand full comment

There is definitely a lot of score settling going on here among journalists. Guys like Jeet Heer, who never seemed to be particularly interested in advocating against Crypto or EA are suddenly on a week-long tweetstorm about how damning this is to journalists that were sympathetic to EA, and Matt specifically.

The whole insistence that one's particular enemies within the commentatorate were not just gullible, but responsible, is a bit sick. Never let a good tragedy go to waste I guess.

For what it's worth SBF seems to have been a very good cheat.

Expand full comment

Jeet Heer has less integrity than anyone else I can think of in left of center Twitter.

Expand full comment

I don’t think it discredits your framework. I understand defensiveness over something you value that seems to be under attack. But this comment and approving replies read to me like people who could also stand to do a bit of introspection. Lots of reads on the motivations of others--not much peering in.

Expand full comment

I don't know, the EA community seems to do a lot of introspection. Almost to a fault. See for example https://forum.effectivealtruism.org/posts/8hvmvrgcxJJ2pYR4X/announcing-a-contest-ea-criticism-and-red-teaming.

But what is the lesson that the EA community should learn from this whole debacle exactly? Be wary of accepting large contributions from donors who's ongoing funding is uncertain? Sure, I guess. But prior to two weeks ago FTX was a company worth $32B that had all the world's largest most sophisticated investors lining up give them money. And the founder of said company who was worth 16B on paper had an apparently genuine lifelong commitment to the EA philosophy so it seemed unlikely he would just decide one day to stop honoring his funding commitments. We can look at it ex post and see it turned out quite badly, but they would have been crazy to turn down the money ex ante. What would have been the justification?

Expand full comment

As I said above, I think SBF is an example of the dangers of buying too heavily into utilitarianism and completely discarding common sense morality. I also think a lot of the mismanagement of FTX was caused in part by traits of SBF I see in the EA/rationalist community as well: over confidence in smart young people over expertise and experience and focusing heavily on theory to the detriment of practical considerations.

Expand full comment

Pay more attention to engineers, accountants, lawyers and doctors and pay less attention to philosophers and mathematicians.

Expand full comment

I do think the community (or parts of it) do good introspection. Just not this thread right here. And I don’t think your proposed lesson is close to the mark either.

Expand full comment

Why? What lesson should they draw?

Expand full comment

clearest takeaway is huge risk of overfitting to one’s own desires in anything ‘long term’.

But personally what jumps out to me is the laziness of sbf and the laziness of a philosophy that minimizes time spent doing hard things in favor of cash donations. I would like some balance.

Expand full comment

I think it's a fair criticism of long-termism in general that it runs a risk of being a vehicle to rationalize bad behavior, but I honestly don't see what the connection is to the FTX collapse. In terms of charitable giving, his "longer termist" giving was mostly related to pandemic preparedness which seems like a pretty good thing to spend money on (and honestly, doesn't really require any special philosophy to justify).

I think the "earn to give" thing is pretty mischaracterized in the discourse. It's a heuristic on the best way to do good at the current margin, not some totalizing rule that applies under any circumstance. Like if nobody was actually running charities or NGOs then at that margin it would be good for some people to actually start them rather than earning money in finance. For people to donate to good causes there obviously need to good causes to donate to.

But the situation we are in now is that we actually have many well-run charities that need money more than manpower to scale up. If you have some great idea for a charity that nobody else is doing (or some great idea to do some existing charity work in a new, better way) then it may make sense to do that. But if you don't have any such ideas and all you care about it "doing good" in some broad sense, then you can do more good by earning money and giving it to people who already have established organizations that can work effectively.

Expand full comment

In general, seems like rationalist often fail to consider that perhaps they are collectively using tools like: group thing, extended forecast time horizons, and arbitrary assigned probabilities for things we largely cannot even define to reallocate higher in salience to issues they have the most self interest in thinking about or working on while calling also labeling it all "altruism"...it seems like an odd discontinuity that more people in the rationalist community would see themselves as a free willed agent who can operate against ones' self interest again and again, instead of one who's willing to reckon with what self interests they may have that push them to do charitable things or focus intently on tail risk outcomes. I think choosing the label of "altruism" for the broader movement activities was a bit of a branding mistake (even if there is some sense that it made people MORE likely to join the ranks and give, because of the pleasant connotation it projects on one's self) largely blinds the movement from all sorts of pitfalls they (as rationalist, many of who probably do not even believe in things like "free will") might otherwise be more open to seeing.

When your focus is on bed nets, and saving lives near term in Africa, or providing glaucoma surgeries...it's MUCH harder to do this sort of thing efficiently given how concrete the measuring tools are for efficacy. But for much of Longtermism...it all goes out the window.

Expand full comment

Sure, and I am skeptical of long termism as well for roughly the same reasons. I just think that in the discourse people overemphasize the importance of the "wacky stuff" in EA because a lot of the online discussion is around that stuff. But most of EA activity and money is spent on global health and other things that are pretty uncontroversial. Even in the long termist bucket, most of the activity and money is on things that don't really require a special philosophy to think are important. Pandemic preparedness/prevention is important whether you buy into any of the philosophical arguments about long termism.

Basically all of the points you (and many others make) against EA and long termism apply equally well to anyone working on issues related to climate change. There is a lot of uncertainty and we are trying to avoid problems that are decades in the future. But we know enough to think, quite reasonably, that there is a high probability that climate change will have bad consequences even if we can't say precisely what they are or how bad. And as such, it makes sense to spend some resources now to try and prevent them.

Expand full comment

I probably count as an EA sympathizer but I promise you I don’t have nearly enough of my ego at stake here to be defensive about it, I just think the takes are off base in a descriptive sense. It’s not that I find the SBF/EA conflation *offensive,* it’s that I find it *boring,* because it feels like everyone’s reaching to find a story here on the assumption that there must be one because the collapse was so vast, when basically it is what it says on the tin: banking industry regulation exists for a reason, crypto only has evil applications, sometimes people do bad things, and pre-20th Century style financial collapses are easy to recapitulate in a similar environmental context. There’s no there there, and it’s tedious to see people reaching so hard to make one.

Expand full comment

The take I think has some pull for me is that this shows the dangers of being a pure utilitarian: it leads to insufficient risk aversion and an “ends justify the means” mindset (see SBF’s Vox interview with Kelsey Piper). If SBF had incorporated common sense morality into his worldview a bit more, it’s plausible he wouldn’t have been so reckless. And this should make us worry about the decision making and ethics of other powerful and/wealthy EAs.

This isn’t an indictment of all EA, but I think it should make us more wary of its more extreme manifestations, and I think it demonstrates that EA culture (of which SBF was part of and heavily influenced by) needs to do more to emphasize the importance of common sense morality. Some already do, but I think there needs to be more of it.

I also think SBF shared many of the same flaws I see in EA/rationalists (as someone who has interacted with that world for years): overconfidence in the judgment of smart young people relative to expertise and experience, an excessive focus on theory relative to practice, and poor social and professional boundaries, to name a few. As more comes out about FTX’s dysfunction, I’m seeing these vices pop up again and again in the leadership. So on this too I hope EAs/rationalists take this as an example of potential flaws in the culture and some ideas of these movements.

Expand full comment

I think with the trusting young people vs experts thing, the life cycle of the tech sector is kind of instructive here... the dotcom bust meant that a lot of people who would have been mid-career in the 2010s didn't go into tech. So you basically ended up in a situation where the young people had the best ideas.

On top of this, Facebook had a wildly successful IPO and amazing run up until ~2014 with an omnipotent wunderkind founder. Finally, the 2012-2019 period was characterized by super-cheap money, so VCs were basically giving young founders whatever they wanted, you can see this with like Snap and other stuff, when probably the google "adult supervision" model would have been better.

Expand full comment

I just think the elements here about focus on ‘maximization’, on people with an intuitive talent for judging what output an algorithm will have deciding morality can be determined algorithmically and then fine tuning the algo to produce the results they themselves find easiest to live out moment by moment are so glaring. You think that’s a tedious reach I guess. I’ll take you at your word. I do not.

Expand full comment

Put it this way: I think SBF did what he did because he was trying to make his numbers go up, because that is what approximately 100% of similarly situated people have done in the past, not because he felt compelled to save more orphans or buy more bed nets.

Expand full comment

Yeah that may be. Personally I think he was interested in EA and had a conscious (perhaps one more EA inclined from his childhood and personality than most) before he became a scammer. And I think the philosophy really can and should point him away from the path he walked and I would like it to think a bit more about personal growth and inventory.

Expand full comment

I hate to be the guy who says “but you screw one lousy sheep…” but shouldn’t we have, well, a SECOND case of malfeasance by someone connected to EA before we start worrying it leads people astray?

Expand full comment

He explicitly said in interviews that he didn’t believe in traditional risk management and that he’d take high risk high reward bets essentially forever because of the expected value. And he explicitly said it was his utilitarian views that lead him to that conclusion.

Expand full comment

If the problem is “explicitly trying to maximize often leads to failure” then that sounds like something that the maximizer will care about. It’s often clear that you do best at something not by trying to do that thing but by trying to do something adjacent (and the fact that so many of the people are into meditative practices indicates they should be aware of that).

Expand full comment

I really have a lot of admiration for some of what I’ve seen in the community, and many seem very thoughtful to me and seem so in this particular case as well. But honestly, I’ve been pretty disappointed by what I’ve seen around ‘longtermism’. I found this post by Matt disappointing as well, I didn’t expect him to endorse virtue but I expected him to talk more about humility.

I do think the maximizes would be best served as you say. But I no longer think that’s what I am seeing as a matter of course. It is easy to lose one’s way in this hall of mirrors.

Expand full comment

Will MacAskill (an EA leader, co-founder of Giving What We Can, etc.) had a close relationship with SBF, was on the board of the FTX Future Fund, introduced him to Elon Musk for business deals, etc. 80k Hours (a major EA communications effort, also founded by MacAskill) interviewed him on a podcast (where they spread the message that he lived frugally even though he in fact owned a private jet and hundreds of millions in Bahamian luxury real estate) and continually promoted him as a role model.

EA as a movement was really heavily tied up with this guy, it was not at all just arms length donation of money.

Expand full comment

So, firstly, this is highly relevant to the arm's length condition precedent, so thanks for the info - I'm in the position of knowing about EA in general, (and to a lesser extent 80k hours) without actually having any real knowledge of MacAskill as a person, and (I suspect like most of the world) having never heard of SBF until shortly before his spectacular downfall. I mostly think of EA in terms of the abstract implications of effectiveness- and cost-weighted giving, the relative advantage of earning to give, and then basically doing whatever GiveWell says one should (hint: it'll probably involve malaria!)

I think this makes a take to the effect of "seriously, EA as a general philosophy existed before SBF and it will exist after, the fact that this dude committed financial fraud is not relevant to bed nets except as a source of funding or to those charities he committed to funding where such funding will no longer be forthcoming" correct, at least for those situated similarly to me, but it does sound based on this and Scott Alexanders' take that the SF rationalist bubble is treating this as having broader implications due to their shared and more direct social connections. I guess my off the cuff response is: "EA is bigger than they are," although obviously at some level someone actually does need to run all this stuff, so the impact on that sphere is relevant.

With more confidence, I can say that from a high level this just sounds like some guy with a lot of wealth -- with, as it happened, a propensity to grossly unethical and frankly stupid behavior -- buying status in the San Francisco rationalist-sphere. That's not uncharacteristic behavior for an obscenely wealthy MIT nerd in general and not something we should be surprised that MacAskill or anyone in SBF's orbit was taken in by. The fact this particular status-buyer just happened to be a shockingly successful huckster seems to be more coincidence than a commentary on EA as a movement or even on dramatis personae affiliated therewith.

Lastly I think the whole "it was utilitarian St. Petersburging[1] that made them take these risks!" angle is a complete red herring. Companies that actually *make money* in and around SF are *already* dominated by EV considerations (this is venture capital in a nutshell - most best lose but enough win to pay for everything, as Kenny Easwaran states below in this comments section). "Maximizing expected value" isn't some kind of utiliarian-inflected black magic, it is *literally capitalism as it is practiced.* You can donate the profits to whatever the hell you like, but the *generation* of profits was, prior to SBF already dominated by this calculus. SBF and co. were just *bad* at it[2]. Thus, I think the utilitarian angle is a red herring and just a random buzzword for people who want a boo-light for the rat-sphere.

[1] Obviously one point of the St Petersburg paradox is that, empirically, you *don't* actually want to pay arbitrary amounts of money to participate (see also Gambler's Ruin).

[2]The point is to make *positive EV bets* and if you're good at them, you end up in the black. You don't, however, bet your entire available cash on each bet (Kelly Criterion). What actually happened here is that SBF and Alameda just did a shitty job of evaluating the EV of their own bets, dipped into some fraud to try to cover their losses, continued to lose, the end - tale as old as time.

Expand full comment

It started before he was rich. When he was in college it was EA advice to go work at Jane Street to earn-to-give. Later he quit that job (presumably rich but not rich enough to turn any heads in Palo Alto), moved to the Bay Area, and worked at an EA nonprofit for a while.

While there, he started Alameda, hiring a bunch of EAs as early employees. There are some rumors of shady business practices at that time very similar to what's come out about poor accounting at FTX, and it's unclear whether EA leadership knew or should have known about them.

After a certain point I think your description of "rich guy buying status in the Berkeley rationalist scene" has to be pretty accurate. A lot of FTX Future Fund grants look more like the donations of a scientific patron than just neutral charitable donations to research.

I tend to agree that utilitarianism here is not really the issue, but I do think certain aspects of EA ideology played some role in helping SBF con people (like uncritical assumption that billionaires are morally neutral).

I think the EA subgroup that takes the biggest is the Center for Effective Altruism/Oxford/80k hours nexus. These were very nice respectable organizations, heavily tied up with both the persona of SBF and his money. They just got off a hugely successful PR push for MacAskill's new book.

(The Bay Area rationalists can't really get a worse reputation among the general public, and have shown some integrity in not trying to defend the guy.)

Expand full comment

Agree. Rationally, the personal foibles of SPF or any other individual don't tell us anything about the validity of EA. But the fact that people nonetheless keep insisting on trying to make SBF a stand-in for the EA whole might, after all, tell us something about the viability of EA as a scaleable idea. People just aren't rational enough for a hyperrational set of ideas to be a viable movement as long as humans are primarily social rather than rational actors.

Expand full comment

The reason most people think communism is a bad idea is because of the all the Stalins and Pol Pots through the history of communism. It may indeed be that they were not doing communism correctly, but we rightly decide that if all these mass murderers litter the history of communism then perhaps it isn't such a good idea.

And I'm afraid that if effective altruism has all these prominent crypto fraudsters and cranks at the heart of it, then that perhaps is useful information. EA strikes me as a set of good ideas plus a bunch of crazy stuff that requires people with a certain type of personality to believe. There may be sound reasons why people find EA to be warped.

Expand full comment

I think we should really dissect this because it's important.

So, first, "The reason most people think communism is a bad idea is because of the all the Stalins and Pol Pots through the history of communism."

Is that in fact "the reason" to think that communism is bad? I don't think it is. As Allan Thoen mentions, we've got despots of all ideological stripes. But more so, we can tell a pretty straightforward story about why communism produces despots: it's because a centrally planned economy requires an incredible concentration of power. There are no incentives in a communist economy to do the right thing, only punishments for doing the wrong thing, and a single entity is supposed to set up basically everything in life.

What the Stalins and Pol Pots are, I think, is a warning flag. You say, "Gosh, it sure looks like we get a lot of particularly murderous tyrants in this system, let's dive in and see what's going on there." And, diving in, you say, "Oh, okay, I see the way that this system is mechanically bad."

How does this analogy line up to SBF and EA?

Well, I mean, let's first note that SBF is not the latest example of a long line of fraudsters who have plagued EA from the start. He's pretty much an example of one.

Second, is SBF uniquely terrible? I mean, I don't think he is? The story of what exactly happened in FTX is still developing, but my read on it so far is that he's kind of a garden variety "guy who was sure that he could turn everything to gold so all these internal safety rules don't apply to him." I feel like we see a pretty regular stream of these guys. In SBF's case, the amount of money was large -- that seems like it's basically the result of the huge amount of capital sloshing around in the last decade and especially during the pandemic.

Third, having seen some kind of warning about EA, when we dive into EA, do we see a mechanical link between EA and SBF's financial malfeasance? I mean... not really? People have tried to link the idea of utilitarianism, of like, "Oh, well maybe he cynically defrauded people in order to fund charity," but I don't think that's what actually happened. Or just the general sense of, "Maybe EA just draws people who are arrogant." Maybe there's some amount of truth to that, but is EA actually more full of arrogant people than, say, "Extremely standard politics"? I don't think it is.

Expand full comment

But the Stalins and Pol Pots aren't the reason communism is bad (murderous tyrants come in all stripes). The actual reason communism is bad is that it's unrealistic, hyperrational utopianism. For it to function, everyone at all levels of society and the system has to behave perfectly rationally in sync, sharing the same beliefs, values and calculations about how much relative weight to give the greater good versus their own personal good. Even assuming people were all operating from the same factual information and premises, that's just too much uniformity of opinion to expect from more than a small, cohesive cadre of true believers. EA suffers from a similar problem.

Expand full comment

This seems like a ... stretched ... analogy. I would agree that if EA people were murdering millions of people that it would make EA bad even if their intentions are noble, but that's not happening. And the "crazy ideas" (AI risk and such) are a very small part of the overall EA pie. The vast majority of EA work and spending is still on global health, development, animal welfare and "longtermist" projects that seem pretty uncontroversial such as pandemic prevention/preparedness.

Expand full comment

The analogy is that communism requires slavery, and the FTX business plan (the whole plan, not just the part sold to investors) requires fraud.

Expand full comment

So that’s an argument against crypto, maybe, but not EA. There’s no reason EA requires fraud.

Expand full comment

I don't think the FTX business plan required fraud? It was an exchange. It made money by charging fees to connect buyers and sellers. (Alameda also didn't have an inherently fraudulent business plan. It was a prop trader). It just... didn't work, and instead of letting it go under, they committed fraud instead.

I feel like the same thing happens a hundred times a day in America, just with lower amounts of money. Someone who owns a dry cleaner takes out a loan to, whatever, renovate or buy new equipment and thinks that the operations of the business will cover the expense, but it doesn't, so they start, making large tax deductions of fraudulent business expenses in order to make the books and the cash flow work. That doesn't mean that the business plan of a dry cleaner is inherently dependent on fraud, it just means that this particular dry cleaner decided to commit fraud instead of letting their business fail.

SBF/FTX/Alameda are interesting in that the amount of money is billions instead of tens of thousands, but people really seem to be stretching to imagine that there is some kind of fundamentally different process going on here, instead of a completely garden-variety white collar criminal response to their business not doing as well as they had hoped.

Expand full comment

Who’s the second most notorious (in the sense of prominent wrongdoing) EA figure, after SBF?

Expand full comment

"...the personal foibles of SPF..."

I think SPF is one factor, but mostly he was trying to limit his own exposure in order to save his skin. Alas, he spent too much time on the beach in the Bahamas, and wound up getting fried. But then, his being fried was kind of baked in.

Expand full comment

Perhaps he was too baked while his head was buried in the sand.

Expand full comment

This fact that humans are primarily social rather than rational actors is why the way to grow rationality is to build rational movements rather than convert individuals. This is, I believe, why there is so much money spent on “movement building” (as mentioned in the original post) even though that sort of spending is often the most questionable from an effectiveness standpoint in any direct measure.

Expand full comment

I bet that spending is, by far, the most fun.

Expand full comment

That's a really interesting observation.

Expand full comment

It's bizarre that one would see the downfall of the most prominent, apparently financially successful EA practitioner as disconnected from EA's credibility. The analogy with the Lincoln Center not taking reputational damage by accepting Koch money is terrible. The Kochs didn't run their business according to arts philanthropy priniciples! The Kochs didn't maximize their profits to give more to museums, universities and hospitals. Similarly, it's not very relevant to say that we don't blame vegetarians for Hitler's monstrosities. I don't see how you arrive at the idea EA was peripheral to SBF and SBF was peripheral to EA. He built his life around EA and a good chunk of EA activity has been based on his money.

There's a sort of "weak EA" that Matt Yglesias has advocated for, roughly giving more, and more effectively. I'd agree that this "weak EA" shouldn't be harmed by SBF's meltdown. But the "strong EA" with the exclusive, introspective but "not a cult" community, the longtermism, the extreme risk neutral calculation EV over millenia with heroic assumptions, zero discount rates etc -- the credibility and reputation of those ideas and their proponents has taken a hammering.

Expand full comment

I'm not sure if I agree or not with you, but I will say that I'm fairly confident that the "weak EA" is *by far* the most common type of EA counted by number of "believers", amount of money spent, number of transactions...and this weak EA is getting painted with the same brush.

Expand full comment

It honestly reminds me of the people who use (say) the Woody Allen assault allegations to say “you know, I never liked his movies.” It’s piggybacking on a substantive allegation with an unrelated dunk.

Expand full comment

Is it not quite an indictment that a movement that prides itself on long term forecasting ability failed to see the inherent danger and risk of being associated with what many people in crypto realize is basically an arbitrary speculative bubble? How is this just not a giant kick in the shorts indictment of what so many perceive Longermism's strengths (projecting existential risks outside their and their children's' own lifetimes and trying to do things now to head them off) are?

https://manifold.markets/NathanpmYoung/will-ftx-go-bankrupt-before-2024

Literally, the crypto winter has been ongoing for months now. There has been a broader Tech Recession for a year, basically. There is a long history of confidence collapses in the Crypto space taking entire companies out. There is a more recent one going back to June. More specifically, reporting in June that Alameda had holding companies that were sunk by Celsuis collapsing, etc. Given knowledge by SBF (and surely others in crypto) that the entire House was built with Cards all along (as he basically admitted to Levine), it just seems even odder that the people that pride themselves on their forecasting....just didn't see it coming and had absolutely no contingency plan or hedge for it.

Would seem like a good time to adjust their priors and reconsider just how far out the horizon is...and how far into the future they can truly consider without just literally making it up. To just try and shift blame on to the bad actions of one misguided Utilitarian who didn't get it...is really just sort of trying to wash hands of all criticism.

Expand full comment

This seems like a reasonable indictment of the investors and backers of FTX, but a terrible indictment of the people (viz., EA) who were in the position of just choosing to either accept free money from a guy with apparently billions of dollars and the backing of firms like Blackrock, or else just....not accept said free money?

Expand full comment

I think it's an indictment when accepting free money to not plan for the money not being there. Granted depending on how big the fraud is you might be impacted by significant clawbacks and I don't think that should be planned for but if someone came to me offering money from a highly speculative source (crypto) I wouldn't be spending it as if it will continue indefinitely.

Expand full comment

Is there an existing principle that charities shouldn’t accept money that was made from overvalued assets? Did any charities circa 2000 refuse money from tech entrepreneurs, or were they discredited if they accepted it?

Expand full comment

I think the principle is "don't accept stolen money". The debate is whether all crypto is a ponzi, or whether it is merely overvalued like internet stocks in 2000.

Expand full comment

Did they have any good reason to believe the money was stolen though?

Expand full comment

>while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event

A bit tangential, but I don't think anyone should be taking at face value that he thought the exchange with Piper was off the record; he's too smart and manipulative and has done too much media work for that to be believable.

The first rule of crisis PR is that you try to control the narrative, and by faking a "hot mic" exchange (a vintage President Bartlet move, btw) SBF has found a way to start doing that. Though the conversation makes him look like a bit of an asshole you'll notice it's oddly exculpatory: he doesn't admit to intentional fraud, he says the whole thing was basically sloppy accounting, and he says his top priority is reimbursing depositors. He also starts throwing in some bait for right wingers with references to virtue signaling "woke" folks and some salty words for regulators, foreshadowing a repositioning to a more rightist persona (common move for a left-coded figure undergoing reputational crisis).

Anyway, of course all of this only reinforces OPs point that SBF is not on the level, but thought it was an interesting thing to bring up.

Expand full comment

I think this gives him too much credit. "I thought the journalist was my friend" is a very old/common problem.

Expand full comment

I mean, we can't know for sure obviously! But IMO the safest thing to do is proceed on the basis that everything this guy says is a calculated lie to maximize his own interests at the expense of others.

SBA is basically just Trump with a few aesthetic variables switched: Palo Alto instead of Queens, crummy t-shirts instead of gold toilets, billboards in SF instead of buildings in NY, fake hot-mic interviews instead of fake calls to People Magaine gossip columnists (https://www.washingtonpost.com/politics/donald-trump-alter-ego-barron/2016/05/12/02ac99ec-16fe-11e6-aa55-670cabef46e0_story.html)...

But the same low-cunning, the same ease of lying, the same canny ability to play media like a fiddle to gain wealth and power.

Expand full comment

The idea that new better thinking can "solve" charitable giving is suspect, just as crypto "solving" finance is a joke. Both tend to undervalue the social and institutional elements, and reduce things to equations. Religious organizations while also prone to corruption, are deeply embedded in the communities they serve and provide a traditional channels to giving.

Second, the most reasonable place to give is local charities where you personally know the leadership and can see the impact. There may be more effect of giving dollars to Brundi instead of the local food bank, but only if I'm reasonably confident my dollars aren't lining several pockets along the way. This isn't easily solved with audits which can be avoided and falsified. If I'm giving to a local charities can be visited and have local reputations.

Inevitably corrupt people take root in charitable channels and heavy, inefficient institutions are unavailable. The tech nerd savant vibe is a red flag in both crypto and EA.

Expand full comment

The effect of this, of course, is that people in Burundi continue to suffer. As far as I know there isn't evidence of bed net money being misappropriated to a significant degree, but let's grant that some of it happens. It still seems feasible that given the desperation of the people who need the bed nets, and the tiny amount of money it takes to improve their lives, the money that does get through may still do more good than money to the local art museum does. In fact that seems almost certain to me.

It's also funny to me that the critiques of EA I've seen in these comments are "you can't say objectively what charitable money does the most good" and "it doesn't make sense to send your money to places where you can't confirm it's doing good." I think the critique can either be that objectivity in giving is impossible, or that we need more objectivity in giving, but not both.

Part of it too is that I think the person who wants to benefit a cause that doesn't directly benefit them, or that is outside their daily experience, is doing something admirable over and above the objective results. The Stoic philosopher Epictetus speaks approvingly of a woman who sent gifts to a man who had been exiled by Diocletian. Someone told her, "Why bother? You know Diocletian is just going to seize the gifts." She responded, "I would rather he seize them than that I should fail to send them."

(Of course, an effective altruist would say it doesn't make sense sending money that you know isn't going to get where it's supposed to go. The point I'm making is that people seem to treat the urge to help those in distant places as being in some way suspect or foolish, when I think it's anything but.)

Expand full comment

Thanks for writing this comment so I didn’t have to. Agree completely- the bad behavior if some individuals doesn’t negate the value or merit of an idea.

Expand full comment

I agree that a lot of criticism is opportunistic from people who didn't like the community in the first place even though they rarely addressed whether, say, GiveWell itself is bad and whether people should stop giving it on the grounds of X. OTOH, I think it is also an opportunity for people involved in these to reflect and improve on various aspects like culture, organizational practices, transparency, etc.

I also think it would be good for others impacted to reflect similarly. Where financial investigative journalists failed (where CoinDesk succeeded), where regulators who interacted so closely with lobbyists failed, etc. Their failings aren't (for now) criticized as vehemently, and I can speculate why, but regardless of external criticism it is fair to think more deeply.

Expand full comment

> Where financial investigative journalists failed (where CoinDesk succeeded)

i’m adjacent to financial journalism (work for a small but respected outlet no one outside of our very niche corner of the industry has heard of) so my guess is that coinbase has been reporting on crypto for waaaaay longer so their journos have more of a source base in the industry, while the rest of us have started the catchup process pretty recently.

Expand full comment

Would more transparency in GiveWell or other EA organizations have changed the course of the scandal?

Expand full comment

Just thinking aloud... It is possible they already considered many of these. I have no insight into how non-profits operate.

I think GiveWell is more involved in evaluating specific programs, and I find them to be thorough and engaging to read about. If they can somehow emphasize the trustworthiness of actors/institutions they recommend, that would be reassuring to donors.

Some EA individuals/organizations seem heavily into abstract theory and philosophy. I find them less interesting, harder to evaluate, and think their influence is over-stated. Perhaps it is more than I thought, in which case they have areas of improvement.

I wonder if there is something that can be done to minimize ripple effects from future fraud. Stop accepting donations in crypto? Insurance? Many EA institutions are relatively young, so perhaps they can look towards other well-respected institutions (Red Cross has been around for 150+ years) and see if there are lessons there.

I doubt that any of these could have changed the course of the scandal. I also think there will always be bad actors even in the future, and there is no guarantee there. I would look towards improvements in investigative reporting and regulation more on that aspect.

Expand full comment

"making fetch happen" LOL!

And yes -- I wonder how much of this is because there is a big crypto Twitter presence. Offline, I'm not sure how many people have ever even heard of SBF prior to his being in the news for thus scandal.

Expand full comment

I wonder if Matt thinks time spent actually serving others might have an effect on a person. I know spending the morning at a soup kitchen ‘doesn’t even try to maximize blah blah’ (whatever that is supposed to mean...). But maybe setting up an annual recurring payment to against malaria fails to nurture something inside the donator?

Expand full comment

"...time spent actually serving others..."

SBF may soon enjoy the personal growth opportunities of serving time with others.

Expand full comment

Maybe even serving soup in the cafeteria?

Expand full comment

You know, I'm sure that setting up an annual recurring payment DOES fail to nurture something inside the donator. And indeed I think that's the point. What's the goal of charity? Is it to spiritually improve the donator? Or practically improve the life of the donatee?

Expand full comment

I grew up in Evangelical missionary circles, and when I first learned of EA, I was amused that utilitarians had reinvented ideas common in those circles. One of the strongest messages of the NT is to give all you can, to the poorest of the poor, without regard to political boundaries (see, e.g., the parables of the Good Samaritan, the Sheep and the Goats, the Wedding Feast, Jesus's encounter with the young rich man). St. Basil was an EA in the fourth century. But the idea that charity is not also a spiritual discipline for yourself would be bizarre to them.

"Do not store up for yourselves treasures on earth, where moth and rust destroy, and where thieves break in and steal. But store up for yourselves treasures in heaven, where moth and rust do not destroy, and where thieves do not break in and steal. For where your treasure is, there your heart will be also."

Expand full comment

I think it’s both, that’s my point

Expand full comment