521 Comments
Nov 17, 2022·edited Nov 17, 2022

Candidly, I find the conflation of the FTX bankruptcy and EA’s general credibility to be a big stretch - kind of a square peg / round hole situation, where people keep trying to put a scandal somewhere where it doesn’t belong. Were SBF pulling the strings at any of these charities (which seems plausible only at most as to never-got-off-the-ground GAP) that would be one thing, but in general we don’t hold the credibility of charitable institutions hostage to their donors’ moral virtue — how many dyed-in-the-wool liberals see productions at Lincoln Center in buildings named for the Koch brothers?

The fact that someone who favored a lot of the same good causes that a lot of rationalist types do turned out to be engaging in suspect transactions in a space whose only demonstrable value is in facilitating illicit transactions (crypto) just isn’t that intrinsically interesting (Hitler was a vegetarian and loved dogs etc. etc.) because there’s no evidence that this was the result of the dispassionate application of utilitarian values as opposed to just, like, one guy being kind of scummy.

The stuff about risk aversion is a good point but the EA framing is masking it: while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event, what he’s reciting is just a bog-standard argument that e.g. Daniel Kahneman makes that people don’t employ proper EV analysis enough and so the appropriate corrective is to be more rational about it. The very real problems Matt points to seem less a function of diminishing marginal utility (there’s probably a lot more than $15 billion of need in the world) and more about how reliance interests actually break nominal parity of outcomes because telling people you’ll give them money creates downstream effects that are now at risk beyond just the upstream sum of money possessed by the donor (see Scott Alexander’s discussion thereof).

Ultimately this isn’t that much of a story for EA. The take away for normies is “libertarian SF tech bro nerd type makes bad.” which is not only not that interesting on its own, but even less so as a minor variation on “libertarian New York finance type makes bad,” which is dog-bites-man banal.

Sorry for the long post but I just can’t help but view the tie-in between SBF and EA credibility as this bizarre attempt at making fetch happen.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

The rationalist community replicates a lot of these behaviours, which is why many of us are sceptical of the whole endeavour. And 'fetch' is already happening. I'm more worried about the opposite – that because SBF was basically a criminal and a fraud, people will say 'oh, the rest of the rationalist/EA community isn't like this,' when for the most part the problems caused by weird ingroup subcultures and massive overestimation of their own rationality are still there. EA is mostly an excuse for people to rationalize doing whatever they want and still feel (or look) like they're doing good, whether that is 'stealing billions' or 'obsessing over AI'.

Expand full comment

The behavior best associated with effective altruism is giving a large fraction of your income to charity. I don't think that can be fairly characterized as rationalizing doing whatever you want.

(It is true that in the last year it's become fashionable for a certain subset of bay area person to self id as an effective altruist without actually doing anything remotely altruistic. As far as I can tell, this was just about accessing Sam's money. I'm hopeful that will self correct now that the money isn't there)

Expand full comment

Isn't there also an element of justifying working in extremely high-paying professions as a way to maximize your income and therefore maximize your donating ability? That's where things get dicey, to me.

Expand full comment

What seems dicey about that? The justification requires you to then give away the money!

Expand full comment

Because every layer abstraction is an opportunity for your human flaws to get through the rationality. 'Ok I'll spend the next ten years screwing poor people for their money to become a billionaire. Ok now I have to give it away...hmm, AI seems cool. AI risk must be the most important thing! And I mean I don't have to give it _all_ away...what a coincidence that my happiness and the world's wellbeing are maximised the same way, how lucky am I. And really, anything I give makes the world better! So maybe I'll just give 10%...'

What's missing is humility and a recognition of one's humanity. EA is just the prosperity gospel for tech libertarians.

Expand full comment

Because it does not force you to fully account for the potential harm caused by your business/professional choices. Like, I dunno, losing $16B of investor cash.

Edit to add: I am aware that some in EA do make efforts to address this: https://80000hours.org/articles/harmful-career/#case-study-earning-to-give-in-finance I just think when push comes to shove, people's tendency to justify doing something that feels good (becoming rich and influential with the donations you can make) can fuel a biased assessment of the harm you might be causing in your job.

"This negative impact is large enough to make working in finance in a way that increases the chance of a financial crisis arguably ethically wrong if you don’t donate to effective charities. But if you donate 25% of your income to Against Malaria Foundation, the benefits are over 50 times larger than the harms." -> This doesn't age well.

Expand full comment

“…it does not force you to fully account for the potential harm caused by your business/professional choices”

How could it? Such questions are impossibly complicated.

Expand full comment

Wasn’t anyone who invested in crypto basically asking to be defrauded?

Expand full comment

What is dicey about that? Working in a high paying, private, profession is simply better than working in a low-paying, or a public, profession — it does more good that way!

Expand full comment

I can think of lots of high-paying, private professions that are ethically dicier than some low-paying, public professions. Not all, but surely lots.

Expand full comment
Comment removed
Expand full comment

Correct, there is nothing inherently wrong with working in high-paying professions. However some (not all!) high-paying professions involve ethically dubious behavior that is tempting to justify because of all the $. Some low-paying professions are ethically dubious too to be sure.

Expand full comment

What are these high-paying professions involving ethically dubious behavior?

I could see an argument that many high-paying professions have government interventions which bar competitors and increase their returns; lawyers, for example, and people in the medical profession, are beneficiaries of unjust laws, and of course, there are government positions themselves. Admittedly, the largest class of parasites, the teachers, are apparently not paid well. I would suggest that ethical dubiousness is not particularly correlated with income.

Expand full comment

Replicates which behaviors? The behaviors of scamming people? Or something else?

Expand full comment

The articles making this connection do seem a bit in the vein of "I already don't like these rationalist clowns or their puffed-up EA sanctimony so when someone adjacent to a cause they support goes down, I readily associate that person's failures with the people I already don't like."

Expand full comment

A lot of people who, for self-interested reasons, don’t like the idea of charities needing to actually be useful above replacement are very happy about this news, and that’s really unfortunate.

Expand full comment
RemovedNov 17, 2022·edited Nov 17, 2022
Comment removed
Expand full comment

You have seen those people, you just listed three groups of people full of them.

Expand full comment

4. People who think that cryptocurrency is pernicious for various reasons and are happy to see its cheerleaders and architects go down

Expand full comment

Yeah, my least favorite thing from the whole SBF saga has been, "SBF shows that I was right all along to hate something that is tenuously connected to SBF and not at all connected to his wrongdoing."

Expand full comment

It appears that EA was the motive for one of the largest financial frauds in history. "Not at all connected to wrongdoing" is a stretch.

Expand full comment

A lot of people steal money to benefit their families--does that mean “you should care for your family” is connected to wrongdoing?

Expand full comment

If there was a whole social/political movement based on the "earn to care (for your family)" philosophy where a prominent advocate was suddenly discovered to be stealing in the name of earning, well, then, claiming to "care about your family" as justification for aggressive earning would indeed start to look suspicious.

Expand full comment

I guess what I was saying is that "earn to care for your family" *is* the dominant philosophy; it's the default. We're willing to do a lot of shady stuff to help out our families (see also the college cheating scandal), and yes, we should all keep in mind—individually and as a society—that we may feel the impulse to cheat and steal on behalf of our families and that we should resist (and societally disincentivize) that impulse. But that doesn't mean that the phrase "family comes first" or the act of setting up a college fund should be viewed with suspicion. Similarly, the idea "we should all give more money to charity, and to the causes where it can do the most good" shouldn't now be tainted by someone who committed crimes on behalf (let's stipulate) of that principle.

Expand full comment

I don't think that we have any evidence at all to support the idea that if two years ago, SBF got disenchanted with EA, and decided to just keep his money for himself, the outcome for FTX would have been one iota different.

Expand full comment

It's not that FTX would have been better run (in fact it's not clear SBF ever actually believed in EA), but many of the EA's gave SBF cover for his actions. It's a similar situation with the Sacklers and museums. If EA had come out and said spending money on the Superbowl ad to encourage people to use FTX was bad, I might give them a pass but very few questioned it. I also think that the crash has exposed that a lot of EA organizations were poorly run with bad risk controls.

Expand full comment

There is definitely a lot of score settling going on here among journalists. Guys like Jeet Heer, who never seemed to be particularly interested in advocating against Crypto or EA are suddenly on a week-long tweetstorm about how damning this is to journalists that were sympathetic to EA, and Matt specifically.

The whole insistence that one's particular enemies within the commentatorate were not just gullible, but responsible, is a bit sick. Never let a good tragedy go to waste I guess.

For what it's worth SBF seems to have been a very good cheat.

Expand full comment

Jeet Heer has less integrity than anyone else I can think of in left of center Twitter.

Expand full comment

I don’t think it discredits your framework. I understand defensiveness over something you value that seems to be under attack. But this comment and approving replies read to me like people who could also stand to do a bit of introspection. Lots of reads on the motivations of others--not much peering in.

Expand full comment

I don't know, the EA community seems to do a lot of introspection. Almost to a fault. See for example https://forum.effectivealtruism.org/posts/8hvmvrgcxJJ2pYR4X/announcing-a-contest-ea-criticism-and-red-teaming.

But what is the lesson that the EA community should learn from this whole debacle exactly? Be wary of accepting large contributions from donors who's ongoing funding is uncertain? Sure, I guess. But prior to two weeks ago FTX was a company worth $32B that had all the world's largest most sophisticated investors lining up give them money. And the founder of said company who was worth 16B on paper had an apparently genuine lifelong commitment to the EA philosophy so it seemed unlikely he would just decide one day to stop honoring his funding commitments. We can look at it ex post and see it turned out quite badly, but they would have been crazy to turn down the money ex ante. What would have been the justification?

Expand full comment

As I said above, I think SBF is an example of the dangers of buying too heavily into utilitarianism and completely discarding common sense morality. I also think a lot of the mismanagement of FTX was caused in part by traits of SBF I see in the EA/rationalist community as well: over confidence in smart young people over expertise and experience and focusing heavily on theory to the detriment of practical considerations.

Expand full comment

Pay more attention to engineers, accountants, lawyers and doctors and pay less attention to philosophers and mathematicians.

Expand full comment

I do think the community (or parts of it) do good introspection. Just not this thread right here. And I don’t think your proposed lesson is close to the mark either.

Expand full comment

Why? What lesson should they draw?

Expand full comment

clearest takeaway is huge risk of overfitting to one’s own desires in anything ‘long term’.

But personally what jumps out to me is the laziness of sbf and the laziness of a philosophy that minimizes time spent doing hard things in favor of cash donations. I would like some balance.

Expand full comment

I think it's a fair criticism of long-termism in general that it runs a risk of being a vehicle to rationalize bad behavior, but I honestly don't see what the connection is to the FTX collapse. In terms of charitable giving, his "longer termist" giving was mostly related to pandemic preparedness which seems like a pretty good thing to spend money on (and honestly, doesn't really require any special philosophy to justify).

I think the "earn to give" thing is pretty mischaracterized in the discourse. It's a heuristic on the best way to do good at the current margin, not some totalizing rule that applies under any circumstance. Like if nobody was actually running charities or NGOs then at that margin it would be good for some people to actually start them rather than earning money in finance. For people to donate to good causes there obviously need to good causes to donate to.

But the situation we are in now is that we actually have many well-run charities that need money more than manpower to scale up. If you have some great idea for a charity that nobody else is doing (or some great idea to do some existing charity work in a new, better way) then it may make sense to do that. But if you don't have any such ideas and all you care about it "doing good" in some broad sense, then you can do more good by earning money and giving it to people who already have established organizations that can work effectively.

Expand full comment
founding

In general, seems like rationalist often fail to consider that perhaps they are collectively using tools like: group thing, extended forecast time horizons, and arbitrary assigned probabilities for things we largely cannot even define to reallocate higher in salience to issues they have the most self interest in thinking about or working on while calling also labeling it all "altruism"...it seems like an odd discontinuity that more people in the rationalist community would see themselves as a free willed agent who can operate against ones' self interest again and again, instead of one who's willing to reckon with what self interests they may have that push them to do charitable things or focus intently on tail risk outcomes. I think choosing the label of "altruism" for the broader movement activities was a bit of a branding mistake (even if there is some sense that it made people MORE likely to join the ranks and give, because of the pleasant connotation it projects on one's self) largely blinds the movement from all sorts of pitfalls they (as rationalist, many of who probably do not even believe in things like "free will") might otherwise be more open to seeing.

When your focus is on bed nets, and saving lives near term in Africa, or providing glaucoma surgeries...it's MUCH harder to do this sort of thing efficiently given how concrete the measuring tools are for efficacy. But for much of Longtermism...it all goes out the window.

Expand full comment

Sure, and I am skeptical of long termism as well for roughly the same reasons. I just think that in the discourse people overemphasize the importance of the "wacky stuff" in EA because a lot of the online discussion is around that stuff. But most of EA activity and money is spent on global health and other things that are pretty uncontroversial. Even in the long termist bucket, most of the activity and money is on things that don't really require a special philosophy to think are important. Pandemic preparedness/prevention is important whether you buy into any of the philosophical arguments about long termism.

Basically all of the points you (and many others make) against EA and long termism apply equally well to anyone working on issues related to climate change. There is a lot of uncertainty and we are trying to avoid problems that are decades in the future. But we know enough to think, quite reasonably, that there is a high probability that climate change will have bad consequences even if we can't say precisely what they are or how bad. And as such, it makes sense to spend some resources now to try and prevent them.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

I probably count as an EA sympathizer but I promise you I don’t have nearly enough of my ego at stake here to be defensive about it, I just think the takes are off base in a descriptive sense. It’s not that I find the SBF/EA conflation *offensive,* it’s that I find it *boring,* because it feels like everyone’s reaching to find a story here on the assumption that there must be one because the collapse was so vast, when basically it is what it says on the tin: banking industry regulation exists for a reason, crypto only has evil applications, sometimes people do bad things, and pre-20th Century style financial collapses are easy to recapitulate in a similar environmental context. There’s no there there, and it’s tedious to see people reaching so hard to make one.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

The take I think has some pull for me is that this shows the dangers of being a pure utilitarian: it leads to insufficient risk aversion and an “ends justify the means” mindset (see SBF’s Vox interview with Kelsey Piper). If SBF had incorporated common sense morality into his worldview a bit more, it’s plausible he wouldn’t have been so reckless. And this should make us worry about the decision making and ethics of other powerful and/wealthy EAs.

This isn’t an indictment of all EA, but I think it should make us more wary of its more extreme manifestations, and I think it demonstrates that EA culture (of which SBF was part of and heavily influenced by) needs to do more to emphasize the importance of common sense morality. Some already do, but I think there needs to be more of it.

I also think SBF shared many of the same flaws I see in EA/rationalists (as someone who has interacted with that world for years): overconfidence in the judgment of smart young people relative to expertise and experience, an excessive focus on theory relative to practice, and poor social and professional boundaries, to name a few. As more comes out about FTX’s dysfunction, I’m seeing these vices pop up again and again in the leadership. So on this too I hope EAs/rationalists take this as an example of potential flaws in the culture and some ideas of these movements.

Expand full comment

I think with the trusting young people vs experts thing, the life cycle of the tech sector is kind of instructive here... the dotcom bust meant that a lot of people who would have been mid-career in the 2010s didn't go into tech. So you basically ended up in a situation where the young people had the best ideas.

On top of this, Facebook had a wildly successful IPO and amazing run up until ~2014 with an omnipotent wunderkind founder. Finally, the 2012-2019 period was characterized by super-cheap money, so VCs were basically giving young founders whatever they wanted, you can see this with like Snap and other stuff, when probably the google "adult supervision" model would have been better.

Expand full comment

I just think the elements here about focus on ‘maximization’, on people with an intuitive talent for judging what output an algorithm will have deciding morality can be determined algorithmically and then fine tuning the algo to produce the results they themselves find easiest to live out moment by moment are so glaring. You think that’s a tedious reach I guess. I’ll take you at your word. I do not.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

Put it this way: I think SBF did what he did because he was trying to make his numbers go up, because that is what approximately 100% of similarly situated people have done in the past, not because he felt compelled to save more orphans or buy more bed nets.

Expand full comment

Yeah that may be. Personally I think he was interested in EA and had a conscious (perhaps one more EA inclined from his childhood and personality than most) before he became a scammer. And I think the philosophy really can and should point him away from the path he walked and I would like it to think a bit more about personal growth and inventory.

Expand full comment

I hate to be the guy who says “but you screw one lousy sheep…” but shouldn’t we have, well, a SECOND case of malfeasance by someone connected to EA before we start worrying it leads people astray?

Expand full comment

He explicitly said in interviews that he didn’t believe in traditional risk management and that he’d take high risk high reward bets essentially forever because of the expected value. And he explicitly said it was his utilitarian views that lead him to that conclusion.

Expand full comment
founding

If the problem is “explicitly trying to maximize often leads to failure” then that sounds like something that the maximizer will care about. It’s often clear that you do best at something not by trying to do that thing but by trying to do something adjacent (and the fact that so many of the people are into meditative practices indicates they should be aware of that).

Expand full comment

I really have a lot of admiration for some of what I’ve seen in the community, and many seem very thoughtful to me and seem so in this particular case as well. But honestly, I’ve been pretty disappointed by what I’ve seen around ‘longtermism’. I found this post by Matt disappointing as well, I didn’t expect him to endorse virtue but I expected him to talk more about humility.

I do think the maximizes would be best served as you say. But I no longer think that’s what I am seeing as a matter of course. It is easy to lose one’s way in this hall of mirrors.

Expand full comment

Will MacAskill (an EA leader, co-founder of Giving What We Can, etc.) had a close relationship with SBF, was on the board of the FTX Future Fund, introduced him to Elon Musk for business deals, etc. 80k Hours (a major EA communications effort, also founded by MacAskill) interviewed him on a podcast (where they spread the message that he lived frugally even though he in fact owned a private jet and hundreds of millions in Bahamian luxury real estate) and continually promoted him as a role model.

EA as a movement was really heavily tied up with this guy, it was not at all just arms length donation of money.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

So, firstly, this is highly relevant to the arm's length condition precedent, so thanks for the info - I'm in the position of knowing about EA in general, (and to a lesser extent 80k hours) without actually having any real knowledge of MacAskill as a person, and (I suspect like most of the world) having never heard of SBF until shortly before his spectacular downfall. I mostly think of EA in terms of the abstract implications of effectiveness- and cost-weighted giving, the relative advantage of earning to give, and then basically doing whatever GiveWell says one should (hint: it'll probably involve malaria!)

I think this makes a take to the effect of "seriously, EA as a general philosophy existed before SBF and it will exist after, the fact that this dude committed financial fraud is not relevant to bed nets except as a source of funding or to those charities he committed to funding where such funding will no longer be forthcoming" correct, at least for those situated similarly to me, but it does sound based on this and Scott Alexanders' take that the SF rationalist bubble is treating this as having broader implications due to their shared and more direct social connections. I guess my off the cuff response is: "EA is bigger than they are," although obviously at some level someone actually does need to run all this stuff, so the impact on that sphere is relevant.

With more confidence, I can say that from a high level this just sounds like some guy with a lot of wealth -- with, as it happened, a propensity to grossly unethical and frankly stupid behavior -- buying status in the San Francisco rationalist-sphere. That's not uncharacteristic behavior for an obscenely wealthy MIT nerd in general and not something we should be surprised that MacAskill or anyone in SBF's orbit was taken in by. The fact this particular status-buyer just happened to be a shockingly successful huckster seems to be more coincidence than a commentary on EA as a movement or even on dramatis personae affiliated therewith.

Lastly I think the whole "it was utilitarian St. Petersburging[1] that made them take these risks!" angle is a complete red herring. Companies that actually *make money* in and around SF are *already* dominated by EV considerations (this is venture capital in a nutshell - most best lose but enough win to pay for everything, as Kenny Easwaran states below in this comments section). "Maximizing expected value" isn't some kind of utiliarian-inflected black magic, it is *literally capitalism as it is practiced.* You can donate the profits to whatever the hell you like, but the *generation* of profits was, prior to SBF already dominated by this calculus. SBF and co. were just *bad* at it[2]. Thus, I think the utilitarian angle is a red herring and just a random buzzword for people who want a boo-light for the rat-sphere.

[1] Obviously one point of the St Petersburg paradox is that, empirically, you *don't* actually want to pay arbitrary amounts of money to participate (see also Gambler's Ruin).

[2]The point is to make *positive EV bets* and if you're good at them, you end up in the black. You don't, however, bet your entire available cash on each bet (Kelly Criterion). What actually happened here is that SBF and Alameda just did a shitty job of evaluating the EV of their own bets, dipped into some fraud to try to cover their losses, continued to lose, the end - tale as old as time.

Expand full comment

It started before he was rich. When he was in college it was EA advice to go work at Jane Street to earn-to-give. Later he quit that job (presumably rich but not rich enough to turn any heads in Palo Alto), moved to the Bay Area, and worked at an EA nonprofit for a while.

While there, he started Alameda, hiring a bunch of EAs as early employees. There are some rumors of shady business practices at that time very similar to what's come out about poor accounting at FTX, and it's unclear whether EA leadership knew or should have known about them.

After a certain point I think your description of "rich guy buying status in the Berkeley rationalist scene" has to be pretty accurate. A lot of FTX Future Fund grants look more like the donations of a scientific patron than just neutral charitable donations to research.

I tend to agree that utilitarianism here is not really the issue, but I do think certain aspects of EA ideology played some role in helping SBF con people (like uncritical assumption that billionaires are morally neutral).

I think the EA subgroup that takes the biggest is the Center for Effective Altruism/Oxford/80k hours nexus. These were very nice respectable organizations, heavily tied up with both the persona of SBF and his money. They just got off a hugely successful PR push for MacAskill's new book.

(The Bay Area rationalists can't really get a worse reputation among the general public, and have shown some integrity in not trying to defend the guy.)

Expand full comment

Agree. Rationally, the personal foibles of SPF or any other individual don't tell us anything about the validity of EA. But the fact that people nonetheless keep insisting on trying to make SBF a stand-in for the EA whole might, after all, tell us something about the viability of EA as a scaleable idea. People just aren't rational enough for a hyperrational set of ideas to be a viable movement as long as humans are primarily social rather than rational actors.

Expand full comment

The reason most people think communism is a bad idea is because of the all the Stalins and Pol Pots through the history of communism. It may indeed be that they were not doing communism correctly, but we rightly decide that if all these mass murderers litter the history of communism then perhaps it isn't such a good idea.

And I'm afraid that if effective altruism has all these prominent crypto fraudsters and cranks at the heart of it, then that perhaps is useful information. EA strikes me as a set of good ideas plus a bunch of crazy stuff that requires people with a certain type of personality to believe. There may be sound reasons why people find EA to be warped.

Expand full comment

I think we should really dissect this because it's important.

So, first, "The reason most people think communism is a bad idea is because of the all the Stalins and Pol Pots through the history of communism."

Is that in fact "the reason" to think that communism is bad? I don't think it is. As Allan Thoen mentions, we've got despots of all ideological stripes. But more so, we can tell a pretty straightforward story about why communism produces despots: it's because a centrally planned economy requires an incredible concentration of power. There are no incentives in a communist economy to do the right thing, only punishments for doing the wrong thing, and a single entity is supposed to set up basically everything in life.

What the Stalins and Pol Pots are, I think, is a warning flag. You say, "Gosh, it sure looks like we get a lot of particularly murderous tyrants in this system, let's dive in and see what's going on there." And, diving in, you say, "Oh, okay, I see the way that this system is mechanically bad."

How does this analogy line up to SBF and EA?

Well, I mean, let's first note that SBF is not the latest example of a long line of fraudsters who have plagued EA from the start. He's pretty much an example of one.

Second, is SBF uniquely terrible? I mean, I don't think he is? The story of what exactly happened in FTX is still developing, but my read on it so far is that he's kind of a garden variety "guy who was sure that he could turn everything to gold so all these internal safety rules don't apply to him." I feel like we see a pretty regular stream of these guys. In SBF's case, the amount of money was large -- that seems like it's basically the result of the huge amount of capital sloshing around in the last decade and especially during the pandemic.

Third, having seen some kind of warning about EA, when we dive into EA, do we see a mechanical link between EA and SBF's financial malfeasance? I mean... not really? People have tried to link the idea of utilitarianism, of like, "Oh, well maybe he cynically defrauded people in order to fund charity," but I don't think that's what actually happened. Or just the general sense of, "Maybe EA just draws people who are arrogant." Maybe there's some amount of truth to that, but is EA actually more full of arrogant people than, say, "Extremely standard politics"? I don't think it is.

Expand full comment

But the Stalins and Pol Pots aren't the reason communism is bad (murderous tyrants come in all stripes). The actual reason communism is bad is that it's unrealistic, hyperrational utopianism. For it to function, everyone at all levels of society and the system has to behave perfectly rationally in sync, sharing the same beliefs, values and calculations about how much relative weight to give the greater good versus their own personal good. Even assuming people were all operating from the same factual information and premises, that's just too much uniformity of opinion to expect from more than a small, cohesive cadre of true believers. EA suffers from a similar problem.

Expand full comment

This seems like a ... stretched ... analogy. I would agree that if EA people were murdering millions of people that it would make EA bad even if their intentions are noble, but that's not happening. And the "crazy ideas" (AI risk and such) are a very small part of the overall EA pie. The vast majority of EA work and spending is still on global health, development, animal welfare and "longtermist" projects that seem pretty uncontroversial such as pandemic prevention/preparedness.

Expand full comment

The analogy is that communism requires slavery, and the FTX business plan (the whole plan, not just the part sold to investors) requires fraud.

Expand full comment

So that’s an argument against crypto, maybe, but not EA. There’s no reason EA requires fraud.

Expand full comment

I don't think the FTX business plan required fraud? It was an exchange. It made money by charging fees to connect buyers and sellers. (Alameda also didn't have an inherently fraudulent business plan. It was a prop trader). It just... didn't work, and instead of letting it go under, they committed fraud instead.

I feel like the same thing happens a hundred times a day in America, just with lower amounts of money. Someone who owns a dry cleaner takes out a loan to, whatever, renovate or buy new equipment and thinks that the operations of the business will cover the expense, but it doesn't, so they start, making large tax deductions of fraudulent business expenses in order to make the books and the cash flow work. That doesn't mean that the business plan of a dry cleaner is inherently dependent on fraud, it just means that this particular dry cleaner decided to commit fraud instead of letting their business fail.

SBF/FTX/Alameda are interesting in that the amount of money is billions instead of tens of thousands, but people really seem to be stretching to imagine that there is some kind of fundamentally different process going on here, instead of a completely garden-variety white collar criminal response to their business not doing as well as they had hoped.

Expand full comment

Who’s the second most notorious (in the sense of prominent wrongdoing) EA figure, after SBF?

Expand full comment

"...the personal foibles of SPF..."

I think SPF is one factor, but mostly he was trying to limit his own exposure in order to save his skin. Alas, he spent too much time on the beach in the Bahamas, and wound up getting fried. But then, his being fried was kind of baked in.

Expand full comment

Perhaps he was too baked while his head was buried in the sand.

Expand full comment
founding

This fact that humans are primarily social rather than rational actors is why the way to grow rationality is to build rational movements rather than convert individuals. This is, I believe, why there is so much money spent on “movement building” (as mentioned in the original post) even though that sort of spending is often the most questionable from an effectiveness standpoint in any direct measure.

Expand full comment

I bet that spending is, by far, the most fun.

Expand full comment

That's a really interesting observation.

Expand full comment

It's bizarre that one would see the downfall of the most prominent, apparently financially successful EA practitioner as disconnected from EA's credibility. The analogy with the Lincoln Center not taking reputational damage by accepting Koch money is terrible. The Kochs didn't run their business according to arts philanthropy priniciples! The Kochs didn't maximize their profits to give more to museums, universities and hospitals. Similarly, it's not very relevant to say that we don't blame vegetarians for Hitler's monstrosities. I don't see how you arrive at the idea EA was peripheral to SBF and SBF was peripheral to EA. He built his life around EA and a good chunk of EA activity has been based on his money.

There's a sort of "weak EA" that Matt Yglesias has advocated for, roughly giving more, and more effectively. I'd agree that this "weak EA" shouldn't be harmed by SBF's meltdown. But the "strong EA" with the exclusive, introspective but "not a cult" community, the longtermism, the extreme risk neutral calculation EV over millenia with heroic assumptions, zero discount rates etc -- the credibility and reputation of those ideas and their proponents has taken a hammering.

Expand full comment

I'm not sure if I agree or not with you, but I will say that I'm fairly confident that the "weak EA" is *by far* the most common type of EA counted by number of "believers", amount of money spent, number of transactions...and this weak EA is getting painted with the same brush.

Expand full comment

It honestly reminds me of the people who use (say) the Woody Allen assault allegations to say “you know, I never liked his movies.” It’s piggybacking on a substantive allegation with an unrelated dunk.

Expand full comment
founding

Is it not quite an indictment that a movement that prides itself on long term forecasting ability failed to see the inherent danger and risk of being associated with what many people in crypto realize is basically an arbitrary speculative bubble? How is this just not a giant kick in the shorts indictment of what so many perceive Longermism's strengths (projecting existential risks outside their and their children's' own lifetimes and trying to do things now to head them off) are?

https://manifold.markets/NathanpmYoung/will-ftx-go-bankrupt-before-2024

Literally, the crypto winter has been ongoing for months now. There has been a broader Tech Recession for a year, basically. There is a long history of confidence collapses in the Crypto space taking entire companies out. There is a more recent one going back to June. More specifically, reporting in June that Alameda had holding companies that were sunk by Celsuis collapsing, etc. Given knowledge by SBF (and surely others in crypto) that the entire House was built with Cards all along (as he basically admitted to Levine), it just seems even odder that the people that pride themselves on their forecasting....just didn't see it coming and had absolutely no contingency plan or hedge for it.

Would seem like a good time to adjust their priors and reconsider just how far out the horizon is...and how far into the future they can truly consider without just literally making it up. To just try and shift blame on to the bad actions of one misguided Utilitarian who didn't get it...is really just sort of trying to wash hands of all criticism.

Expand full comment

This seems like a reasonable indictment of the investors and backers of FTX, but a terrible indictment of the people (viz., EA) who were in the position of just choosing to either accept free money from a guy with apparently billions of dollars and the backing of firms like Blackrock, or else just....not accept said free money?

Expand full comment

I think it's an indictment when accepting free money to not plan for the money not being there. Granted depending on how big the fraud is you might be impacted by significant clawbacks and I don't think that should be planned for but if someone came to me offering money from a highly speculative source (crypto) I wouldn't be spending it as if it will continue indefinitely.

Expand full comment

Is there an existing principle that charities shouldn’t accept money that was made from overvalued assets? Did any charities circa 2000 refuse money from tech entrepreneurs, or were they discredited if they accepted it?

Expand full comment
founding

I think the principle is "don't accept stolen money". The debate is whether all crypto is a ponzi, or whether it is merely overvalued like internet stocks in 2000.

Expand full comment

Did they have any good reason to believe the money was stolen though?

Expand full comment

>while Kelsey Piper’s recent piece discussing SBF makes clear he’s full of it on a lot of public pronouncements in any event

A bit tangential, but I don't think anyone should be taking at face value that he thought the exchange with Piper was off the record; he's too smart and manipulative and has done too much media work for that to be believable.

The first rule of crisis PR is that you try to control the narrative, and by faking a "hot mic" exchange (a vintage President Bartlet move, btw) SBF has found a way to start doing that. Though the conversation makes him look like a bit of an asshole you'll notice it's oddly exculpatory: he doesn't admit to intentional fraud, he says the whole thing was basically sloppy accounting, and he says his top priority is reimbursing depositors. He also starts throwing in some bait for right wingers with references to virtue signaling "woke" folks and some salty words for regulators, foreshadowing a repositioning to a more rightist persona (common move for a left-coded figure undergoing reputational crisis).

Anyway, of course all of this only reinforces OPs point that SBF is not on the level, but thought it was an interesting thing to bring up.

Expand full comment

I think this gives him too much credit. "I thought the journalist was my friend" is a very old/common problem.

Expand full comment

I mean, we can't know for sure obviously! But IMO the safest thing to do is proceed on the basis that everything this guy says is a calculated lie to maximize his own interests at the expense of others.

SBA is basically just Trump with a few aesthetic variables switched: Palo Alto instead of Queens, crummy t-shirts instead of gold toilets, billboards in SF instead of buildings in NY, fake hot-mic interviews instead of fake calls to People Magaine gossip columnists (https://www.washingtonpost.com/politics/donald-trump-alter-ego-barron/2016/05/12/02ac99ec-16fe-11e6-aa55-670cabef46e0_story.html)...

But the same low-cunning, the same ease of lying, the same canny ability to play media like a fiddle to gain wealth and power.

Expand full comment

The idea that new better thinking can "solve" charitable giving is suspect, just as crypto "solving" finance is a joke. Both tend to undervalue the social and institutional elements, and reduce things to equations. Religious organizations while also prone to corruption, are deeply embedded in the communities they serve and provide a traditional channels to giving.

Second, the most reasonable place to give is local charities where you personally know the leadership and can see the impact. There may be more effect of giving dollars to Brundi instead of the local food bank, but only if I'm reasonably confident my dollars aren't lining several pockets along the way. This isn't easily solved with audits which can be avoided and falsified. If I'm giving to a local charities can be visited and have local reputations.

Inevitably corrupt people take root in charitable channels and heavy, inefficient institutions are unavailable. The tech nerd savant vibe is a red flag in both crypto and EA.

Expand full comment

The effect of this, of course, is that people in Burundi continue to suffer. As far as I know there isn't evidence of bed net money being misappropriated to a significant degree, but let's grant that some of it happens. It still seems feasible that given the desperation of the people who need the bed nets, and the tiny amount of money it takes to improve their lives, the money that does get through may still do more good than money to the local art museum does. In fact that seems almost certain to me.

It's also funny to me that the critiques of EA I've seen in these comments are "you can't say objectively what charitable money does the most good" and "it doesn't make sense to send your money to places where you can't confirm it's doing good." I think the critique can either be that objectivity in giving is impossible, or that we need more objectivity in giving, but not both.

Part of it too is that I think the person who wants to benefit a cause that doesn't directly benefit them, or that is outside their daily experience, is doing something admirable over and above the objective results. The Stoic philosopher Epictetus speaks approvingly of a woman who sent gifts to a man who had been exiled by Diocletian. Someone told her, "Why bother? You know Diocletian is just going to seize the gifts." She responded, "I would rather he seize them than that I should fail to send them."

(Of course, an effective altruist would say it doesn't make sense sending money that you know isn't going to get where it's supposed to go. The point I'm making is that people seem to treat the urge to help those in distant places as being in some way suspect or foolish, when I think it's anything but.)

Expand full comment

Thanks for writing this comment so I didn’t have to. Agree completely- the bad behavior if some individuals doesn’t negate the value or merit of an idea.

Expand full comment

I agree that a lot of criticism is opportunistic from people who didn't like the community in the first place even though they rarely addressed whether, say, GiveWell itself is bad and whether people should stop giving it on the grounds of X. OTOH, I think it is also an opportunity for people involved in these to reflect and improve on various aspects like culture, organizational practices, transparency, etc.

I also think it would be good for others impacted to reflect similarly. Where financial investigative journalists failed (where CoinDesk succeeded), where regulators who interacted so closely with lobbyists failed, etc. Their failings aren't (for now) criticized as vehemently, and I can speculate why, but regardless of external criticism it is fair to think more deeply.

Expand full comment

> Where financial investigative journalists failed (where CoinDesk succeeded)

i’m adjacent to financial journalism (work for a small but respected outlet no one outside of our very niche corner of the industry has heard of) so my guess is that coinbase has been reporting on crypto for waaaaay longer so their journos have more of a source base in the industry, while the rest of us have started the catchup process pretty recently.

Expand full comment

Would more transparency in GiveWell or other EA organizations have changed the course of the scandal?

Expand full comment

Just thinking aloud... It is possible they already considered many of these. I have no insight into how non-profits operate.

I think GiveWell is more involved in evaluating specific programs, and I find them to be thorough and engaging to read about. If they can somehow emphasize the trustworthiness of actors/institutions they recommend, that would be reassuring to donors.

Some EA individuals/organizations seem heavily into abstract theory and philosophy. I find them less interesting, harder to evaluate, and think their influence is over-stated. Perhaps it is more than I thought, in which case they have areas of improvement.

I wonder if there is something that can be done to minimize ripple effects from future fraud. Stop accepting donations in crypto? Insurance? Many EA institutions are relatively young, so perhaps they can look towards other well-respected institutions (Red Cross has been around for 150+ years) and see if there are lessons there.

I doubt that any of these could have changed the course of the scandal. I also think there will always be bad actors even in the future, and there is no guarantee there. I would look towards improvements in investigative reporting and regulation more on that aspect.

Expand full comment
founding

"making fetch happen" LOL!

And yes -- I wonder how much of this is because there is a big crypto Twitter presence. Offline, I'm not sure how many people have ever even heard of SBF prior to his being in the news for thus scandal.

Expand full comment

I wonder if Matt thinks time spent actually serving others might have an effect on a person. I know spending the morning at a soup kitchen ‘doesn’t even try to maximize blah blah’ (whatever that is supposed to mean...). But maybe setting up an annual recurring payment to against malaria fails to nurture something inside the donator?

Expand full comment

"...time spent actually serving others..."

SBF may soon enjoy the personal growth opportunities of serving time with others.

Expand full comment

Maybe even serving soup in the cafeteria?

Expand full comment

You know, I'm sure that setting up an annual recurring payment DOES fail to nurture something inside the donator. And indeed I think that's the point. What's the goal of charity? Is it to spiritually improve the donator? Or practically improve the life of the donatee?

Expand full comment

I grew up in Evangelical missionary circles, and when I first learned of EA, I was amused that utilitarians had reinvented ideas common in those circles. One of the strongest messages of the NT is to give all you can, to the poorest of the poor, without regard to political boundaries (see, e.g., the parables of the Good Samaritan, the Sheep and the Goats, the Wedding Feast, Jesus's encounter with the young rich man). St. Basil was an EA in the fourth century. But the idea that charity is not also a spiritual discipline for yourself would be bizarre to them.

"Do not store up for yourselves treasures on earth, where moth and rust destroy, and where thieves break in and steal. But store up for yourselves treasures in heaven, where moth and rust do not destroy, and where thieves do not break in and steal. For where your treasure is, there your heart will be also."

Expand full comment

I think it’s both, that’s my point

Expand full comment

In what ratio?

Expand full comment

Altruism cannot be quantified. This is nonsensical.

Expand full comment
founding

People benefited can be quantified.

Expand full comment
founding

But that begs the question, assuming the goal of altruism is some measurable quantity of "number of people" and an agreed-upon definition of "benefited"

How many people are benefited by hearing Bach or seeing the Statue of David? Are those better or worse than having another 1,000 mosquito nets? I think these are quasi-religious questions and EA is ill-equipped to answer them.

Expand full comment

Only to a limited extent.the benefits of human interaction gained when you yourself volunteer cannot. And certainly “benefit to your soul as ratio to benefit to others” is -I think - was asked here is pure gibberish.

Expand full comment

Charitable dollars can be quantified.

Expand full comment

Telling you what?

Expand full comment

For me, you are pointing to the weakness of consequentialism. There is not a single variable to reduce ‘ethics’ to, and it simply tempts me to create some over-engineered algo for you. A story about how x hours of SBFs life spent actually living ethical actions might have enabled a much longer and more fruitful career of more impersonal EA.

But that story could be fruitful for some, I don’t know.

But as a rule of thumb? I’d say for a very rich person 10% of their working hours could be converted to practical, personal do gooding. That’s an amount that will make an impression in the rest of their lives, imo.

Expand full comment

I'm not a consequentialist.

And sorry, I was unclear: I'm not asking what ratio of someone's life should be spent on improving their soul, I'm asking what ratio of charitable donations should be spent on improving the donor's soul versus improving the outcomes of the less fortunate.

Expand full comment

Well I’m trying to get at the difference between donating $ and donating one’s own time and deeds. I’m

Not sure any cash donations improve the ‘donors soul’ more than others.

I am saying hours in the soup kitchen are completely different from donated $. There’s no ratio between them, they have no relationship to each other at all except that people with free money to donate probably have free time as well in some sense.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

as someone who has held with intense firmness to virtue ethics all these years despite constant haranguing about how it's not "rationally founded" and provides "ambiguous action guidance", seeing this comment gladdens me.

i can only hope that one day people will realize that moral character is not fake namby-pamby bullshit, no matter how much the consequentialists and materialists want it to be, and you can't take a shortcut around developing citizens of virtue.

Expand full comment

I think a lot of charitable activity that is "inefficient" could be described as community-building. Clearly donations of time and money to my kid's relatively middle-class public school don't pass any utility maximization analysis. I think on some level you could describe it as "selfish," either in the sense that it's an effort to help your own kid or help folks who you might come into contact with (when you are talking about donations to a coat drive or gift drive for less privileged kids).

But I think it's also a worthwhile enterprise worth a few hours a month and a small chunk of change to be involved in working with the school community, etc. How many thinkpieces have we read about the decline of clubs/membership orgs, etc. I think a pure "EA" view would de this is wasteful. But I think there's real value in community.

Expand full comment

It's completely true that there's real value in community - in rich countries having a strong community is the most important thing for happiness.

But it is also true that 1.6 million people, primarily children, die every year from dieereal diseases and all of them could be saved with water+sugar+salt. You'd have to work extremely hard to convince me that anything in a rich country is more important than stopping these children from dying.

It is completely natural to care about building the community one is part of, but in general rich people are part of rich communities and if everyone does this it is the very poorest who suffer.

Expand full comment

I think most EA adherents agree with this.

Expand full comment

Yes. I don't know how you quantify it (and maybe you cannot) but there is something to be said for *literally* doing the work sometimes. Maybe you are not maximizing your potential (or whatever) for impacting as many people as possible, but there is something nourishing for the soul (or your analogous equivalent) about being in the trenches, helping.

Expand full comment

I don't think it's really fair to "blah blah whatever" past trying to measure how much help your donations are accomplishing. Sure, not everything can be quantified but some things can. In terms of the problems facing modern philanthropy, spending on projects that make donors feel good -or increase their prestige but don't actually improve society that much seems to be a bigger problem than donors failing to grow spiritually.

Expand full comment

You’re right that I was too flip. I found it frustrating that philosophy major Matt made no mention of this very common type of idea in this ‘what went wrong’

retrospective. It’s certainly good to let donors at least know what different charities are actually accomplishing. not sure on the bigger problem part of what you’re saying. They are different problems. Transactional ‘charity’ like getting your kid into school with a big donation is not charity to me, but it’s not my opinion that anyone is confused about this. I’m not actually sure donations to other ‘ineffective’ charities like maintaining a lovely old church are about prestige and making people feel good about themselves primarily. I think lots of people place value in institutions and community that is not easily quantifiable and I personally am not about to attempt to try in general. I’m sure there are some limits that I would look at and say ‘no this one was self aggrandizement for sure!’

Expand full comment

Why in the hell do we care about the giver? Oughtn’t we care about the person in poverty?

Expand full comment

I think it helps the donor. You would like to spend the money on yourself and instead give it to help other people. That has a good effect on your heart.

Also, I think we tend to believe the average person is like us, so donating money (or time) makes us believe the average person is more charitable, and that makes us more hopeful about the world.

Expand full comment
Nov 18, 2022·edited Nov 18, 2022

In order for people like SBF to make tons of money there has to be a functioning society around them, and this necessarily involves people doing things like volunteering for soup kitchens, giving to local charities, throwing a few dollars to pan-handlers, and working in jobs like nursing, teaching, medicine, that EA 80,000 hrs type deride. It also, in the real world, involves people enjoying the fruits of their labor. That this isn't blindingly obvious just demonstrates the fundamentally autistic nature of the EA outlook.

Expand full comment

Matt actually covered this in his EA post back in October? The answers is “both.”

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

Ah my memory is that, as he briefly mentions in this column, he thinks community is undervalued by EA. I guess thats close to what I’m saying, and it’s something I believe in too, but it’s not quite what I’m talking about here.

Expand full comment

I think you’re right, but this is the column I read to think about how society and government should be structured. I go to other columns to think about how to live a rich and fulfilling life.

Expand full comment
Comment deleted
Expand full comment

Yeah I just saw this but commented similarly above. The entire concept of charity comes from Christian ethics, which anticipate all of the non-nutty parts of EA. And there's a ton of sophisticated religious literature on the ethics of charity. But the essentials are all in the New Testament itself. EAs reinvented the moral thought of a first-century Judean peasant, but without reflection on the character of the charitable person.

Expand full comment

I think Jesus is silly, and totally unconcerned with people receiving it. Rather, he is concerned with the effect of giving on the soul, and has a hard time grasping that poverty is not inevitable.

If giving indiscreetly leads to more people foregoing consumption to help people with higher marginal values for dollars, that is a good thing.

Expand full comment

The guidance to give discreetly is to avoid the addiction to positive attention for yourself or influence over others. Giving in large amounts is a form of power, and power corrupts.

Expand full comment

Saving 1000 children from dying of diarrhea is worth engorging your ego.

Expand full comment

If anyone wants to learn more about the FTX fiasco, I’d recommend checking out Matt Levine’s recent columns, https://www.bloomberg.com/opinion/authors/ARbTQlRLRjE/matthew-s-levine

Notably the current info on FTX’s balance sheet (it’s stated assets and liabilities) [1]

> But there is a range of possible badness, even in bankruptcy, and the balance sheet that Sam Bankman-Fried’s failed crypto exchange FTX.com sent to potential investors last week before filing for bankruptcy on Friday is very bad. It’s an Excel file full of the howling of ghosts and the shrieking of tortured souls. If you look too long at that spreadsheet, you will go insane. Antoine Gara, Kadhim Shubber and Joshua Oliver at the Financial Times reported on Saturday:

>> Sam Bankman-Fried’s main international FTX exchange held just $900mn in easily sellable assets against $9bn of liabilities the day before it collapsed into bankruptcy, according to investment materials seen by the Financial Times.

> But then there is the “Hidden, poorly internally labeled ‘fiat@’ account,” with a balance of negative $8 billion. I don’t actually think that you’re supposed to subtract that number from net equity — though I do not know how this balance sheet is supposed to work! — but it doesn’t matter… You cannot apply ordinary arithmetic to numbers in a cell labeled “HIDDEN POORLY INTERNALLY LABELED ACCOUNT.” The result of adding or subtracting those numbers with ordinary numbers is not a number; it is prison.

Everything we know so far suggests at minimum extreme negligence from FTX and SBF; and outright criminal fraud seems almost certain.

[1] https://www.bloomberg.com/opinion/articles/2022-11-14/ftx-s-balance-sheet-was-bad

Expand full comment

Levine’s newsletter has been an essential read these last few weeks. I cannot think of a journalist more fit for the moment than he is for the financial madness of 2022.

Expand full comment

Levine should win a pulitzer for his work on Musk's acquisition of Twitter and now the SBF Debacle.

Expand full comment

I suspect Michael Lewis’ work is going to be solid.

Expand full comment

The amount of money I'd pay to get my hands on an early release of his account of this saga is embarrassing.

Expand full comment

New bankruptcy filing just dropped, https://pacer-documents.s3.amazonaws.com/33/188450/042020648197.pdf

Highlights, https://twitter.com/kadhim/status/1593222595390107649?t=inwFSn06il_Wp1Vk72nijA&s=19

> New CEO John Ray is scathing about Sam Bankman-Fried's management.

> "Never in my career have I seen such a complete failure of corporate controls and such a complete absence of trustworthy financial information."

More coverage, "Here Are the Wildest Parts of the New FTX Bankruptcy Filing" https://www.bloomberg.com/news/articles/2022-11-17/here-are-the-craziest-parts-from-the-new-ftx-bankruptcy-filing

Expand full comment

And I'm reading that John Ray also oversaw the bankruptcy of Enron, which says a lot given the "never in my career" comment.

Expand full comment

And Odd Lots just released an episode this morning, “Understanding the Collapse of Sam Bankman-Fried's Crypto Empire.”

> The collapse of the Sam Bankman-Fried empire is gigantic, sprawling and fast moving. While details are still coming out, it already ranks among the most prominent corporate disasters of all time and has left the entire crypto community reeling. To better understand the role that FTX played in the industry and how the exchange started to unravel, we speak with two guests on this episode. First, we have Evgeny Gaevoy, the founder and CEO of the crypto market-making firm Wintermute, to explain how he used the FTX platform and how he understood its relationship with SBF's trading firm, Alameda Research. Then we speak with independent researcher James Block, author of the Dirty Bubble Media newsletter, and one of the first observers to blow the whistle on the FTX disaster.

Bloomberg: https://www.bloomberg.com/news/articles/2022-11-17/odd-lots-podcast-understanding-sam-bankman-fried-s-ftx-crypto-collapse?srnd=oddlots-podcast

Apple: https://podcasts.apple.com/us/podcast/odd-lots/id1056200096

Spotify: https://open.spotify.com/show/1te7oSFyRVekxMBJUSethH

Listening to it now and tomorrow they’ll publish an episode with Matt Levine himself!

Expand full comment

I work in real estate finance, which means I have a lot of experience with looking at balance sheets, personal financial statements and just general cash flow analysis for potential clients.

I give that run up because I think Matt Levine (and Bloomberg) is actually underselling how shoddy this excel spreadsheet is. Like I can't emphasize enough that this the excel spreadsheet of someone completing an assignment for a high school level intro to business class at the last minute in the morning before first period.

Highlight a few things left out of your summary:

- "less liquid" assets is...not a thing. Furthermore, the largest "less liquid" assets was $2.2 Billion of Serum tokens. In fact all of it was essentially made up crypto currencies that basically had no re-sale value. It would be like if in my basement I created a fictitious Bank called "Bank of me, myself and I" and declared this Bank had $2.2 billion in cash of a currency I made up on the spot called "me, myself and I" currency and called it "less liquid" assets.

- Perhaps my favorite is $7.4 million in illiquid assets called "Trumplose" tokens, which were tokens used to bet on the outcome of the 2020 election. In some ways, this might actually be one of the more "above board" assets. It's not that different then if I went to Vegas, placed a $10K wager on Trump to lose the 2020 election at 3-1 odds and then created investment tranches that allowed outside investors to invest in my betting slip. Still, the name of this asset alone speaks to an "unserious" company.

Expand full comment

Great comment, minor point on TRUMPLOSE: Google and Coindesk are failing me here but I actually believe that that's part of a side bet with a crypto trader (GCR) about the 2024 election. Nooooo idea why that's on there though

Source: A podcast. Confidence: 60%.

Expand full comment

Where is the spreadsheet?

Expand full comment

Probably best place to read in full. https://twitter.com/amix3k/status/1591734543538847745/photo/1

Expand full comment

I interpreted the comments to mean that the actual file was publicly available.

Expand full comment

That is the whole spreadsheet :P Though there are a bunch now floating around (from various places and times).

Expand full comment

Levine's analysis of the situation has been great. I just wish he could condense is writing by ~20%. His articles seem too long for the amount of info (even accounting for extra humor text).

Expand full comment

MattY's even got a funny footnote a la MattL this week ("For the record, we split the check").

Expand full comment

I regard crypto as a complete and utter trashfire. It facilitates every kind of fraud and environmental degradation while creating nothing of value. A generation of young men (for that is typically who invests in crypto, just like MLMs are typically favoured by women) think it is their passport to wealth and privilege. If SBF helps to kill it, then he will actually have done effective altruism perfectly well.

Expand full comment

To your last sentence, I hope you read this hilariously awesome comment by dysphemistic treadmill:

https://www.slowboring.com/p/matts-mailbox-af0/comment/10471479

Expand full comment

I think you are overrating how much buy-in crypto has among young men. It's true that crypto investors are overwhelmingly young men, but young men are not overwhelmingly crypto investors.

Expand full comment

In a long term view SBF going to jail/fleeing and tanking crypto is probably a great outcome for EA. All his money coming from crypto meant he was almost certainly going to turn out to be some sort of fraud or scammer and I couldn't believe how the EA community seemed fine with him even though it tainted them (to me at least).

Expand full comment

Will McAskill’s promotion of longtermism is probably the worst thing to happen to EA.

Let’s assume you’re McAskill and you genuinely believe in longtermism (I think it’s epistemic and effectiveness issues are damning, but I’ll leave that aside). Should you do a huge book tour and give a bunch of interviews that result in EA getting more press than it ever has, and focus it all on AGI risks?

Well, are most people going to find that credible? No. It doesn’t matter how good your argument is; most people aren’t going to believe the robots are coming to kill us, so branding a philanthropic movement with that cause area can only hurt it. FTR, I think this is somewhat true of farm animal animal welfare as well, despite my greater agreement with that as a high priority cause.

Every opportunity EA gets, it should be branding itself as fighting poverty, supporting global health, and preventing pandemics. For some audiences, the EA’s interest in prison reform is probably helpful too.

Once you’ve got people interested - sending some donations, reading the forum, etc. - you can start trying to persuade them on the weirder stuff, if you truly believe that weirder stuff is important.

Don’t let your freak flag fly. EA isn’t about self-expression; it’s about impact.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

Very slight factual disagreement (? maybe more like elaboration) on farm animal welfare: I think people that think about it mostly have to undertake a certain willful blindness and the resistance is to having to take those blinders off and really think about what farm animal welfare means (and the magnitude of suffering to which one has hitherto been a party).

This doesn't necessarily mean it should be a point of strategic emphasis--because people don't *like* having those blinders removed--but whereas AGI risk is so far out there that it's going to have most people backing away slowly and and looking for a nearby constable, people are inherently *supportive* of the underlying ethos of animal welfare -- people *like* cute animals and routinely voluntarily go to petting zoos!

Expand full comment

And I really like that you added "cute" to this: very few people will complain if a swarm of insects or a herd of sewer rats are mass killed.

Expand full comment

It's certainly convenient from an ethical perspective that we don't raise naked mole rats for their meat.

Expand full comment

There might be a correlation-causation thing here: cows are totally cute once you stop thinking of them as food on the hoof

Expand full comment

Yeah, I mostly agree with this.

I *do* think there remain large segments of the population who’s ethics are just fundamentally opposed to anima welfare, but for the most part, these probably aren’t people who’d be interested in EA anyway.

Expand full comment

Or better yes, don’t even do that? People in the main have a pretty good radar for weird culty stuff and having the Yudkowskites hovering just offstage at every EA event will have entirely predictable results.

Expand full comment

I mean, if you genuinely believe the weird stuff is important, then you have to try to expand funding for it somehow. I can’t stop the people who believe in the weird stuff I disagree with from being part of EA, and I don’t disagree with all of it (like farm animal welfare).

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

I’m sorry, I still think that the premise of ea is stupid. The idea that you can quantify “the greater good” is idiotic , for one missing the fact that values are subjective (and there are a thousand other problems with this). Also, anyone who has read history (or Harry Potter) should immediately get chills from people convinced they’ve cracked the code for “the greater good “ and feel entitled to do whatever with that justification (esp. if it just so happens to benefit themselves). SBF surely did us a favor in exposing this nonsense pretty early on. I’m still troubled however by the crisis in education+values that allowed it to happen in the first place.

P.S. note the vox interview with SBF where he defines his moral philosophizing as nothing more than “woke shibboleths so people like me” probably his only honest words. One of way too many vain and morally empty sociopaths.

Expand full comment

Flourishing is hard to quantify. Suffering is not. The number of children who die of malnutrition of diarrhea can be quantified if someone pays for the count. Ditto the number of mosquito nets distributed to poor, tropical places. Ditto the number of girls who learn to read and don’t marry before 17.

Expand full comment

Ok. And what does quantifying this actually tell you ?

Expand full comment

Whether particular governments, policies and weather events have caused suffering. I don’t need an expert to tell me that human suffering is bad. That’s pretty intuitive. If suffering is bad isn’t intuitive to you, I’m not sure what more I can say

Expand full comment

But what’s got to do with prioritizing charitable donations ?

Expand full comment

Buy mosquito nets, vaccines and basic foods

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

Why? Why is suffering the be all and end all? Reducing suffering is good, sure, but why should that be the *only* thing we care about? Why should it be the *exclusive* good we advance? There is more to life and more worthy of our contribution. This reductionism makes no sense, or at least is based on extremely subjective premises.

Expand full comment

What gets measured gets managed. Quantify suffering, and you have a tool to fight it. We all have limited resources in our life -- money, effort, connections -- and we must be efficient in how we expend them. You can spend your resources poorly and have little to show for it, or you can spend them wisely and leave the world a better place. That's it, that's all EA is.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

Forgive me, but this is all nonsense. There is no way to "quantify suffering". Suffering is a feeling, by definition subjective. There is also no reason why fighting it should be our only goal. Nor is it at all clear to me why "efficiency" is put on such pedestal. A whole bunch of very questionable assumptions, to put it politely.

Expand full comment

“your suffering is just a feeling i can’t quantify” is a step away from “your suffering doesn’t matter, fuck off”

Expand full comment

Of course, quantifying suffering is a step away from 'your suffering is quantifiably less than Jones's, so fuck off, I'll help him instead of you.'

Expand full comment

only if quantifiable things are the only thing you care about. Circular reasoning.

Expand full comment

Obviously it's difficult to quantify, but here are some things you might believe: It's bad when a child dies. Being healthy is better than being sick. It would be better if fewer people went to bed hungry. I want more people to be rich enough to sleep under a real roof.

Is it "subjective" when a child starves to death? Is it "unquantifiable" when a subsistence farmer starts fertilizing their crops and makes enough money to buy livestock? You'll forgive me for thinking it's not.

It's hard to know how to make the world a better place, but we can try. I won't accept an attitude of giving up just because it's hard to quantify things.

Expand full comment

that's not at all what i'm saying. Reduction of *poverty* or improving of *health* are objective, quantifiable worthwhile causes. Notice, however, that these are not the same as "reducing suffering". We know they definitely *correlate* but we cannot quantify the latter, and the distinciton is important. The subjectivity however comes in prioritizing different goals. Feeding the hungry is good, but improving their literacy is also good. Helping people in africa is good, but so is helping people in your own countyr and your own neighborhood. Giving people enough food to live is very good, but giving them access to art and beauty is also good, and *also* makes the world a better place.

I totlaluy agree that we should strive to make the wolrd a better place, but lets not delude ourselves that we are going to 1. be in 100% agreement on what a better world looks like (Somethings would be easy to agree on, some not) 2. that the prioritization could be done in a fully objective, let alone quantifiable way. Yes, within specific, subjective, goals, certian quantifiable measures can be helpful, but their role is only *part* of the story, that is all i'm saying.

Expand full comment

Are you willing to deny entirely that some things are better than other things?

Expand full comment

No, but I do deny that the answer about what is better hm what is always quantifiable.

Expand full comment

profits can be quantified! shall they be our lodestar?

Expand full comment

Well, it helps us determine what expenditures of money are most efficacious at reducing this suffering.

Expand full comment

perhaps, but it doesn't help you balance that against other goals such as supporting a museum or tutoring the kid next door.

Expand full comment

Right. EA is an argument that we should care more about reducing that suffering than about supporting a museum. A lot of people find that argument compelling. You may not find it compelling, but why do you object to people making the argument? I am certain museums will not go away because of EA's influence.

Expand full comment

I’m objecting to its clear exclusivit thinking and the dangerous naïveté in thinking everything in the world must be quantifiable or else is unimportant. And I think this way of thinking is already doing serious damage.

Expand full comment

I am a person incredibly skeptical (more like cynical) of the entire EA movement. SBF is a sociopath and I would guess many that are involved with EA are not in it for good reasons. but even I would concede the *premise* is sound. Trying to maximize the impact of charitable investments is just a sensible thing to do.

Expand full comment

EA is an outstanding case of overthinking. We're in real angels dancing on the head of a pin territory here. Want to make the world a better place? Give money to or volunteer for charitable causes you believe in. Try to give more than you have in the past. The rest is commentary; now go do it.

Expand full comment

Are you suggesting it’s not possible to evaluate the relative effectiveness of a dollar spent? Or that we can’t compare actions in the real world against each other to see what is probably better?

To me, that’s all EA is - attempting that kind of analysis with the goal of increasing the efficiency of charity.

Expand full comment

To repeat a comment I just made, I do think there's great value in evaluating how dollars are spent *within each individual cause*, and that we need more of that assistance. Where I get off the train is when it's suggested that there's an objective way to prioritize some causes over others.

Expand full comment

That's only true if you define causes as very broad. One cause might be saving human lives. Within that goal, it's clear that buying bednets is better than donating to a homeless shelter in US on a lives/dollar basis.

But if you're saying that it's impossible to compare the causes of "lives saved around the world" with "lives saved in the US", then we have a fundamental moral disagreement.

Expand full comment

Sort of but I think it depends on how we categorize causes. Are preventing malaria and malnutrition in sub-saharan Africa different causes or both part of the same cause of helping people in sub-saharan Africa.

But you're right in that it does get tricky when weighing things like should we spend more money of democracy promotion or economic development vs trying to help more acute problems like malaria. I don't know that there is a one true answer to that. But surely it's not bad if people argue about what is more important right. How would individuals decide.

It seems like some of these arguments assume that there is some EA ministry of charity which compels people to donate money to charities they choose. But in reality what is happening is mostly that people are providing information and arguments about the best way to deploy resources and donors can decide for themselves what they believe is best.

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

As noted before, any evaluation depends on goals and assumptions, so I do not buy the argument that it's all 100% subjective and that there is no point in evaluating various different options.

I don't look at it any differently that evaluating where to spend tax dollars. You can definitely evaluate and prioritize the benefits of defense spending vs social safety net spending, to name one example. It's not entirely subjective once you consider goals and your general priorities and it's no different with spending money on charity or anything else.

Edited to add:

Here's an example. Supposed my goal was to ameliorate the effects of poverty in my community. What is effective for that would be different if my goal was instead, to ameliorate the effects of poverty in Africa or globally. You can evaluate options against specific goals in ways that aren't completely subjective.

Expand full comment

Agreed. The subjectivity is in the goals themselves however and how to prioritize them. effectiveness is a tricky one, though, especially if measured narrowly under the assumption that everything can be quantified which is not the case. In other words we can and should think about whether our money is used effectively to advance a certain cause we subjectively favor, but not expect that the answer will be entirely quantifiable. There are other ways to demonstrate it, more traditional ways donors are shown results: meet the choir you funded and hear them sing, learn about the research you funded , tour the new exhibition in the museum you became a member of etc. these complement the financial reports or what have you.

second, money donations are not the only way to contribute and another fallacy is that volunteer work can be quantified.

Expand full comment

Tax dollars are a good example, because in a democracy the way they're spent is determined by the majority of collective opinions roughly in unison as to what those tax dollars should be spent on.

Expand full comment

Sure, but we still analyze policy based on how effective it might be and compared to other choices, and that analysis informs the democratic process.

Expand full comment

And we as a society have plenty of disagreements on what the most effective way to spend is--or even if it should be spent, instead of retaining it to the citizens. That drives a big chunk of the political policy debate.

Expand full comment

Precisely!

Expand full comment
Nov 17, 2022·edited Nov 17, 2022

I mean, there's an entire field of how we trade off subjective experiences with real resources, economics. Now any decent economist will tell you there's significant danger in taking concepts like revealed preference and utility functions too literally. EA folks, especially longtermists, seem to have gone too far there. However, there's still some value there that I think altruisticly-minded individuals and institutions should take more seriously on the margin.

Expand full comment

On the margin, perhaps. The ea folks seem literalist and totalizing (?) in their approach though.

Expand full comment

I wish I could upvote this many more times than once.

Expand full comment

Crypto has never been an effective store of value, it’s as volatile than paper money during a major war. That so many smart people have taken such a ludicrous idea seriously makes me doubt faddish one percenters when they talk about climate change, AI risk and gender.

Expand full comment

Of course you should doubt faddish one percenters on those issues, but there are professionals studying each of those and hardly any of them is a 1 percenter*…

(*with the possible exception of AI experts as I hear the field is extremely lucrative. However they are not you typical 1percenter at any rate)

Expand full comment

I don’t trust climate scientists making catastrophic predictions of climate change for the same reason I don’t trust investment advisors to opine on the solvency of social security and don’t trust surgeons to decide who should have a vagina installed. If climate change were a moderate, manageable risk, there wouldn’t be much need to keep training and paying hordes of climate scientists.

Expand full comment

The catastrophic predictions are doomer leftists who have an obsession with proving that climate change is the ultimate, fatal contradiction of capitalism.

The actual science makes predictions more like 'vulnerable areas of the world where survival is already a marginal affair will continue to decline in habitability, triggering a severe refugee crisis' not 'the Earth will burn in the cleansing fire of nature's righteous justice'

Expand full comment

hot parts of india might become really hot. isn’t air conditioning the least disruptive fix?

Expand full comment

Air conditioning can’t save plants and wildlife

Expand full comment

I guess it depends on your view of how feasible "create power generation infrastructure with enough redundancy and peak capacity to handle demand on 110 degree days with 95% humidity, while also building out and maintaining AC units" is. i mean I live in Florida so I know it is in theory possible but I'm not convinced places like Bangladesh or India's poorer states can pull it off.

Expand full comment

"triggering a severe refugee crisis" is already beyond scientific assessment into the realm of social-science educated guess, but other than that you're spot on.

Expand full comment

They’re not actually making catastrophic predictions though

Expand full comment

I was initially skeptical of forcing a connection between the FTX affair and the EA philosophy, but this piece helped me grasp something more clearly that’s always bugged me about EA and Utilitarianism. To be clear, I’m pretty pro-utilitarianism, not in an “ends justify the means” kind of way but a “the way things actually turn out in the end matter more than what you thought would happen or hoped would happen.” And what’s always been suss about the EA vibe is the way some people used the logic to justify earning as much money as possible, so long as they felt the good they could do with the money would outweigh the bad of whatever their day job was. (I’m thinking hedge fund managers, for example.) But this whole logic is based on an extreme amount of hubris that you have the capacity to fully play out the likely consequences of all your actions, and you are free from bias to the extent you’ll be able to accurately calculate good vs bad and redirect your actions if the balance goes off. I’m pro-rationality but I’m skeptical of rationalists whose philosophy is based on the claim that people can be rational all or most of the time. In short, human morality evolved to give us easy rules and shortcuts that avoid the need for extreme mental calculus that’s susceptible to self-serving biases. Anyone who thinks they can or should make it up from scratch themselves just hasn’t been properly humbled (yet). All that said I think 95% of EA is on the right track, and I agree with all of Matt’s numbered points above!

Expand full comment

Great comment! I have similar feelings.

I’ve been rationalist-curious for quite a few years, but am often irritated by the hubris you describe as well as the notion that “good” can be quantified and ranked with anything like certainty. Trying to be effective is good; thinking that you have found the one true path is delusional.

Expand full comment

When people start sounding like prescriptive Marxists (i.e. revolutionaries) or End Times enthusiasts I get really suspicious. Waaayyy more uncertainty in those views than the proponents are willing to admit.

Expand full comment

The corollary that I'm finding the most useful is Ayn Rand framing her ideas as objectivism, as if everything she believed in was so obviously reasoned fact.

Expand full comment

That makes SBF the EA/consequentialist version of a trolley-car problem. :-)

Expand full comment

I hope people have been making memes of a trolley running over stacks of investments in recent days.

Expand full comment

Made this just for you, Trees... https://imgflip.com/i/715pg2

Expand full comment