This past weekend, I attended Effective Altruism Global x Berkeley, one of the small effective altruism conferences that the Centre for Effective Altruism — the main leadership group of the EA movement — runs all over the world.
I attend two or three EA Global events a year, but this one felt more than a bit different. It was the first such meeting since the bombshell that is the collapse of Sam Bankman-Fried’s multibillion-dollar cryptocurrency empire hit the effective altruism movement last month.
Bankman-Fried was perhaps EA’s highest-profile champion, and had pledged to give his wealth (once estimated as high as $32 billion) to EA causes, with a focus on longtermism and pandemic prevention. Now, his company is in bankruptcy proceedings, something like $8 billion in customer deposits is missing, and the Securities and Exchange Commission and US Department of Justice are investigating.
(Disclosure: This August, Bankman-Fried’s philanthropic family foundation, Building a Stronger Future, awarded Vox’s Future Perfect a grant for a 2023 reporting project. That project is now on pause.)
Effective altruism has come under intense criticism in the wake of the FTX debacle, including from those involved with the movement. Josh Morrison, who founded an organization that promotes challenge trials for vaccines, told the New Yorker writer Gideon Lewis-Kraus in a piece last week that “the E.A. community disregarded the risks of tying itself to an aggressive businessman in a lawless industry.”
So I wondered: What will EAG x Berkeley be like? Is the movement shell-shocked? Angry? Confused? Where is effective altruism headed?
I’ve been involved with EA for 10 years, so take this with a grain of salt, but my overall impression from the conference is that the EA movement is more resilient than I’d been giving it credit for. It’s not that FTX wasn’t on people’s minds; I gave a talk discussing what I’ve observed around the crisis and it was well-attended, reflecting the intense interest in the matter. But the attendees, one got the sense, were a bit too busy to obsess over cryptocurrency. They wanted to focus on the work.
I talked to people who advised on national Covid-19 response, to the founder of a maternal health nonprofit working to improve women’s access to contraception options, to people developing new tools for understanding the behavior of powerful AI systems. I moderated a panel on nuclear risks and the state of efforts to mitigate them, and when the panel concluded, the panelists were surrounded by college students eager to know more about how to get into a career in nuclear risk. In other words, it was precisely what happens when I usually attend an EA conference.
Some lessons from a crypto disaster
But the question remains: What could a movement focused around effective charity have done about one of its biggest donors and most public adherents blowing up his company, potentially leaving hundreds of thousands of customers high and dry?
Substantially more than it did — though less, perhaps, than many of its members would like to think.
In my talk, I argued that external critics from EA or anywhere else probably weren’t going to catch Bankman-Fried’s shady business practices, which he managed to keep secret even from the legal and compliance teams, and most of the employees, at his own company. But the critics could have helped ensure that stories of past accounting incompetence and unreasonable risk appetite by Bankman-Fried got out, as Lewis-Kraus charged in his piece, and even if rumors of being ruthless or untrustworthy wouldn’t have taken Bankman-Fried down, they might’ve helped some of his customers be warier.
EA advocates could have pushed back harder on the wisdom of EA affiliating itself so closely with Bankman-Fried, and asked more questions — as my colleague Dylan Matthews did — about whether running Super Bowl ads for what was effectively a gambling company was morally okay even if it had been all on the up-and-up.
While those are the concrete issues, the Bankman-Fried saga also calls into stark relief some more abstract ones.
A tiny movement of idealistic, frugal young people putting their giving toward malaria nets — as EA did in its earliest days — doesn’t need to grapple with big questions about political power, metaethics, risk appetite, or how their message might be interpreted by extremists or provide cover for wrongdoing.
The movement that effective altruism is today — big, well-funded (if less well-funded than it was before FTX’s fall), and working in many more muddy waters — cannot avoid these questions.
EA as a formalized concept is barely more than a decade old and hasn’t yet learned all of the cautionary lessons it will need. Current events are a painful introduction to the fact that there’s a lot that can go wrong when things go wrong.
What’s ahead for effective altruism
I do think that the effective altruism movement and its leaders and community institutions — especially the ones that trusted and vouched for Bankman-Fried, such as Oxford philosopher Will MacAskill — are facing a reckoning, with effects that have yet to be fully felt.
But I ultimately believe that EA — meaning the people doing the work, if not necessarily a leadership that appears to have let them down — will emerge stronger from this moment. Effective altruists are still driven to identify the most important, underserved, solvable problems in the world, and then do something about them. It’s a powerful motivator that doesn’t depend on the credibility of EA organizations or EA-associated billionaires. The young people I talked to this weekend didn’t want EA money, they wanted advice on how to live the most impactful lives possible.
People show up to these things because they like the message that they, through their work and donations, can tackle whatever matters the most. It turns out it’s deeply important to people to do that, and EA is where many of them gather to do it. Faced with a choice between a painful moral and institutional reckoning or quitting to go work a normal job, a lot of people will take the reckoning.
A version of this story was initially published in the Future Perfect newsletter. Sign up here to subscribe!