EA - FTX, 'EA Principles', and 'The (Longtermist) EA Community' by Violet Hour

The Nonlinear Library: EA Forum - Un pódcast de The Nonlinear Fund

Podcast artwork

Categorías:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: FTX, 'EA Principles', and 'The (Longtermist) EA Community', published by Violet Hour on November 25, 2022 on The Effective Altruism Forum.1. IntroTwo weeks ago, I was in the process of writing a different essay. Instead, I’ll touch on FTX, and make the following claims.I think ‘the principles of EA’ are, at the community level, indeterminate in important ways. This makes me feel uncertain about the degree to which we can legitimately make statements of the form: “SBF violated EA principles”.The longtermist community — despite not having a set of explicit, widely agreed upon, and determinate set of deontic norms — nevertheless contains a distinctive set of more implicit norms, which I believe are worth preserving at the community level. I thus suggest an alternative self-conception for the longtermist community, centered on striving towards a certain set of moral-cum-epistemic virtues.Section 2 discusses the first claim, and Section 3 discusses the second. Each chunk can probably be read independently, though I’d like it if you read them both.2. 'EA Principles’This section will criticize some of the comments in Will’s tweet thread, published in the aftermath of FTX’s collapse.I want to say that, while I’ll criticize some of Will’s remarks, I recognize that expressing yourself well under conditions of emotional stress is really, really hard. Despite this difficulty, I imagine that Will nevertheless felt he had to say something, and quickly. So, while I stand behind my criticism, I hope that my criticism can be viewed as an attempt to live up to ideals I think Will and I both share — of frank intellectual honesty, in service of a better world.2.1.From Will’s response:“If those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community.” (emphasis mine)Overall, I’m not convinced. In his tweet thread, Will cites various sources — one of which is Holden’s post on the dangers of maximization, in which Holden makes the following claim:“I think “do the most good possible” is an . important idea . but it’s also a perilous idea if taken too far . Fortunately, I think EA mostly resists [the perils of maximization] – but that’s due to the good judgment and general anti-radicalism of the human beings involved, not because the ideas/themes/memes themselves offer enough guidance on how to avoid the pitfalls.”According to Holden, one of EA’s “core ideas” is a concern with maximization. And he thinks that the primary way in which EA avoids the pitfalls of their code ideas is through being tempered by moderating forces external to the core ideas themselves. If we weren’t tempered by moderating forces, Holden claims that:We’d have a community full of low-integrity people, and “bad people” as most people define it.Here’s one (to me natural) reading of Holden’s post, in light of the FTX debacle. SBF was a risk-neutral Benthamite, who describes his own motivations in founding FTX as the result of a risky, but positive expected value bet done in service of the greater good. And, indeed, there are other examples of Sam being really quite unusually committed to this risk-neutral, Benthamite way of approaching decisions. In light of this, one may think that Sam’s decision to deceive and commit fraud may well have been more in keeping with an attempt to meet the core EA idea of explicit maximization, even if his attempt was poorly executed. On this reading, Sam’s fault may not have consisted in abandoning the principles of the EA community. Instead, his failings may have arisen from the absence of normal moderating forces, which are external to EA ideas themselves.Recall Will’s statement: he claimed that, conditional on Sam committing frau...

Visit the podcast's native language site