Thinking, writing, and tweeting from Berkeley California. Previously, I ran programs at the Institute for Law & AI, worked on the one-on-one advising team at 80,000 Hours in London and as a patent litigator at Sidley Austin in Chicago.
The relationship between how fun your movement is for participants and its overall effectiveness is non-linear. You need to offer selfish rewards for (most) people to join. Offer too few selfish rewards and you're going to have a small, ineffective movement no matter how good your ideas are or how qualitatively good the few people you do have are.
I agree that entertaining EAs has no terminal value, but it has huge instrumental value. Few people seem to be trying to make EA interesting and fun on that score. People's main experience with EA seems to be getting pitched on jobs and then not getting them. Not fun!
There's real work to be done getting people excited to earn to give and spread the ideas in their spare time. Done well enough, it can even improve your direct work talent pool!
I'm surprised at the "on during presentations, off during 1:1s" advice. My intuition is the opposite because of the volume of droplets and aerosols directed right at you by a speaking person in a 1:1. That seems more dangerous than sitting in a quiet room with many people just lightly breathing through their noses not directed at anyone. If you do all your 1:1s outside, I can see how this flips, but maybe you should say the recommendation depends on that.
This is assuming you go to 3-4 presentations and have ~20 1:1s.
The real solution is of course for ASB to provide us with 500 of those chlorine misters.
I keep coming back to Yeats on this topic:
"The best lack all conviction while the worst are filled with passionate intensity"
I think the exceptionally truth-seeking, analytical, and quantitative nature of EA is virtuous, but those virtues too easily translate into a culture of timidness if you don't consciously promote boldness.
Conceptually, Julia Galef talks about pairing social confidence with with epistemic humility in the Scout Mindset. It doesn't come naturally, but it is possible and valuable when done well.
Right now I think Nicholas Decker is a great embodiment of this ethos. He says what he thinks without fear or social hesitation. He's not always right and he flagrantly runs afoul of what's considered socially acceptable or what a PR consultant would tell him to do, but there's no mistaking his good-natured-ness and self-assurance that he's on the right side of history, because generally *he is.* He doesn't make the perfect the enemy of the good or excessively play it safe to avoid criticism.
A "bring on the haters" attitude is in fact more welcoming and trust-inducing than words carefully crafted to minimize criticism because it defeats the concern that you're hiding something. And come on friends, the stuff you're "hiding" in EA's case – veganism, shrimp, future generations, etc. – is nothing to be ashamed of. And when you soft roll it, you're endorsing the social sanction on these things as weird. Fuck that. Hit back. With grace. And pride. Fire the PR consultant in your head.
I'd be doing less good with my life if I hadn't heard of effective altruism
Subjectively, I think I have done a lot more good because of EA, but have doubts around the potential negative sign of AI safety work (have we contributed to AI hype? Delayed beneficial AI? Else?) and cluelessness. On cluelessness, my alternative was a very causally-minimal life close to ~never leaving the house. I still don't leave the house, but lots of what I send out of the house over the internet is more effectual than it would be.
Re commuter schools, it seems like the argument is just as strong in principle because the would-be organizer's opportunity cost is proportionately lower. In practice, if that organizer is reading this post, there's a good chance that they're a a big-enough fish in a small-enough pond that they should focus on their individual development, so your point might hold nonetheless.
Maybe something that spans all the cruxes here is that there are just very low effort ways to run a group and capture a big part of the value. If no one else is doing it, it's just very worth it to text the 3-4 interested people you know and substitute a group meeting for a general hang out once a month.
4-6 seem like compelling reasons to discount the intersection of AI and animals work (which is what this post is addressing), because AI won't be changing what's important for animals very much in those scenarios. I don't think the post makes any comment on the value of current, conventional animal welfare work in absolute terms.
We wanted to start our new channel with a compelling story that viewers can sink their teeth into, and that a wide audience would have reason to watch, even if they don’t yet know who we are or trust our viewpoints yet. (We think a video about “Why AI might pose an existential risk”, for example, might depend more on pre-existing trust to succeed.)
Can you say more about this? I largely have the opposite intuition: that presenting a specific set of empirical predictions (indeed "a story") requires more – rather than less – trust in the presenter as compared to a more abstract model with its assumptions and alternative explanations explicitly stated.
Really nice post and completely resonates with my experience.
Dealing with squeaky wheels is especially challenging once you're in a position where it's known you have lots of good connections. There are just a lot of people very committed to their own advancement for its own sake who will nominally jump through whatever hoops you put in front of them even though you doubt they will put the connections they want to good use for the world. Getting good at saying no without being mean or cagey is a real skill.
As Huw says, the video comes first. I think this puts almost anything you'd be excited about off the table. Factory farming is a really aversive topic for people, and people are quite opposed to large scale WAS interventions. The intervention in the video he did make wasn't chosen at random. People like charismatic megafauna.