I have some claim to be an “old hand” EA:[1]
- I was in the room when the creation Giving What We Can was announced (although I vacillated about joining for quite a while)
- I first went to EA Global in 2015
- I worked on a not-very successful EA project for a while
But I have not really been much involved in the community since about 2020. The interesting thing about this is that my withdrawal from the community has nothing to do with disagreements, personal conflicts, or FTX. I still pretty much agree with most “orthodox EA” positions, and I think that both the idea of EA and the movement remain straightforwardly good and relevant.
Hence why I describe the process as “senescence”: intellectually and philosophically I am still on board and I still donate, I just… don’t particularly want to participate beyond that.
What does the community have to offer me?
I won’t sugar-coat it: the main reason I don’t engage so much with EA these days is that I find it boring. What draw me to EA was the ideas, and the idea of being able to make an impact. It was interesting and fun to talk to people about it, and there were often new and interesting things happening. Even as a non-specialist community member, there was often a lot you could contribute. I don’t feel like that today, and it's hard to see myself as having more impact than I do by donating.
There are two main sub-strands to my lack of interest, which are more apparent online or in-person.
Online: much of the content on the EA forum is quite specialized. I, in principle, absolutely love it that people are writing 10,000 word reports on shrimp sentience and posting it on the forum. That is what actually doing the work looks like - rather than speculating at a high level about whether shrimp could suffer and if so what that would mean for us, you go out and actually try to push our knowledge forward in detail. However, I have absolutely no desire to read it.
I think it’s telling that the main things I commented on or posted about on the Forum since 2017 are all “community” stuff - i.e. things that you don’t require any context to have an opinion about!
Is this a me problem? Maybe! Some people love to consume this stuff and I admire them. But also it’s unsurprising that people doing more focussed work on sub-areas will start to exclude people who aren’t focussed on that sub-area, through the gradual build-up of required context; lack of time; or just the normal inability to care about too many specific things.
In some ways it felt like the early EA community was like the old scientific Royal Society. Someone would come in and demonstrate the relationship between heat and pressure in gases - awesome! Then another person would show you what lungs looked like under a microscope - cool! Whereas now it’s a bit more like modern science: fields are much deeper (which is good!) but it is harder to maintain an interest in many of them.
Offline: a lot of what goes on in in-person EA meetups is, quite reasonably, somewhat introductory. There is a lot of emphasis on bringing in and engaging new people. This is all excellent! But I don’t have much interest in it myself. Neither the content nor, to be honest, projecting the charisma to make people feel welcome and want to come back.
There are the conferences, where at least the content is a bit more novel. But the general vibe of the conferences is that they’re for people who are highly engaged and expect to be re-directing large fractions of their energy depending on what happens. And that no longer feels like me.
In the past what mostly kept me attached to the community was the social life. I knew enough people that I always enjoyed going to events for that reason alone, and there were enough parties that I could meet new people as well. The combination of moving cities and COVID more or less broke that connection for me, alas.
What do I have to offer the community?
The other side of this picture is that I don’t feel like I have much to offer the EA community these days:[2]
- I’m not deep into any particular areas of active work, so I am neither producing content nor terribly useful at reviewing it
- I do not have the spirit to throw myself into a direct work project[3]
- I don't have any particular advantage in working out where to donate over, say, the EA Funds
- I am unlikely to drastically change my career at this point
- I just have less energy than I did when I was younger
- I am not generally the one hosting all the parties!
As a senescent older EA, my virtues are pretty much that I exist, I have some earning capacity, and I’ve been through a bunch. Is that useful? I’m unsure. Perhaps my comparative advantage remains continuing more-or-less as I am and funding the people who have more to offer.
I can’t help but compare EA with the intentional community of my youth: Christianity. People don’t “senesce” out of church in the same way. There’s always a reason to go - the activity itself is regarded by everyone as valuable, regardless of whether it leads to anything in particular. The community is very welcoming. There are many roles for people to contribute something, even if it’s just greeting people or doing a bit of admin.
I fear we have yet to truly refute Robin Hanson’s claim that EA is primarily a youth movement. What could a version of the EA community look like which was compelling to people throughout their entire lives?
- ^This post is partly a personal response to Will MacAskill's claim that it would be useful to get "old-hand" EAs out of the woodwork. 
- ^I'm going to bracket the obvious response of "why don't you change those things?". That's pretty reasonable, and maybe I will, but for now let's take me as I am. 
- ^I learned this from GTP. Honestly, I kind of hated doing it. 

Fwiw, GTP seems like reasonably little evidence about whether you could find a role you enjoyed that you also thought had substantial direct impact. There are so many different roles with direct impact, and setting up your own charity start up is very different from working at existing charity in a more defined role, or working in government, or for a large tech company.
Obviously, changing direction in your career is a substantial undertaking, and it's reasonable to not want to do it. But the reason shouldn't be that you hated doing GTP - there are a lot of other options out there that you might find yourself intrinsically motivated by. Seems at least worth a chat with 80k about?
[1]
Not at all biased by how very much I enjoyed working with you at GWWC/CEA :-p
I think your experience matches what most people interested in EA actually do, the vast majority aren't deeply involved in the 'community' year-round. The EA frameworks and networks tend to be most valuable at specific decision points (career/cause changes, donation decisions) or if you work in niche areas like meta-EA or cause incubation.
After a few years, most people find their value shifts to specific cause communities (as you noted) or other interest-based networks. I think it might actually be a bad sign if there was more expectation of people being very involved as soon as they hear about EA and forever more.
I'd also push back on Hanson's characterisation, which was more accurate at the time it was written but less so now. The average age continues rising (mean 32.4 years old, median 31) and more than 75% aren't students.
There are people now with 15+ years of EA engagement in senior positions across business, tech and government, and there are numerous, increasingly professional and sizable, organisations inspired by EA ideas.
The methods within EA differ markedly from typical youth movements, there's minimal focus on protests or awareness-raising except where it's seemingly more strategic within specific cause areas.
+1, this seems more like a Task Y problem.
My impression is that if OP did want to write specialist blogposts etc. they would be able to do that (probably even better placed than a younger person, given their experience). (And conversely, 18 year olds who don't want to do specialist work or get involved in a social scene don't have that many points of attachment.)
Some of what long-time EAs might have to offer (possibly but not necessarily including you) is:
I would add to the list:
According to the Meta Coordination Forum (2024) these kinds of skills will be the most valuable to recruit for across the EA movement. So if you do have some work experience in this area, you could potentially make a very big impact by working for good.
Yeah, those are somewhat reasonable. I guess then the question becomes: how to actually deploy those?
Have you checked with a nearby local EA group if they have younger people looking for mentors? I find that sometimes the youthful optimism energizes me too - like going to church!
I have not - maybe I should!
Something that came to mind while reading this was that this is probably a somewhat common (though by no means majority) experience. EA isn't about helping other EAs, though this is often an instrumental goal; it's about doing good for the world and thus I'd expect some people to be a bit bored by it sometimes and such.
I definitely identify with where you're coming from here, but these insights might also imply a potential partner post on "How to avoid EA senescence (if you want to)".
Based on your examples, this might look like:
These are good ideas. I think I wrote this post in a bit of a negative frame of mind, thank you for the positive lens!
When I worked at CEA, a standard question I would ask people was "what keeps you engaged with EA?" A surprisingly common answer was memes/shitposts.
This content has obvious downsides, but does solve many of the problems in OP (low time commitment, ~anyone can contribute, etc.).
Just wanted to say that I enjoyed reading this and the section starting with "Online:" and your concluding question really resonated with me.
Really liked this post, and as an oldie myself (by which I mean in my 40s, which feels like quite old compared to the average EA or EA-Adjacent), I resonated a lot with it. In my case, I am not an 'old hand EA' though: I rather arrived relatively circuitously and recently (about 3 years ago) to it.
Some have commented, here or elsewhere, that the fact that EA puts so much emphasis on the effectiveness means that it generally doesn't care much about either community building, general recruitment/retention and group satisfaction, and when it half-heartedly tries to engage in this, it is with a utilitarian logic that doesn't seem congenial to the task. Once could make a good case, though, that this isn't a bug, but a feature: EA as resources-optimizer with little time to waste, given the importance of the issues it tries to solve or ameliorate on dealing with a less active, talented and effective series of people and needs. Once senses an elitist streak inevitably tied to its moral seriousness and focus on results.
On the other hand, I feel communities tend to thrive when they manage to become hospitable and nice places where people are happy to be in, in different degrees. This is what most successful movements -and religions- manage successfully: come for the values, stay for the group.
Passion and intellectual engagement also help a lot, but these perhaps vary a lot in a way that isn't tractable. Like the OP, I find much of the forum posts dull and uninteresting, but then again, the type of person I am, my priorities, values and interests mean I am probably badly fitted to become anything more than mildly EA-Adjacent, so I don't think I'd be a good benchmark in this regard. I think Will's recent post on EA in the age of AGI does hit the nail on the head in many respects, with interesting ideas for revitalizing and updating EA, its actions and its goals. EA might never match religion’s or some group's capacity for lifelong belonging, but recognizing that limitation, and trying to soften its edges, could make it more resilient.
FWIW my impression is that CEA have spent significantly more effort on recruiting people from universities than any other comparable subset of the population.
Wow. This is my first time reading that Robin Hanson blog post from 2015.
When I was around 18 to 20 or 21, I was swept up in political radicalism, and then I became a pretty strong skeptic of political radicalism afterward — although it always bears mentioning that such things are too complex to cram into an either/or binary and the only way to do them justice is try to sort the good from the bad.
I think largely because of this experience I was pretty skeptical of radicalism in EA when I got involved with my university EA group from around age 23 to 25 or 26. I don't like it when ideas become hungry and try to take over everything. Going from a view on charity effectiveness and our moral obligation to donate 10% of our income to charity to a worldview that encompassed more and more and more was never a move I supported or felt comfortable with.[1]
It has always seemed to me that the more EA tried to stretch beyond its original scope of charity effectiveness and an obligation to give, which Peter Singer articulated in The Life You Can Save in 2009,[2] the more it was either endorsing dubious, poorly-supported conclusions or trying to reinvent the wheel from first principles for no particularly good reason.
I think this paragraph from Hanson's blog post is devastatingly accurate:
If you think that effective altruism has discovered or invented radically novel and radically superior general-purpose principles for how to think, live, be rational, or be moral, I'm sorry, but that's ludicrous. EA is a mish-mash of ideas from analytic moral philosophy, international development, public health, a bit of economics and finance, and a bit of a few other things. That's all.
I think the trajectory that is healthy is when people who have strong conviction in EA start with a radical critique of the status quo (e.g. a lot of things like cancer research or art or politics or volunteering with lonely seniors seem a lot less effective than GiveWell charities or the like, so we should scorn them), then see the rationales for the status quo (e.g. ultimately, society would start to fall apart if tried to divert too many resources to GiveWell charities and the like by taking them away from everything else), and then come full circle back around to some less radical position (e.g. as many people as possible should donate 10-20% of their income to effective charities, and some people should try to work directly in high-priority cause areas).
This healthy trajectory is what I thought of when Hanson said that youth movements eventually "moderate their positions" and "become willing to compromise".
I think the trajectory that is unhealthy is when people repudiate the status quo in some overall sense, seemingly often at least partially because it fills certain emotional needs to make the world other than oneself and to condemn its wicked ways.
Many (though not all) effective altruists seem content to accept the consensus view on most topics, to more or less trust people in general, to trust most mainstream institutions like academia, journalism, and the civil service (of liberal democratic countries), and they don't particularly seek out being contrarian or radical or to reject the world.
On the other hand, this impulse to reject the world and be other than it is probably the central impulse that characterizes LessWrong and the rationalist community. EA/rationalist blogosphere writer Ozy Brennan wrote an insightful blog post about rationalists and the "cultic milieu", a concept from sociology that refers to new religious movements rather than the high-control groups we typically think of when we think of "cults". (Read the post if you want more context.) They wrote:
In a similar vein, the EA Forum member Maniano wrote a post where they conveyed their impression of EAs and rationalists (abbreviating "rationalists" to "rats", as is not uncommon for rationalists to do):
I don't know for sure what "narrative addiction" means, but I suspect what the author meant is something similar to the sort of psychological tendencies Ozy Brennan described in the post about the cultic milieu. Namely, the same sort of tendency often seen among people who buy into conspiracy theories or the paranoid style in politics to think about the world narratively rather than causally, to favour narratively compelling accounts of events (especially those containing intrigue, secrets, betrayal, and danger) rather than awkward, clunky, uncertain, confusing, and narratively unsatisfying accounts of events.
From the linked Wikipedia article:
I think seeing oneself as other than the wicked world is not a tendency that is inherent to effective altruism or a necessary part of the package. But it is a fundamental part of rationalism. Similarly, EA can be kept safely in one corner of your life, even as some people might try to convince you it needs to eat more of your life. But it seems like the whole idea of rationalism is that it takes over. The whole idea is that it's a radical new way to think, live, be rational, and be moral and/or successful.
I wonder if the kind of boredom you described, Michael, that might eventually set in from a simpler The Life You Can Save-style effective altruism is part of what has motivated people to seek a more expansive (and eventually maybe even totalizing) version of effective altruism — because that bigger version is more exciting (even if it's wrong, and even if it's wrong and harmful).
Personally, I would love to be involved in a version of effective altruism that felt more like a wholesome, warm, inclusive liberal church with an emphasis on community, social ties, and participation. (Come to think of it, one of the main people at the university EA group I was involved in said he learned how to be warm and welcoming to people through church. And he was good at it!) I am not really interested in the postmodernist cyberpunk novel version of effective altruism, which is cold, mean, and unhappy.
I think we should be willing to entertain radical ideas but have a very high bar for accepting them, noting that many ideas considered foundational today were once radical, but also noting that most radical ideas are wrong and some can lead to dangerous or harmful consequences.
Another thing to consider is how hungry these ideas are, as I mentioned. Some radical ideas have a limited scope of application. For example, polyamory is a radical idea for romantic relationships, but it only affects your romantic relationships. Polyamory doesn't tell you to quit your current job and find a new job where you convince monogamous people to become polyamorous. Or provide services to people who are already polyamorous. Polyamory doesn't tell you to have any particular opinions about politics — besides maybe narrow things like rights (e.g. hospital visitation rights) for people in polyamorous relationships — or technology or culture or the fate of the world.
When radical ideas become totalizing and want to be the axis around which the world turns, that's when I start to feel alarmed.
The Life You Can Save is an example of a radical idea — one I think we should accept — that, similar to polyamory, may affect our lives in a significant way, but is naturally limited in scope. The Life You Can Save is an expression of a simple and straightforward version of effective altruism. As people have wanted the scope of effective altruism to get larger and larger over time, that has led to the accretion of a more complicated and eclectic version of effective altruism that I think is a lot more dubious.
I agree that we probably shouldn't just defund all arts/cancer/old people charities overnight, but there are lots of causes that plausibly 'deserve' way less funding on the margin which would be better spent by GiveWell without society falling apart.
I take a Chesterton's fence sorta view here where I imagine a world which has zero arts funding and maybe that ends up being impoverished in a hard-to-quantify way, and that seems worth avoiding. But for the time being I'm happy to tell people to stop donating to the Cancer Research UK and send it to AMF instead.
Yes, there is an important difference between doing something yourself or recommending it to others (when you don’t expect to persuade the whole world) vs. a prescription for the whole world to universally follow. So, maybe it’s good to stop donating to anything but GiveWell-recommended charities and suggest the same to others, but maybe it would end up being bad if literally the whole world suddenly did this.
It’s also different to say that society’s priorities or allocation of resources, as a whole, should be shifted somewhat in one direct or another than to say, I don’t know, developed countries should abolish their welfare systems and give the money to GiveWell.
The real life example that sticks out in my mind is when someone who was involved in our university EA group talked about volunteering with seniors and someone else told her this was self-interested rather than altruistic. To me, that is just a deeply unwise and overzealous thing to say. (In that group, we also discussed the value of novels and funding for cancer research and we had people arguing both sides of each issue.)
My attitude on those things was that there is no cost to me at least taking a cautious approach and trying to practice humility with these topics. I wasn’t trying to tell people to devote every ounce of their lives to effective altruism (not that I could convince people even if I wanted to) but actually proposing something much more modest — switching whatever they donated to a GiveWell charity, maybe pledging to give 10% of their income, things of that nature.
If we were pitching the Against Malaria Foundation to a student group planning a fundraiser, then I would see my goal as persuading them to donate to AMF and if they decided to donate to AMF, that would be success. If we did a presentation like a Giving Game, I didn’t mind trying to give people a little razzle dazzle — that was the whole idea.
But if someone came to our EA group alone, though, my attitude was more like: “Here’s the idea. What do you think?” I never felt like it was for me to try to convert anybody. (Does that actually even work?) I always wanted to respect people’s autonomy and their humanity. That felt sacred to me. And, honestly, I just don’t have the stomach to give someone a hard sell. I could never be a telemarketer.
I have long had the opposite criticism; that almost everything that gets high engagement on the Forum is lowest-common-denominator content, usually community-related posts or something about current events, rather than technical writing that has high signal and helps us make progress on a topic. So in a funny way, I have also come to the same conclusion as you:
but for the opposite reason.
I sort of see where both criticisms are coming from. The lowest-common-denominator, community-related posts get the highest engagement (including from people like the OP) because they require little context. The high context technical stuff is much harder to parse, and necessarily has a smaller audience, so gets less engagement (perhaps with the exception for AI Safety, which is currently experiencing a "boom").
There will naturally be fewer technical posts in the areas I'm interested in and, like Michael_PJ, I have no desire to read long technical posts in areas I'm not interested in, so I end up engaging disproportionately with community-related posts.
Fiddling with the forum filters helps - I personally have downweighted posts tagged "Community" and "Building effective altruism" - but I suspect few people do this.
+1 to this. It's a tough one because I do think we old-timers have something to offer in terms of perspective, but I share your sense of not having much energy to engage.
I also have a stronger interest in the direct cause areas these days, than in the meta-cause of EA.
This post somewhat resonates with me, as I'm also sort of an old hand, albeit I've always been more on the periphery of EA, and sometimes consider myself EA-adjacent rather than full on EA (even though I've done a bunch of EA-ish things like donate to AMF/Give Directly and attend an EA Global).
I've been around long enough to see a bunch of the early EAs who were part of the old Felicifia forums become more or less leaders in the movement (i.e. Peter Wildeford), as well as some sorta fade into obscurity (i.e. Brian Tomasik?). It's interesting to see, and I'm happy for the former, and a bit sad about the latter.
Weirdly, I've also moved a bit further leftish on the political spectrum in recent years, and this has led me to feel conflicted about EA, as it's very much a western liberal movement, and my sympathy for socialism seems to be an awkward fit nowadays. Though, admittedly I tend to oscillate at times, so this may be temporary.
And yeah, as I've mentioned before in other comments, I do feel like the movement is more geared towards the young university elite as well.
Just some thoughts, I guess.
Are you willing to share why you hated it?
Yes, I guess I didn't go into this so much in the project post-mortem. But in short:
I think this is a problem with for-profit startups as well. Most of the time they fail. But sometimes they succeed (in the sense of “not failing” rather than breakout success which is far rarer), and in that case you’re stuck with the thing to see it through to an exit.
Hi! I really appreciate you writing this, it offers a level of thoughtful introspection that is otherwise difficult to inspect by people like me (community builder).
Can you imagine a group that you would feel motivated to spend time in, that would be EA affiliated? I'd love to know what aspects would be appealing and any constraints it would have to satisfy.
There seems to be a tension, where EA is exciting as a sort of intellectual project, where you read things and share thoughts and learn new stuff, but the energy to sustain that sort of involvement is difficult to maintain.
A social scene where everyone hangs out is easier to be a part of, but then - whats the point? Plus, lots of people already have a roster full of friends and families and don't need another social group.
I think if anyone is attracted to the intellectual project, but there aren't any formats that make it easy for them to engage, this might be solvable with thought event design. If the intellectual project is too taxing in an otherwise busy life (work, family, other interests etc), I'm not sure there's a way to meaningfully re-engage the old-hand EAs.
This seems plausible to me. EA should be a community organized around doing the intellectual project, not a random social club.
(On the other hand, I am doing an EA activity, namely giving away money effectively. But there's not much to talk about with that!)
So I think a very relevant question is just: is it actually useful or desirable to "retain" people like me? It depends on what we want the EA community to look like.
I strongly agree with this point. I enjoy hanging out with EA friends, but if our objective shifts from "doing good better" to having fun, then I'd probably be less involved with the community.
I enjoyed this, and I miss bumping into you on the stairs at house parties!