There is a growing gap between the pace of technological change and society's ability to reason about it. I think this gap is now a primary bottleneck for making good decisions about AI, institutions, and human identity — and EA could be doing significantly more to address it.
Consider who is currently dominating the public conversation about technology and its consequences.
LinkedIn thought leaders telling you AI will 10x your productivity. Podcasters with confident 45-minute takes on superintelligence. Techno-optimists whose argument is essentially "trust the builders." And on the other side, a simplified AI doom discourse that has been memed and diluted into something that generates anxiety without producing understanding.
These are the dominant intellectual infrastructure through which most educated, engaged people — including policymakers, educators, journalists, and future EA-adjacents — are forming their views about the most consequential technological transition in human history.
The quality of reasoning at that layer is poor. EA has done real work here — MacAskill's books, 80,000 Hours, Ord's The Precipice, Future Perfect — and that work matters. But there is a specific layer that remains underleveraged: serious, rigorous, readable philosophy of technology for the informed general reader.
That layer is undersupported. And supporting it more is probably higher-leverage than EA currently treats it.
This Is a Philosophy of Technology Problem
The issue is not a lack of information, but a lack of good frameworks— the conceptual tools that determine what questions people even think to ask.
Most people engaging with AI right now are reasoning with borrowed equipment. They're applying frameworks built for previous technological transitions — the printing press, the industrial revolution, the internet — to something that may not resemble any of them. The result is that entire categories of important questions don't get worked on at all.
-What is authorship when the line between human and machine expression is genuinely unclear?
-What is expertise worth in a world where the appearance of expertise can be generated instantly and cheaply?
-How should we think about the democratic legitimacy of decisions increasingly mediated by systems nobody fully understands?
-When does technological acceleration outpace a society's capacity for meaningful consent, and what do we do when it already has?
These are the load-bearing questions underneath every serious policy debate, every institutional decision, every individual choice about how to live and work right now. And the people making those decisions are mostly reasoning about them with frameworks they absorbed from podcasts and Linkedin op-eds that handle it very poorly.
Answering those questions requires philosophy, along with writers who can take ideas won from technical, jargon-heavy papers and briefs, and bring them to a general but intellectually serious audience.
A Career Pipeline EA Is Missing
There's a second problem worth naming. EA is genuinely good at creating career pathways for technical AI safety researchers, policy analysts, and operations people. It is considerably less good at creating meaningful career pathways for people with philosophy and humanities training — exactly the people best equipped to do this kind of work.
Every year, sharp people graduate with philosophy degrees, care deeply about technology and its consequences, and find almost no institutional home that takes their skills seriously and pays them to apply those skills to important problems. Many drift toward adjacent careers that don't fully use what they're good at. Some leave the orbit of EA-adjacent thinking entirely.
This is a talent waste. And it's partly a structural problem — EA hasn't built many venues or institutions that create career capital for early-stage philosophy graduates working on philosophy of technology.
What Better Argument Is
Disclosure: I’m the publisher of Better Argument and therefore have a direct stake in this.
Better Argument is a new print magazine from the Institute for Classical Dialogue, based in Santa Fe. Its explicit mission is building that missing layer: applying classical philosophical methods to the transformations taking place now.
It is not an EA publication. It is something arguably better positioned: a serious intellectual venue aimed at the informed general reader, at a moment when that reader is currently being served by LinkedIn and Joe Rogan.
But it is also, practically, a career infrastructure project. The ~20 writers currently working with Better Argument are building real publication records, developing their public voice, finding intellectual mentorship and establishing themselves as serious thinkers at the intersection of philosophy and technology — at exactly the moment when that intersection is becoming one of the most important places to be. For early-stage philosophy graduates trying to build career capital in this space, this is a rare and concrete opportunity.
The neglectedness case is straightforward. EA has spent millions on technical AI safety. Better Argument's fundraising goal is $10,000. The counterfactual impact of marginal support here — on an intellectual infrastructure project that could reach and shape exactly the audiences EA has struggled to reach, while creating career pathways for exactly the people EA has struggled to absorb — is unusually high.
What I'm Asking
Three things:
Read it. The inaugural edition is $11.99. If you care about how ideas about technology propagate into broader culture — and you should — this is worth your time.
Write for it. We accept rolling submissions. If you have ideas about philosophy of technology that need more room and a different audience than a forum post, this is a real venue with a real editorial process.
Spread it. Particularly to the sharp, curious people in your life who aren't in EA but are trying to make sense of what's happening. Better Argument is built for them — and bringing more of them into serious thinking about these problems is something EA should want.
The gap between technological change and our ability to reason about it is getting wider. Closing it requires better intellectual infrastructure and more people with the right skills working on it. Better Argument is building both.

Direct call for articles on the following:
[visit better-argument.com/writers for details]
Economics & Political Economy
Technology, Platforms & Incentives
Education & Knowledge
Work, Identity & Intellectual Life
Governance & Power
Ethics & Human Meaning
Culture & Society