I run the Centre for Exploratory Altruism Research (CEARCH), a cause prioritization research and grantmaking organization.
Yep, and that's about 5x existing GiveWell top charities. Also, GiveWell has made a 700k grant to Resolve to Save Lives (RTSL) for their salt reduction work in China; CEARCH previously recommended and evaluated RTSL for their salt policy work in Southeast & South Asia, and I'm very excited to see GiveWell make this exploratory grant.
One minor suggestion I have is to do annual donations (i.e. sit down at the end of the year / whenever you do your taxes, and decide how much disposable income you have and how much you want to give). It feels more special this way, you get more invested (if you spend time looking at potential charities recommended by GiveWell and charity evaluators), and of course it's less of a hassle than deciding monthly.
The tractability on this will be terrible - you'll be trying to persuade poor countries to limit quality/quantity/variety of food availability to their own people, and you can imagine how that will go. Additionally, there may be potential human costs in terms of nutrition and economic growth.
If you do want to affect AW in LMICs, Innovate Animal Agriculture style work trying to affect the kind of industrial animal agriculture seems far more tractable (in fact it's probably more tractable than in HICs, since e.g. there are no sunk costs in terms of equipment and capital).
Hi Vasco,
We have standard GCR discounts when estimating long term impact, but for AI - we're generally more sceptical, both on paperclipping and on extremely rosy projections of economic growth. In any case, while there might be a theoretical case for discounting income-based interventions (if you really believe GDP growth is going jump to 10% per annum, we're obviously moving up the DMR curve more rapidly), there's much less direct impact on health (in fact, if you think income is going to drastically increase, that makes consumption greater, and the DALY burden of diseases of affluence much worse, and hence preventing them more cost-effective).
The percentage of EAs earning to give is too low
Money is the major bottleneck for high impact charities, at least in GHD and AW (and some GCR causes). Meanwhile, there's an excess in demand for EA jobs relative to the supply (everyone knows it's extremely competitive). Hence, the marginal value of getting more money in via earning to give (loosely defined - not everyone needs to be in finance or tech) is probably higher than trying to squeeze into a direct role where replaceability is extremely high.
Object level I agree, though ironically I think we would agree this also falls into the entertainment category (as evinced by myself and a whole bunch of people posting on this)