P

PhilZ

Recruiting Lead @ Open Philanthropy
117 karmaJoined

Comments
12

PhilZ
13
2
0
3

I agree with a lot of this post and I'm really glad you wrote it! I would be very excited to see readers with whom this post resonates apply for the Open Phil recruiting team (which I manage). I think our team's work is extremely high-impact and we've often found it hard to hire great people onto it in the past.

One place where I slightly disagree with the post is on the value of past recruiting experience. Most of our team had ~no previous recruiting experience before joining the team, and that hasn't stopped them being successful in the role, so it's very far from a requirement. However, it does mean that past recruiting experience can be a valuable differentiator for some candidates. At the risk of speaking too specifically to my past self, anyone who has a background in recruiting (in any context) and is interested enough in high-impact work to read this comment should know that their application would be extremely welcome!

Have you seen this page? It has links to a bunch of other pages that might help. The key quote is probably this one:

There isn't one main EA community in London, there are quite a few subgroups based around different causes, workplaces and interests. There are also people who have an interest but attend an event or get involved with a relevant opportunity once every few years.

FWIW, this tracks with my experience as someone based in London but doing only occasional socialising with other EAs.

Not the original responder but wanted to jump in anyway, hope that's ok! 

This seems likely to fail even in utilitarian or expected value terms. As you mentioned, the employment consequences (and/or consequences for the CCG program as a whole) would be serious, and even with a small chance of being caught, I reckon the expected value would be net negative.

But even if that weren't true, I think you should take seriously that it would feel pretty repugnant; taking common-sense morality into account alongside utilitarianism is important IMO. 

Having said all that, your efforts to find ways to do more good are commendable; thank you for thinking about this and engaging about it here! I just think you should focus those efforts on tactics that are above board. 

Hi Saramago, thanks for the question and sorry that it got missed initially! This made me curious about how Open Phil compares on this metric to the companies you mentioned, and it turns out we're actually pretty similar to Google and BCG (slightly fewer positive and negative responses, slightly more neutral responses). We also keep an internal candidate survey which shows a broadly similar picture. So I think we're doing some things right, but I agree there are pain points as well:

  • Our main reasons for negative feedback are the length of our process and the lack of feedback. The former involves difficult tradeoffs against making sure we're hiring the best candidate from each hiring round, but we have been placing higher weight on it in recent months and looking for ways to shorten the process e.g. by using one work test rather than two where feasible. The latter is difficult to solve at scale without huge investments of staff time, but we're aiming to provide more generalised feedback that's hopefully still actionable to candidates where we can.
  • We've also invested in setting out clearer timelines and communicating these with candidates from near the start of our processes, and have found that this often mitigates many of the negative impacts of longer round timelines.

Finally, I would love to hear if you or anyone else has been put off from applying to Open Phil by perceptions about what the hiring process will be like, either due to Glassdoor or otherwise – this kind of data is extremely valuable and really hard to get! 

Thanks for doing the AMA, Tom! How is EA perceived in UK journalism circles (if it's perceived at all)? What did people in those circles make of you writing a book on GCRs? 

Thanks for the question Christoph! I’ll start with my takes on the examples you listed and then pivot to some broader thoughts:

  • Being at one company for a long time: this really depends on the specifics. The main reasons I might see this as a negative signal are a) if I worry that the long tenure indicates that someone will find it difficult to translate their experience to a different working environment or b) that they haven’t been successful in their career and have been stuck in one role for a long time (if they haven’t switched roles as well as not switching companies). I think these can both be real concerns, but there are often mitigating factors too, and frequent job-hopping can also be a relevant concern. The question of ‘would staying or leaving be better for my career’ is really the thing to worry about IMO - if you’re making the decision on that basis, you’ll probably have a convincing story to tell. 
  • CV certificates: it depends on the role! Usually a good job description will highlight skills or qualifications that are relevant to it, and certificates that demonstrate those can be helpful, especially in more technical roles. If there’s no obvious connection, they probably won’t be helpful (or harmful). 
  • Cover letters: I would recommend being guided by the application form on this front. If it includes an option for a cover letter, I’d almost always recommend including one. If it doesn’t, I’d almost always recommend not including one. This is a personal preference, but I don’t enjoy reading unsolicited cover letters myself!

Speaking more generally, the key things that often put candidates higher on the priority list are the straightforward factors of e.g. do they seem to understand the role / organization and to be excited about it for genuine reasons, are their skills and experience a good fit for our requirements, do they communicate effectively, etc. If you’re demonstrating these things in a way that’s easy for a recruiter to understand even on a quick review, you’re likely doing just fine - I don’t think there are any special tricks here.

Note that the application deadline has now been extended to Monday, November 27th at 11:59pm PST.

Thanks for the questions! Given the quantity of questions you've shared across the different roles, I think our teams might struggle to get to all of them in satisfactory detail, especially since we're past the initial answering window. Would you be able to highlight your highest priority question(s) under each top-level comment, as we'd like to make sure that we're addressing the ones that are most important for your decision-making?

PhilZ
12
0
0
1
1
1

Generally, we try to compensate people in such a way that compensation is neither the main reason to be at Open Phil nor the main reason to consider leaving. We rely on market data to set compensation for each role, aiming to compete with a candidate’s “reasonable alternatives” (e.g., other foundations, universities, or high-end nonprofits; not roles like finance or tech where compensation is the main driving factor in recruiting). Specifically, we default to using a salary survey of other large foundations (Croner), and currently target the 75th percentile, as well as offering modest upwards adjustments on top of the base numbers for staff in SF and DC (where we think there are positive externalities for the org from staff being able to cowork in person, but higher cost of living). I can’t speak to what they’re currently doing, but historically, GiveWell has used the same salary survey; I’d guess that the Senior Research role is benchmarked to Program Officer, which is a more senior role than we’re currently posting for in this GCR round, which explains the higher compensation. I don’t know what BMGF benchmarks you are looking at, but I’d guess you’re looking at more senior positions that typically require more experience and control higher budgets at the higher end.

That said, your point about technical AI Safety researchers at various nonprofit orgs making more than our benchmarks is something that we’ve been reflecting on internally and think does represent a relevant “reasonable alternative” for the kinds of folks that we’re aiming to hire, and so we’re planning to create a new comp ladder for technical AI Safety roles, and in the meantime have moderately increased the posted comp for the open TAIS associate and senior associate roles.

I don’t have specific data on this, but only a minority of our hires apply through referrals or active networking. Our process is set up to avoid favoring people who come via prior connections (e.g. by putting a lot of weight on anonymized work tests), and many of our best hires have joined without any prior connections to OP. However, we do still get a lot of value from people referring candidates to our roles, and would encourage more of this to keep expanding our networks.

Load more