Hey. I'm Jordan. I'm currently the EA Global Content Coordinator at CEA, meaning I arrange all the talks, workshop and meetups at EAG conferences.Â
Previously, I was involved in a variety of EA community-building projects, including running EA Oxford for the 23/24 academic year.
Feel free to reach out if you want to share feedback on EAG, or have ideas for content!
Thanks Jan, I appreciate this comment. I'm on the EAG team, but responding with my personal thoughts.Â
While it's true that we weight 1:1s heavily in assessing EAG, I don't think we're doing 'argmax prioritisation'âwe still run talks, workshops, meetups, and ~1/4 of our team time goes to this. My read of your argument is that we're scoring things wrong and should give more consideration to the impact of group conversation. You're right that we don't currently explicitly track the impact of group conversations, which could mean we're missing significant value.
I do plan to think more about how we can make these group conversations happen and measure their success. I haven't yet heard a suggestion (in this thread or elsewhere) that I believe would sufficiently move the needle, but maybe this is because we're over-optimising for better feedback survey scores in the short term (e.g., we'll upset some attendees if we turn off specific 1:1 slots).
By coincidence, I just came across this layer-hen genetics project that got funding from OP. I don't know much about the work or how promising it might be.
Luke, I took part in the GWWC ambassador program you ran in 2021 and had the pleasure of interacting with you on a few occasions. Your enthusiasm and thoughtfulness were a part of what pushed me to deepen my involvement in the EA community and eventually go on to do community building full-time. I really appreciate the work you've done to grow GWWC into an outstanding organisation. I wish you all the best!
In case you haven't seen it, here's a fireside chat we hosted with Ezra Klein in 2021. It might be cool to have him back at EAG though!
I share your inclination toward significant diversification. However, I find myself grappling with the question of whether there should be specific limits on this diversification. For instance , Open Philanthropy's approach seems to be "we diversify amongst worldviews we find plausible," but it's not clear to me what makes a worldview plausible. How seriously should we consider, for example, Nietzscheanism?
Thanks for sharing this! Minor feedback: I'd like to see the survey data (eg. your average LTR) in text in addition to the graphs