Thanks for your very thoughtful response. I'll revise my initial comment to correct the point I made about funding; I apologize for portraying this inaccurately.
Your points about the broadening of the research agenda make sense. I think GPI is, in many ways, the academic cornerstone of EA, and it makes sense for GPI's efforts to map onto the efforts of researchers working at other institutions and in a broader range of fields.
And thanks also for clarifying the purpose of the agenda; I had read it as a document describing GPI's priorities for itself, but it makes more sense to read it as a statement of priorities for the field of Global Priorities Research writ large. (I wonder if, in future iterations of the document—or even just on the landing page—it might be helpful to clarify this latter point, because the documents themselves read to me as more internal facing, e.g., "This document outlines some of the core research priorities for the economics team at GPI." Outside researchers not affiliated with GPI might, perhaps, be more inclined to engage with these documents if they were more explicitly laying out a research agenda for researchers in philosophy, economics, and psychology aiming to do impactful research.)
Thanks for sharing this! I think these kinds of documents are super useful, including for (e.g.) graduate students not affiliated with GPI who are looking for impactful projects to focus their dissertations on.
One thing I am struck by in the new agenda is that the scope seems substantially broader than it did in prior iterations of this document; e.g., the addition of psychology and of projects related to AI/philosophy of mind in the philosophy agenda. (This is perhaps somewhat offset by what seems to be a shift away from general cause prioritization research.)
I am wondering how to reconcile this apparent broadening of mission with what seems to be a decreasing budget (though maybe I am missing something)—it looks like OP granted ~$3 million to GPI approximately every six months between August 2022 and October 2023, but there are no OP grants documented in the past year; there was also no Global Priorities Fellowship this year, and my impression is that post-doc hiring is on hold.
Am I right to view the new research agenda as a broadening of GPI's scope, and could you shed some light on the feasibility of this in light of what (at least at first glance) looks like a more constrained funding environment?
EDIT: Eva, who currently runs GPI, notes that my comment paints a misleading picture of the funding environment. While she writes that "the funding environment is not as free as it was previously," the evidence I cite doesn't really bolster this claim, for reasons she elaborates on. I apologize for this.
No shade to the mods, but I'm just kind of bearish on mods' ability to fairly determine what issues are "difficult to discuss rationally," just because I think this is really hard and inevitably going to be subject to bias. (The lack of moderation around the Nonlinear posts, Manifest posts, Time article on sexual harassment, and so on makes me think this standard is hard to enforce consistently.) Accordingly, I would favor relying on community voting to determine what posts/comments are valuable and constructive, except in rare cases. (Obviously, this isn't a perfect solution either, but it at least moves away from the arbitrariness of the "difficult to discuss rationally" standard.)
Yeah, just to be clear, I am not arguing that the "topics that are difficult to discuss rationally" standard should be applied to posts about community events, but instead that there shouldn't be a carveout for political issues specifically. I don't think political issues are harder to discuss rationally or less important.
This is weird to me. There are so many instances of posts on this forum having a “strong polarizing effect… [consuming] a lot of the community’s attention, and [leading] to emotionally charged arguments.” The several posts about Nonlinear last year strike me as a glaring example of this.
US presidential candidates’ positions on EA issues are more important to EA—and our ability to make progress on these issues—than niche interpersonal disputes affecting a handful of people. In short, it seems like posts about politics are ostensibly being held to a higher standard than other posts. I do not think this double standard is conducive to healthy discourse or better positions the EA community to achieve its goals.
Two separate points:
I think there’s a lot of truth to this; the part about sanctifying criticism and critical gadflies especially resonated with me. I think it is rational to ~ignore a fair bit of criticism, especially online criticism, though this is easier said than done.
Two pieces of advice I encountered recently that I’m trying to implement more in my life (both a bit trite, but perhaps helpful as heuristics):
Despite working in global health myself, I tend to moderately favor devoting additional funding to animal welfare vs. global health. There are two main reasons for this:
Importance: The level of suffering and cruelty that we inflict on non-human animals is simply unfathomable.
I think the countervailing reason to instead fund global health is:
It’s super cool to see USAID and OP partnering very publicly on such an important project. In addition to the obvious good this will do via the project’s direct impact on lead exposure, I’m glad to see such a powerful and reputable government agency implicitly endorsing OP as an organization. I hope this will help legitimize some of OP’s other important work, and pave the way for similar partnerships in other arenas.
I don't think visceral gout is an infectious disease. I also don't think chickens can vomit. Two inaccuracies in this one sentence just made me wonder if there were other inaccuracies in the article as well (though I appreciate how deeply researched this is and how much work went into writing it).