Would be curious which top recommendations people have in the areas of xrisk and ai safety? Have donated to and considered:

Any other top choices that seem potentially more underfunded and impactful? Feel free to share your own effort but state it as such, or other conflicts of interest

17

0
0

Reactions

0
0
Comments11
Sorted by Click to highlight new comments since:

CLR seems pretty funding-constrained: https://forum.effectivealtruism.org/posts/J7gdciCXFgqyimAAe/center-on-long-term-risk-2023-fundraiser

They focus on s-risks, Cooperative AI and acausal interactions.

Thanks, great rec, donated!

Rethink Priorities' AI Governance & Strategy team (which I co-lead) has room for more funding. There's some info about our work and the work of RP's other x-risk-focused team* here and elsewhere in that post. One piece of public work by us so far is Understanding the diffusion of large language models: summary. We also have a lot of work that's unfortunately not public, either because it's still in progress or e.g. due to information hazards. I could share some more info via a DM if you want.

We also have yet to release a thorough public overview of the team, but we aim to do so in the coming months.

(*That other team - the General Longtermism team - may also be interested in funding, but I don't want to speak for them. I could probably connect you with them  if you want.)

Big fan of RT, thanks for sharing!

Glad to hear that!

Oh also, just noticed I forgot to add info on how to donate, in case you or others are interested - that info can be found at https://rethinkpriorities.org/donate 

If you want to support work in other contexts, Riesgos Catastróficos Globales is working on improving GCR management in Spain and Latin America.

I believe this project can improve food security in nuclear winter (tropical countries are very promising as last-resort global food producers), biosecurity vigilance (the recent H5N1 episode happened in Spain and there are some easy improvements to biosec in LatAm)  and potentially AI policy in Spain.

Funding is very constrained, we currently have runway until May, and each $10k extends the runway by one month.

We are working on a way to receive funds with our new fiscal sponsor, though we can already facilitate a donation if you write to info@riesgoscatastroficosglobales.com.

(disclaimer: I am a co-founder of the org and acting as interim director)

Thanks for sharing, website isnt working for me, is there a deeper writeup or independent evaluation of the org and effort?

Here is a write up of the organisation vision one year ago:

https://forum.effectivealtruism.org/posts/LyseHBvjAbYxJyKWk/improving-the-public-management-of-global-catastrophic-risks

Not sure why the link above is not working for you. Here is the link again:

https://riesgoscatastroficosglobales.com/

As per Michael's comment, Rethink Priorities' General Longtermism team (in which I work) also has room for more funding. You can read about our work in 2022 in this post

More recent public outputs include Does the US public support ultraviolet germicidal irradiation technology for reducing risks from pathogens? and Scalable longtermist projects: Speedrun series.

Curated and popular this week
Relevant opportunities