One outstanding question is at what point AI capabilities are too close to loss of control. We propose to delegate this question to the AI Safety Institutes set up in the U.K., U.S., China, and other countries.
I consider it clickbait if you write "There Is a Solution", but then say that there are these AI safety institutes that will figure out the crucial details of the solution some time in the future.
In my opinion you have not really argued why it is neglected. As a starting point, they seem to spend roughly $35 million per year: https://projects.propublica.org/nonprofits/organizations/824506840. $14 million of those are for salaries, so I would be surprised if new features are strongly bottlenecked by money and talent.
I am just guessing on these issues, but my suspicion as for why some features such as "live location sharing" and "display past encrypted messages from the group chat you were not a member of but only just now joined" are not (yet) implemented because they do not fit well into Signal's approach to security/privacy.
Could you provide examples of political discussions on the EA Forum that appear to have negatively impacted the forum’s environment or impaired its ability to achieve its objectives?
As far as I remember, the political discussions have been quite civilized on the EA Forum. But I think this is because of the policies and culture the EA Forum has. If political discussions were a lot more frequent, the culture and discussion styles could get worse. For example, it might attract EA-adjacent people or even outsiders to fight their political battles on the EA Forum. Maybe this can be solved by hiring additional moderators though.
Also, politics can get a lot of attention that would be better spend elsewhere. For example this post about Trump generated 60 comments, and I am not sure if it was worth it.
That is, many (most?) people need a break-in point to move from something like "basically convinced that EA is good, interested in the ideas and consuming content, maybe donating 10%" to anything more ambitious.
I am under the impression that EAGx can be such a break-in point, and has lower admission standards than EAG. In particular, there is EAGxVirtual (Applications are open!).
Has the rejected person you are thinking of applied to any EAGx conference?
A strategy for scaling effective giving that is not mentioned here is earning to give.
Encouraging and helping people who are already bought into the idea of donating effectively to earn more could generate a lot of money and value. I think this strategy should be considered besides encouraging high-earners to donate effectively (I am not making a claim here about which is better).
A concrete step could be to talk to people from 80k about advertising earning to give again.
Some (or all?) Lightspeed grants are part of SFF: https://survivalandflourishing.fund/sff-2023-h2-recommendations