ID

Isaac Dunn

406 karmaJoined
isaacdunn.com

Comments
64

Would you be eligible for the graduate visa? https://www.gov.uk/graduate-visa

If so, would that meet your needs?

(I've just realised this is close to just a rephrasing of some of the other suggestions. Could be a helpful rephrasing though.)

The Superalignment team's goal is "to build a roughly human-level automated alignment researcher".

Human-level AI systems sound capable enough to cause a global catastrophe if misaligned. So is the plan to make sure that these systems are definitely aligned (if so, how?), or to make sure that they are deployed in a such a way that they would not be able to take catastrophic actions even if they want to (if so, what would that look like?)?

Thanks David, that's just the kind of reply I was hoping for! Those three goals do seem to me like three of the most important. It might be worth adding that context to your write-up.

I'm curious whether there's much you did specifically to achieve your third goal - inspiring people to take action based on high quality reasoning - beyond just running an event where people might talk to others who are doing that. I wouldn't expect so, but I'd be interested there was.

Thanks for writing this up! I'd be interested if you had time to say more about what you think the main theory of change of the event was (or should have been).

Interesting results, thanks for sharing! I think getting data from people who attend events is an important source of information about what's working and what's not.

I do worry a bit about what's best for the world coming apart from what people report as being valuable to them. (This comment ended up a bit rambley, sorry.)

Two main reasons that might be the case:

  1. If the event causes someone's goals or motivations to change in a way that's great for the world, my guess is that doesn't feel valuable to the person compared to helping the person get or do things they already want.
    • Eg if someone isn't that on board with the EA project, then them getting more enthusiastic about making it an important priority in their life could be very good for the world, but feel like only a small personal benefit from the event (maybe it feels like "I had a good time and felt more excited" - and it's very hard to tell from a report like that whether the person is now much more likely to go on to do the EA project well, or whether it's not really going to affect their actions).
    • Eg if someone is attached to a cause area or job type that they have existing connections with, but the event nudges them to seriously consider other possibilities that are in fact more valuable, then this could be a great outcome that again doesn't feel that valuable to the individual.
  2. If the people attending the event don't have that good an understanding of what's in fact counterfactually valuable for achieving their goals.
    • Eg they might report that the event caused something to happen, but that thing would likely have happened anyway. Eg learning about something, getting funding, getting a job.
    • Eg they might just overrate or underrate the importance of specific outcomes (specific relationships, changes to motivation, specific ideas).

Ā 

I think these reasons are actually important enough for people professionally involved in community building to try to beat the baseline of "let's do things that people report as valuable" by trying to build a detailed understanding of the mechanisms that cause someone to go on to do things that are valuable for the world (weighted by their value). How exactly do their different motivations, beliefs and experiences fit together? Is there a typical "journey", or maybe several different journeys? Are there certain things that are necessary in order for people to go on to do great work, and in particular are there things that individuals are likely to underrate?

Of course this happens some amount, but I'd be keen to see more discussion of this in general among people doing EA meta work.

(Something like the 2020 OP longtermist survey but much more focused on understanding the mechanisms that caused the good work, rather than the categories of thing that the people interacted with. Rather than a survey, maybe more like in-depth user interviews. I think 80k may have done a bit of this, I'm not sure.)

Are there any lessons that GWWC has learnt that you think would be useful for EA community builders to know and remember?

If GWWC goes very well over the next five years (say 90th percentile), what would that look like?

Do you think that most of GWWC's impact will come from money moved, or from introducing people to EA who then change their career paths, or something else? (I can't tell immediately tell from your strategy, which mentions both.)

What is the best reason to think that GWWC isn't good for the world, in your view?

Load more