C

ChanaMessinger

3493 karmaJoined www.chanamessinger.com

Bio

Participation
2

I work at CEA usually on community epistemics and supporting high school outreach from a community health perspective, currently interim manager of the team.  (Opinions here my own by default though will sometimes speak in a professional capacity).

Personal website: www.chanamessinger.com

Comments
320

Topic contributions
20

I liked this!

I appreciated that for the claim I was most skeptical of: "There’s also the basic intuition that more people with new expertise working on a hard problem just seems better", my skepticism was anticipated and discussed.

For me one of the most important things is:

Patch the gaps that others won’t cover

  • E.g., if more academics start doing prosaic alignment work, then ‘big-if-true’ theoretical work may become more valuable, or high-quality work on digital sentience. 
  • There’s probably predictable ‘market failures’ in any discipline – work that isn’t sexy but still very useful (e.g., organizing events, fixing coordination problems, distilling the same argument into new language, etc.). 
  • Generally track whether top priority work is getting covered (e.g., information security, standards and monitoring) 

This, plus avoiding and calling out safety washing, keeping an eye out for overall wrongheaded activities and motions (for instance, probably a lot of regulation is bad by default and some could actively make things worse), seem like the strongest arguments against making big naive shifts because the field is in a broad sense less neglected.

More generally, I think a lot of the details of what kinds of engagements we have with the broader world will matter (and I think in many cases "guide" will be a less accurate description of what's on the table than "be one of the players in the room", and some might be a lot more impactful than others, but I don't have super fleshed out views on which yet!

I really loved this! I have basically no knowledge of the underlying context, but I think this symmary gave me a feel for how detailed and complicated this is (reality has a lot of detail and a lot of societies for air conditioning engineers!), a bit of the actual science, as well as some of the players involved and their incentives.

It's helpful and interesting to look at what small scientific communities are like as analogues for EA research groups.

From Astral Codex Ten

FRI called back a few XPT forecasters in May 2023 to see if any of them wanted to change their minds, but they mostly didn’t.


 

I really like this concept of epistemic probation - I agree also on the challenges of making it private and exiting such a state. Making exiting criticism-heavy periods easier probably makes it easier to levy in the first place (since you know that it is escapable).

Did you mean for the second paragraph of the quoted section to be in the quote section? 

Thanks so much for this, I really enjoyed it! I really like this format and would enjoy seeing more of it.

This isn't the point, and there's likely so much behind each vignette that we don't see, but I so wish for some of these folks that they are able to find e.g. people/mentors who encourage their "dumb questions", people who want to talk about consciousness, people who can help figure out what to do with doomer-y thoughts, maybe telling aggregators of information about some of the things listed (community health is one for some topics including some cases of bad management, there are others). I wish them luck, encourage finding an information aggregator, and wonder if maybe the comments here might end up with offers to talk about the things people find hard to talk about. I just have a sense that (exempting all the complexity I don't see) there are people who want to talk about these things and feel open to weird and heterodox views here!

But I know that we're more talking about vibes and overall incentive gradients and so on. I'm pretty uncertain what systemic solutions would look like here, but I'll be curious what your poll ends up finding.

I myself have been worried about the social effects of friends working at AI labs and organizations and whether that's going to make it harder for me or others to criticize that org or have a negative sentiment towards them. Would love to talk more about that some time, especially with people who work at these places!

A small sadness I have (and not sure what there is to do about this, I appreciate the sharing) is that I think I'm pretty likely to remember the unendorsed ones about the same as the endorsed ones, because the vignettes are the memorable bits. Just an unfortunate fact about this kind of thing.

Right, right, I think on some level this is very unintuitive, and I appreciate you helping me wrap my mind around it - even secret information is not a problem as long as people are not lying about their updates (though if all updates are secret there's obviously much less to update on)

I appreciate the reminder that "these people have done more research" is itself a piece of information that others can update on, and that the mystery of why they haven't isn't solved. (Just to ELI5, we're assuming no secret information, right?)

I suppose this is very similar to "are you growing as a movement because you're convincing people or via selection effects" and if you know the difference you can update more confidently on how right you are (or at least how persuasive you are).

I tried for a while to find where I think Oliver Habryka talked about this, but didn't find it. If someone else finds it, let me know!

I want to just appreciate the description you’ve given of interaction responsibility, and pointing out the dual tensions. 

On the one hand, wanting to act but feeling worried that by merely getting involved you open yourself up to criticism, thereby imposing a tax on acting even when you think you would counterfactually make the situation better (something I think EA as a concept is correctly really opposed to in theory). 

On the other hand, consequences matter, and if in fact your actions cause others who would have done a better job not to act, and that’s predictable, it needs to be taken into account. This is all really tough, and it bites for lots of orgs or people trying to do things that get negative feedback, and it also bites for the orgs giving negative feedback, which feels worth bearing in mind.

Load more