Camille

Group Organizer at EA ENS Paris, @ Managing Tense Disagreements
195 karmaJoined Pursuing other degree/diplomaWorking (0-5 years)94110 Arcueil, France
www.effectivedisagreement.org

Bio

Participation
3

Currently building a workshop with the aim to teach methods to manage strong disagreements (including non-EA people). Also community building.

Background in cognitive science.

Interested in cyborgism and AIS via debate.

https://typhoon-salesman-018.notion.site/Date-me-doc-be69be79fb2c42ed8cd4d939b78a6869?pvs=4

How others can help me

I often get tremendous amounts of help from people knowing how to program being enthusiastic for helping over an evening.

Comments
25

Yes, I saw it, and have it in mind as well! I'll reach back to you in the not so far future ;)

Hello Mako, thanks for your interest ^^
I'm planning to open a negotiation training module later, probably next year.

I personally agree. I'm a bit on repeat-mode on this, but outside of EA, it's actually very hard to have a productive rational conversation with someone who disagrees with us without relying on storytelling. People tend to reject arguments they view as coming from the opposite side. Stories establish positive rapport and empathy, which subsequently allows more rigorous conversations.

I may get back to you at some point for a collaboration!

Thank you very much for this work, this was a core missing piece in the effort of those of us who are interested in X-risk communication.

I broadly wish we could get to a point where this is applicable, but I'm unsure whether the strategy outlined by the OP is the best one. Although, I'm by no means saying it is exactly comparable, but having experienced it myself, an oppressed minority has no solution but to get transparent about it. If you feel that you belong to a group with a bad reputation, you might experience fear and anxiety when opening up about it.

A small detail I'd add is that, as far as I perceive it, no one actually tries to debunk false ideas and over-generalizations related to EA. When I suggested doing so in the past, a few people actively discouraged me to do it. Some of those false ideas are emotionally hard to bear, others are completely outlandish. What made my coming-out possible, in contrast, was the wealth of ressources and arguments I could throw to people, or just knowing they'll run into them at some point.

This might be caused by us not owning our affiliation.

Yet EA is starting to become a well-identified group, something that people can have clichés about, and who can suffer unfair ostracisation. It's very hard for me to publish my affiliation if I don't feel defended from such downsides. So the circle continues.

Information on the current context, not directly related to the AMA:

There seems to be new concerning claims and discussions about Mr. Beast and the illegality of his business model.

See : https://youtu.be/DJvDLqDAM60?si=5Rc78_1GTBgmmRrP https://youtu.be/k5xf40KrK3I?si=caiCJ0u7AaUeL2VX And more that will pop up with a simple YouTube search.

I'm not weighing anything in terms of credence, I haven't done any dutiful scrutinizing myself. However this is reminiscing enough of previous mistakes that were made that I hope a constructive conversation can be had on the side of any willing partner before further engaging with Beast Philanthropy if this ever becomes an option.

Related to this, Julius Bauman has made an excellent talk about inward facing vs outward facing communities. In general, you don't find inward facing communities among the large social movements who succeed at their goals.

I'm still very worried about the fact that the community hasn't seemingly caught up on this and is not promoting outward facing practices more than in the past. I myself often get advised to orient my work to the EA community when I actually want to be outward facing.

I would definitely understand if you felt that giving areas outside of short term, human centered philanthropy feels riskier or unrelated to your brand image.

I'm wondering however, in what conditions would you plan to engage in animal philanthropy, e.g. the open wings alliance? In a parallel world where this is something you regularly do, what happened for you to do it?

Similarly, what would be needed for you to engage in messaging about more abstract cause areas, such as AI risk/ethics or biosecurity (if you feel convinced by any)?

My commentary would be that promoting political ideologies (or connecting to them) sound usually bottom-lined to me about the nature of reality.  I think that bundling concepts under the tag "socialism" or "capitalism" makes it hard to sift through it and find the occasional diamond (see this). It's hard to "check" whether socialism works, because it refers to too many things at the same time.

Let's suppose the government spends money, not on subsidies for national activities X or Y, but for interventions X or Y in the global south. Is this socialism? I don't care. The real question is : does it work? What are the costs? What are the benefits? How does it compare with my pondered ethical beliefs?

Some people would be happy to denounce legislation on AI frontier model as excessive governmental regulation, and thus socialism. But this is not important. What is important is : does it work? Does it help reduce X-risks? What do superforecasters say?

I'm not interested in evaluating the general tendency to act like a socialist, but specific interventions, no matter which tribe identifies to it. It doesn't seem like healthy thinking to me to bundle interventions in packages and call them "socialist" and have the entire package be considered either as working or not working, worthy trying or not trying. I'll be very happy to have one kind of systemic change such as massive governmental subsidy for a medical system in one country, and zero subsidies in another, if the end results are equally counterfactually optimal regarding my set of moral credences.

In contrast, charter cities experimenting with various interventions, or analyzing data resulting from the application of distinct policies, all sound like a more promising idea to me -and I be damned if the end intervention is socialist, capitalist, or whatever. For a more zoomed-out alternative, stuff like Reasoned Politics sounds less confusing to me.

Caveat: conflict of interest

I agree. However, I also think that doing more surveys do not prevent the failure mode of EAs "doing comms" as doing more surveys over actual interventions for aligning the general opinion with more rational takes on this particular topic. Shyness and low socio-emotional skills among leaders seems to me commonplace in EA, far too much compared to the rest of the world, up to the point where the best interventions targetting communications skills seems neglected to me.

Skills in communications, and funds for paying skilled individuals responsible for communications in any particular org is, imho, generally lacking. I have eavesdropped to a certain number of (non-sensitive) meetings of a small AIS org, and the general level of knowledge on how to convey any given message (especially outside of EA, especially to an opponent) is, in my opinion, insufficient, despite good knowledge of the surveys and their results. People in this org mostly generated their own ideas, judged them using their intuition, and did them, rather than using established knowledge or empirical expertise to pick the best ideas. Most of the people in the process are AIS researchers with background in CS, rather than people with both a background in AIS and communications, who are also excellent communicators (to a non-EA audience). One person I met openly shared their concern of not having enough funding for paying a PR and Comms responsible, as well as them growing tired of managing something they have no background in. Surveys didn't really help with this bottleneck.

My fear is that there is not enough money, and that most people don't care enough because they trust their intuitions too much / are afraid to actualy remedy to this lack of skills and would rather do surveys (on my side, I definitely feel fear and worry about talking to journalists or carefully balancing epistemics whilst not hurting common sense). 

My only not-so-real data point is this (compare karma on LW vs EAF for a better sense). In a world where people saw a technical problem in communications, I would have expected this post to have more success. In short, I'd bet that most communications-skills related interventions/hires are usually considered with reluctance.

I do aknowledge that surveys could be an even lower-hanging fruit, of course. But I think that they should not distract us from improving skills per se.

Load more