Julia_Wise🔸

Community liaison @ Centre for Effective Altruism
12720 karmaJoined Boston, MA, USAjuliawise.net

Bio

Participation
1

I'm one of the contact people for the effective altruism community. I work at CEA as a community liaison, trying to support the EA community in addressing problems and being a healthy and welcoming community.

Please feel free to contact me at julia.wise@centreforeffectivealtruism.org.

Besides effective altruism, I'm interested in folk dance and trying to keep up with my three children.

Sequences
1

2023 project on reforms in EA

Comments
477

Topic contributions
5

There’s an asymmetry between people/orgs that are more willing to publicly write impressions and things they’ve heard, and people/orgs that don’t do much of that. You could call the continuum “transparent and communicative, vs locked down and secretive” or “recklessly repeating rumors and speculation, vs professional” depending on your views!

When I see public comments about the inner workings of an organization by people who don’t work there, I often also hear other people who know more about the org privately say “That’s not true.” But they have other things to do with their workday than write a correction to a comment on the Forum or LessWrong, get it checked by their org’s communications staff, and then follow whatever discussion comes from it.

A downside is that if an organization isn’t prioritizing back-and-forth with the community, of course there will be more mystery and more speculations that are inaccurate but go uncorrected. That’s frustrating, but it’s a standard way that many organizations operate, both in EA and in other spaces.

There are some good reasons to be slower and more coordinated about communications. For example, I remember a time when an org was criticized, and a board member commented defending the org. But the board member was factually wrong about at least one claim, and the org then needed to walk back wrong information. It would have been clearer and less embarrassing for everyone if they’d all waited a day or two to get on the same page and write a response with the correct facts. This process is worth doing for some important discussions, but few organizations will prioritize doing this every time someone is wrong on the internet.

So what’s a reader to do?

When you see a claim that an org is doing some shady-sounding thing, made by someone who doesn’t work at that org, remember the asymmetry. These situations will look identical to most readers:

  • The org really is doing a shady thing, and doesn’t want to discuss it
  • The org really is doing the thing, but if you knew the full picture you wouldn’t think it was shady
  • The claims are importantly inaccurate, but the org is not going to spend staff time coordinating a response
  • The claims are importantly inaccurate, and the org will post a comment next Tuesday that you probably won’t notice

Glad to see more attention on this area!
A little spot-checking:
"People with nothing more than a high-school diploma and a month long crash course can treat PTSD ~75% as well as a professional therapist." The metastudy linked doesn't attempt to compare lay counselors with professional therapists; it's only about trained lay counselors.

Thank you for writing about an important subject! I’m sorry about the ways I gather EA has been difficult for you. I’ve found EA pretty emotionally difficult myself at times.

People who fill out the EA Survey are likely to report that EA has a neutral or positive effect on their mental health. This might be because participating in a community and having a sense of purpose can be helpful for people's wellbeing. Of course, you’d expect bias here because people who find EA damaging may be especially likely to leave the community and not take the survey. An excerpt from a colleague’s summary:

“An interesting bit of information is that the 2022 EA survey asked how EA had affected the mental health of individuals in the community. While some people reported that their mental health had reduced as a result of being part of EA, on average, most people reported improved mental health. Obviously, there is some sampling bias here in who filled out the survey. Still, this was more positive than I expected. That’s not to say that we can’t do better - it would be really great if no one was in a situation where they found that this was personally harmful for them.

. . . I asked Rethink Priorities to do a more thorough analysis of this question. They’ve now done this! TL;DR: There are only small differences in responses across cause area/engagement level/location/career level/time in EA (students + newcomers were slightly more likely to say EA improved their mental health than other groups).”


source: EA Survey 2022

About existing efforts on mental health in EA (some of which are mentioned in other comments):

  • MentNav (formerly the EA Mental Health Navigator) aims to list mental health resources that will be useful to people in EA or elsewhere
  • You mention Rethink Wellbeing, which is running projects similar to some of what you suggest
  • The EA Peer Support Facebook group is for informal peer support, and allows anonymous posts
  • The Effective Peer Support Slack is one location where people have worked on related projects, although it doesn’t seem to be very active currently
  • Some other resources, like community contact people in local groups and at EA conferences, or the community health team where I work, aren’t focused on mental health specifically but do end up assisting with some situations related to mental health.
  • Some efforts to provide volunteer support with accessing mental health services proved difficult, because of liability to the volunteers.
  • On imposter syndrome, there’s enough content that there’s a Forum tag specifically on this topic.
  • You suggested mental health materials like articles or podcasts by mental health practitioners. Readers interested in this might explore the 80,000 Hours interview with with psychotherapist Hannah Boettcherwriting by psychologist Ewelina Tur, another mental health provider who writes as Daystar Eld on the Forum, and other writing under the Forum tag self-care and wellbeing in the effective altruism community.

I’ll note that I think it’s good to have mental health resources tailored to specific communities / populations, but this doesn’t necessarily mean much about the prevalence of problems in those populations. E.g. there are lots of therapy resources aimed at people with climate anxiety, therapists who specialize in treating medical professionalsclergy, etc.

Cross-posting Georgia Ray's / @eukaryote's "I got dysentery so you don't have to," a fascinating read on participating in a human challenge trial. 

Welfare standards on farms like larger cage sizes, stunning before killing, etc don't have obvious benefits for humans. Arguably there are downstream benefits by making meat more expensive and thereby causing less of whatever indirect effects meat consumption creates.

Glad this question-and-answer happened!
A meta note that sometimes people post questions aimed at an organization  but don't flag it to the actual org. I think it's a good practice to flag questions to the org, otherwise you risk:
- someone not at the org answers the question, often with information that's incorrect or out of date
- the org never sees the question and looks out-of-touch for not answering 
- comms staff at the org feel they need to comb public spaces for questions and comments about them, lest they look like they're ignoring people

(This doesn't mean you can't ask questions in public places, but email the org sending them the link!)

(Writing personally, not organizationally)
I'm happy people are trying experiments like this!

Thinking about other ways that people incorporate each other's judgement about where to donate: often it involves knowing the specific people. 

I think some people who knew each other through early EA / GWWC did this — some had a comparative advantage in finance so went into earning to give, and others had a comparative advantage in research or founding organizations so went into nonprofits. But they made heavy use of each other's advice, because they knew each other's strengths.

It's also common to do this within a couple / family. My husband spent 10 years earning to give while I worked in social work and nonprofits, so he's earned the large majority of what we've donated. Early on, the two of us made separate decisions about where to donate our own earnings (though very informed by talking with each other). Later we moved to making a shared decision on where we'd donate our shared pot of money. This isn't necessarily the best system — people are biased toward trusting their family even in domains where the person isn't very competent, and you can see examples like the Buffett family where family members seem to make kind of random decisions.

I feel good about people pooling judgement when they know the strengths and weaknesses of the specific other people involved. I feel much less excited about pooling judgement with people whose judgement I know nothing about.

In 2020 when I asked you about lead policy work, you weren't optimistic that people without strong networks and expertise could make much progress on policy advocacy. Has your view changed?

I found this a clear explanation of the costs and benefits - thanks for writing it up!
A similar issue: lack of iodization in Europe, the region where children have the highest rates of low iodine. https://www.who.int/publications/i/item/9789241593960

Load more