Julia_Wise🔸

Community liaison @ Centre for Effective Altruism
13098 karmaJoined Boston, MA, USAjuliawise.net

Bio

Participation
1

I'm one of the contact people for the effective altruism community. I work at CEA as a community liaison, trying to support the EA community in addressing problems and being a healthy and welcoming community.

Please feel free to contact me at julia.wise@centreforeffectivealtruism.org.

Besides effective altruism, I'm interested in folk dance and trying to keep up with my three children.

Sequences
1

2023 project on reforms in EA

Comments
481

Topic contributions
5

I'd think a better way to get feedback is to ask "What do you think of this pledge wording?" rather than encourage people to take a lifelong pledge before it's gotten much external feedback.

For comparison, you could see when GWWC was considering changing the wording of its pledge (though I recognize it was in a different position as an existing pledge rather than a new one): Should Giving What We Can change its pledge?

Sometimes people mention "expanding the moral circle" as if it's universally good. The US flag is an item that has expanded and contracted in how much care it gets.

The US Flag Code states: "The flag represents a living country and is itself considered a living thing." When I was a child, my scout troop taught us that American flags should never touch the ground, and a worn-out flag should be disposed of respectfully by burial (in a wooden box, as if it were a person) or burning (while saluting the flag and reciting the Pledge of Allegiance) and then burying. Example instructions. People from most countries find this hard to believe!

One explanation is that the veneration for this physical object is symbolic of respect for military troops and veterans, but my scout troop sure put more effort into burning the flag properly than we ever did to helping troops or veterans in any more direct way.

Which beings / objects / concepts are worthy of special care can be pretty arbitrary. Expansion isn't always good, and contraction of the moral circle isn't always bad.

Further reading: https://gwern.net/narrowing-circle 
 

Worth a try! Different meds are activating (make you more alert / awake) vs sedating, so it's also worth noticing if an evening dose makes it harder to sleep.

Different medications get metabolized at different rates. If you were taking one that peaks after ~8 hours (like extended-release venlafaxine), evening timing might work better.

The first antidepressant I tried gave me nausea for much of the day when I took it in the morning. Taking it at night helped a lot because I slept through the part where I felt sick. This is how I learned "complain to your prescriber if you're having side effects" because sometimes they have simple ideas that help.

There’s an asymmetry between people/orgs that are more willing to publicly write impressions and things they’ve heard, and people/orgs that don’t do much of that. You could call the continuum “transparent and communicative, vs locked down and secretive” or “recklessly repeating rumors and speculation, vs professional” depending on your views!

When I see public comments about the inner workings of an organization by people who don’t work there, I often also hear other people who know more about the org privately say “That’s not true.” But they have other things to do with their workday than write a correction to a comment on the Forum or LessWrong, get it checked by their org’s communications staff, and then follow whatever discussion comes from it.

A downside is that if an organization isn’t prioritizing back-and-forth with the community, of course there will be more mystery and more speculations that are inaccurate but go uncorrected. That’s frustrating, but it’s a standard way that many organizations operate, both in EA and in other spaces.

There are some good reasons to be slower and more coordinated about communications. For example, I remember a time when an org was criticized, and a board member commented defending the org. But the board member was factually wrong about at least one claim, and the org then needed to walk back wrong information. It would have been clearer and less embarrassing for everyone if they’d all waited a day or two to get on the same page and write a response with the correct facts. This process is worth doing for some important discussions, but few organizations will prioritize doing this every time someone is wrong on the internet.

So what’s a reader to do?

When you see a claim that an org is doing some shady-sounding thing, made by someone who doesn’t work at that org, remember the asymmetry. These situations will look identical to most readers:

  • The org really is doing a shady thing, and doesn’t want to discuss it
  • The org really is doing the thing, but if you knew the full picture you wouldn’t think it was shady
  • The claims are importantly inaccurate, but the org is not going to spend staff time coordinating a response
  • The claims are importantly inaccurate, and the org will post a comment next Tuesday that you probably won’t notice

Glad to see more attention on this area!
A little spot-checking:
"People with nothing more than a high-school diploma and a month long crash course can treat PTSD ~75% as well as a professional therapist." The metastudy linked doesn't attempt to compare lay counselors with professional therapists; it's only about trained lay counselors.

Thank you for writing about an important subject! I’m sorry about the ways I gather EA has been difficult for you. I’ve found EA pretty emotionally difficult myself at times.

People who fill out the EA Survey are likely to report that EA has a neutral or positive effect on their mental health. This might be because participating in a community and having a sense of purpose can be helpful for people's wellbeing. Of course, you’d expect bias here because people who find EA damaging may be especially likely to leave the community and not take the survey. An excerpt from a colleague’s summary:

“An interesting bit of information is that the 2022 EA survey asked how EA had affected the mental health of individuals in the community. While some people reported that their mental health had reduced as a result of being part of EA, on average, most people reported improved mental health. Obviously, there is some sampling bias here in who filled out the survey. Still, this was more positive than I expected. That’s not to say that we can’t do better - it would be really great if no one was in a situation where they found that this was personally harmful for them.

. . . I asked Rethink Priorities to do a more thorough analysis of this question. They’ve now done this! TL;DR: There are only small differences in responses across cause area/engagement level/location/career level/time in EA (students + newcomers were slightly more likely to say EA improved their mental health than other groups).”


source: EA Survey 2022

About existing efforts on mental health in EA (some of which are mentioned in other comments):

  • MentNav (formerly the EA Mental Health Navigator) aims to list mental health resources that will be useful to people in EA or elsewhere
  • You mention Rethink Wellbeing, which is running projects similar to some of what you suggest
  • The EA Peer Support Facebook group is for informal peer support, and allows anonymous posts
  • The Effective Peer Support Slack is one location where people have worked on related projects, although it doesn’t seem to be very active currently
  • Some other resources, like community contact people in local groups and at EA conferences, or the community health team where I work, aren’t focused on mental health specifically but do end up assisting with some situations related to mental health.
  • Some efforts to provide volunteer support with accessing mental health services proved difficult, because of liability to the volunteers.
  • On imposter syndrome, there’s enough content that there’s a Forum tag specifically on this topic.
  • You suggested mental health materials like articles or podcasts by mental health practitioners. Readers interested in this might explore the 80,000 Hours interview with with psychotherapist Hannah Boettcherwriting by psychologist Ewelina Tur, another mental health provider who writes as Daystar Eld on the Forum, and other writing under the Forum tag self-care and wellbeing in the effective altruism community.

I’ll note that I think it’s good to have mental health resources tailored to specific communities / populations, but this doesn’t necessarily mean much about the prevalence of problems in those populations. E.g. there are lots of therapy resources aimed at people with climate anxiety, therapists who specialize in treating medical professionalsclergy, etc.

Cross-posting Georgia Ray's / @eukaryote's "I got dysentery so you don't have to," a fascinating read on participating in a human challenge trial. 

Welfare standards on farms like larger cage sizes, stunning before killing, etc don't have obvious benefits for humans. Arguably there are downstream benefits by making meat more expensive and thereby causing less of whatever indirect effects meat consumption creates.

Load more