Joanna Michalska

freelance artist + volunteer in the people operations team @ Otwarte Klatki (Anima International)
109 karmaJoined Working (6-15 years)Polska
https://cara.app/urikedi

Bio

Participation
1

Here since September 2024

How I can help others

If you need some google sheet table made more readable - please let me do it.
 

Comments
14

I do get the concern about the EA forum being very serious. I myself find it intimidating to write here and very much share the sentiment of Olivia Addy's great post

At the same time, I don't think the culture here should change.

In defense of gatekeeping:

  1. There are already EA communities that are more casual - on Reddit, Twitter, Facebook and Slack. The forum presents a unique niche.
  2. I assume most people lurking the forum are more on the casual side (pulling it from general internet user base rates?). I'd worry that less-serious posts would get more engagement for the virtue of being less-serious and more accessible to people new to EA, while more sophisticated posts would be drowned out.
  3. I'm not sure if you can stop the culture shift when it starts (see: Thresholding - by Duncan Sabien - Homo Sabiens). And when it changes enough, the people who were initially the most engaged stop posting. 
     

Anecdotal examples from my n=1:

  • A Discord server for negative utilitarians - I once went through all messages spanning a few years (don't ask me why) and saw the shift from a discussion not too different from the EA forum, through gradual casualization to the current state where it's mostly young people discussing suicide and venting about their arguments with 'breeders'. The people who engaged with the server in the beginning, when it was EAforum-like, have completely stopped posting.
  • Another Discord server I was in went through the same shift - some people joined the server and gradually shifted the culture towards something resembling Twitter. They weren't doing anything bad enough to be banned (thresholding problem again), but dominated the discussion and made the veterans leave. In the end, the server got closed by the admin, citing the culture shift.
  • Exactly the same happened to a fairly large Facebook group I was in. It got closed down as well.
  • I observed that as subreddits grow, the culture undergoes the same shift. It takes a few people to start posting less thought-through content -> the silent majority of casual lurkers upvotes these posts into the frontpage -> people see that and start posting more of this kind of content -> veterans leave, because it's not the same space anymore.

It may look this way, but I have nothing against casual communities. In fact, in the majority of communities I'm in I am the casual lurker upvoting the memes and skipping over dissertations. I think both kinds of spaces are needed, it's just that the more niche ones need some curation/protection.

Maybe putting a link to the EA Anywhere Slack somewhere visible would be a good idea? I only learned about its existence at the recent EAG, and I think it's the kind of space that a lot of people here are after.

During the EA forum voting week I did some research into the orgs that were listed and I was impressed by RP's detailed doc on what they'd use the funding for; I hadn't been aware of the work you're doing in the field of animal welfare. I care for invertebrate welfare as both the scale and neglectedness of the problem are enormous, so the projects listed got me excited.
Keep up the great work!

This year I started montlhly donations to ACE recommended charities fund and Anima International + made some individual donations to THL, Hive and others. I'll be switching to Rethink Priorities in the new year.

I think offsetting your emissions and offsetting your meat consumption are treated differently by EAs because they really are different.

I liked the two examples presented by William MacAskill in 'Doing good better':

  1. Offsetting your contribution to climate change by donating to CoolEarth, so that all the CO2 you produce gets offset before it gets to harm anyone - the 'undoing harm' kind
  2. Donating to an org that advocates for not cheating on your spouse to offset your cheating on your spouse - the 'apology' kind

In the second case the damage is already done; by offsetting, you just prevent further harm. Eating meat while donating to animal welfare organizations is more like the second example. You harm some animals and then pay for some other animals to be saved. You can't undo the harm done to the animals harmed.

PART 1

  1. The lives of the 100 people living today aren't worth 10x more than the lives of the thousands living in the future, so I wouldn't bury the waste.

  2. I would have still donated; I don't see much of a difference, and the time when the beneficiaries are alive isn't a morally significant factor.

PART 2 My judgement is terrible but my confidence is very low so let's hope they cancel out.

  1. The causes I feel are the most important are factory farming, wild animal suffering and S-risks (these, I believe, cause or have the potential to cause the most suffering while being hugely neglected).

  2. Key uncertainty: The tractability of working on wild animal suffering seems to be a huge problem.

  3. What to do about the uncertainty: Read up on what is already being done (Arthropoda foundation, Wild Animal Initiative) and what the prospects are.

  4. Aptitudes to explore: community building, organization running/boosting, supporting roles.

  5. Keep volunteering for an effective organization while also recruiting new people into EA in free time; learn how to communicate ideas better.

  6. I'm donating monthly to effective charities, volunteering my skills and engaging with the community.

My criticisms about EA:

As a negative utilitarian I'm bitter about all the X-risk prevention enthusiasts trying to stop me from pushing the big red button

Jokes aside - I got very excited about EA when I learned about it. At some point I became aware of the excitement and I had a concern pop up that it sounds too good to be true, almost like a cult. I consider myself rather impressionable/easy to manipulate so I learned that when I feel very hyped about something it should make me healthily suspicious.

I'm grateful for the article earlier in the chapter that presented some good faith criticism and I agree with some of its points

Some thoughts:

  • EA may feel alienating to people who aren't top-of-their-field 150iq professionals. I very much relate to this post: https://forum.effectivealtruism.org/posts/x9Rn5SfapcbbZaZy9/ea-for-dumb-people . Maybe it's for the better and results in higher talent density and better reputation for the movement, maybe we're missing out on some skilled people/potential donors or critical mass.
  • I'd love to see some statistics on why people leave the movement, and what the rate is. I suspect that moral perfectionism leading to self-neglect and burnout is an occupational hazard among EAs (like it is among animal advocates).
  • It's somewhat difficult to talk about EA to regular people. Look, there's this movement that can literally save the world from apocalypse (cultish), and we also believe that shrimp welfare is important (insane). On the other hand, maybe I shouldn't start my conversations like that.

I may be mistaken, but I think the author was referring to positive beliefs (as opposed to normative beliefs), in which case your points 1 and 2 would be addressed. It's not made clear in this article but that's what I believe based on the context I first read this essay in (a series of blog posts collected in the book "Map and Territory"), which was more about seeking truth than doing good.

There seems to be a mistake, the link at the beginning leads to the nuclear weapons problem profile.

After some reading I moved my votes around slightly as I can't rationally justify not giving more weight to potential invertebrate suffering + these causes likely won't attract too many philanthropists from outside EA.

Load more