Jeffrey Kursonis built and co-built quite a number of non-profits in New York City, including The Haven, an arts and altruism collective with 300 people gathering weekly for ten years in Manhattan. A multicultural and altruistic faith community in Harlem, still going today. The New York City New Sanctuary Movement, one of two main hubs of the national network of faith communities giving sanctuary protection to undocumented families being pursued by Federal Immigration Law Enforcement. It’s a long list.
After my work in NYC, a nascent national organization, Emergent Village, tapped me to lead their early growing network of local cohorts seeking to organize progressive religious leaders. I formed a team and we built it up to over 100 US cities, as well as many regional gatherings and other movement training and organizing (extremely similar to CEA). This “emerging church” movement changed the face of American religion by directly moving thousands of religious leaders and their congregations to the left, spawned a whole publishing genre, helped elect Obama, helped influence our Federal same sex marriage legal structure and sadly became a focal point of the conservative backlash unleashed by Trumpism. As a side note, Jeffrey is no longer religious but still deeply appreciates the proven training ground religion provides. Here is a video we produced about the national cohort network, note my name as producer in the end credits: https://www.youtube.com/watch?v=O-oaU29Z4dg
Jeffrey has been an active EA’er for over a year now, doing the Intro and Advanced Fellowships and working as a Meta-Moderator for Virtual Program trainings for new facilitators and actively posting on the forum. I recently applied to be on the CEA Virtual Programs new Advisory Board.
I have a long career in the religious world, and now I'm no longer religious so I'm rebuilding in the regular world, it's a challenge. Religion is very good at movement building and at persuading people to change their views two things EA is trying to do...in the internet boom of the 90's the term "evangelist" became popular for basically marketing pro's communicating their vision. Many EA's might be surprised to learn that the word charity simply means love...in older English versions of the bible, like the King James, verses that we now use the word love for would then use the word charity. To give with no expectation of return benefit, the very core of EA, is essentially the act of charity which equals an act of love. These less scientific and more artful human kind of expressions are more my style. Because I agree with the core EA notion of bringing scientific method into charity work, I want to see that happen, but implementing it can still be a very human and social and creative thing. I need help evangelizing this message. :)
If you are a young EA and have any anxiety in your work and life, I am a lifelong coach and mentor to young activists, and from the culture of my long experience it has a pastoral quality...definitely not for everyone, but very good for some.
Wow thanks for your long and thoughtful reply. I really do appreciate your thinking and I'm glad CU is working for you and you're happy with it...that is a good thing.
I do think you've given me a little boost in my argument against CU unfortunately, though, in the idea that our brain just doesn't have enough compute. There was a post a while back from a well know EA about their long experience starting orgs and "doing EA stuff" and how the lesson they'd taken from it all is that there are just too many unknown variables in life for anything we try to build and plan outcomes for to really work out how we hoped...it's a lot of shots in the dark and sometimes you hit. That is similar to my experience as well...and the reason is we just don't have enough data nor enough compute to process it all...nor adequate points or spectrums of input. The thing that better fits in that kind of category is a robot who with an AI mind can do far more compute...but even they are challenged. So for me that's another good reason against CU optimizing well for humans.
And the other big thing I haven't mentioned is our mysterious inner life, the one that responds to spirituality and to emotions within human relationships, and to art...this part of us does not follow logic or compute...it is somehow organic and you could almost say quantum in how we are connected to other people...living with it is vital for happiness...I think the attraction of CU is that it adds to us the logic side that our inner life doesn't always have...and so the answer is to live with both together...to use CU thinking for the effective things it does, but also to realize where it is very ineffective toward human thriving...and so that may be similar to the differences you see between naive and mature CU. Maybe that's how we synthesize our two views.
How I would apply this to the Original Post here is that we should see "the gaping hole where the art should be" in EA as a form of evidence of a bug in EA that we should seek to fix. I personally hope as we turn this corner toward a third wave, we will include that on the list of priorities.
Yes I appreciate very much what you're saying, I'm learning much from this dialogue. I think what I said that didn't communicate well to you and Brad West isn't some kind of comparison of utilitarianism and communist thought...but rather how people defend their ideal when it's failing, whatever it is...religion, etc. that, "They're not doing it right"..."If you did it right (as I see it) then it would produce much better stuff".
EA is uniquely bereft of art in comparison to all other categories of human endeavor: education, business, big tech, military, healthcare, social society, etc. So for EA there's been ten years of incredible activity and massive funding, but no art in sight...so whatever is causing that is a bug and not a feature. Maybe my thesis that utilitarianism is the culprit is wrong. I'd be happy to abandon that thesis if I could find a better one.
But given that EA "attracts, creates and retains consequentialists" as you say, and that they are hopefully not the bad kind that doesn't work (naive) but the good kind that works (mature) then why the gaping hole in the center where the art should be? I think it's not naive versus mature utilitarianism, it's that utilitarianism is a mathematical algorithm and simply doesn't work for optimizing human living...it's great for robots. And great for the first pioneering wave of EA blazing a new path...but ulitimately unsustainable for the future.
Eric Hoel does a far better job outlining the poison in utilitarianism that remains no matter how you dilute it or claim it to be naive or mature (but unlike him I am an Effective Altruist).
And of course I agree with you on the "it's hard to tell one religion to be another religion", which I myself said in my reply post. In fact, I have a college degree in exactly that - Christian Ministry with an emphasis in "missions' where you go tell people in foreign countries to abandon their culture and religion and adopt yours...and amazingly, you'd be surprised at how well it works. Any religious group that does proselytizing usually gets decent results. I don't agree with doing that anymore with religion, but it is surprisingly effective...and so I don't mind telling a bunch of utilitarians to stop being utilitarians...on the other hand if I can figure out a different reason for the debilitating lack of art in EA and the anxious mental health issues connected to not saving enough lives guilt, I'll gladly change tactics.
If you compare EA to all those other human endeavors I listed above, what's the point of differentiation? Why do even military organizations have tons of art compared to EA?
You seem to think if art was good for human optimization then consequentialists should have plenty, so why don't they around here?
Thanks for helping me think these things through.
Well yes your logic is perfect, but it's a lot like the logic of communism...if humans did communism perfectly it would usher in world peace and Utopia...the problem is not ideal communism, it's somehow it just doesn't fit humanity well. Yours is the exact same argument you would hear over and over when people still argued passionately for communism..."They're just not doing it right!!"...after a while you realize, it just isn't the right thing no matter how lovely on paper. Eventually almost all of them let go of that idealism, but it doggedly held on a long time and I'm sure that will be the case of many EA's holding on way to long to utilitarianism.
Hardly anything really does fit us...the best path is to keep iterating reachable modifications wherever you are...I can see the benefits of ideal utilitarianism and I appreciate early EA embracing it with gusto...it got fantastic results, no way to argue with that. To me EA is one of the brightest lights in the world. But I've been steering movements and observing them for many decades and it's clear to me that, as in the OP, EA is transitioning into a new phase or wave, and the point of resistance I come up against when I discuss there being more art in EA is the Utilitarian response of "why waste money on aesthetics", or I hear about stressed anxious EA's and significant mental health needs...the only clear answer I see to these two problems is reform the Utilitarian part of EA, that's what's blocking it moving into the next era. You can run at a fast pace for a long time when you're young...but eventually it just isn't sustainable. That's my thesis...early EA was utilitarian awesome, but time moved on and now it's not sustainable anymore.
Changing yourself is hard, I've done it a few times, usually it was forced on me. And I totally get this is not obvious to most in EA...it's not popular to tell utilitarians, "don't be utilitarian"...but it's true, you should not be so utilitarian...because that's for robots, but you're human. It's time to move on to a more mature and sustainable path.
I like this framing, and here's some thoughts on movements from a movement veteran. First, it's obvious EA moved from raising money for effective charities focus to longtermism/X-risk and it interesting to see all the cultural flows of that in EA. And then it seems fairly obvious that the series of scandals from FTX to sexual harassment has had a reverberating shock wave affect in EA and that to me is the clearest sign EA is primed for a new wave, the third. I would date it not only by AI but also by some date averaging of these big difficult stories from Dec. 2022 to early 2023 where FTX was still everywhere while ChatGPT debuted globally and the sex stuff. The fact that EA is years ahead of any other organized movement in organizing on AI Safety means it can be a hub and that has value moving forward.
I think the really big question is this - what will the effect of all the trauma and embarrassment and people reassessing themselves and EA as a whole end up producing on the steering rudder of EA...where will it turn, what ways will it change? Comments on that would be very interesting.
Here's my comment, which if you were to read all my posts and comments you could see the trend: I don't know at all what direction the rudder will steer us toward, but I hope that it includes a huge cultural reformation surrounding Utilitarianism. One of the iconic quotes that would give me this idea is of an interview with SBF where he says he's a Benthamite Utilitarian very soon before he's revealed to be a historically awful fraudster who was spawned and enabled by a bunch of Benthamite Utilitarians calling themselves Effective Altruists.
Now I know some leading lights have spoken out recently that, "Aw we haven't really been hard core Benthamites...we've always been more balanced". I would say that's classic blind spot talk because EA you have no idea how strongly Utilitarian you come across to anyone from the outside...you are not at all balanced, you are hard core Utilitarian...if you think you're balanced you're just too inside to know how things look from the outside. I think what's happening mentally is you are smart enough to imagine the freakish' crazy side of extreme Utilitarianism and you know you aren't that...but that's because nobody is that freakish' except literally psychopath outliers who don't count, but instead you are still firmly situated in a kind of Utilitarianism which though balanced with some common sense which is both socially and literally unavoidable, is still very far over from most of your peers in non EA culture worlds.
I can imagine all the defensive comments saying it's not that bad, we're more balanced, but as I said above that's just being too inside to see from the outside - if there was any one major cultural thing that typified EA and EA people it would be utilitarianism...eating Huel alone at your desk so you can grind on to be more effective is to me the iconic image of that.
I know this is tough love, but I do dearly love EA...and I just want all to be happy and stop eating Huel alone at your desk and discover the joy of being with others and having an ice cream cone now and then. You'll be far more optimized by that to do your good work. Utilitarianism is optimizing for robots not for humans. Effective Altruists are humans helping humanity...optimize for humanness.
Really great to hear especially because they are using actors recording "radio drama's" telling stories relevant to their audience, ie. using art thoughtfully to influence. It's interesting to note in the cultures they broadcast within radio is the preeminent media people listen to. It's great to see the power of creative media and how it can be measured for impact. Every single EA org whether a charity or a research group or CEA could use more creative media to amplify the effects of their work. That's kind of a base understood thing outside of EA, but in this utilitarian bubble, it hasn't quite yet made it's mark..."wasteful aesthetics" is the false meme around here. Thanks to FEM for showing the reality.
You have a lot of great ideas, the one trend I see that aligns with some of my thoughts is a general sense that "EA Culture" is not for everyone and how that informs our outreach. I personally love EA culture, but I'm also not a STEM person and I clearly see how some people might refer to EA culture as "borg-like" as Jenn reported in her forum piece, https://forum.effectivealtruism.org/posts/5oTr4ExwpvhjrSgFi/things-i-learned-by-spending-five-thousand-hours-in-non-ea just the other day. I also appreciate Vanessa's comments below and she seems to be killing it in her work. Also below zchuang seems to hit it on the head that there's been a weird confluence of events; FTX, WWOTH dropping as LLM's hit the stage publicly, and Covid-19 changing things...if ever there was a time for re-thinking and pivoting on our direction, it's now.
I have many years of being in multiple kinds of religious communities and I can say I've seen this kind of dynamic...this is great thinking.
I remember reading about a group of people early in Alameda, before FTX started and that had a good number of EA's among them, who stood up against SBF and demanded a better contract and he dismissed them and they all left...one can't help but think that helped create an evaporative concentration of people willing to go along with his crazy.
As an aside, for pondering, the evaporative cooling dynamic is something I experience in cooking almost daily where you reduce the liquid in order to make it a sauce with more concentrated flavors...they always refer to this as reduction, or a reduction sauce. In this case it's a good thing...but knowing this dynamic in cooking helped me understand your argument.