I think that the poor outcomes you listed - causing reputational damage, spreading a wrong version of EA, over-focusing or under-focusing on certain cause areas, giving bad career advice, etc... - are on the mark, but might not entirely stem from EA community builders not taking the time to understand EA principles.
For example, I can imagine a scenario where an EA community builder is not apt at presenting cause-areas, but understands the cause-area landscape very well. Perhaps as a result of their poor communication skills (and maybe also from a lack of being self-assured), some individuals on the edge of adopting EA in this particular org. begin to doubt the EA community builder and eventually leave.
Back to the question. I think that group leaders, including EA community builders, don't often take the time to empathize or comprehend what the topic of the group means to each of the members in the group.
The question of how this organization, movement, cause, etc... (in this case EA, and EA cause-areas) fits into group member X's life is useful in that it can be predictive of how committed they are, or of how long they'll stick around.
In my personal experience coming to understand EA, and in my experience helping others at my undergraduate institution understand EA principles, I have noticed that there are a few close (highly involved) individuals, and many other, less involved individuals in the periphery.
Many times, a lot of work that was expended by the highly involved individuals in trying to get the less involved individuals more involved could have been prevented by communicating the content of the group's activities more clearly. Regularly making sure that everyone is on the same page (literally just bring it up in conversation) can help to reduce the damage caused by the EA community builder.
Practically speaking, exercises that would likely achieve this outcome of better communication could be: asking each member of the group what EA means to them, having each member present their case or analysis for why their cause-area is more pressing than other cause-areas, and having anonymous surveys to make sure there is a consensus among the group on their understanding of EA principles and making an impact.
"EA is an aggressive set of memes. Best handled by people firmly grounded in some other community or field or worldview."
What do you mean?
(Context: I accept the following philosophy almost hook-line-and-sinker.)
I mean the amount of mental room the philosophy takes up: the uncapped demands, the absence of an act/omission distinction, the absence of a notion of "enough" or supererogation, the astronomical stakes, and the sweeping devaluation of ineffective things.
Consider the strong Singer principle:
"If it is in our power to prevent something bad from happening, without thereby sacrificing anything of comparable moral importance, we ought, morally, to do it."
and even the weak Singer principle:
"If it is in our power to prevent something very bad from happening, without sacrificing anything morally significant, we ought, morally, to do it."
This is in some sense much broader than most sets of deontological rules: it applies to more of your actions, and for practical purposes neither constraint ever stops being active. "Not helping is the same [value] as harming." The work is never done. It can take over your whole life if you let it, and I know people who did.
Throw in the vast, humbling scale of long-termism and you have something which can totally occupy you indefinitely.
What's wrong with that, if the philosophy is... (read more)