Is there a maximum effective membership size for EA?
@Joey 🔸 spoke at EAGx last night and one of my biggest take-aways was the (controversial maybe) take that more projects should decline money.
This resonates with my experience; constraint is a powerful driver of creativity and with less constraint you do not necessarily create more creativity (or positive output).
Does the EA movement in terms of number of people have a similar dynamic within society? What growth rate is optimal for a group of members to expand, before it becomes sub-optimal? Zillions of factors to consider of course but... something maybe fun to ponder.
I had it hammered into me during training as a crisis supporter and I still burnt out.
Now I train others, have seen it hammered into them and still watch countless of them burn out.
I think we need to switch at least 60% of compassion fatigue focus to compassion satisfaction.
Compassion satisfaction is the warm feeling you receive when you give something meaningful to someone, if you're 'doing good work' I think that feeling (and its absence) ought to be spoken about much more.
Could job application / grant application / whatever application feedback be outsourced?
Lack of feedback maybe causes a lack of improvement in applications (for jobs, grants, whatever) and keeps the entire pool of applicants (and therefore effective altruists) less competitive.
I wonder under what conditions people could benefit from one another's feedback being visible. For example if I applied for a job and was knocked back (e.g. an EA job) and not given any feedback, how could my information gap be filled in novel ways?
Maybe outsource feedback to different people such as a curated community of HR people incentivised to contribute to a feedback pool so that future job applicants are more prepared
Maybe outsource feedback to an AI using specific training and specific templates so that questions are designed to surface blind spots
Maybe outsource feedback to structured industry specific mentorship / networking organisations
Maybe the person who didn't hire me could visibly publish one detailed constructive criticism publicly, clearly deidentified or depersonalised
Considerations:
Maybe organisations have strong incentives to not show constructive criticism
Perhaps those incentives are more important places to intervene or problem solve?
Maybe the cost of publishing high quality constructive criticism, done effectively with the right parameters, would offer enough return to be worth it
E.g. a grants organisation publishing 'top 10 reasons we didn't go with x proposal'
Is this too controversial? Could the balance be struck between informative and helpful without being condescending or offensive?
If more people had more access to more quality feedback, would that likely improve 'things'?
Learning about EA has revealed within me two distinctly different drives, both to achieve the same outcome: belonging.
On one hand I want to share thoughts, hunches and instincts based on little more than experience in attempts to start discussions and hear others thoughts.
On the other hand I want my thoughts to be at least logical or rational enough that their sharing lowers friction for those receiving.
When trying to write for the EA forums it feels like I'm hosting a party for guests whose expectations I'm unfamiliar with.
I don't want to out myself as not belonging, but I have to risk that in order to a) improve my thoughts and b) find out better where I belong.
The desire to belong within EA seems like a me problem, instinct tells me it's less EAs job to make me feel welcome than it is my job to know myself with more clarity (and thus have more confidence in the value of hunches and instincts even if they do get downvoted to oblivion as I fear they might).
Is there a maximum effective membership size for EA?
@Joey 🔸 spoke at EAGx last night and one of my biggest take-aways was the (controversial maybe) take that more projects should decline money.
This resonates with my experience; constraint is a powerful driver of creativity and with less constraint you do not necessarily create more creativity (or positive output).
Does the EA movement in terms of number of people have a similar dynamic within society? What growth rate is optimal for a group of members to expand, before it becomes sub-optimal? Zillions of factors to consider of course but... something maybe fun to ponder.
Compassion fatigue should be focused on less.
I had it hammered into me during training as a crisis supporter and I still burnt out.
Now I train others, have seen it hammered into them and still watch countless of them burn out.
I think we need to switch at least 60% of compassion fatigue focus to compassion satisfaction.
Compassion satisfaction is the warm feeling you receive when you give something meaningful to someone, if you're 'doing good work' I think that feeling (and its absence) ought to be spoken about much more.
Thoughts on the applicant feedback problem.
Could job application / grant application / whatever application feedback be outsourced?
Lack of feedback maybe causes a lack of improvement in applications (for jobs, grants, whatever) and keeps the entire pool of applicants (and therefore effective altruists) less competitive.
I wonder under what conditions people could benefit from one another's feedback being visible. For example if I applied for a job and was knocked back (e.g. an EA job) and not given any feedback, how could my information gap be filled in novel ways?
Considerations:
If more people had more access to more quality feedback, would that likely improve 'things'?
Learning about EA has revealed within me two distinctly different drives, both to achieve the same outcome: belonging.
On one hand I want to share thoughts, hunches and instincts based on little more than experience in attempts to start discussions and hear others thoughts.
On the other hand I want my thoughts to be at least logical or rational enough that their sharing lowers friction for those receiving.
When trying to write for the EA forums it feels like I'm hosting a party for guests whose expectations I'm unfamiliar with.
I don't want to out myself as not belonging, but I have to risk that in order to a) improve my thoughts and b) find out better where I belong.
The desire to belong within EA seems like a me problem, instinct tells me it's less EAs job to make me feel welcome than it is my job to know myself with more clarity (and thus have more confidence in the value of hunches and instincts even if they do get downvoted to oblivion as I fear they might).