YTJ

Yelnats T.J.

276 karmaJoined

Bio

CE Incubatee 2023

Talk to me about American governance/political systems/democracy

 

My journey to EA:

  • 2010: start arriving at utilitarian-adjacent ethics
  • 2013: read Peter Singer’s Famine Affluence and Morality
  • Circa 2013/14: find my way to EA through googling about Singer and FAaM
  • 2014-2019: in the orbit of EA. i.e. will talk to people about morality and utilitarian stuff but not very engaged in the community aside from attending uni club meeting every once and while.
  • 2020: EAGxVirtual (I’m starting to move from the orbit closer to the actual community)
  • 2022: Dive deep into the community. And now we arrive at the present day. 

Comments
42

Also in support of the sodium tax is that we've seen health taxes used as a cost-effective way improve health/save lives in tobacco, alcohol, and sugar sweetened beverages. For tobacco, taxation is the most cost-effective of all the tobacco control measures. I'm not surprised to see the evidence point to a sodium tax.

To be clear I am not CE staff or a seed funder. I don't make decisions regarding how much is doled out or the culture upstream of those decisions. I'm an incubatee focused on the co-founder search here. Yes, I'd like to see more funding, but I'm not gonna benefit by railing here against a group of people who I don't know and don't know their circumstances or reasoning in their decision making.

Also worth noting most CE charity founders are not American. The range is much more manageable for other people around the globe. But people with more needs have asked for more. E.g. A line item for American health insurance.

I was referencing the comment for this: "We are a bit skeptical about the perception that talent increases from offering higher salaries (instead of attracting new talent, we typically see the same EA people getting job roles but just for a higher cost). "

My understanding is that the 40-60k figure is for CE itself not the founders of CE charities.

Yah I'm not opposed to better compensation for founders. Note: founders with more needs have asked for more before. How the seed funders have considered that is opaque though.

As a principle I'd like to see impactful roles be more compensated than non-impactful roles e.g. a marketing position figuring out how to sell more sugary beverages to kids shouldn't be more rewarded than someone trying to solve a problem.
 

Re: "a salary range like this just can't be optimal"
Joey (co-founder of CE) had this to say about it in CE's AMA:
https://forum.effectivealtruism.org/posts/xnHnsrFEMEMPXBWqR/ask-charity-entrepreneurship-anything?commentId=w93Smttpwa8eKboT7

My two cents:
As I mentioned in the post, a co-founding and an intervention are an experimental product. Only about a 1/3 of CE charities will go onto be stellar. So the ETG seed funders don't want spend 80k on a nice salary for a founder that isn't fully proven yet and org that might go bust in a year.

You can also launch more charities (i.e. experiments) per year the more people can take a lower salary (and co-founder salaries are the biggest line item of seed budgets).

That being said my personal opinion is there can be too much of a race to the bottom dynamic with salaries that leaves some founders in their first year might feel financial stress or save money by not spending on convivences and things that have an ROI for annual productivity. (but to be fair I haven't founded a charity yet, maybe some other alums will disagree).


It should be noted that some CE founders did a significant bump in their salaries in their second year following a successful fundraising round.

To my knowledge the highest co-founder salary that has been asked of the seed funders was 54k, but even then I don't what they did in practice. Most asks are 25k - 45k. Again, what co-founders actually ended up doing in practice after the figured out the amount of seed funding received is a different Q. 

 

Revised to say 54k (previously said 51k)

Flagging here where I think the next problem can arise from: AI doomers taking matters into their own hands to slow down AI progress or bring attention to the issue through tactics/strategy similar to those of Earth Liberation Front in late 90's/early 00's reminiscent.

Perceiving something as existential & impending and the typical options for change as inconsequential creates a logic to drastically escalate. I've heard this sentiment even as strong as tepidly supporting triggering a great power wat that would likely decimate semiconductor production because it would slow down AI development.

If you told me that five years from now a fringe EA did something like send a pipe bomb to an AI exec, I would not be surprised. We as community should on guard of doomer unilateralists doing something extreme in the name of an EA cause area.

I've been hearing for a year about an EAGx potentially being run in Sub-Saharan Africa. Any updates on that? I heard CEA was hesitant on backing an EAGx in Africa until communities were more developed there. If CEA was hesitant, it would be great to hear why and what they would want to see to get behind and EAGx on the continent?

Load more