D

Dicentra

530 karmaJoined

Comments
39

I heard someone from Kevin Esvelt's lab talking about this + pain-free lab mice once

I upvoted this because I like the passion, and I too feel a desire to passionately defend EA and the disempowered beneficiaries EAs seek to protect, who are indirectly harmed by this kind of sloppy coverage. I do hope people respond, and I think EAs err towards being too passive about media coverage. 

But I think important parts of this take are quite wrong. 

Most people just aren't basically sympathetic to EA, let alone EAs-waiting-to-happen; they have a tangle of different moral intuitions and aren't very well-informed or thoughtful about it. Sure, they'll say they want more effective charity, but they also want to give back to their local community and follow fads and do what makes them feel good and support things that helped them in particular and keep the money for themselves and all kindsa stuff. So, I don't think this is surprising, and I think it's important for EAs to be clear-eyed about how they're different from other people.

I don't think that means EAs could never be a dominant force in philanthropy or whatever; most people throughout history didn't care about anti-racism or demoncracy but they're popular now; caring about what your ancestors has declined a lot; things can change, I just don't think it's inevitable or foregone (or couldn't reverse). 

If someone wrote an article about a minority group and described them with a few nasty racist stereotypes, there would be massive protests, retractions, apologies and a real effort to ensure that people were well informed about the reality.

People would do this for some kinds of minorities (racial or sex/gender minorities), and for racist stereotypes. I don't think they would for people with unusual hobbies or lifestyle choices or belief sets, with stereotypes related to those things. "not being racist" or discriminating against some kinds of minorities is a sacred value for much of liberal elite society, but many kinds of minorities aren't covered by that.  

Crappy stereotypes are always bad, but I don't think that means that just because you're a minority you shouldn't be potentially subject to serious criticism (of course, unfortunately this criticism isn't intellectually serious). 

I don't think I saw the 80k thing in particular at the time 

I agree with some of the thrust of this question, but want to flag that I think these sources and this post kind of conflate FTX being extravagant and SBF personally being so. E.g. if you click through the restaurant tabs were about doordash orders for FTX, not SBF personally. I think it's totally consistent to believe it's worth spending a lot on employee food (especially given they were trying to retain top talent in a difficult location in a high-paying field) while being personally more abstemious

As an EA at the time (let's say mid-2022), I knew there were aspects of the of the FTX situation what were very plush. I still believed it was part of SBF's efforts to make as much money as possible for good causes, and had heard SBF say things communicating that he thought it was worth spending a lot in the course of optimizing intensely for having the best shot of making a ton of money in the long run, and was generally skeptical of the impact of aiming at frugality. My impression at the time was indeed that the Corolla was a bit of a gimmick (and that the beanbag was about working longer, not saving money), but that SBF was genuinely very altruistic and giving his wealth away extremely quickly by the standards of new billionaires. 

Yeah re the export controls, I was trying to say "I think CSET was generally anti-escalatory, but in contrast, the effect of their export controls work was less so" (though I used the word "ambiguous" because my impression was that some relevant people saw a pro of that work that it also mostly didn't directly advance AI progress in the US, i.e. it set China back without necessarily bringing the US forward towards AGI). To use your terminology, my impression is some of those people were "trying to establish overwhelming dominance over China" but not by "investing heavily in AI". 

I largely agree with this post, and think this is a big problem in general. There's also a lot of adverse selection that can't be called out because it's too petty and/or would require revealing private information. In a reasonable fraction of cases where I know the details, the loudest critics of a person or project is someone who has a pretty substantial negative-COI that isn't being disclosed, like that the project fired them or defunded them or the person used to date them and broke up with them or something. As with positive COIs, there's a problem where being closely involved with something both gives you more information you could use to form a valid criticism (or make a good hire or grant) that others might miss and is correlated with factors that could bias your judgment. 

But with hiring and grantmaking there are generally internal processes for flagging these, whereas when people are making random public criticisms, there generally isn't such a process 

This is inconsistent with my impressions and recollections. Most clearly, my sense is that CSET was (maybe still is, not sure) known for being very anti-escalatory towards China, and did substantial early research debunking hawkish views about AI progress in China, demonstrating it was less far along than ways widely believed in DC (and that EAs were involved in this, because they thought it was true and important, because they thought current false fears in the greater natsec community were enhancing arms race risks) (and this was when Jason was leading CSET, and OP supporting its founding). Some of the same people were also supportive of export controls, which are more ambiguous-sign here.

yeah, on second thought I think you're right that at least the arg "For a fixed valuation, potential is inversely correlated with probability of success" probably got a lot less attention than it should have, at least in the relevant conversations I remember

I'm a bit confused about how the first part of this post connects to the final major section... I recall people saying many of the things you say you wish you had said... do you think people were unaware FTX, a recent startup in a tumultuous new industry, might fail? Or weren't thinking about it enough? 

I agree strongly with your last paragraph, but I think most people I know who bounced from EA were probably just more of gold diggers, fad-follwing, or sensitive to public opinion and less willing to do what's hard when circumstances become less comfortable (but of course they won't come out and say it and plausibly don't admit it to themselves). Of the rest, it seems like they were bothered by a combination of the fraud, how EAs responded to the collapse, and updated towards the dangers of more utilitarian-style reasoning and the people it attracts. 

Load more