Doing alignment research with Vivek Hebbar's team at MIRI.
Should the EA Forum team stop optimizing for engagement?
I heard that the EA forum team tries to optimize the forum for engagement (tests features to see if they improve engagement). There are positives to this, but on net it worries me. Taken to the extreme, this is a destructive practice, as it would
I'm not confident that EA Forum is getting worse, or that tracking engagement is currently net negative, but we should at least avoid failing this exercise in Goodhart's Law.
I was thinking of reasons why I feel like I get less value from EA Forum. But this is not the same as reasons EAF might be declining in quality. So the original list would miss more insidious (to me) mechanisms by which EAF could actually be getting worse. For example I often read something like "EA Forum keeps accumulating more culture/jargon; this is questionably useful, but posts not using the EA dialect are received increasingly poorly." There are probably more that I can't think of, and it's harder for me to judge these...
The epistemic spot checker could also notice flaws in reasoning; I think Rohin Shah has done this well.
Note that people in US/UK and presumably other places can buy drugs on the grey market (e.g. here) for less than standard prices. Although I wouldn't trust these 100%, they should be fairly safe because they're certified in other countries like India; gwern wrote about this here for modafinil and the basic analysis seems to hold for many antidepressants. The shipping times advertised are fairly long but potentially still less hassle than waiting for a doctor's appointment for each one.
This currently has +154 karma on EA Forum and only +24 on LW, with similar exposure on each site, so I think it's fair to say that the reception is positive here and negative on LW. Maybe it's worth thinking about why.