I wanted to offer a different perspective on this post on why people use burner accounts from someone who has used one previously. This is not intended to be a rebuttal of the arguments made there. It is meant to add to the public discourse and made more sense as a separate post than a comment.
I hope that any upvotes/downvotes are given based on the effort you think I put in offering a different perspective instead of whether you agree/disagree with my comments (NB: I think there should be separate upvote/downvote agree/disagree buttons for forum posts too).
Disclaimer
Note that...
- I used a burner account in early 2021 after finding myself unhappy with the EA Community.
- I've been visiting the EA Forums weekly since and occasionally still going to EAGs.
- I work at an EA organisation that receives significant funding from Open Philanthropy.
Reasons for Using Burners
The two biggest reasons for using burners are the potential operation of blacklists and funding. I'll use them together (with anecdotes and some evidence) to make an overarching point at the end, so read all the way through.
Blacklists
- Unfortunately, some EA groups and organisations use blacklists (this seems more common in the Bay Area).
- Note that this is difficult to prove as...
- I'd have to directly ask them, "Are you using a blacklist? I've heard X rumour that seems to suggest this", and they're very unlikely to say yes (if they are) as it's not in their interests.
- I don't want to be seen as a "troublemaker" by suggesting an organisation is using a blacklist when I have a strong reason to believe they are. If they operate a blacklist, I'd likely be blacklisted from events for being a "troublemaker".
- [Anecdote suggesting the existence of a blacklist removed on request]
- People have been told that they may have been supposedly blacklisted by organisers for being "epistemically weak" and "not truth-seeking" enough.
- These are common terms used amongst Bay Area rationalists and even some funders.
- When I searched the term truth-seeking on the EA Forums, I found this comment by a funder who was later asked by the OP of the post what "truth-seeking" meant.
- Anecdotally, a friend of mine was rejected by the Open Philanthropy Undergraduate Scholarship with the grantmaker saying they weren't "truth-seeking enough" as their feedback.
- Strong Personal Opinion: I think the problem here is rationality. It provides a camouflage of formalism, dignity, and an intellectual high ground for when you want to be an absolute asshole, justify contrarian views, and quickly dismiss other people's opinions. By contrarian, as an example, you think diversity is not important (as many Bay Area rationalists do) when most of Western society and the media do think it is important.
- Justifying this strong personal opinion with 20+ anecdotes would take a post on its own. I may write this in the future if there's enough interest, and I can do so without the risk of doxing myself by accidentally de-anonymizing details.
- Previously, I've pushed some rationalists on why they thought someone wasn't "truth-seeking" enough or "epistemically weak". Around half the time, they couldn't give a clear answer which makes me believe they're using these as buzzwords to camouflage the fact they don't like someone for whatever reason.
- These are common terms used amongst Bay Area rationalists and even some funders.
Funding
- Another claim that is hard to prove is that 'there is/has been an intermingling of funding and romantic relationships'. This becomes more complicated with the prevalence of polyamory.
- I realised this to be disturbingly common about two years ago, but I chose not to speak up for fear of being blacklisted.
- I've been inspired by courageous people making posts on this (with burner accounts). I didn't have the courage to write a post when I first noticed this two years ago and still didn't have the courage in the many successive times since.
- A rumour about a senior program officer at Open Philanthropy and a grantee in a metamour-relationship has previously been 'verified' on the Forums.
- There seems to be an inner circle of funders and grantees (predominately in the Bay Area) where the grantees often don't need to write grant applications and can just ask for money (often retroactively).
- Note that this is also hard to prove. I could email Open Philanthropy asking this, but my organisation receives significant funding from them, and I don't want to be seen as a "troublemaker". I like to think they don't operate a blacklist, but even if there's a 1-20% chance they do, questioning a grant only to be later put on a blacklist with my organisation being defunded is not in my interests, given I have to make a living.
- This touches on a wider point about having your entire life: relationships (professional, personal and romantic), professional identity, and personal identity wrapped up in EA with the need to also made a living. When this is true for hundreds of people in a tight-knit community with regular conferences and meetups, it leads to strange dynamics/decision-making.
- Therefore, I feel comfortable questioning these grants using burner accounts.
- I'll do some of this now.
- Despite Holden pausing "pausing most new longtermist funding commitments" in November 2022 only to later unpause it in January 2023, the Atlas Fellowship (which falls under longtermism) received $1.8 million in December 2022. (Note the use of "most", this suggests to me that the same rules don't apply to everyone, as you'll see below).
- I find this problematic as the Atlas Fellowship share the same offices with Open Philanthropy in the Bay Area. The offices are called Constellation.
- In the Forum post linked on the word "Constellation", it says, "Constellation is run by Redwood Research. Aside from the Redwood team, which is about 30 people, they have a coworking space that is more targeted at organisations rather than independent researchers or smaller projects. Thus, Constellation hosts staff from Open Phil, ARC, the FTX Future Fund, CEA, AI Impacts, Atlas Fellowship, MIRI, Lightcone, Alvea, and GCP. Access to Constellation is typically more limited than Lightcone. Currently (as of July 2022) there is no application form, and they are mostly focused on supporting members from the organisations in the space."
- Note that all these organisations (besides the FTX Future Fund) receive significant funding from Open Philanthropy. This includes Redwood Research.
- The fact there is no application form seems to add to the cliquey-ness and the fact there may be an "inner circle" of funders and grantees who are held less accountable by each other.
- I have visited Constellation before, and all three individuals in question on Rumour 4 on this post currently do work there regularly / have worked in the past there regularly.
- I didn't realise until this post that a Senior Program Officer at Open Philanthropy is married to the CTO of Redwood Research. I find this disturbing as Redwood Research received $10.7m from Open Philanthropy without mention of the grant investigator.
- (Encrypted in rot13) Ol Fravbe Cebtenz Bssvpre ng Bcra Cuvynaguebcl, V'z ersreevat gb Pynver Mnory. Ol PGB bs Erqjbbq Erfrnepu, V'z ersreevat gb Ohpx Fpuyrtrevf.
- More widely, I'd be interested in how power dynamics work at Open Philanthropy. I imagine even if the grant investigator for Redwood Research is one of the other Program Officers at Open Philanthropy, does the Senior Program Officer (who, remember, is married to the CTO of Redwood Research) have to agree/disagree with the grant amount/decision that their subordinate makes?
- I imagine there are power dynamics where a program officer wants a promotion, doesn't want to risk being fired, or wouldn't want to disappoint their superior (who is married to the CTO of Redwood Research) by giving her husband a smaller amount than requested.
- Things become significantly more complicated once you throw polyamory into the mix.
- I didn't realise until this post that a Senior Program Officer at Open Philanthropy is married to the CTO of Redwood Research. I find this disturbing as Redwood Research received $10.7m from Open Philanthropy without mention of the grant investigator.
- In the Forum post linked on the word "Constellation", it says, "Constellation is run by Redwood Research. Aside from the Redwood team, which is about 30 people, they have a coworking space that is more targeted at organisations rather than independent researchers or smaller projects. Thus, Constellation hosts staff from Open Phil, ARC, the FTX Future Fund, CEA, AI Impacts, Atlas Fellowship, MIRI, Lightcone, Alvea, and GCP. Access to Constellation is typically more limited than Lightcone. Currently (as of July 2022) there is no application form, and they are mostly focused on supporting members from the organisations in the space."
- I find this problematic as the Atlas Fellowship share the same offices with Open Philanthropy in the Bay Area. The offices are called Constellation.
By "ask for money (often retroactively)", I am referring to thegrant made to the Future Forum(a conference held Aug 4 - 7, 2022).What is true is that the Future Forum was kicked out of the advertised venue (in theNeogenesis Group House) due to noise complaints from neighbours (an attorney showed up on the driveway and told everyone to leave). The problem was this happened on the Day 1 of the conference, Day 2 was still in the group house, but after Day 2, the volunteers had to work through the night (reportedly with no breaks) to set up a new venue for Day 3 and Day 4.From the volunteers, I've heard that the Future Forum's organisers were so bad that anyone from the CEA Events Team still in the Bay Area post-EAG SF (July 29 - 31, 2022) had to step in to clean up their mess.Cleaning up their mess included getting a new venue last minute (which was very expensive), which took them into large debt, and then, reportedly, being bailed out by Open Philanthropy (retroactively).They could have been bailed out because the organisers were on good terms with the funders.- Whilst I can't verify this, I believe this is true as I've seen instances of smaller grants (often in the $ 1000s) where a well-connected grantee will spend the money, go into debt, and ask to be bailed out by a funder instead of applying for a grant to begin with. This is a bad culture.
- NB: this grant also doesn't have a grant investigator listed. I think all OP grants should have their grant investigators listed with an indication of who made the final call and what percentage of time each grantmaker spent on the application (which shouldn't be hard to do with most time-tracking software).
- Note that this is also hard to prove. I could email Open Philanthropy asking this, but my organisation receives significant funding from them, and I don't want to be seen as a "troublemaker". I like to think they don't operate a blacklist, but even if there's a 1-20% chance they do, questioning a grant only to be later put on a blacklist with my organisation being defunded is not in my interests, given I have to make a living.
My Overall Point
Ultimately, my overall point is that one reason for using a burner account (like in my case) is that if you don't belong to the "inner circle" of funders and grantees, then I believe that different rules apply to you. And if you want to join that inner circle, you need not question grants by directly emailing OP. And once you're inside the inner circle, but you want to criticise grants, you must use a burner account or risk being de-funded or blacklisted. If you ask why you were blacklisted, you'll have the reason of "we don't like you, or you're a trouble-maker" camouflaged in you being "epistemically weak" or "not truth-seeking enough."
Edit 1: I edited the beginning of this post as per this comment.
Edit 2: Retracted Future Forum statement because of this comment.
Edit 3: Anecdote removed on request.
I'm Isaak, the lead organizer of Future Forum. Specifically addressing the points regarding Future Forum:
I don't know whether retroactive funding happens in other cases. However, all grants made to Future Forum were committed before the event. The event and the organization received three grants in total:
Applications for the grants were usually sent 1-3 weeks before approval. While we had conversations with funders throughout, all applications went through official routes and application forms.
I received the specific grant application approval emails on:
The event ran from August 4-7th. I.e., we never had a grant committed "retroactively".
Knowing that the event was experimental and that the core team didn't have much operat... (read more)
Hi, I'm Leilani. I run the org that was brought on to help with Future Forum in the final weeks leading to the event.
I wanted to verify that no grants were applied for retroactively by Future Forum or Canopy. All funding was approved prior to the event. All OP funding was also received prior to the event. At no point have we been in debt.
Additionally, we are eternally grateful for the CEA events team and our volunteers for all their help. It was a stressful and unexpected situation that we would not have gotten through without them.
Hi, I'm Patrick Finley. I want to chime in here, because I think there's a number of fairly ridiculous claims below about Future Forum (in addition to the original post). I also think Isaak's response above is overly generous/conservative, and want to share my opinions on just how far the delta is between this post+comments and reality.
I attended Future Forum and it was easily the best conference/event I've been to in Bay Area in the vectors that matter (ie, quality of conversations, people, speakers, new connections, etc) and frankly its not close (I've been to EAGs & others). I don't mean this as a knock to others (the standard at EA&adjacent events seems pretty good), rather this was unusually great.
The negatives I heard during the event and afterwards were about behind the scenes stuff that didn't seem to affect the actual value to attendees much (including the below). Eg I was at a hotel, and had a nice comfy bus take me to a new venue bc of issues w/ neighborhood. Not sure how this makes the event worse if you're talking about the purpose of the event vs things that don't matter for attendees.
- The event went from idea to reality in like 3 months. Isaak founded a
... (read more)Suggestion of concensus opinion:
People should not be dating or living with those who work for them. If they start, they should tell HR so they can be transferred.
Your grant hypo is likely moot because making the grant without the partner recused would probably be unlawful -- at least in the US and if a nonprofit were involved. That's self-dealing and improper inurement (private gain) in my book.
[EDIT: it looks like the summary in the post was wrong and there wasn't any retroactive funding; see Isaak's comment.]
In your section on asking for money retroactively you mostly rely on an event where you say:
It was organized by one group of organizers.
They messed up pretty badly and the CEA Events team took it over very late in the process (either just before or during, I can't tell) and worked really hard to keep it going.
OP agreed to fund CEA's work on this retroactively.
I'm just going on your description, but what would you have liked CEA and/or OP to do differently? Some options, none of which are great:
CEA doesn't step in. Lots of people have a bad experience at an EA-[EDIT: adjacent?] event.
CEA waits to step in until they have funding formalized. This turns into #1, because they were stepping in at the last minute and there wouldn't have been time. (Possibly CEA informally confirmed with OP that this was the kind of thing they would likely fund -- I don't know.)
CEA doesn't ask OP to fund it or OP refuses, and CEA fundraises for it independently. This isn't terrible, but it's a complicated situation that requires a lot of trust, exactly the kind of
I think stuff failing is usually not as bad as it seems, and most of the reputational harm would have fallen on the organisers (who deserved it). On the other hand there are substantial risks to creating a perception that poorly executed things will be bailed out.
So from an outside view (I don't know much about the event itself) I say they should have let it fail.
(EDIT: to be clear, in light of the new comments about the Future Forum: I think bailouts are often bad, if the Future Forum wasn't actually bailed out then great. )
Thanks; edited! Are you saying you like (1) because it wouldn't actually have been that bad for it to implode?
I'm not going to deal with the topic of the post, but there's another reason to not post under a burner account if it can be avoided that I haven't seen mentioned, which this post indirectly highlights.
When people post under burner accounts, it makes it harder to be confident in the information that the posts contain, because there is ambiguity and it could be the same person repeatedly posting. To give one example (not the only one), if you see X number of burner accounts posting "I observe Y", then that could mean anywhere from 1 to X observations of Y, and it's hard to get a sense of the true frequency. This means it undermines the message of those posting, to post under burners, because some of their information will be discounted.
In this post, the poster writes "Therefore, I feel comfortable questioning these grants using burner accounts," which suggests in fact that they do have multiple burner accounts. I recognize that using the same burner account would, over time, aggregate information that would lead to slightly less anonymity, but again, the tradeoff is that it significantly undermines the signal. I suspect it could lead to a vicious cycle for those posting, if they repeatedly feel like their posts aren't being taken seriously.
Here's an example of a past case where a troll (who also trolled other online communities) made up multiple sock-puppet accounts, and assorted lies about sources for various arguments trashing AI safety, e.g. claiming to have been at events they were not and heard bad things, inventing nonexistent experts who supposedly rejected various claims, creating fake testimonials of badness, smearing people who discovered the deception, etc.
One thing I'd like to quickly flag on the topic of this comment: using multiple accounts to express the same opinion (e.g. to create the illusion of multiple independent accounts on this topic) is a (pretty serious) norm violation. You can find the full official norms for using multiple accounts here.
This doesn't mean that e.g. if you posted something critical of current work on forecasting at some point in your life, you can't now use an anonymous account to write a detailed criticism of a forecasting-focused organization. But e.g. commenting on the same post/thread with two different accounts is probably quite bad.
Can you name the prior burner to establish a link?
Something I agree strongly on: I think EA is far too casual as a movement about conflicts of interest and mixing romantic and personal relationships. Extremely important executives at influential billion-dollar companies are routinely fired for the kind of relationship that wouldn't even warrant a mention in EA. We should, as a community, have MUCH stronger guardrails where there is never a question of impropriety in this way.
In the case you cite, they were fired for not disclosing the relationship. My understanding is the way this normally works is that you to tell HR, and then the company figures out how to move people around so that neither is in a position to unfairly affect the other's work at the company.
(To give a non-central example, when my wife (then-fiancee) and I worked in a kitchen I reported to the head cook even in cases when my wife would normally have been my supervisor.)
Several people were confused by what I meant here
The problem goes beyond guardrails. Any attempts to reduce these conflicts of interest would have to contend with the extremely insular social scene in Berkeley. Since grantmakers frequently do not interact with many people outside of EA, and everyone in EA might end up applying for a grant from Open Phil, guardrails would significantly disrupt the social lives of grantmakers.
Let's not forget that you can not just improperly favor romantic partners, but also just friends. The idea of stopping Open Phil from making grants to organizations where employees are close friends with (other) grantmakers is almost laughable because of how insular the social scene is--but that's not at all normal for a grantmaking organization.
Even if Open Phil grantmakers separated themselves from the rest of the community, anyone who ever wanted to potentially become a grantmaker would have to do so as well because the community is so small. What if you become a grantmaker and your friend or romantic partner ends up applying for a grant?
In addition, many grants are socially evaluated at least partially, in my experience. Grantmakers have sometimes asked me what I think of people applying for grants, for ex... (read more)
To be very clear: I am not saying "this can never be changed." I am saying that it would require changing the EA social scene-- that is, to somehow decentralize it. I am not sure how to do that well (rather than doing it poorly, or doing it in name only). But I increasingly believe it is likely to be necessary.
I have not (yet) known myself to ever be negatively affected for speaking my mind in EA. However, I know others who have. Some possible reasons for the difference:
- My fundamental ethical beliefs are pretty similar to the most senior people.
- On the EA Forum, I make almost extreme effort to make tight claims and avoid overclaiming (though I don't always succeed). If I have vibes-based criticisms (I have plenty) I tend to keep them to people I trust.
- I "know my audience:" I am good at determining how to say things such that they won't be received poorly. This doesn't mean "rhetoric," it means being aware of the most common ways my audience might misinterpret my words or the intent behind them, and making a conscious effort to clearly avoid those misinterpretations.
- Related to the above, I tend to "listen before I speak" in new environments. I avoid making sweeping claims before I know my audience and understand their perspective inside and out.
- I'm a techy white man working in AI safety and I'm not a leftist, so I'm less likely to be typed by people as an "outsider." I suspect this is mostly subconscious, except for the leftist part, where I think there are some community members who will
... (read more)So, to clarify, a guess from an unrelated party about why this talk might have resulted in a lack of an invitation pattern-matched to language used by other people in a way that has no (obvious to me) relationship to blacklists...?
I'm not sure what this was intended to demonstrate.
I am curious how you would distinguish a blacklist from the normal functioning of an organization when making hiring decisions. I guess maybe "a list of names with no details as to why you want to avoid hiring them" passed around between organizations would qualify as the first but not the second? I obviously can't say with surety that no such thing exists elsewhere, but I would be pretty surprised to learn about any major organizations using one.
Thank you for a good description of what this feels like . But I have to ask… do you still “want to join that inner circle” after all this? Because this reads like your defense of using a burner account is that it preserves your chance to enter/remain in an inner ring which you believe to be deeply unethical. Which would be bad! Don't do that! Normally I don’t go around demanding that people must be willing to make personal sacrifices for the greater good if they want to be taken seriously but this is literally a forum for self-declared altruists.
Several times I’ve received lucrative offers and overtures from sources (including one EA fund) that seemed corrupt in ways that resemble how you think your funder is corrupt. Each t... (read more)
Just for the record, since I can imagine the comments giving people a vibe that getting retroactive funding is bad: If you run an EA project and for some unexpected reason you go over budget, please do apply for retroactive funding at least from the LTFF. Planning is hard, sometimes things go wrong.
It won't always make sense to bail you out, but I do actually prefer the world where we fund people enough to cover the 80th percentile of expected cost and then fill you up in the remaining 20% instead of a world where we fund everyone to the 98th percentile of cost and then have people try to give money back to us, or generically overfund a bunch of projects.
Like others commenting, I'm not convinced that the anecdotes here point to blacklists. I will say, if organizations do have blacklists and put people on them for reasons like "they gave a talk I didn't like", that's very bad, and I'm against it.
I do think adjectives like "weak epistemics", "not truth-seeking", and "not rational" are often completely contentless and are basically power moves. I basically think there are few contexts when it makes sense to apply these to other people in the community, and if you think there's something flawed with the way that a person thinks, you should state it more precisely (e.g. "this person seems to make hyperbolic and false claims a lot", or "they gave a talk and it seemed to be based on vague vibes rather than evidence", or "they value other things much more highly than utility and I value utility extremely highly, so we don't agree".
- Sorry Constellation is not a secret exclusive office (I've been invited and I'm incredibly miscellaneous and from New Zealand).
- It's unlikely Claire gave a grant to Buck since (a) like you said this is a well-known relationship (b) the grant-makers for AI safety are 2 people who are not Claire (Asya Berghal and Luke Muelhausser).
- From personal experience it's actually really easy to talk about diversity in EA? I literally chatted with someone who is now my friend when I said I believe in critical race theory and they responded wokism hurt a lot of their friends and now we're friends and talk about EA and rationalism stuff all the time. I find most rationalist are so isolated away from blue tribe now days that they treat diversity chat with a lot of morbid curiosity and if you can justify you beliefs well.
- Blacklists as I understand them have really high bars and are usually used for assault or when a person is a danger to the community. I also think not inviting someone to a rationality retreat cause you don't want to hang out with them is fine. I would rather die th
... (read more)It's a WeWork and from my understanding it doesn't accept more applications because it's full.I don't think this is standard anywhere for grantors, but I was unsure, so I checked a few. Carnegie, Gates, and the National Council of Nonprofits guidance. All three require disclosure, some cases require recusal, none of the three ban funding.
This seems pretty hard to put into practice. Let's say TLA gets most of it's funding from OP. TLA is considering hiring someone: should they ask "are you romantically involved with any of these 80 people" as part of their decision to hire, and weigh employing this particular person against the difficulty of making up the funding shortfall? Or after hiring someone should TLA then ask the question, and just accept losing most of their funding if the answer is yes? Should OP be doing the same? ("Are you romantically involved with any of these ~10,000 people at these organizations we have an ongoing relationship with?")
The particular situation you're talking about is with a relatively senior person at OP, but I think not incredibly so? They're one of 27 /80 people who either have "senior" in their title or are co-CEO/President. The person at the grantee org looks to be much more senior, probably the #2 or #3 person at the org. A version of your proposal that only included C-level people at the grantee org and people working on the relevant area (or people who directly or indirectly supervise people who do) at the granting org would make a lot more sense, though I'm not sure it's a good idea.
(I do think you should have COI policies, but recusal at the granting organization is the standard way to do it outside EA, and I think is pretty reasonable for EA as well.)
I appreciate where the sentiment is coming from (and I'd personally be in favour of stronger COI norms than a lot of EA funders seem to have) but the impact cost of this seems too high as stated.
There's value is being squeaky clean, but there's also value in funding impactful projects, and I think having COI policies apply across the whole large org ("If anyone from our org is dating anyone from your org, we can't fund you") will end up costing way more value than it gains.
I've heard others suggest this, but don't know what it means. Do you think Dustin should give half his money to someone else? Or that he should fund two independent grantor organizations to duplicate efforts instead of having only one? Or just that we want more EA billionaires?
This wasn't clear, but I don't think any of the three make sense as solutions. You can't really tell donors that they don't get to make decisions about what to fund, having multiple orgs creates duplication and overlap which is really costly and wasteful - and if the organizations coordinate you haven't really helped, and lastly, sure, I'd love for there to be more donors, but it's not really actionable, other than to tell more EAs to make lots of money. (And they probably should. EA was, is, and will be funding constrained.)
I'm concerned this post describes itself as a response to another post, but doesn't actually address the arguments made in that discussion.
Instead it reads like a continuation of posts by other burner accounts.
For this reason, I downvoted.
It's also full of insinuation and implication and "X may mean Y [which is damning]" in a way that's attempting to get the benefit of "X = Y" without having to actually demonstrate it.
In my opinion, "you have to use a burner account to put forth this kind of 'thinking' and 'reasoning' and 'argument'" is actually a point in EA culture's favor.
On the Future Forum stuff, it seems worth noting that Sam Altman was literally speaking on the morning of the new venue (I was volunteering, though I didn't help through the night as many did). Feels it was a reasonably important morning to go well.
I don't quite know what blacklists means but I imagine that people make judgements based on backchannel information all the time. If that's what the poster means then yeah I'd imagine this happens in lots of little ways (60%) though I'm not sure if disputing grants is a negative signal, though maybe that's easy for me to say as someone who seems to be disagreeable in an acceptable way.
lol when people use this burner account, it's usually closer to "this argument could get a bit annoying" than "I feel the need to protect my anonymity for fear of retribution." please don't speak for all burners
I've been trying it out for a while now before the posts on why there are burner accounts on the forum during the last few days, though I've been trying out this thing where I've posted lots of stuff most effective altruists may not be willing to post. I cover some of the how and why that is here.
https://forum.effectivealtruism.org/posts/KfwFDkfQFQ4kAurwH/evan_gaensbauer-s-shortform?commentId=EjhBquXGiFEppdecY
I've been meaning to ask if there is anything anyone else thinks I should do with that, so I'm open to suggestions.
Of your two concerns regarding community gatekeeping, I'm not concerned about funding because I think good projects still get funded. BUT I'm semi-concerned about "logs" of things or rumors. I don't think there are outright "blacklists" unless we are talking about banning names from EA events due to harrassment, smear journalism, etc. However of "logs" I am concerned about, I am not concerned about the EA forum if you are actually being rigorous or something which is totally in everyone's control.
The type of logs or rumors I am concerned about are wel... (read more)
It seems worth acknowledging a couple of points that are in tension:
- Asking difficult questions probably does affect your ability to get jobs and grants. It's not obvious to me that this is always negative, but it does take a lot more energy to frame negative comments in a way that they aren't taken wrongly. Sometimes being anonymous is just easier
- EA as a community could be better at providing ways to gather around useful critiques. The fact that all have to be posts seems non-ideal. If people could comment on and agree/disagree on grants all listen on the
... (read more)