Hide table of contents

I wanted to offer a different perspective on this post on why people use burner accounts from someone who has used one previously. This is not intended to be a rebuttal of the arguments made there. It is meant to add to the public discourse and made more sense as a separate post than a comment.

I hope that any upvotes/downvotes are given based on the effort you think I put in offering a different perspective instead of whether you agree/disagree with my comments (NB: I think there should be separate upvote/downvote agree/disagree buttons for forum posts too).

Disclaimer

Note that...

  • I used a burner account in early 2021 after finding myself unhappy with the EA Community. 
  • I've been visiting the EA Forums weekly since and occasionally still going to EAGs.
  • I work at an EA organisation that receives significant funding from Open Philanthropy.

Reasons for Using Burners

The two biggest reasons for using burners are the potential operation of blacklists and funding. I'll use them together (with anecdotes and some evidence) to make an overarching point at the end, so read all the way through.

Blacklists

  • Unfortunately, some EA groups and organisations use blacklists (this seems more common in the Bay Area).
    • Note that this is difficult to prove as...
    • I'd have to directly ask them, "Are you using a blacklist? I've heard X rumour that seems to suggest this", and they're very unlikely to say yes (if they are) as it's not in their interests.
    • I don't want to be seen as a "troublemaker" by suggesting an organisation is using a blacklist when I have a strong reason to believe they are. If they operate a blacklist, I'd likely be blacklisted from events for being a "troublemaker".
  • [Anecdote suggesting the existence of a blacklist removed on request]
  • People have been told that they may have been supposedly blacklisted by organisers for being "epistemically weak" and "not truth-seeking" enough.
    • These are common terms used amongst Bay Area rationalists and even some funders.
    • Strong Personal Opinion: I think the problem here is rationality. It provides a camouflage of formalism, dignity, and an intellectual high ground for when you want to be an absolute asshole, justify contrarian views, and quickly dismiss other people's opinions. By contrarian, as an example, you think diversity is not important (as many Bay Area rationalists do) when most of Western society and the media do think it is important.
      • Justifying this strong personal opinion with 20+ anecdotes would take a post on its own. I may write this in the future if there's enough interest, and I can do so without the risk of doxing myself by accidentally de-anonymizing details.
      • Previously, I've pushed some rationalists on why they thought someone wasn't "truth-seeking" enough or "epistemically weak". Around half the time, they couldn't give a clear answer which makes me believe they're using these as buzzwords to camouflage the fact they don't like someone for whatever reason.

Funding

  • Another claim that is hard to prove is that 'there is/has been an intermingling of funding and romantic relationships'. This becomes more complicated with the prevalence of polyamory.
    • I realised this to be disturbingly common about two years ago, but I chose not to speak up for fear of being blacklisted.
    • I've been inspired by courageous people making posts on this (with burner accounts). I didn't have the courage to write a post when I first noticed this two years ago and still didn't have the courage in the many successive times since.
    • A rumour about a senior program officer at Open Philanthropy and a grantee in a metamour-relationship has previously been 'verified' on the Forums.
  • There seems to be an inner circle of funders and grantees (predominately in the Bay Area) where the grantees often don't need to write grant applications and can just ask for money (often retroactively).
    • Note that this is also hard to prove. I could email Open Philanthropy asking this, but my organisation receives significant funding from them, and I don't want to be seen as a "troublemaker". I like to think they don't operate a blacklist, but even if there's a 1-20% chance they do, questioning a grant only to be later put on a blacklist with my organisation being defunded is not in my interests, given I have to make a living.
      • This touches on a wider point about having your entire life: relationships (professional, personal and romantic), professional identity, and personal identity wrapped up in EA with the need to also made a living. When this is true for hundreds of people in a tight-knit community with regular conferences and meetups, it leads to strange dynamics/decision-making.
    • Therefore, I feel comfortable questioning these grants using burner accounts.
      • I'll do some of this now.
      • Despite Holden pausing "pausing most new longtermist funding commitments" in November 2022 only to later unpause it in January 2023, the Atlas Fellowship (which falls under longtermism) received $1.8 million in December 2022. (Note the use of "most", this suggests to me that the same rules don't apply to everyone, as you'll see below).
        • I find this problematic as the Atlas Fellowship share the same offices with Open Philanthropy in the Bay Area. The offices are called Constellation.
          • In the Forum post linked on the word "Constellation", it says, "Constellation is run by Redwood Research. Aside from the Redwood team, which is about 30 people, they have a coworking space that is more targeted at organisations rather than independent researchers or smaller projects. Thus, Constellation hosts staff from Open Phil, ARC, the FTX Future Fund, CEA, AI Impacts, Atlas Fellowship, MIRI, Lightcone, Alvea, and GCP. Access to Constellation is typically more limited than Lightcone. Currently (as of July 2022) there is no application form, and they are mostly focused on supporting members from the organisations in the space."
            • Note that all these organisations (besides the FTX Future Fund) receive significant funding from Open Philanthropy. This includes Redwood Research.
            • The fact there is no application form seems to add to the cliquey-ness and the fact there may be an "inner circle" of funders and grantees who are held less accountable by each other.
          • I have visited Constellation before, and all three individuals in question on Rumour 4 on this post currently do work there regularly / have worked in the past there regularly.
            • I didn't realise until this post that a Senior Program Officer at Open Philanthropy is married to the CTO of Redwood Research. I find this disturbing as Redwood Research received $10.7m from Open Philanthropy without mention of the grant investigator.
              • (Encrypted in rot13) Ol Fravbe Cebtenz Bssvpre ng Bcra Cuvynaguebcl, V'z ersreevat gb Pynver Mnory. Ol PGB bs Erqjbbq Erfrnepu, V'z ersreevat gb Ohpx Fpuyrtrevf.
              • More widely, I'd be interested in how power dynamics work at Open Philanthropy. I imagine even if the grant investigator for Redwood Research is one of the other Program Officers at Open Philanthropy, does the Senior Program Officer (who, remember, is married to the CTO of Redwood Research) have to agree/disagree with the grant amount/decision that their subordinate makes?
              • I imagine there are power dynamics where a program officer wants a promotion, doesn't want to risk being fired, or wouldn't want to disappoint their superior (who is married to the CTO of Redwood Research) by giving her husband a smaller amount than requested.
              • Things become significantly more complicated once you throw polyamory into the mix.
    • By "ask for money (often retroactively)", I am referring to the grant made to the Future Forum (a conference held Aug 4 - 7, 2022). 
      • What is true is that the Future Forum was kicked out of the advertised venue (in the Neogenesis Group House) due to noise complaints from neighbours (an attorney showed up on the driveway and told everyone to leave). The problem was this happened on the Day 1 of the conference, Day 2 was still in the group house, but after Day 2, the volunteers had to work through the night (reportedly with no breaks) to set up a new venue for Day 3 and Day 4.
        • From the volunteers, I've heard that the Future Forum's organisers were so bad that anyone from the CEA Events Team still in the Bay Area post-EAG SF (July 29 - 31, 2022) had to step in to clean up their mess.
        • Cleaning up their mess included getting a new venue last minute (which was very expensive), which took them into large debt, and then, reportedly, being bailed out by Open Philanthropy (retroactively).
          • They could have been bailed out because the organisers were on good terms with the funders.
          • Whilst I can't verify this, I believe this is true as I've seen instances of smaller grants (often in the $ 1000s) where a well-connected grantee will spend the money, go into debt, and ask to be bailed out by a funder instead of applying for a grant to begin with. This is a bad culture.
      • NB: this grant also doesn't have a grant investigator listed. I think all OP grants should have their grant investigators listed with an indication of who made the final call and what percentage of time each grantmaker spent on the application (which shouldn't be hard to do with most time-tracking software).

My Overall Point

Ultimately, my overall point is that one reason for using a burner account (like in my case) is that if you don't belong to the "inner circle" of funders and grantees, then I believe that different rules apply to you. And if you want to join that inner circle, you need not question grants by directly emailing OP. And once you're inside the inner circle, but you want to criticise grants, you must use a burner account or risk being de-funded or blacklisted. If you ask why you were blacklisted, you'll have the reason of "we don't like you, or you're a trouble-maker" camouflaged in you being "epistemically weak" or "not truth-seeking enough."

Edit 1: I edited the beginning of this post as per this comment

Edit 2: Retracted Future Forum statement because of this comment.

Edit 3: Anecdote removed on request.

Comments98
Sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I'm Isaak, the lead organizer of Future Forum. Specifically addressing the points regarding Future Forum:

By "ask for money (often retroactively)", I am referring to the grant made to the Future Forum (a conference held Aug 4 - 7, 2022). 

I don't know whether retroactive funding happens in other cases. However, all grants made to Future Forum were committed before the event.  The event and the organization received three grants in total:

Applications for the grants were usually sent 1-3 weeks before approval. While we had conversations with funders throughout, all applications went through official routes and application forms. 

I received the specific grant application approval emails on: 

  • Feb 28th, 2022, 9:36 AM PT, 
  • July 5th, 2022, 5:04 PM PT, 
  • July 18th, 2022, 11:28 AM PT. 

The event ran from August 4-7th. I.e., we never had a grant committed "retroactively".

 

Cleaning up their mess included getting a new venue last minute (which was very expensive), which took them into large debt, and then, reportedly, being bailed out by Open Philanthropy (retroactively).

Knowing that the event was experimental and that the core team didn't have much operat... (read more)

Hi, I'm Leilani. I run the org that was brought on to help with Future Forum in the final weeks leading to the event. 

I wanted to verify that no grants were applied for retroactively by Future Forum or Canopy.  All funding was approved prior to the event. All OP funding was also received prior to the event. At no point have we been in debt.

Additionally, we are eternally grateful for the CEA events team  and our volunteers for all their help. It was a stressful and unexpected situation that we would not have  gotten through without them. 

Hi, I'm Patrick Finley. I want to chime in here, because I think there's a number of fairly ridiculous claims below about Future Forum (in addition to the original post). I also think Isaak's response above is overly generous/conservative, and want to share my opinions on just how far the delta is between this post+comments and reality.

I attended Future Forum and it was easily the best conference/event I've been to in Bay Area in the vectors that matter (ie, quality of conversations, people, speakers, new connections, etc) and frankly its not close (I've been to EAGs & others). I don't mean this as a knock to others (the standard at EA&adjacent events seems pretty good), rather this was unusually great. 

The negatives I heard during the event and afterwards were about behind the scenes stuff that didn't seem to affect the actual value to attendees much (including the below). Eg I was at a hotel, and had a nice comfy bus take me to a new venue bc of issues w/ neighborhood. Not sure how this makes the event worse if you're talking about the purpose of the event vs things that don't matter for attendees.

  • The event went from idea to reality in like 3 months. Isaak founded a
... (read more)
7
Zeynep
Hi, I'm Zeynep! As someone who volunteered at Future Forum I want to give my piece as well. I completely agree with Patrick's response below, the conference was exceptionally good and the majority of attendees from my experience feel the same way.  Although things did go wrong, as they easily can with any event, a lot of the volunteers worked very hard to fix the situation. This included staying up all night to fix the venue. I would like to highlight that when we found out we had to switch venues, we were given the option to not take part in moving equipment. Many volunteers agreed to help out regardless, and did not back out, and this in itself shows how much value even we as volunteers felt the event had to give. It may look trivial on paper, but staying up all night to move an entire venue is not for the faint of heart.  Although stress levels were briefly high, everything was handled exceedingly well internally and I was personally told by attendees that they did not feel that the venue change was an issue. As someone who has been to many many many conferences, I would say that the event was a huge success. Everything other than the venue change - from the food to the talks, ran without any hiccups, which is ridiculously impressive when you have witnessed things go very wrong in other events. I firmly think that the entire staff + volunteers deserve a big round of applause. 

Suggestion of concensus opinion:

People should not be dating or living with those who work for them. If they start, they should tell HR so they can be transferred.

6
Ivy Mazzola
Dating conflicts of interest: I generally agree, but I'd like to flag that I think there are no cases mentioned in the post where people are romantically involved who work for each other. If people have some recent case. I'm open to being proved wrong (eg find some specific case of a grantmaker writing a grant to a partner). But I just don't think this happens. Maybe it used to happen in early EA days IDK, but I'd not think this has happened for years, because the reputational harm is great enough that it should incentivize grantmakers to recuse themselves and let an alternate review the application.  Now, rare exceptions:  * I disagree that grantmakers who don't have alternate grantmakers they can call in to determine grants from their managed fund should not be allowed to grant to a partner who applies. This amounts to barring romantic partners of people to apply for some grants. Not only do I find it unethical and somewhat arbitrarily controlling ("you and only you, oh romantic partner, is not allowed to apply even though your project may very well be the best"), but this would systematically disadvantage women applicants as majority of EA grantmakers are straight men. It's possible this is a moot point basically, as almost every EA grantmaker could recuse themselves and someone else could do the job. But there are the rare tiny funds like Scott Alexander's regranting program. Should his wife not be allowed to apply?  * In cases of startups and small projects, people should be allowed to hire and cofound with their romantic partners, and they can keep those positions for as long as it makes sense to them Simply living with: I disagree that people who work together or for each other should not be housemates, as a mandatory rule. It should be on a case-by-case basis and I think people's incentives are generally aligned so they make the useful choice here. So until I heard something weird I wouldn't even bother looking into it tbh.

Your grant hypo is likely moot because making the grant without the partner recused would probably be unlawful -- at least in the US and if a nonprofit were involved. That's self-dealing and improper inurement (private gain) in my book.

1
Ivy Mazzola
Hm, if you say so. Then I really don't know what people think is going on with EA funding that the legal system wouldn't handle. I really have not heard of anything that I'd call actually-bad once you do your own digging.  EAs are really careful about who we give money to. That's kind of our thing.
2
Nathan Young
That said this is slightly more complex if people have one-off hookups with people. Not really sure what norms should be there. 
2
Davidmanheim
I think this is generally correct, but not sure "living with" is clear enough, given that I don't necessarily have any problem with a boss and employee living in the same group house, if they aren't dating. (I'd imagine there could be some weird power dynamics for votes on house rules, etc. but I don't think it's necessarily an ethical problem, and hope that adults could work out how to deal with that.)

[EDIT: it looks like the summary in the post was wrong and there wasn't any retroactive funding; see Isaak's comment.]

In your section on asking for money retroactively you mostly rely on an event where you say:

  • It was organized by one group of organizers.

  • They messed up pretty badly and the CEA Events team took it over very late in the process (either just before or during, I can't tell) and worked really hard to keep it going.

  • OP agreed to fund CEA's work on this retroactively.

I'm just going on your description, but what would you have liked CEA and/or OP to do differently? Some options, none of which are great:

  1. CEA doesn't step in. Lots of people have a bad experience at an EA-[EDIT: adjacent?] event.

  2. CEA waits to step in until they have funding formalized. This turns into #1, because they were stepping in at the last minute and there wouldn't have been time. (Possibly CEA informally confirmed with OP that this was the kind of thing they would likely fund -- I don't know.)

  3. CEA doesn't ask OP to fund it or OP refuses, and CEA fundraises for it independently. This isn't terrible, but it's a complicated situation that requires a lot of trust, exactly the kind of

... (read more)

I think stuff failing is usually not as bad as it seems, and most of the reputational harm would have fallen on the organisers (who deserved it). On the other hand there are substantial risks to creating a perception that poorly executed things will be bailed out.

So from an outside view (I don't know much about the event itself) I say they should have let it fail.

(EDIT: to be clear, in light of the new comments about the Future Forum: I think bailouts are often bad, if the Future Forum wasn't actually bailed out then great. )

6
Ivy Mazzola
But how do you know reputational harm didn't fall on the organizers? I assume it did tbh, regardless of bailing? Especially if there are blacklists (I'd prefer to call them "logs"), it seems like that is what they'd be for, eg "X screwed up that major thing even though they told us they were totally equipped and experienced in that type of thing"
3
Michael_PJ
Sorry, I wasn't very clear. I agree the organisers suffered reputational damage regardless. I thought that the previous comment was arguing that a reason for bailing out was to prevent _greater _ reputational damage than actually occurred. I was saying that I think the additional damage would mostly have also accrued to the organizers rather than, say, EA as a whole.
6
Quadratic Reciprocity
Re 1: it wasn't really an EA-branded event though, I think.

Thanks; edited! Are you saying you like (1) because it wouldn't actually have been that bad for it to implode?

5
Quadratic Reciprocity
I don't have thoughts on that, just being nitpicky since the original framing was "EA-branded event" :)
2
Jeff Kaufman 🔸
Correction appreciated!
-20
BurnerExplainer

I'm not going to deal with the topic of the post, but there's another reason to not post under a burner account if it can be avoided that I haven't seen mentioned, which this post indirectly highlights.

When people post under burner accounts, it makes it harder to be confident in the information that the posts contain, because there is ambiguity and it could be the same person repeatedly posting. To give one example (not the only one), if you see X number of burner accounts posting "I observe Y", then that could mean anywhere from 1 to X observations of Y, and it's hard to get a sense of the true frequency. This means it undermines the message of those posting, to post under burners, because some of their information will be discounted.

In this post, the poster writes "Therefore, I feel comfortable questioning these grants using burner accounts," which suggests in fact that they do have multiple burner accounts. I recognize that using the same burner account would, over time, aggregate information that would lead to slightly less anonymity, but again, the tradeoff is that it significantly undermines the signal. I suspect it could lead to a vicious cycle for those posting, if they repeatedly feel like their posts aren't being taken seriously.

Here's an example of a past case where a troll (who also trolled other online communities) made up multiple sock-puppet accounts, and assorted lies about sources for various arguments trashing AI safety, e.g. claiming to have been at events they were not and heard bad things, inventing nonexistent experts who supposedly rejected various claims, creating fake testimonials of badness, smearing people who discovered the deception, etc. 

Lizka
Moderator Comment27
8
0

One thing I'd like to quickly flag on the topic of this comment: using multiple accounts to express the same opinion (e.g. to create the illusion of multiple independent accounts on this topic) is a (pretty serious) norm violation. You can find the full official norms for using multiple accounts here

This doesn't mean that e.g. if you posted something critical of current work on forecasting at some point in your life, you can't now use an anonymous account to write a detailed criticism of a forecasting-focused organization. But e.g. commenting on the same post/thread with two different accounts is probably quite bad.

2
BurnerExplainer
I agree with the second paragraph of this comment. Regarding the third paragraph, In my specific case, 1. I acknowledged in the post that I previously used a burner two years ago for whose password I did not save (due it being a burner) and therefore found myself logged out of. I would have used the same burner otherwise. 2. I could flip this around and say that I don't have other active burners because fact that my next bullet-point in the post was "I'll do some of that now" with me proceeding to comment on recent grants in the same post under the same burner instead of making a different post with a different burner. 3. The use of the plural term "burner accounts" is me talking about burner accounts in the abstract rather than me saying I have multiple burner accounts.

Can you name the prior burner to establish a link?

-8
BurnerExplainer

Something I agree strongly on:  I think EA is far too casual as a movement about conflicts of interest and mixing romantic and personal relationships.  Extremely important executives at influential billion-dollar companies are routinely fired for the kind of relationship that wouldn't even warrant a mention in EA.  We should, as a community, have MUCH stronger guardrails where there is never a question of impropriety in this way.

routinely fired for the kind of relationship

In the case you cite, they were fired for not disclosing the relationship. My understanding is the way this normally works is that you to tell HR, and then the company figures out how to move people around so that neither is in a position to unfairly affect the other's work at the company.

(To give a non-central example, when my wife (then-fiancee) and I worked in a kitchen I reported to the head cook even in cases when my wife would normally have been my supervisor.)

5
Jason
Although sometimes mitigating methods aren't going to be effective. As an extreme example, there is no plan HR could have put in place to green-light encounters between Bill Clinton and a White House intern.
7
Jeff Kaufman 🔸
That's a good point: this doesn't always work out nicely. Often this means that the more junior person leaves, which disproportionately falls on women. (My impression is that there's usually severance and this isn't considered a negative by others the same way as being fired for an undisclosed inappropriate relationship would be? But it's still not a good situation.)
2
Nathan Young
Yeah though to my knowledge most heads of EA orgs don't actually date (right) so this isn't a problem we actually have. If it's just inside an org then one or other moving is pretty feasible.

Several people were confused by what I meant here

3
Davidmanheim
Senior people at EA orgs certainly date within the community, and I could imagine it being a problem - you can't transfer away from, say, the head of HR, or the head of operations. But I don't really know if or how this happens within EA orgs, and think organizations need policies to deal with this.

The problem goes beyond guardrails. Any attempts to reduce these conflicts of interest would have to contend with the extremely insular social scene in Berkeley. Since grantmakers frequently do not interact with many people outside of EA, and everyone in EA might end up applying for a grant from Open Phil, guardrails would significantly disrupt the social lives of grantmakers.

Let's not forget that you can not just improperly favor romantic partners, but also just friends. The idea of stopping Open Phil from making grants to organizations where employees are close friends with (other) grantmakers is almost laughable because of how insular the social scene is--but that's not at all normal for a grantmaking organization.

Even if Open Phil grantmakers separated themselves from the rest of the community, anyone who ever wanted to potentially become a grantmaker would have to do so as well because the community is so small. What if you become a grantmaker and your friend or romantic partner ends up applying for a grant?

In addition, many grants are socially evaluated at least partially, in my experience. Grantmakers have sometimes asked me what I think of people applying for grants, for ex... (read more)

To be very clear: I am not saying "this can never be changed." I am saying that it would require changing the EA social scene-- that is, to somehow decentralize it. I am not sure how to do that well (rather than doing it poorly, or doing it in name only). But I increasingly believe it is likely to be necessary.

6
Guy Raveh
I appreciate you holding that the Bay Area EAs who "[all live/party/work] with the same very small group of people" should, umm, stop doing that. But until they do, do you think it's worth it having very poor governance of large sums of money?
2
Ivy Mazzola
"very poor governance" Flagging that this claim needs backing. How poorly governed are the actual dollars (not the relationships) at the end of the day? You decide
2
Guy Raveh
This is fair, though I stand behind my words.
1
NickLaing
I'm genuinely not understanding this. Do you think only Bay area people can manage large amounts of money well?  Or that non EA people won't manage it well? Or something else? 
7
Guy Raveh
I'm saying the opposite, that the same small group shouldn't continue managing everything (and specifically, grantmaking) if they are so prone to conflicts of interest with each other.
1
NickLaing
Thanks I get the point now.
5
BurnerExplainer
I respect you for writing this comment.  This would be something I'd be uncomfortable writing under my name.

I have not (yet) known myself to ever be negatively affected for speaking my mind in EA. However, I know others who have. Some possible reasons for the difference:

  • My fundamental ethical beliefs are pretty similar to the most senior people.
  • On the EA Forum, I make almost extreme effort to make tight claims and avoid overclaiming (though I don't always succeed). If I have vibes-based criticisms (I have plenty) I tend to keep them to people I trust.
  • I "know my audience:" I am good at determining how to say things such that they won't be received poorly. This doesn't mean "rhetoric," it means being aware of the most common ways my audience might misinterpret my words or the intent behind them, and making a conscious effort to clearly avoid those misinterpretations.
  • Related to the above, I tend to "listen before I speak" in new environments. I avoid making sweeping claims before I know my audience and understand their perspective inside and out.
  • I'm a techy white man working in AI safety and I'm not a leftist, so I'm less likely to be typed by people as an "outsider." I suspect this is mostly subconscious, except for the leftist part, where I think there are some community members who will
... (read more)
9
Guy Raveh
Same goes for me, despite not satisfying most of your bullet points, and I often comment with contrarian and controversial views, and am a leftist. But I think different orgs might have very different approaches here. I took part in a residency and in some other activities organised by Czech EAs, and I made it to advanced stages of the hiring process of Rethink Priorities and some other orgs. I hold all of those in high regard, including those who ultimately rejected me, but there are many others who seem fishy in comparison, and who I can see taking my views as expressed on the forum into account. I'm a "white" male too though.
3
TW123
I certainly didn't mean to imply that if you don't have one of those bullet points, you are going to be "blacklisted" or negatively affected as a result of speaking your mind. They just seemed like contributing factors for me, based on my experience. And yeah, I agree different people evaluate differently. Thanks for sharing your perspective.

When they asked a different Bay Area rationality organiser, they were told that their talk on diversity may have been "epistemically weak" and "not truth-seeking" enough.

 

So, to clarify, a guess from an unrelated party about why this talk might have resulted in a lack of an invitation pattern-matched to language used by other people in a way that has no (obvious to me) relationship to blacklists...?

I'm not sure what this was intended to demonstrate.

I am curious how you would distinguish a blacklist from the normal functioning of an organization when making hiring decisions.  I guess maybe "a list of names with no details as to why you want to avoid hiring them" passed around between organizations would qualify as the first but not the second?  I obviously can't say with surety that no such thing exists elsewhere, but I would be pretty surprised to learn about any major organizations using one.

1
BurnerExplainer
1. I should have made this clearer. My claim (also informed other anecdotes that I should have shared) is that people are put on blacklists for trivial reasons (eg, I don't like what this person said, they seem too "woke", they spoke badly about a friend of mine one time) but camouflaged under someone having "weak epistemics" or not being "truth-seeking enough". 2. I'm not sure as I haven't ever made a blacklist or seen other people's blacklists. A blacklist to me seems something that either has (1) no reason or (2) a very weak reason - maybe that's camouflaged in something else (perhaps in rationalist language as described in Point #1).

 Ultimately, my overall point is that one reason for using a burner account (like in my case) is that if you don't belong to the "inner circle" of funders and grantees, then I believe that different rules apply to you. And if you want to join that inner circle, you need not question grants by directly emailing OP. And once you're inside the inner circle, but you want to criticise grants, you must use a burner account or risk being de-funded or blacklisted.

Thank you for a good description of what this feels like . But I have to ask… do you still “want to join that inner circle” after all this? Because this reads like your defense of using a burner account is that it preserves your chance to enter/remain in an inner ring which you believe to be deeply unethical. Which would be bad! Don't do that! Normally I don’t go around demanding that people must be willing to make personal sacrifices for the greater good if they want to be taken seriously but this is literally a forum for self-declared altruists.

Several times I’ve received lucrative offers and overtures from sources (including one EA fund) that seemed corrupt in ways that resemble how you think your funder is corrupt. Each t... (read more)

4
Milan_Griffes
Anonymity is not useful solely for preserving the option to join the critiqued group. It can also help buffer against reprisal from the critiqued group.   See Ben Hoffman on this (a):  "Ayn Rand is the only writer I've seen get both these points right jointly: 1. There's no benefit to joining the inner ring except discovering that their insinuated benefit does not exist. 2. Ignoring inner rings is refusing to protect oneself against a dangerous adversary."
3
NickLaing
Thanks Sarah, you crystallised a bit of what was floating around in my mind on this topic. This sentance could be considered a bit emotive and persuasive  for this forum, but I loved it ;). "Normally I don’t go around demanding that people must be willing to make personal sacrifices for the greater good if they want to be taken seriously but this is literally a forum for self-declared altruists."

Just for the record, since I can imagine the comments giving people a vibe that getting retroactive funding is bad: If you run an EA project and for some unexpected reason you go over budget, please do apply for retroactive funding at least from the LTFF. Planning is hard, sometimes things go wrong. 

It won't always make sense to bail you out, but I do actually prefer the world where we fund people enough to cover the 80th percentile of expected cost and then fill you up in the remaining 20% instead of a world where we fund everyone to the 98th percentile of cost and then have people try to give money back to us, or generically overfund a bunch of projects. 

Like others commenting, I'm not convinced that the anecdotes here point to blacklists. I will say, if organizations do have blacklists and put people on them for reasons like "they gave a talk I didn't like", that's very bad, and I'm against it.

I do think adjectives like "weak epistemics", "not truth-seeking", and "not rational" are often completely contentless and are basically power moves. I basically think there are few contexts when it makes sense to apply these to other people in the community, and if you think there's something flawed with the way that a person thinks, you should state it more precisely (e.g. "this person seems to make hyperbolic and false claims a lot", or "they gave a talk and it seemed to be based on vague vibes rather than evidence", or "they value other things much more highly than utility and I value utility extremely highly, so we don't agree". 

  1. Sorry Constellation is not a secret exclusive office (I've been invited and I'm incredibly miscellaneous and from New Zealand). It's a WeWork and from my understanding it doesn't accept more applications because it's full. 
  2. It's unlikely Claire gave a grant to Buck since (a) like you said this is a well-known relationship (b) the grant-makers for AI safety are 2 people who are not Claire (Asya Berghal and Luke Muelhausser). 
  3. From personal experience it's actually really easy to talk about diversity in EA? I literally chatted with someone who is now my friend when I said I believe in critical race theory and they responded wokism hurt a lot of their friends and now we're friends and talk about EA and rationalism stuff all the time. I find most rationalist are so isolated away from blue tribe now days that they treat diversity chat with a lot of morbid curiosity and if you can justify you beliefs well.  
  4. Blacklists as I understand them have really high bars and are usually used for assault or when a person is a danger to the community.  I also think not inviting someone to a rationality retreat cause you don't want to hang out with them is fine. I would rather die th
... (read more)
8
Habryka
Constellation isn't located in a WeWork. The Lightcone Offices are located in a WeWork. Constellation doesn't really have an application process IIRC, the Lightcone Offices accepts applications (though we are also likely to shut down in March).
3
zchuang
Oh sorry I thought both were definitely weworks? I'll edit that in. 
7
Davidmanheim
The second half of Point 4 seems like a jumble of words I can't figure out the meaning or relevance of. Am I missing something?   It says, if I understand correctly, that you don't know what circling is, you would rather die than do it with people you dislike, that's why people shouldn't be invited to rationality retreats, and you don't know what rationality retreats involve, but (you assume?) they involve circling.
7
zchuang
Yeah so I should have written this clearer. It's making a few claims: 1. Rationalist retreat type things often require a level of intimacy and trust that means it's probably ok for them to be more sensitive and have a lower bar for inviting people. 2. Often a lot of young EAs have status anxiety about being invited to things of actual low importance for their impact (e.g. circling). I'm signalling that these are overblown and these social activities are often overestimated in their enjoyment and status.
6
Robi Rahman
People who are agreement-downvoting this: if you don't agree with part of the comment, please write a reply explaining what you disagree with before downvoting. I see this has many downvotes but I can't tell what part everyone is objecting to.
5
Elizabeth
I think you're conflating Lightcone Office (in WeWork, does have applications but I think did pause at one point because of space issues, run by LessWrong/Lightcone team, houses mostly lesser known orgs and independents) with Constellation (a few blocks away, run by Redwood Research, houses several major orgs)
-4
NickLaing
On point 2, I would probably argue that philanthropic agencies shouldn't give grants to an org which their partner is involved with, regardless of if they are involved in the grant or not. I have been surprised to read a number of posts where this seems to be happening.  This might seem harsh, but there is value in being squeaky clean and leaving yourself above reproach. Other funding agencies can come in if the work is valuable enough.

I don't think this is standard anywhere for grantors, but I was unsure, so I checked a few. Carnegie, Gates, and the National Council of Nonprofits guidance. All three require disclosure, some cases require recusal, none of the three ban funding.

-6
BurnerExplainer

This seems pretty hard to put into practice. Let's say TLA gets most of it's funding from OP. TLA is considering hiring someone: should they ask "are you romantically involved with any of these 80 people" as part of their decision to hire, and weigh employing this particular person against the difficulty of making up the funding shortfall? Or after hiring someone should TLA then ask the question, and just accept losing most of their funding if the answer is yes? Should OP be doing the same? ("Are you romantically involved with any of these ~10,000 people at these organizations we have an ongoing relationship with?")

The particular situation you're talking about is with a relatively senior person at OP, but I think not incredibly so? They're one of 27 /80 people who either have "senior" in their title or are co-CEO/President. The person at the grantee org looks to be much more senior, probably the #2 or #3 person at the org. A version of your proposal that only included C-level people at the grantee org and people working on the relevant area (or people who directly or indirectly supervise people who do) at the granting org would make a lot more sense, though I'm not sure it's a good idea.

(I do think you should have COI policies, but recusal at the granting organization is the standard way to do it outside EA, and I think is pretty reasonable for EA as well.)

6
NickLaing
Thanks Jeff you've convinced me that a zero relationship policy woyldnt work. I think I didn't grasp the scale of these orgs and just how unrealistic this might be to avoid romantic entanglement at all levels. I think something asking the lines of your steelmanning me here might ensure an extremely low chance of relationship bias affecting grants. "A version of your proposal that only included C-level people at the grantee org and people working on the relevant area (or people who directly or indirectly supervise people who do) at the granting org would make a lot more sense, though I'm not sure it's a good idea." I know other orgs operate through Recusal at the granting org, but romantic bias can still get legs in the donors door, and people may well still struggle to vote against someone's partner out of loyalty. Recusal helps and I'm sure it's happening already, but it doesn't seem good enough in some situations Thanks for the engagement.

I appreciate where the sentiment is coming from (and I'd personally be in favour of stronger COI norms than a lot of EA funders seem to have) but the impact cost of this seems too high as stated. 

There's value is being squeaky clean, but there's also value in funding impactful projects, and I think having COI policies apply across the whole large org ("If anyone from our org is dating anyone from your org, we can't fund you") will end up costing way more value than it gains.

4
NickLaing
That's a strong argument thanks Will. It's an interesting question which has more value - being squeaky clean or having some projects perhaps being underfunded. I would hope though that it wouldn't necessarily cost as much as we might think, if other funders could cover shortfalls. OpenPhil isn't the only donor fish in the sea, although they are perhaps the only Leviathan for some EA related orgs. Perhaps this is also part of the argument in favour of having more, slightly smaller funders rather than a few huge ones. To help avoid COI Although I didn't say it as I was going for the "squeaky clean" argument, but you could also potentially draw a line at no funding to orgs where there are relationships between those at some kind of leadership/decision making level. This wouldn't be squeaky clean, but cleaner at least.

part of the argument in favour of having more, slightly smaller funders rather than a few huge ones

 

I've heard others suggest this, but don't know what it means. Do you think Dustin should give half his money to someone else? Or that he should fund two independent grantor organizations to duplicate efforts instead of having only one? Or just that we want more EA billionaires?

2
quinn
I feel NickLaing is encoding an implicit graph-theoretic belief that may not be factually accurate. The premise is that CoI opportunities fall with decentralization, but it may be the case that more diffuseness actually lead to problematic intermingling. I don't have super good graph theory intuitions so I'm not making a claim about whether this is true, just that it's a premise and that the truth value matters. 
4
Davidmanheim
My graph-theoretic intuition is that it depends a lot of the distribution of opportunities. Because EAs tend to both fund and date other EAs, the COI increase / decrease probably depends to some extent on the relative size of the opportunity / recipient network.
3
NickLaing
My premise may well be wrong, but all I have heard to date is that the conflicts of interests aren't that big a problem, not a clear argument that more diffuseness could make COI worse. if we take an imaginary world where there is only one donor org and many donee organisations, within a small community like EA it seems almost impossible to avoid conflicts of interest in a high proportion of grants. But I have low confidence in this, and would appreciate someone explaining arguments in favor of centralisation reducing potential for COI
3
Jason
I think Nick is suggesting that if we had Open Phil split into funders A and B (which were smaller than Open Phil), then A declining to fund an organization due to a COI concern would be somewhat less problematic because it could go to B instead. I'm not a graph theory person either, but it seems the risk of both A and B being conflicted out is lower. I don't think that's a good reason to split Open Phil, although I do think some conflicts are so strong that Open Phil should forward those organizations to external reviewers for determination. For example, I think a strong conflict disqualifies all the subordinates of the disqualified person as well -- eg I wouldn't think it appropriate to evaluate the grant proposal of a family member of anyone in my chain of command.
1
quinn
Correct: a treatment of this question that does not consider BATNAs or counterfactuals would be inaccurate. 
1
NickLaing
Thanks David. I think any of those 3 might work I also didn't know just how much money Dustin was giving until I looked it up now. Great stuff!

This wasn't clear, but I don't think any of the three make sense as solutions. You can't really tell donors that they don't get to make decisions about what to fund, having multiple orgs creates duplication and overlap which is really costly and wasteful - and if the organizations coordinate you haven't really helped, and lastly, sure, I'd love for there to be more donors, but it's not really actionable, other than to tell more EAs to make lots of money. (And they probably should. EA was, is, and will be funding constrained.)

1
NickLaing
I think the first 2 options make some sense and I don't think the donor diversity question is simple, with simple answers On the first option, of course people can give money where they want, but I think any smart big donor could respond to a good argument for diversification of giving. It's not about telling anyone to do anything, but about figuring out what is actually the best way to do philanthropy in the EA community in the long term.  I don't think  having multiple orgs are necessarily costly and wasteful. Even if donors co-ordinate to some degree, having a donor board with no major COI could make more uncompromised and rational decisions, and also avoid controversy both from within and outside the movement..  Charity entrepreneurship have invested in foundation entrepreneurship, and make a number of good arguments why it can good to have more funding orgs out there, even if smaller. These benefits  includemore exploration of different cause areas and potential access to different pools of funding and different donors. As a side note (although I know it wasn't intentional) don't think it's a great conversation technique on a forum to suggest 3 possible solutions which seemed to be in good faith, and then turn around and say that they don't make sense in the next comment. This would work in an in person discussion I think, but it makes it hard to have a discussion on a forum.
9
NickLaing
I've retracted the original comment as it's clear to me now that it doesn't make practical or ethical sense to completely rule out grants to orgs where there is partner entanglement. I still think it's an important discussion though!

I'm concerned this post describes itself as a response to another post, but doesn't actually address the arguments made in that discussion.

Instead it reads like a continuation of posts by other burner accounts.

For this reason, I downvoted.

It's also full of insinuation and implication and "X may mean Y [which is damning]" in a way that's attempting to get the benefit of "X = Y" without having to actually demonstrate it.

In my opinion, "you have to use a burner account to put forth this kind of 'thinking' and 'reasoning' and 'argument'" is actually a point in EA culture's favor.

-2
BurnerExplainer
I'm not describing this post as a response to the other post. I initially wrote at the top of this post that, and I believe that this warranted a separate post. In no way was this meant to be a rebuttal of all the arguments made there. I apologise for the confusion. At the top of the post, I've now clarified that I'm offering a different perspective (to add to the public discourse) and that this made more sense to me as a post than a comment and NOT a rebuttal.

On the Future Forum stuff, it seems worth noting that Sam Altman was literally speaking on the morning of the new venue (I was volunteering, though I didn't help through the night as many did). Feels it was a reasonably important morning to go well

I don't quite know what blacklists means but I imagine that people make judgements based on backchannel information all the time. If that's what the poster means then yeah I'd imagine this happens in lots of little ways (60%) though I'm not sure if disputing grants is a negative signal, though maybe that's easy for me to say as someone who seems to be disagreeable in an acceptable way.

lol when people use this burner account, it's usually closer to "this argument could get a bit annoying" than "I feel the need to protect my anonymity for fear of retribution." please don't speak for all burners

5
Nathan Young
Naaah that's what main is for :P
3
burner
Your patience is admirable :)

I've been trying it out for a while now before the posts on why there are burner accounts on the forum during the last few days, though I've been trying out this thing where I've posted lots of stuff most effective altruists may not be willing to post. I cover some of the how and why that is here.

https://forum.effectivealtruism.org/posts/KfwFDkfQFQ4kAurwH/evan_gaensbauer-s-shortform?commentId=EjhBquXGiFEppdecY

I've been meaning to ask if there is anything anyone else thinks I should do with that, so I'm open to suggestions.

Of your two concerns regarding community gatekeeping, I'm not concerned about funding because I think good projects still get funded. BUT I'm semi-concerned about "logs" of things or rumors. I don't think there are outright "blacklists" unless we are talking about banning names from EA events due to harrassment, smear journalism, etc. However of "logs" I am concerned about, I am not concerned about the EA forum if you are actually being rigorous or something which is totally in everyone's control.

The type of logs or rumors I am concerned about are wel... (read more)

It seems worth acknowledging a couple of points that are in tension:

  • Asking difficult questions probably does affect your ability to get jobs and grants. It's not obvious to me that this is always negative, but it does take a lot more energy to frame negative comments in a way that they aren't taken wrongly. Sometimes being anonymous is just easier
  • EA as a community could be better at providing ways to gather around useful critiques. The fact that all have to be posts seems non-ideal. If people could comment on and agree/disagree on grants all listen on the
... (read more)
Curated and popular this week
Relevant opportunities