Forgive the clickbait title, but EA is as prone to clickbait as anywhere else.
It seemed at EAG that discussions focussed on two continuums:
Neartermist <---> Longtermist
Frugal spending <---> Ambitious spending
(The labels for the second one are debatable but I'm casually aiming for ones that won't offend either camp.)
Finding common ground on the first has been an ongoing project for years.
The second is much more recent, and it seems like more transparency could really help to bring people on opposite sides closer together.
Accordingly: could FTX and CEA please publish the Back Of The Envelope Calculations (BOTECs) behind their recent grants and community building spending?
(Or, if there is no BOTEC and it's more "this seems plausibly good and we have enough money to throw spaghetti at the wall", please say that clearly and publicly.)
This would help in several ways:
- for sceptics of some recent spending, it would illuminate the thinking behind it. It would also let the community kick the tires on the assumptions and see how plausible they are. This could change the minds of some sceptics; and potentially improve the BOTECs/thinking
- it should help combat misinformation. I heard several people misrepresent (in good faith) some grants, because there is not a clear public explanation of the grants' theory of change and expected value. A shared set of facts would be useful and improve debate
- it will set the stage for future evaluation of whether or not this thinking was accurate. Unless we make predictions about spending now, it'll be hard to see if we were well calibrated in our predictions later
Objection: this is time consuming, and this time is better spent making more grants/doing something else
Reply: possibly true, and maybe you could have a threshold below which you don't do this, but these things have a much higher than average chance of doing harm. Most mistaken grants will just fail. These grants carry reputational and epistemic risks to EA. The dominant theme of my discussions at EAG was some combination of anxiety and scorn about recent spending. If this is too time-consuming for the current FTX advisers, hire some staff (Open Phil has ~50 for a similar grant pot and believes it'll expand to ~100).
Objection: why drag CEA into this?
[EDIT: I missed an update on this last week and now the stakes seem much lower - but thanks to Jessica and Max for engaging with this productively anyway: https://forum.effectivealtruism.org/posts/xTWhXX9HJfKmvpQZi/cea-is-discontinuing-its-focus-university-programming]
Reply: anecdata, and I could be persuaded that this was a mistake. Several students, all of whom asked not be named because of the risk of repercussions, expressed something between anxiety and scorn about the money their own student groups had been sent. One said they told CEA they didn't need any money and were sent $5k anyway and told to spend it on dinners. (Someone from CEA please jump in if this is just false, or extremely unlikely, or similar - I do realise I'm publishing anonymous hearsay.) It'd be good to know how CEA is thinking about spending wisely as they are very rapidly increasing their spending on EA Groups (potentially to ~$50m/year).
Sidenote: I think we have massively taken Open Phil for granted, who are exceptionally transparent and thoughtful about their grant process. Well done them.
Hi Jack,
Just a quick response on the CEA’s groups team end.
We are processing many small grants and other forms of support for CB and we do not have the capacity to publish BOTECs on all of them.
However, I can give some brief heuristics that we use in the decision-making.
Institutions like Facebook, Mckinsey, and Goldman spend ~ $1 million per school per year at the institutions they recruit from trying to pull students into lucrative careers that probably at best have a neutral impact on the world. We would love for these students to instead focus on solving the world’s biggest and most important problems.
Based on the current amount available in EA, its projected growth, and the value of getting people working in EA careers, we currently think that spending at least as much as McKinsey does on recruiting pencils out in expected value terms over the course of a student’s career. There are other factors to consider here (i.e. double-counting some expenses) that mean we actually spend significantly less than this. However, as Thomas said - even small chances that dinners could have an effect on career changes make them seem like effective uses of money. (We do have a fair a... (read more)
Hi Jessica,
Thanks for outlining your reasoning here, and I'm really excited about the progress EA groups are making around the world.
I could easily be missing something here, but why are we comparing the value of CEA's community building grants to the value of Mckinsey etc?
Isn't the relevant comparison CEA's community building grants vs other EA spending, for example GiveWell's marginally funded programs (around 5x the cost-effectiveness of cash transfers)?
If CEA is getting funding from non-EA sources, however, this query would be irrelevant.
Looking forward to hearing your thoughts :)
I'm obviously not speaking for Jessica here, but I think the reason the comparison is relevant is that the high spend by Goldman ect suggests that spending a lot on recruitment at unis is effective.
If this is the case, which I think is also supported by the success of well funded groups with full or part time organisers, and that EA is in an adversarial relationship to with these large firms, which I think is large true, then it makes sense for EA to spend similar amounts of money trying to attract students.
The relvent comparison is then comparing the value of the marginal student recurited with malaria nets ect.
I'm surprised to see CEA making such a strong claim. I think we should have strong priors against this stance, and I don't think I've seen CEA publish conclusive evidence in the opposite direction.
Firstly, note that these three companies come from very different sectors of the economy and do very different things.
Secondly, even if you assign high credence to the problems with these firms, it seems like there is a fair bit of uncertainty in each case, and you are proposing a quite harsh upper bound - 'probably at best neutral'.
Thirdly, each of these are (broadly) free market firms, who exist only because they are able to persuade people to continue using their services. It's always possible that they are systematically mistaken, and that CEA really does understand social network advertising, management consulting, trading and banking better than these customers... but I think our prior should be a... (read more)
Curious if you disagree with Jessica's key claim, which is "McKinsey << EA for impact"? I agree Jessica is overstating the case for "McKinsey <= 0", but seems like best-case for McKinsey is still order(s) of magnitude less impact than EA.
Subpoints:
Agree we should usually avoid saying poorly-justified things when it's not a necessary feature of the argument, as it could turn off smart people who would otherwise agree.
Sorry, I was trying to get a quick response to this post and I made a stronger claim than I intended. I was trying to say that I think that EA careers are doing much more good than the ones mentioned on average and so spending money is a good bet here. I wasn’t intending to make a definitive judgment about the overall social impact of those other careers, though I know my wording suggests that. I also generally want to note that this element was a personal claim and not necessarily a CEA endorsed one.
I consider this to be a pretty weak argument, so it doesn't contribute much to my priors, which although weak (and so the particulars of a company matter much more), are probably centered near neutral on net welfare effects (in the short to medium term). I think a large share of goods people buy and things they do are harmful to themselves or others before even considering the loss of income/time as a result, or worse for them than the things they compete with. It's enough that I wouldn't have a prior strongly in favour of what profitable companies are doing being good for us. Here are reasons pushing towards neutral or negative impacts:
- A lot of goods are mostly for sig
... (read more)Ok. Lark’s response seems correct.
But surely, the spirit of the original comment is correct too.
No matter which worldview you have, the value of a top leader moving into EA is overwhelmingly larger than the the social value of the same leader “rowing” in these companies.
Also, at the risk of getting into politics (and really your standard internet argument) gesturing at “free market” is really complicated. You don’t need to take the view of Matt Stoller or something to notice that the benefits of these companies can be provided by other actors. The success of these companies and their resources that allow recruitment with 7 figure campus centres probably has a root source different than pure social value.
The implication that this statement requires CEA to have a strong model of these companies seems unfair. Several senior EAs, who we won’t consider activists or ideological, have deep experiences in these or similar companies. They have opinions that are consistent with the parent comment’s statement. (Being too explicit here has downsides.)
I think the main crux here is that even if Jessica/CEA agrees that the sign of the impact is positive, it still falls in the neutral bracket because on the CEA worldview the impact is roughly negligible relative to the programs that they are excited about.
If you disagree with this maybe you agree with the weaker claim of the impact being comparatively negligible weighted by the resources these companies consume? (there's some kind of nuance to 'consuming resources' in profitable companies, but I guess this is more gesturing at a leaving value on the table framing as opposed to just is the organisation locally net negative or positive.
Do you think people are better off overall than otherwise because of Facebook (and social media generally)? You may have made important connections on Facebook, but many people probably invest less in each connection and have shallower relationships because of social media, and my guess is that mental health is generally worse because of social media (I think there was an RCT on getting people to quit social media, and I wouldn't be surprised if there were multiple studies. I don't have them offhand). I'd guess social media is basically addictive for a lot of people, so people often aren't making well-informed decisions about how much to use, and it's easy for it to be net negative despite widespread use. People joining social media pressures others to join, too, making it more costly to not be on it, so FB creates a problem (induces fear of missing out) and offers a solution to it. Cancel culture, bubbles/echo chambers, the spread of misinformation, and polarization may also be aggravated by social media.
That being said, maybe FB was really important for the growth of the EA community. I mostly got into EA through FB initially, although it's not where I was first exposed to EA. If... (read more)
Thanks Jessica, this is helpful, and I really appreciate the speed at which you replied.
A couple of things that might be quick to answer and also helpful:
Overall, CEA is planning to spend ~$1.5mil on uni group support in 2022 across ~75 campuses, which is a lot less than $1mil/campus. :)
Fwiw, I personally would be excited about CEA spending much more on this at their current level of certainty if there were ways to mitigate optics, community health, and tail risk issues.
Just as a casual observation, I would much rather hire someone who had done a couple of years at McKinsey than someone coming straight out of undergrad with no work experience. So I'm not sure that diverting talented EAs from McKinsey (or similar) is necessarily best in the long run for expected impact. No EA organization can compete with the ability of McK to train up a new hire with a wide array of generally useful skills in a short amount of time.
I'm not sure how clear it is that it's much better for people to hear about EA at university, especially given there is a lot more outreach and onboarding at the university level than for professionals.
Good to see a post that loosely captures my own experience of EAG London and comes up with a concrete idea for something to do about the problem (if a little emotionally presented).
I don't have a strong view on the ideal level of transparency/communication here, but something I want to highlight is: Moving too slowly and cautiously is also a failure mode.
In other words, I want to emphasise how important "this is time consuming, and this time is better spent making more grants/doing something else" can be. Moving fast and breaking things tends to lead to much more obvious, salient problems and so generally attracts a lot more criticism. On the other hand, "Ideally, they should have deployed faster" is not a headline. But if you're as consequentialist as the typical EA is, you should be ~equally worried about not spending money fast enough. Sometimes to help make this failure mode more salient, I imagine a group of chickens in a factory farm just sitting around in agony waiting for us all to get our act together (not the most relevant example in this case, but the idea is try to counteract the salience bias associated with the problems around moving fast). Maybe the best way fo... (read more)
Thanks so much for this comment. I find it incredibly hard not to be unwarrantedly risk averse. It feels really tempting to focus on avoiding doing any harm, rather than actually helping people as much as I can. This is such an eloquent articulation of the urgency we face, and why we need to keep pushing ourselves to move faster.
I think this is going to be useful for me to read periodically in the future - I'm going to bookmark it for myself.
A related thought: If an org is willing to delay spending (say) $500M/year due to reputational/epistemic concerns, then it should easily be willing to pay $50M to hire top PR experts to figure out the reputational effects of spending at different rates.
(I think delays in spending by big orgs are mostly due to uncertainty about where to donate, not about PR. But off the cuff, I suspect that EA orgs spend less than the optimal amount on strategic PR (as opposed to "un-strategic PR", e.g., doing whatever the CEO's gut says is best for PR).)
Yeah personally speaking, I don't have very developed views on when to go with Spaghetti-wall vs RCT, so feel free to ignore the following which is more of a personal story. I'd guess there's a bunch of 'Giving Now vs Giving Later' content lying around that's much more relevant.
I think I used to be a lot more RCT because:
Over time, however:
- I became more longtermist and there's no GiveWell for longtermism
- We grew up, and basically the more I saw of the rest o
... (read more)| Out of interest, did you read the post as emotional? I was aiming for brevity and directness
Ah, that might be it. I was reading the demanding/requesting tone ("show us your numbers!", "could FTX and CEA please publish" and "If this is too time-consuming...hire some staff" vs "Here's an idea/proposal") as emotional, but I can see how you were just going for brevity/directness, which I generally endorse (and have empathy for emotional FWIW, but generally don't feel like I should endorse as such).
It's bugged me for a while that EA has ~13 years of community building efforts but (AFAIK) not much by way of "strong" evidence of the impact of various types of community building / outreach, in particular local/student groups. I'd like to see more by way of baking self-evaluation into the design of community building efforts, and think we'd be in a much better epistemic place if this was at the forefront of efforts to professionalise community building efforts 5+ years ago.
By "strong" I mean a serious attempt at causal evaluation using experimental or quasi-experimental methods - i.e. not necessarily RCTs where these aren't practical (though it would be great to see some of these where they are!), but some sort of "difference in difference" style analysis, or before-after comparisons. For example, how do groups' key performance stats (e.g. EA's 'produced', donors, money moved, people going on to EA jobs) compare in the year(s) before vs after getting a full/part time salaried group organiser? Possibly some of this already exists either privately or publicly and the relevant people know where to look (I haven't looked hard, sorry!). E.g. I remember GWWC putting together a fu... (read more)
I'd personally be pretty excited to see well-run analyses of this type, and would be excited for you or anyone who upvoted this to go for it. I think the reason why it hasn't happened is simply that it's always vastly easier to say that other people should do something than to actually do it yourself.
I completely agree that it is far easier to suggest an analysis than to execute one! I personally won't have the capacity to do this in the next 12-18 months, but would be happy to give feedback on a proposal and/or the research as it develops if someone else is willing and able to take up the mantle.
I do think that this analysis is more likely to be done (and in a high quality way) if it was either done by, commissioned by, or executed with significant buy-in from CEA and other key stakeholders involved in community building and running local groups. This is partly a case of helping source data etc, but also gives important incentives for someone to do this research. If I had lots of free time over the next 6 months, I would only take this on if I was fairly confident that the people in charge of making decisions would value this research. One model would be for someone to write up a short proposal for the analysis and take it to the decision makers; another would be for the decision-makers to commission it (my guess is that this demand-driven approach is more likely to result in a well-funded, high quality study).
To be clear, I massively appreciate the work ... (read more)
I also agree this would be extremely valuable.
I think we would have had the capacity to do difference-in-difference analyses (or even simpler analyses of pre-post differences in groups with or without community building grants, full-time organisers etc.) if the outcome measures tracked in the EA Groups Survey were not changed across iterations and, especially, if we had run the EA Groups Survey more frequently (data has only been collected 3 times since 2017 and was not collected before we ran the first such survey in that year).
As a positive example, 80,000 Hours does relatively extensive impact evaluations. The most obvious limitation is that they have to guess whether any career changes are actually improvements, but I don't see how to fix that—determining the EV of even a single person's career is an extremely hard problem. IIRC they've done some quasi-experiments but I couldn't find them from quickly skimming their impact evaluations.
I mean, sometimes you have reason to make titles into a simple demand, but I wish there were a less weaksauce justification than “because our standards here are no better than anywhere else”.
To be clear I think this instance is a fairly okay request to make as a post title, but I don’t want the reasoning to imply anyone can do this for whatever reason they like.
Candidly, I'm a bit dismayed that the top voted comment on this post is about clickbait.
Hiring is an extremely labour and time intensive process, especially if the position you're hiring for requires great judgement. I think responding to a concern about whether something is a good use of staff time with 'just hire more staff' is pretty poor form, and given the context of the rest of the post it wouldn't be unreasonable to respond to it with 'do you want to post a BOTEC comparing the cost of those extra hires you think we should make to the harms you're claiming?'
The top-voted suggestion in FTX's call for megaproject ideas was to evaluate the impacts of FTX's own (and other EA) grantmaking. It's hard to conduct such an evaluation without, at some point, doing the kind of analysis Jack is calling for. I don't have a strong opinion about whether it's better for FTX to hire in-house staff to do this analysis or have it be conducted externally (I think either is defensible), but either way, there's a strong demonstrated demand for it and it's hard to see how it happens without EA dollars being deployed to make it possible. So I don't think it's unreasonable at all for Jack to make this suggestion, even if it could have been worded a bit more politely.
That's right, and this was very casually phrased, so thanks for pulling me up on it. A better way of saying this would be: "if you're going to distribute billions of dollars in funding, in a way that is unusually capable of being harmful, but don't have the time to explain the reasoning behind that distribution, it's reasonable to ask you to hire people to do this for you (and hiring is almost certainly necessary for lots of other practical reasons)."
One generic back-of-the-envelope calculation from me:
Assume that when you try to do EA outreach, you get the following funnel:
~10% (90% CI[1] 3%-30%) of people you reach out to will be open to being influenced by EA
~10% (90% CI 5%-20%) of people who are reached and are open to being influenced by EA will actually take the action of learning more about EA
~20% (90% CI 5%-40%) of people who learn more about EA actually become EA in some meaningful way (e.g., take GWWC pledge or equivalent)
Thus we expect outreach to a particular person to produce ~0.002 EAs on average.
Now assume an EA has the same expected impact as a typical GWWC member, and assume a typical GWWC member donates ~$24K/yr for ~6 years, making the total value of an EA worth ~$126,000 in donations, discounting at 4%. I imagine the actual mean EA is likely more valuable than that given a long right tail of impact.
Note that these numbers are pretty much made up[2] and each number ought to be refined with further research - something I'm working on and others should too. Also keep in mind that obviously these numbers will vary a lot based on the specific type of outreach being considered and so should be modifie... (read more)
I don’t think that’s how it works. Your reasoning here is basically the same as “I value having Internet connection at $50,000/year, so it’s worth it for me to pay that much for it.”
The flaw is that, taking the market price of a good/service as given, your willingness to pay for it only dictates whether you should get it, now how much you should pay for it. If you value people at a certain level of talent at $1M/career, that only means that, so long as it’s not impossible to recruit such talent for less than $1M, you should recruit it. But if you can recruit it for $100,000, whether you value it at $100,001 or $1M or $1010 does not matter: you should pay $100,000, and no more. Foregoing consumer surplus has opportunity costs.
To put it more explicitly: suppose you value 1 EA with talent X at $1M. Suppose it is possible to recruit, in expectation, one such EA for $100,000. If you pay $1M/EA instead, the opportunity cost of doing so is 10 EAs for each person you recruit, so the expected value of the action is -9 EAs per recruit, and you a... (read more)
I agree with what you are saying that yes, we ideally should rank order all the possible ways to market EA and only take those that get the best (quality adjusted) EAs per $ spent, regardless of our value of EAs - that is, we should maximize return on investment.
**However, in practice, as we do not currently yet have enough EA marketing opportunities to saturate our billions of dollars in potential marketing budget, it would be an easier decision procedure to simply fund every opportunity that meets some target ROI threshold and revise that ROI threshold over time as we learn more about our opportunities and budget. ** We'd also ideally set ourselves to learn-by-doing when engaging in this outreach work.
This still sounds like a strong understatement to me – it seems that some people will have vastly more impact. Quick example that gestures in this direction: assuming that there are 5000 EAs, Sam Bankman-Fried is donating $20 billion, and all other
19994999 EAs have no impact whatsoever, the mean impact of EAs is $4 million, not $126k. That's a factor of 30x, so a framing like "likely vastly more valuable" would seem more appropriate to me.One reason to be lower than this per recruited EA is that you might think that the people who need to be recruited are systematically less valuable on average than the people who don't need to be. Possibly not a huge adjustment in any case, but worth considering.
My guess is this would reduce grant output a lot relative to how much I think anyone would learn (maybe it would grantmaking in half?) so personally I'd rather see them just push ahead and make a lot of grants then review or write about just a handful of them from time to time.
I also wish all the EA Funds and Open Phil would do this/make their numbers more accessible.
By the way, we are not planning to spend $50m on groups outreach in the near future. Our groups budget is $5.4m this year.
Also note that our focus university program is passing to Open Philanthropy.
Just wanted to add that I did a rough cost-effectiveness estimate of the average of all past movement building efforts using the EA growth figures here. I found an average of 60:1 return for funding and 30:1 for labour. At equilibrium, anything above 1 is worth doing, so I expect that even if we 10x the level of investment, it would still be positive on average.
I've done informal BOTECs and it seems like the current funding amounts are roughly correct, though we need to be careful with deploying this funding due to concerns like optics and epistemics. Regarding the example, spending $5k on EA group dinners is really not that much if it has even a 2% chance to cause one additional career change. This seems like a failure of communication, because funding dinners is either clearly good and students weren't doing the BOTEC, or it's bad due to some optics or other concerns that the students didn't communicate to CEA.
That's fair - I'm not the earlier commenter but would suggest (as someone who's heard some of these conversations but isn't necessarily representative of others' thinking):
For dinners: Suppose offering to buy a $15 dinner for someone makes it 10% more likely than they'll go to a group dinner, and suppose that makes it 1% more likely that they'll have a very impactful career. Suppose that means counterfactually donating 10% of $100k for 40 years. Then on average the dinner costs $15 and yields $400.
For retreats: Suppose offering to subsidize a $4oo flight makes someone 40% more likely to go to a retreat and that this makes them 5% more likely to have a very impactful career. Again suppose that means counterfactually donating 10% of $100k for 40 years. Then on average the flight costs $400 and yields $8,000.
(And expected returns are 100x higher than that under bolder assumptions about how much impact people will have. Although they're negative if optics costs are high enough.)
Thanks - this is exactly what I think is useful to have out there, and ideally to refine over time.
My immediate reaction is that the % changes you are assigning look very generous. I doubt a $15 dinner makes some 1% more likely to pursue an impactful career; and especially that a subsidised flight produced a 5% swing. I think these are likely orders of magnitude too high, especially when you consider that other places will also offer free dinners/retreats.
If a $400 investment in anything made someone 5% more likely to pursue an impactful career, that would be amazing.
But I guess what I'm really hoping is that CEA and FTX have exactly this sort of reasoning internally, with some moderate research into the assumptions, and could share that externally.
Unsure but probably more than 20% if the person wouldn't be found through other means. I think it's reasonable to say there are 3 parties: CEA, the group organizers, and the person, and none is replaceable so they get 33% Shapley each. At 2% chance to get a career change this would be a cost of 750k per career which is still clearly good at top unis. The bigger issue is whether the career change is actually counterfactual because often it's just a speedup.
Someone else in this thread found a report claiming that employers spend an average of ~$6,100 to hire someone at a US university. I also found this report saying that the average cost per hire in the United States is <$5,000, $15k for an executive. At 1 career = 10 jobs that's $150,000/career for executive-level talent, or $180,000/career adjusting for inflation since the report was released.
I'm not sure how well those numbers reflect reality (the $15k/executive number looks quite low), but it seems at least fairly plausible that the market price is substantially less than $750k/career.
This line of reasoning is precisely what I'm claiming to be misguided. Giving you a gallon of water to drink allows you to live at least two additional days (compared to you having no water), which at $750k of impact/year (~$2000/day ) means, by your reasoning, that EA should fund all int... (read more)
Just a list of projects and organisations FTX has funded would be beneficial and probably much less time-consuming to produce. Some of the things you mention could be deducted from that, and it would also help in evaluating current project ideas and how likely they are to get funding from FTX at some point.
I kind of like the general sentiment but I'm a bit annoyed that it's just assumed that your burden of proof is so strongly on the funders.
Maybe you want to share your BOTEC first, particularly given the framing of the post is "I want to see the numbers because I'm concerned" as opposed to just curiosity?
I think what I'm getting at is that burden of proof is generally an unhelpful framing, and an action that you could take that might be helpful is communicating your model that makes you sceptical of their spending.
Hiring consultancies to do this seems like it's not going to go well unless it's rethink priorities or they have lot of context and on the margin I think it's reasonable for CEA to say no, they have better things to do.
I feel confused about the following but I think that as someone that runs an EA org you could easily have reached out directly to CEA/FTX to ask this question (maybe you did, if so apologies) and this action seems kind of like outing them more than being curious. I'm not necessarily against this (in fact I think this is helpful in lots of ways) but many forum users seen to not like these kinds of adversarial actions.
Just noticed Sam Bankman-Fried's 80,000 Hours podcast episode where he sheds some light on his thinking in this regard.
I think the excerpt below is not far from the OP's request that "if there is no BOTEC and it's more 'this seems plausibly good and we have enough money to throw spaghetti at the wall', please say that clearly and publicly."
Sam:
... (read more)I would prefer that they be less transparent so they don't have to waste their valuable time.
I strongly agree we need transparency. In lieu of democracy in funding, orgs need to be accountable to the movement in some way.
Also, what's a BOTEC?
Back when LEAN was a thing we had a model of the value of local groups based on the estimated # of counterfactual actively engaged EAs, GWWC pledges and career changes, taking their value from 80,000 Hour $ valuations of career changes of different levels.
The numbers would all be very out of date now though, and the EA Groups Surveys post 2017 didn't gather the data that would allow this to be estimated.
Good questions, I have ended up thinking about many of these topics ofren.
Something else where I would find improved transparency valuable would be what are the back of envelope calcs and statistics for denied fundings. Reading EA funds reports for example doesn't give a total view into where the current bar for interventions is, because we're only seeing the project distribution from above the cutoff point.
Downvoted because of the clickbait title and the terrible formatting
I know this isn't the central part of the post but I'm not sure the title is really clickbait. It seems like an accurate headline to me? I understand clickbait to be "the intentional act of over-promising or otherwise misrepresenting — in a headline, on social media, in an image, or some combination — what you’re going to find when you read a story on the web." Source.
A real clickbait title for this would be something like "The one secret fact FTX doesn't want you to know" or "Grantmakers hate him! One weird trick to make spending transparent"
Personally, I don't have a problem with the title. It clearly states the central point of the post.
Not long enough for the formatting to matter in my opinion. We can, and should, encourage people to post some low-effort posts, as long as they're an original thought.
One of the the EA forum norms that I like to see is people explaining why they downvoted a post/comment so I'm a bit annoyed that NegativeNuno's comment that supported this norm was fairly heavily downvoted (without explanation).