I'm looking at different career choices and in particular, academic research projects. I've tried to compare their impact by using some Fermi calculations, including working out:
- the magnitude of the problem the research will attempt to solve
- the likely value of the research if it is successful
- the likelihood it will be successful
- my marginal contribution to the research if I get involved with it
My calculations seem like they could be easily out by a couple of orders of magnitude. And it makes a difference--one less order of magnitude and the project is not more than the value of my marginal career impact if I simply maximized income and earned-to-give. Of course...who knows...I might have the order of magnitude the other way around, and perhaps the research project could be even more impactful.
Honestly, it seems like a bad idea to make any kind of decision based on this process, but if I want to know which career path has the most impact, I don't know any other way to do it!
So I have lots of questions, but the biggest one is: if you don't use this awfully flawed method to decide what the impact of a career path will be, what else would you use?
Thanks for asking this! I'm looking forward to reading discussion of it. I feel similar to you, I think. I'm trying to decide between career options in global health, health security, and global catastrophic biological risk reduction (GCBR). There's a lot of different inputs, both personal and external, but one aspect I've struggled with is the tension between being mostly convinced of the arguments for GCBR work (and trusting the many smart people convinced by them) and feeling the probabilities of me making a difference on low-probability/high-consequence events are too small when multiplied together.
Regardless of whether the math 'works out' in a Fermi calculation of a GCBR career (whether by me or others), it still feels sort of 'thin' to base a major career change on.
Here's an example to try to capture the feeling of thinness (or fragility): it seems plausible that a few papers (or blog posts) might come along that are devastatingly clever, featuring arguments or evidence I hadn't thought of, that showed with high confidence that the risk of synthetic pandemics is extremely low (this specific example might not be plausible, but it captures my psychology at least). If that came along after I'd spent 20ish years narrowly focused on synthetic pandemic risk, without major transferable career capital, I'd feel like I'd made a mistake (if not ex ante, at least ex post).
The reduction in importance wouldn't necessarily have to be dramatic for it to be consequential for individual career choices . If, given my previous experience, it's a close race between my options in terms of projected ethical impact, a more minor reduction could reveal my eventual choice was actually a distant second. Also, the reduction wouldn't necessarily have to deflate the entire problem - arguments might have revealed that people similar to me in the field actually has vanishingly little (or negative) impact on the issue.
Traditional career choices seem to be based more on personal preference , social connection, tradition, or chance. And there's plenty wrong with those approaches. One positive however is that those ways of making choices are more resistant to this kind of intellectual deflation by others.
Of course, there's also plenty of regret and uncertainty in traditional careers and career-choices. It's not clear to me whether this feeling of 'thinness' is just a bias I need to work through, or is actually tracking something important. And, like you say, it's not clear what we should do otherwise.