Hide table of contents

Hi there!

I'm looking at one of Bostrom's papers (Existential Risk Prevention as Global Priority, p. 19). He includes this expected value calculation which I just can't make sense of:

"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] a mere 1 per cent chance of being correct, we find that the expected value of reducing existential risk by a mere one billionth of one billionth of one percentage point is worth a hundred billion times as much as a billion human lives."

When trying to repeat his calculation, I reason as follows: reducing the risk of losing 10^50 expected lives by 10^-20 - that's the same as increasing the probability of getting 10^50 by 10^-20. So, it should go 10^50*10^-20 = 10^30. However, he writes that the expected value of this change is equal to 10^20 lives. It's a fairly trivial calculation, so I assume there's something obvious I've overlooked. Can you help me see what I'm missing?

15

0
0

Reactions

0
0
New Answer
New Comment

1 Answers sorted by

Your calculation looks correct to me. (WolframAlpha confirms "10^52 * 1% * 1 billionth * 1 billionth * 1%" is 10^30.) It seems that Nick Bostrom is underestimating the expected value by 10^10.

A minor factor of ten billion 😉

3
Linch
A mere order of magnitude of an order of of magnitude!

Thanks for your reply. I'm glad my calculation doesn't seem way off. Still feel like it's too obvious a mistake for it not to have been caught, if it indeed were a mistake...

Comments4
Sorted by Click to highlight new comments since:

Not an excuse, but maybe Bostrom was using the old British definition of "billion," rather than the American and modern British definition of billion? 

Even then it seems off?

"Even if we give this allegedly lower bound on the cumulative output potential of a technologically mature civilisation [he's referring to his estimate of 10^52 future lives here] (+52) a mere 1 per cent chance (-2) of being correct, we find that the expected value of reducing existential risk by a mere one billionth (-12) of one billionth (-12) of one percentage point (-2) is worth (=) a hundred (+2) billion (+12) times as much as a billion (+12) human lives."

52-2-12-12-2 = 24 != 26 = 2+12+12

Sure but what's 2 OOMs between friends?

Yeah, I've had the same thought. But as far as I can tell, it still doesn't add up, so I figured there must be something else going on. Thanks for your reply, though.

Curated and popular this week
Relevant opportunities