MZ

Michał Zabłocki

208 karmaJoined

Posts
1

Sorted by New

Comments
25

So, uh, does it follow that human extinction [or another x-risk which is not an s-risk] realised could be desired in order to avoid an s-risk? (e.g. VHEMT)

Hi, as an anti-natalist: while I saw the climate change branded as the leading motivation for anti-natalism, I don't think that anti-natalists should first and foremost be regarded as motivated in their views by the climate change concerns.

Paradoxically, I don't have any concrete title in mind, but perhaps some science-fiction story could be supplied somewhere in the course? 2001 Space Odyssey as some most basic example.

my guess is most people competent to review philosophy paper either hate Rand or have never read her

I believe that to be true, and to be a very good sign of what kind of an ivory tower philosophy has become.

As a layman: first and foremost correlation that pops in my mind is high reserves ~ responsible spendings. A charity so rich it is dumping money onto buying castles won't get this negative badge, neither will one that isn't buying castles but still is set on course to let go their employees the moment the funding slows down.

"everyone but cis men" is a pretty vile policy

I see it's indeed page 83 in the document on arxiv; it was 82 in the pdf on OpenAI website

Ok, I should have been clear in the beginning - what struck me was that the first example was essentially answering the question on doing great harm with minimum spendings - a really wicked "evil EA", I would say. I found it somewhat ironic.

Load more