@Linch Have you ever met any of these engineers who work on advancing AI in spite of thinking that the "most likely result ... is that literally everyone on Earth will die."
I have never met anyone so thoroughly depraved.
Mr. Yudkowsky and @RobBensinger think our field has many such people.
I wonder if there is a disconnect in the polls. I wonder if people at MIRI have actually talked to AI engineers who admit to this abomination. What do you even say to someone so contemptible? Perhaps there are no such people.
I think it is much more likely that these MIRI folks have worked themselves into a corner of an echo chamber than it is that our field has attracted so many low-lifes who would sooner kill every last human than walk away from a job.
I do not believe @RobBensinger 's and Yudkowsky's claim that "there are also lots of people in ML who do think AGI is likely to kill us all, and choose to work on advancing capabilities anyway."
Yudkowsky claims that AI developers are plunging headlong into our research in spite of believing we are about to kill all of humanity. He says each of us continues this work because we believe the herd will just outrun us if any one of us were to stop.
The truth is nothing like this. The truth is that we do not subscribe to Yudkowsky’s doomsday predictions. We work on artificial intelligence because we believe it will have great benefits for humanity and we want to do good for humankind.
We are not the monsters that Yudkowsky makes us out to be.
Daylight Savings Time Fix: The real problem is losing the hour of sleep in the spring. The solution is to set clocks back an hour in Fall, and move them forward by 40 seconds everyday for 90 days in spring. No one is going to miss 40 seconds of sleep. Most clocks are digital and set themselves, so you don't need adjust them and you won't notice anything in spring.
"when I know a bunch of excellent forecasters..."
Perhaps your sampling techniques are better than Tetlock's then.