> How many people are allowed to die to prevent AGI?
He didn’t say “not everyone dying is preferable to everyone dying”. The question was about acceptable consequences from preemptively stopping AGI under his assumption that AGI will lead to the extinction all humans.
Those are only the same thing under the assumptions that 1) AGI is inevitable without intervention and 2) AGI will lead to the extinction of humanity.
If he believes he is being misunderstood, his “apology” doesn’t actually deny either of the assumptions I identified, and he is widely known to believe them.
In fact, his stated reason for correcting his earlier tweet, that using nuclear weapons is taboo, is an extremely weak excuse. Given the opportunity to save humanity from AGI if that is what you believe, it would be comical to draw the line at first use of nukes.
No, I think Eliezer is trying to come to grips with the logical conclusion of his strident rhetoric.
What I am not saying: Yudkowsky intends to exterminate most of humanity.
What I am saying: this is a dangerous environment, and these kinds of statements will be seen as a call to action by a certain kind of person. TFA is literal proof of the truth of that statement. Moreover: within the community there exist trained experts who might be able to, at the cost of millions of lives, plan an attack that could (plausibly) delay AI by many years.
The danger of this argument is that someone who reveres Yudkowsky might take his arguments to the logical conclusion, and actually do something to act on them. (Although I can't prove it, I also think Yudkowsky knows this, and his decision to speak publicly should be viewed as an indicator of his preferences.) That's why these conversations are so dangerous, and why I'm not going to give Yudkowsky and his folks a lot of credit for "just having an intellectual argument." I think this is like having an intellectual discussion about a theater being on fire, while sitting in a crowded theater.
> someone who reveres Yudkowsky might take his arguments to the logical conclusion
What about Eliezer himself? Does he not believe his own rhetoric? Certainly if he believes the future of the human race is at stake it demands more action than writing a book about it and going on a few podcasts.
I think the whole thing is a bit like the dog who finally caught the car. It’s easy to use this strident rhetoric on an Internet forum, but LessWrong isn’t real life.