upvote
You doubt that Yudkowsky "was only advocating for state-sponsored airstrikes, not civilian airstrikes, bombs, or attacks." Let's let the reader decide.

In the article, the string "kill" occurs twice, both times describing what some future AI would do if the AI labs remain free to keep on their present course. The strings "bomb" and "attack" never occur. The strings "strike" and "destroy" occurs once each, and this quote contains both occurrences:

>Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for governments and militaries. Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.

>Frame nothing as a conflict between national interests, have it clear that anyone talking of arms races is a fool. That we all live or die as one, in this, is not a policy but a fact of nature. Make it explicit in international diplomacy that preventing AI extinction scenarios is considered a priority above preventing a full nuclear exchange, and that allied nuclear countries are willing to run some risk of nuclear exchange if that’s what it takes to reduce the risk of large AI training runs.

>That’s the kind of policy change that would cause my partner and I to hold each other, and say to each other that a miracle happened, and now there’s a chance that maybe Nina will live. The sane people hearing about this for the first time and sensibly saying “maybe we should not” deserve to hear, honestly, what it would take to have that happen. And when your policy ask is that large, the only way it goes through is if policymakers realize that if they conduct business as usual, and do what’s politically easy, that means their own kids are going to die too.

reply
> The strings "bomb" and "attack" never occur.

What do you think an "airstrike" is, then?

Trying to argue that certain strings don't occur in the page is the kind of argument that gets brought out when someone is desperate for any technicality to avoid having to concede a point.

This level of weaponized pedantry is what makes trying to debate anything with LessWrong-style rationalists so impossible: There's always another volley of gish gallop to be fired at you when you get too close to anything that goes against their accepted narratives.

reply
You were trying to get people to view what EY wrote in the time.com article as an encouragement to engage in criminal violence (as opposed to state-sponsored violence a la an airstrike on a data center) such as the firebombing of Sam's home when in actuality (both before and after the publication of the time.com article) EY has explicitly argued against doing any crimes particularly violent crimes against the AI enterprise.

Knowing that most readers do not have time to read the entire article, I brought up how many times various strings occur in the article to make it less likely in the reader's eyes that there are passages in the article other than the one passage I quoted that could possibly be interpreted as advocating criminal violence. I.e., I brought it up to explain why I quoted the 3 (contiguous) paragraphs I quoted, but not any of the other paragraphs.

In finding and selecting those 3 paragraphs, I was doing your work for you since if this were a perfectly efficient and fair debate, the burden of providing quotes to support your assertion that EY somehow condones the firebombing of Sam's home would fall on you.

reply