Eliezer Yudkowsky wrote an article for Time magazine: Pausing AI Developments Isn’t Enough. We Need to Shut it All Down. It openly calls for extreme violence (airstrikes) to prevent AI software projects, and it calls for governments to track computer hardware sales.
Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for governments and militaries. Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.
All this is over current “AI” technologies, not anything I think resembles real, actual AGIs.
Yudkowsky won’t debate me and personally took moderator action against me when I posted about Popper at Less Wrong. His stated justification for taking moderator action was my unpopularity as indicated by downvotes. He has no rational debate policy. He has no reasonable, organized approach to truth seeking and debating about AI Alignment. He hasn’t made a debate tree or other good document to explain why he’s right about AI Alignment. He and his allies haven’t even tried to give rebuttals to Popperian viewpoints (or various other viewpoints he contradicts, like anti-violence viewpoints, as far as I know – though I haven’t really been following their publications). But, despite the lack of anything resembling winning conclusive debates with a representative sample of opponents, he wants his ideas enforced by every government worldwide and their militaries.
Yudkowsky wrote two pretty good books. He’s done some good work. So it’s sad to see this.