ER

Edson Reistad

9 karmaJoined

Comments
3

Concentration of power doesn't need superintelligence. It doesn't even need AGI. That's precisely what makes it scary.

I wrote about this in The Front-Running Catastrophe. The core argument: the technologies required for stable totalitarianism — omni-surveillance, autonomous lethal systems, AI-assisted internal security — require strictly weaker AI than the technologies required for extinction via misalignment

Thanks! I think there's a lot of thinking that needs to be done wrt all the different contingencies conditioned on the "best case scenario of pausing / not reaching ASI"

I have always thought that there is a lot of unpicked low-hanging fruit for animation among the most popular blogposts. Examples might be "I only believe in the paranormal" and "Great minds might not think alike"from lesswrong.