Closed Source AI = Neofeudalism · the singularity is nearer
Politics & Government · Mar 30, 2026
The piece reframes open-source AI as a governance stance—'anti-feudal' rather than anti-safety—arguing that closed, API-only access uses safety rhetoric to create asymmetric dependence (turning users into 'serfs'), whereas openness, portability, and local control are political choices that preserve user ownership and legitimacy.
Closed Source AI = Neofeudalism · the singularity is nearer
Politics & Government · Mar 30, 2026
The document rejects a middle-ground that confines AI to vetted incumbents, arguing that because AI development is inevitable the choice is between universal access or unjustified exclusive control, reframing safety as a governance-consistency issue and criticizing closed-source firms claiming public danger while seeking exclusive commercialization.
Closed Source AI = Neofeudalism · the singularity is nearer
Science, Technology & Innovation · Mar 30, 2026
The text argues AI differs from nuclear weapons because its generative, creative power means monopoly control would compound advantages over time, risking permanent asymmetry and making exclusive-access business models structurally extractive rather than merely temporarily strategic.
Closed Source AI = Neofeudalism · the singularity is nearer
Politics & Government · Mar 30, 2026
The text warns that when a few secretive, closed‑source labs concentrate compute, talent, and deployment power, intelligence becomes centralized and translated into social authority, risking a neofeudal hierarchy and a ‘permanent underclass’ and turning closed AI deployment into a political legitimacy crisis for builders and investors.