Back to feed

Closed Source AI = Neofeudalism

the singularity is nearer

Mar 30, 2026

3/30/2026

Open Source AI Frames Deployment as Governance Against Dependency and Centralized Power Emphasizing Openness, Portability, and Local Control

Closed Source AI = Neofeudalism · the singularity is nearer

Politics & Government · Mar 30, 2026

The piece reframes open-source AI as a governance stance—'anti-feudal' rather than anti-safety—arguing that closed, API-only access uses safety rhetoric to create asymmetric dependence (turning users into 'serfs'), whereas openness, portability, and local control are political choices that preserve user ownership and legitimacy.


3/30/2026

AI Safety Debate Reframes From Technical Control To Governance And Universal Access

Closed Source AI = Neofeudalism · the singularity is nearer

Politics & Government · Mar 30, 2026

The document rejects a middle-ground that confines AI to vetted incumbents, arguing that because AI development is inevitable the choice is between universal access or unjustified exclusive control, reframing safety as a governance-consistency issue and criticizing closed-source firms claiming public danger while seeking exclusive commercialization.


3/30/2026

Monopoly Over Frontier AI Risks Permanent Asymmetry By Controlling A Generative Engine For Invention

Closed Source AI = Neofeudalism · the singularity is nearer

Science, Technology & Innovation · Mar 30, 2026

The text argues AI differs from nuclear weapons because its generative, creative power means monopoly control would compound advantages over time, risking permanent asymmetry and making exclusive-access business models structurally extractive rather than merely temporarily strategic.


3/30/2026

Concentrated Control Of Advanced AI Could Create A Legitimacy Crisis And A Neofeudal Social Order

Closed Source AI = Neofeudalism · the singularity is nearer

Politics & Government · Mar 30, 2026

The text warns that when a few secretive, closed‑source labs concentrate compute, talent, and deployment power, intelligence becomes centralized and translated into social authority, risking a neofeudal hierarchy and a ‘permanent underclass’ and turning closed AI deployment into a political legitimacy crisis for builders and investors.