Who Pays for AI's Power Bill?
Data Centers & Environmental Sustainability
Data centers are the physical backbone of AI, cloud computing, and the internet. They consume enormous amounts of electricity and water — often in communities that see little of the benefit. This model ensures communities share the benefits and aren't left with the costs.
You Own It. You Should Be Able to Fix It.
Right to Repair, Interoperability & E-Waste
Manufacturers use software locks to prevent repair, force planned obsolescence, and generate e-waste. The world's fastest growing waste stream. This model closes the software loophole, requires spare parts and repair documentation, and holds producers responsible for end-of-life.
Public Money, Public Code
Open Source in Government
When governments buy software with public money, the result belongs to the public. Yet most publicly-funded software is never released, never reused, and creates permanent dependency on a handful of vendors. This model requires open release, prohibits lock-in, and establishes open source preference in procurement.
AI That Works For You, Not On You
AI Adoption & Governance
Governments are adopting AI in healthcare, justice, social services, and hiring — often without meaningful oversight. This model establishes what legitimate public sector AI looks like: assessed for risk, transparent to affected people, contestable, and required to disclose when it's AI.
Who's Holding the Algorithm Accountable?
Algorithmic Accountability
Algorithms decide who gets hired, who gets a loan, who gets housing assistance, and who gets flagged by police — often with no explanation and no appeal. This model establishes independent bias auditing, individual rights to challenge, and meaningful human review.
A Greener Web
Web Sustainability
The internet's carbon footprint is roughly equivalent to aviation. Web sustainability isn't just about data centers — it's about how software is written, how long devices last, and how digital services are designed. Policy levers exist; the legislative framework is still emerging.
Your City Is Watching You
Smart Cities & Privacy
Smart city technology — cameras, sensors, connected infrastructure — can improve services. It can also enable mass surveillance without meaningful consent. This domain provides model language for procurement limits, data minimisation, and community consent mechanisms.
Whose Internet Is It?
Digital Sovereignty
Digital sovereignty means communities can understand, audit, and exit the digital systems they depend on. It's not about building walls — it's about ensuring that dependency is transparent, voluntary, and reversible.
Technology That's Safe for Kids
Children & Technology
Children are not small adults. They are among the most targeted groups in the digital economy — for manipulation, surveillance, and algorithmic amplification of harmful content. This domain draws on the UN CRC, UK Children's Code, and COPPA 2.0.
Who Decides What You Can Say Online?
Freedom of Expression & Content Governance
Content moderation done poorly enables harm. Done poorly in the other direction, it enables censorship. This is one of the most contested domains in digital policy. This model surfaces the options and trade-offs honestly.
Digital Infrastructure for Everyone
Digital Public Infrastructure
Digital public infrastructure — identity systems, payment rails, data exchange platforms — should work like roads and water systems: open, governed in the public interest, accessible to all. The UNDP/ITU framework and the 50-in-5 campaign are defining this emerging domain.
Who's Managing the Algorithm Managing You?
Platform Work & Algorithmic Labour Rights
Platform workers — drivers, delivery couriers, freelancers — have their pay set, tasks assigned, performance monitored, and livelihoods ended by algorithms they cannot see or challenge. This model establishes rights to algorithmic transparency, human review of automated decisions, pay formula disclosure, and collective governance of the systems that manage their work.
Why Are You Paying More Than Your Neighbour?
Surveillance Pricing & Consumer Data Rights
Retailers, insurers, and platforms use your browsing history, location, financial status, and psychological profile to charge you more than they charge other people for the same product. This model establishes the right to know when your price was personalised, protection against vulnerability-based exploitation, and prohibition on discriminatory pricing through algorithmic proxy variables.
Who Is Accountable When Platforms Cause Harm?
Platform Liability & Systemic Accountability
Large platforms profit from algorithmic amplification of harmful, false, and manipulative content. Yet liability shields designed for a 1996 internet let them escape accountability for the foreseeable consequences of their design choices. This model establishes systemic risk assessments, independent audits, user due process rights, and penalties scaled to global revenue.