Open Digital Policies

Good digital policy is a public good.

Technology shapes our energy systems, our privacy, our access to public services, and our relationship with government. Most digital policy is written behind closed doors in language inaccessible to the communities most affected. ODP changes that.

What we stand for

We want innovation — but done right

⚡ AI without the power grab

We want AI — but not at the cost of clean water and reliable electricity for the communities hosting data centers.

🏙️ Smart cities without surveillance

We want smart cities — but not at the cost of our privacy and the right to move through public space without being tracked.

🔓 Public money, public code

Software built with taxpayer money should be available to all taxpayers — not locked inside proprietary systems that charge for access.

🔧 Own it. Fix it. Keep it.

Innovation shouldn't mean manufactured obsolescence. You should be able to repair what you own, and software locks shouldn't stop you.

♿ Access is a right

Digital public services must work for everyone — not just the majority. Accessibility is a precondition for digital equity, not a bonus feature.

📊 Algorithms need answers

When an algorithm affects your job, housing, or benefits, you have the right to know why — and the right to challenge it.

Policy domains

What matters to you?

Each domain follows a four-pillar structure: Principles → Standards → Implementation → Governance — with model legislative language, real-world examples, and identified gaps.

draft

Who Pays for AI's Power Bill?

Data Centers & Environmental Sustainability

Data centers are the physical backbone of AI, cloud computing, and the internet. They consume enormous amounts of electricity and water — often in communities that see little of the benefit. This model ensures communities share the benefits and aren't left with the costs.

draft

You Own It. You Should Be Able to Fix It.

Right to Repair, Interoperability & E-Waste

Manufacturers use software locks to prevent repair, force planned obsolescence, and generate e-waste. The world's fastest growing waste stream. This model closes the software loophole, requires spare parts and repair documentation, and holds producers responsible for end-of-life.

draft

Public Money, Public Code

Open Source in Government

When governments buy software with public money, the result belongs to the public. Yet most publicly-funded software is never released, never reused, and creates permanent dependency on a handful of vendors. This model requires open release, prohibits lock-in, and establishes open source preference in procurement.

draft

AI That Works For You, Not On You

AI Adoption & Governance

Governments are adopting AI in healthcare, justice, social services, and hiring — often without meaningful oversight. This model establishes what legitimate public sector AI looks like: assessed for risk, transparent to affected people, contestable, and required to disclose when it's AI.

draft

Who's Holding the Algorithm Accountable?

Algorithmic Accountability

Algorithms decide who gets hired, who gets a loan, who gets housing assistance, and who gets flagged by police — often with no explanation and no appeal. This model establishes independent bias auditing, individual rights to challenge, and meaningful human review.

draft

A Greener Web

Web Sustainability

The internet's carbon footprint is roughly equivalent to aviation. Web sustainability isn't just about data centers — it's about how software is written, how long devices last, and how digital services are designed. Policy levers exist; the legislative framework is still emerging.

draft

Your City Is Watching You

Smart Cities & Privacy

Smart city technology — cameras, sensors, connected infrastructure — can improve services. It can also enable mass surveillance without meaningful consent. This domain provides model language for procurement limits, data minimisation, and community consent mechanisms.

draft

Whose Internet Is It?

Digital Sovereignty

Digital sovereignty means communities can understand, audit, and exit the digital systems they depend on. It's not about building walls — it's about ensuring that dependency is transparent, voluntary, and reversible.

draft

Technology That's Safe for Kids

Children & Technology

Children are not small adults. They are among the most targeted groups in the digital economy — for manipulation, surveillance, and algorithmic amplification of harmful content. This domain draws on the UN CRC, UK Children's Code, and COPPA 2.0.

draft

Who Decides What You Can Say Online?

Freedom of Expression & Content Governance

Content moderation done poorly enables harm. Done poorly in the other direction, it enables censorship. This is one of the most contested domains in digital policy. This model surfaces the options and trade-offs honestly.

draft

Digital Infrastructure for Everyone

Digital Public Infrastructure

Digital public infrastructure — identity systems, payment rails, data exchange platforms — should work like roads and water systems: open, governed in the public interest, accessible to all. The UNDP/ITU framework and the 50-in-5 campaign are defining this emerging domain.

draft

Who's Managing the Algorithm Managing You?

Platform Work & Algorithmic Labour Rights

Platform workers — drivers, delivery couriers, freelancers — have their pay set, tasks assigned, performance monitored, and livelihoods ended by algorithms they cannot see or challenge. This model establishes rights to algorithmic transparency, human review of automated decisions, pay formula disclosure, and collective governance of the systems that manage their work.

draft

Why Are You Paying More Than Your Neighbour?

Surveillance Pricing & Consumer Data Rights

Retailers, insurers, and platforms use your browsing history, location, financial status, and psychological profile to charge you more than they charge other people for the same product. This model establishes the right to know when your price was personalised, protection against vulnerability-based exploitation, and prohibition on discriminatory pricing through algorithmic proxy variables.

draft

Who Is Accountable When Platforms Cause Harm?

Platform Liability & Systemic Accountability

Large platforms profit from algorithmic amplification of harmful, false, and manipulative content. Yet liability shields designed for a 1996 internet let them escape accountability for the foreseeable consequences of their design choices. This model establishes systemic risk assessments, independent audits, user due process rights, and penalties scaled to global revenue.

View all domains including in-development →

How to use this

From model to movement

Policy models in this repository are not off-the-shelf ordinances to adopt verbatim. They are starting points — carefully researched, ready to adapt.

Read

Find the policy domain that matters to your community. Read the Principles section to understand what values are at stake and the Standards section to see what enforceable language looks like.

Adapt

Model language needs to be adapted for your jurisdiction, legal system, and political context. The real-world examples show how others have done it. The gaps section shows what's still unresolved.

Advocate

Use the model to make the case to your elected representatives, your municipality, or your institution's procurement team. Then add what you learn back to this repository.

Open to everyone

This is your repository too

Good policy language belongs to all of us. If you know of a jurisdiction that has enacted similar language, have expertise in a gap we've identified, or want to translate a model into another language — your contribution is needed.

Flag a gap Report an adoption How to contribute