Policy model
Why Are You Paying More Than Your Neighbour?
Surveillance Pricing & Consumer Data Rights
Surveillance Pricing & Consumer Data Rights — Model Policy
Status:
DraftLast updated: 2026-04-18 Maintainers: Open Digital Policies community Related domains: Algorithmic Accountability, Platform Work & Algorithmic Labour Rights, Smart Cities & Privacy, Digital Sovereignty Key sources: EU Digital Markets Act (DMA) 2022, EU Unfair Commercial Practices Directive, FTC surveillance pricing inquiry (2023–2024), GDPR (EU) 2016/679, Consumer Financial Protection Bureau research
Overview
Surveillance pricing — also called personalised pricing, dynamic pricing, or price discrimination — uses behavioural, locational, and demographic data to charge different prices to different individuals or groups for identical goods and services. Unlike transparent dynamic pricing (e.g., airline seat availability), surveillance pricing exploits information asymmetry: the seller knows everything about the buyer’s circumstances and willingness to pay; the buyer knows nothing about how their price was set. This policy model establishes rights of transparency, challenge, and protection against manipulative pricing practices that exploit personal data, with particular attention to their discriminatory potential and their impact on wages as well as consumer prices.
The Core Tension
We want the potential efficiency benefits of dynamic pricing and personalised offers — without enabling platforms and retailers to exploit individuals’ personal data, financial vulnerability, and lack of alternatives to charge them more than others for the same goods and services.
Scope
This policy model is designed to apply at the level of: (select all that apply)
- Municipal / local government
- Regional / state / provincial government
- National government
- Public sector procurement (any level)
- Regulated industry
- Other: _______
Note: Surveillance pricing occurs in both consumer markets (retail, travel, insurance, gig services) and labour markets (algorithmic wage-setting by platforms). This model addresses both, recognising that the same data infrastructure and the same mechanisms of exploitation operate in both contexts.
Pillar 1: Principles
Foundational Values
1. Price Is Not a Trade Secret When It Exploits Personal Data The price charged to an individual consumer or worker is not a proprietary algorithm deserving protection when that price is derived from surveillance of that individual’s behaviour, location, demographics, or financial circumstances. People have the right to know how their price was determined and what data was used to set it.
2. Personalised Pricing at the Expense of the Vulnerable Is Exploitation Dynamic pricing systems that systematically charge higher prices to people who are more geographically constrained, more financially stressed, or less price-sensitive exploit vulnerability rather than serve efficiency. The same mechanism that extracts surplus from affluent consumers who value convenience extracts survival payments from low-income consumers who have no choice. These are not equivalent outcomes.
3. Behavioural Manipulation Is Not Market Efficiency Pricing systems that are designed to identify and exploit individual psychological vulnerabilities — identifying moments of heightened anxiety, urgency, or decision fatigue — are not market mechanisms. They are manipulation. Policy must distinguish between pricing that responds to legitimate supply and demand signals and pricing that engineers individual susceptibility.
4. Anti-Discrimination Principles Apply to Algorithmic Pricing Pricing algorithms that use proxy variables to charge racialised communities, women, older people, or disabled people more for equivalent goods and services are discriminatory — regardless of whether the discrimination was intended, regardless of whether protected characteristics were directly used, and regardless of whether it is the algorithm or a human who set the price. Anti-discrimination law applies to algorithmic pricing as to any other pricing practice.
5. Workers Have the Same Rights as Consumers Algorithmic wage-setting — adjusting the pay offered to workers based on their demonstrated willingness to accept lower pay, their financial circumstances, or their responsiveness to pressure — is the labour-market equivalent of surveillance pricing. Workers’ right to pay transparency and protection against data-driven wage suppression is analytically equivalent to consumer protection against surveillance pricing.
6. Market Fairness Requires Price Intelligibility A market in which sellers have complete information about buyers and use it to extract maximum willingness to pay, while buyers have no information about how prices are set, is not a competitive market. It is a system of individually targeted extraction. Restoring market fairness requires not just individual transparency rights but structural constraints on the use of behavioural data in pricing.
Equity Considerations
- Low-income consumers and workers — Surveillance pricing systems that use financial stress, limited geographic mobility, or desperation as signals to extract higher prices or suppress wages impose the greatest harm on people with the least ability to absorb it or seek alternatives. These are not edge cases; they are the primary mechanism of exploitation.
- Racialised communities — Research shows that algorithmic pricing and insurance rating systems that use postcode, purchasing pattern, or social network data systematically charge higher prices to residents of racialised communities. Algorithmic pricing is a mechanism through which redlining-equivalent practices continue in contemporary consumer markets.
- Older adults — Research on pricing algorithms for insurance, travel, and consumer goods shows that older adults — who tend to show less price-comparison behaviour and have established patterns that suggest loyalty — are frequently charged premium prices by surveillance systems.
- Rural and remote communities — Geographic constraints reduce residents’ practical alternatives. Pricing algorithms that identify low competition or high travel costs as willingness-to-pay signals exploit geographic isolation.
- Women and gender-diverse consumers — “Pink tax” phenomena — higher prices for equivalent goods marketed to women — have been documented in both traditional retail and algorithmic pricing systems. Algorithmic pricing that uses browsing, purchase history, and demographic signals reproduces and extends these patterns.
Environmental Considerations
Surveillance pricing infrastructure requires persistent data collection at scale — tracking browsing, location, purchase history, and behavioural signals across platforms and devices. This infrastructure has a significant energy and data storage footprint. Policy must not inadvertently incentivise the expansion of surveillance data collection; requirements for data minimisation in pricing systems serve both privacy and environmental goals. See the Data Centers model for standards on the energy footprint of data processing infrastructure.
Pillar 2: Standards
Mandatory Standards
Standard 1: Right to Know Your Price Was Personalised Where an operator uses personal data — including but not limited to browsing history, purchase history, location history, device type, IP address, inferred demographic or psychographic profiles, or any data derived from third-party data brokers — to set a price charged to an individual consumer or worker that differs from the price offered to other individuals in equivalent circumstances, the operator must:
(a) Inform the individual, at or before the time the price is presented, that personalised pricing has been applied;
(b) Identify the general categories of data used in the pricing decision;
(c) Provide, upon request, information about the significant factors that led to this individual’s price differing from a baseline or reference price;
(d) Disclose whether the personalised price is higher or lower than the price offered to consumers or workers not subject to personalised pricing, and by what approximate percentage.
This disclosure obligation applies to all goods and services, including insurance premiums, subscription pricing, gig platform earnings, on-demand service fees, and any other context where personal data influences the price or pay offered.
Rationale: The EU Digital Markets Act (2022) requires transparency in ranking and pricing practices for gatekeepers, and prohibits combining personal data from third-party services for targeting without consent. The EU Unfair Commercial Practices Directive prohibits commercial practices that materially distort consumers’ economic behaviour, including through manipulation and withholding of material information. Disclosure that personalised pricing has been applied is the minimum necessary for consumers to make informed decisions. Without this disclosure, surveillance pricing operates in complete information asymmetry. The FTC’s 2024 surveillance pricing inquiry found that eight major retailers used customer data to personalise prices without disclosure, including price-testing tools that identified individual price sensitivity.
Reference: EU Digital Markets Act, Regulation (EU) 2022/1925, Articles 5–7; EU Unfair Commercial Practices Directive, 2005/29/EC; FTC Order directing eight companies to provide information on surveillance pricing practices (2024); FTC surveillance pricing inquiry
Standard 2: Prohibition on Vulnerability-Based Pricing Operators must not use personal data to identify or exploit individual vulnerability in pricing, including:
(a) Identifying that an individual is under time pressure, financial stress, or in an emergency situation and using this to increase prices above what would be charged in normal circumstances;
(b) Using individual financial distress signals — including credit score data, debt data, or spending pattern data indicating financial constraint — to charge higher prices or reduce pay on the basis that the individual has limited alternatives;
(c) Using data about individual health status, disability, or protected characteristics to charge higher prices in contexts where those characteristics indicate reduced mobility, reduced alternatives, or increased need;
(d) Using psychological profiling data — including inferred emotional states, cognitive load indicators, or identified decision-making patterns — to time pricing offers to moments of identified susceptibility.
This prohibition applies regardless of whether the operator directly holds the data used or whether it is derived from third-party data brokers.
Rationale: Vulnerability-based pricing is the clearest case of a pricing practice that exploits rather than serves consumers. Emergency ride-hailing surge pricing during disasters — explicitly prohibited in some jurisdictions under price-gouging law — is the most visible example. Insurance pricing that charges higher premiums to financially stressed consumers, and platform gig work that suppresses pay offers to workers showing high acceptance rates (indicating financial desperation), are structurally equivalent. The EU Consumer Rights Directive and the UK Consumer Protection from Unfair Trading Regulations provide a basis for this standard; this model extends those principles explicitly to algorithmic and data-driven contexts.
Reference: EU Unfair Commercial Practices Directive 2005/29/EC, Articles 5–9 (aggressive commercial practices); UK Consumer Protection from Unfair Trading Regulations 2008; Consumer Financial Protection Bureau research on personalised pricing in financial products
Standard 3: Prohibition on Protected-Characteristic Proxies in Pricing Operators must not use algorithmic pricing systems that produce systematic price differences across groups defined by race, ethnicity, sex, gender, age, disability, religion, national origin, or other characteristics protected under applicable anti-discrimination law, whether directly or through proxy variables.
(a) Operators must test pricing algorithms for disparate impact across protected groups before deployment and at minimum every 24 months thereafter;
(b) The use of postcode, neighbourhood, or geographic unit as a variable in pricing is prohibited where it produces a systematic price difference correlated with the racial or ethnic composition of that area that lacks documented justification independent of protected characteristic proxies;
(c) Evidence of systematic price differences across protected groups constitutes a rebuttable presumption of discriminatory pricing; the operator bears the burden of demonstrating that the difference is not attributable to protected characteristic proxies.
Rationale: Algorithmic pricing can reproduce discriminatory outcomes even when protected characteristics are not directly used as inputs. Insurance pricing algorithms that use postcode, purchasing patterns, and social network data as proxies produce racially disparate premiums equivalent to discriminatory insurance practices previously prohibited by law. This standard draws on the Colorado SB21-169 insurance algorithm framework and extends it to all pricing contexts. The disparate impact doctrine in US civil rights law and the EU non-discrimination directives provide the legal foundation.
Reference: Colorado SB21-169 (insurance algorithm testing for unfair discrimination, 2021); EU Non-Discrimination Directive, GDPR Article 22 (automated decisions); US Fair Housing Act (disparate impact standard reaffirmed by Supreme Court, 2015)
Standard 4: Data Minimisation for Pricing Operators using algorithmic pricing must:
(a) Use only data that is directly relevant to the legitimate pricing purpose and not retain data in excess of what is needed for that purpose;
(b) Not combine personal data from third-party sources (data brokers, social media, loyalty programmes, device tracking) for pricing purposes without explicit, informed consent from the individual;
(c) Provide consumers and workers with the right to opt out of personalised pricing while retaining access to the good or service at the non-personalised rate;
(d) Delete or anonymise pricing profile data within 12 months of the last transaction or interaction, unless the individual explicitly consents to retention for stated purposes.
Rationale: The EU Digital Markets Act prohibits designated gatekeepers from combining personal data across services without consent. GDPR data minimisation (Article 5(1)(c)) and purpose limitation (Article 5(1)(b)) principles apply to pricing data. The right to opt out of personalised pricing (sub-clause c) is critical: without this right, disclosure of personalised pricing does not give consumers meaningful choice.
Reference: EU Digital Markets Act Article 5(2); GDPR Articles 5, 7, 9; EU ePrivacy Directive; California Consumer Privacy Act (CCPA) opt-out provisions
Standard 5: Wage Surveillance Prohibition In the context of algorithmic wage-setting by platform operators, the following practices are prohibited:
(a) Using data about an individual worker’s demonstrated acceptance rate for prior pay offers, financial circumstances, or geographic constraints to set pay offers below what would be offered to workers without those indicators;
(b) Using data about a worker’s response patterns to previous surge or bonus offers to reduce the frequency, magnitude, or timing of such offers to that worker;
(c) Setting different base pay rates for workers with equivalent task profiles based on inferred willingness to accept lower pay.
These prohibitions apply regardless of whether the operator characterises such mechanisms as “dynamic pricing,” “personalised incentives,” or any other term that obscures their function as individual wage suppression.
Rationale: FTC and academic research has documented that ride-hailing and food delivery platforms use acceptance rate data and driver location/financial patterns to modulate pay offers — offering higher pay to workers showing lower acceptance rates and suppressing rates for workers who consistently accept low offers. This is the labour-market equivalent of price discrimination against consumers with low price sensitivity, and it exploits the information asymmetry inherent in algorithmic management. This standard complements Standard 3 in the Platform Work domain.
Aspirational Standards
Aspirational Standard 1: Algorithmic Pricing Registry Jurisdictions should require operators above a defined revenue threshold to register their algorithmic pricing systems with a designated regulatory body, including a description of the data inputs used, the types of personalisation applied, and the results of disparate impact testing. Registry entries should be publicly accessible to enable civil society monitoring.
Aspirational Standard 2: Reference Price Publication Operators using personalised pricing should be required to publish, for each product or service category, the reference price — the price charged to individuals whose data does not trigger personalisation — so that consumers can determine the premium they are paying as a result of data exploitation. Where a reference price does not exist (i.e., all pricing is personalised), operators should be required to calculate and publish a median price.
Aspirational Standard 3: Sector-Specific Prohibitions Jurisdictions should consider outright prohibitions on personalised pricing in specific high-stakes sectors where vulnerability is greatest and alternatives are fewest, including: essential utilities, healthcare goods and services, emergency transport, essential food staples, housing and mortgage products, and insurance for health or life.
Standards Cross-Reference
| Standard Referenced | Body | Version | Notes |
|---|---|---|---|
| EU Digital Markets Act | European Parliament | 2022/1925 | Pricing transparency and data combination prohibitions |
| EU Unfair Commercial Practices Directive | European Parliament | 2005/29/EC | Aggressive and deceptive practices |
| GDPR | EU | 2016/679 | Data minimisation, purpose limitation, consent |
| Colorado SB21-169 | Colorado Legislature | 2021 | Insurance algorithm testing for unfair discrimination |
| FTC Surveillance Pricing Inquiry | FTC (US) | 2024 | Regulatory investigation into personalised pricing practices |
| California Consumer Privacy Act (CCPA) | California Legislature | 2018/2020 | Consumer opt-out and data rights |
Pillar 3: Implementation
Procurement Requirements
Procurement Clause A: Prohibition on Surveillance Pricing in Public Contracts Government bodies and publicly funded entities must not contract with providers that: (a) use personal data to set prices or remuneration for individuals in ways that violate the standards established in this policy; (b) use data broker-sourced profiles to set prices without disclosed consent; (c) cannot demonstrate that their pricing algorithms have been tested for disparate impact across protected groups.
Procurement Clause B: Pay Formula Transparency in Government-Contracted Services Where government bodies contract with platform operators or gig economy intermediaries for the supply of labour or services, contractors must disclose the pay formula used to set worker compensation, demonstrate that no wage surveillance mechanisms are used, and provide aggregate pay data disaggregated by worker demographics on request.
Transition and Timeline
| Milestone | Timeframe from adoption | Notes |
|---|---|---|
| Personalised pricing disclosure required | 6 months | Notification at point of pricing |
| Opt-out right in force | 6 months | Access to non-personalised rate upon request |
| Vulnerability-based pricing prohibition in force | 12 months | |
| Disparate impact testing required for new systems | 6 months | |
| Disparate impact testing required for existing systems | 24 months | |
| Data broker data combination prohibition in force | 12 months | |
| Wage surveillance prohibitions in force | 6 months |
Reporting and Transparency
Transparency Requirement Operators above a defined revenue threshold must report annually to the designated oversight body: (a) the categories of data used in pricing and wage-setting algorithms; (b) the results of disparate impact testing across protected groups; (c) the number and outcomes of individual disclosure requests; (d) the number and outcomes of opt-out requests; (e) any material changes to algorithmic pricing systems in the reporting period; (f) a summary of complaints received and their resolution. Summary reports must be published publicly; detailed testing methodology and results must be provided to the oversight body.
Enforcement
Enforcement Clause The designated consumer protection or competition authority may: (a) require operators to produce algorithmic pricing documentation, data input logs, and disparate impact test results; (b) commission independent algorithmic audits at operator expense where the authority has reason to believe pricing systems produce discriminatory or exploitative outcomes; (c) impose penalties scaled to operator revenue for violations of disclosure, opt-out, or data minimisation requirements; (d) seek injunctive relief to suspend pricing practices found to produce disparate impact pending remediation; (e) grant consumer and worker advocacy organisations standing to bring enforcement complaints without requiring a named individual complainant.
The burden of proof in enforcement actions concerning disparate impact rests with the operator once a pattern of differential pricing correlated with protected characteristics has been demonstrated by the complainant or regulator.
Notes on enforcement: Surveillance pricing enforcement requires technical capacity that most consumer protection bodies currently lack. Investing in algorithmic auditing capacity — or establishing a joint technical body with data protection and competition authorities — is a prerequisite for meaningful enforcement.
Pillar 4: Governance
Oversight Body
Oversight Clause Oversight of surveillance pricing shall be shared between: the designated consumer protection authority (for consumer pricing); the labour or employment regulatory body (for wage-setting); and the data protection authority (for data use in pricing). These bodies must establish a joint working group on algorithmic pricing with authority to: develop shared audit methodology for pricing algorithms; coordinate enforcement across regulatory boundaries; publish guidance on the boundary between legitimate dynamic pricing and prohibited surveillance pricing; and advise on sector-specific prohibitions. The joint working group must include representation from competition authorities given the market concentration dimensions of surveillance pricing.
Community Representation
Participation Clause The joint working group and any associated advisory bodies must include seats reserved for: consumer advocacy organisations; worker organisations and trade unions; civil rights and racial justice organisations; disability rights advocates; organisations representing low-income consumers; and academic researchers in algorithmic pricing, data economics, and consumer protection. Community representatives must be consulted before publication of guidance, revision of audit methodology, and any proposed sector-specific prohibitions.
Equity note: Surveillance pricing research has consistently shown that its most severe effects fall on low-income consumers, racialised communities, and people with disabilities. These communities’ representatives must have direct input into the governance of pricing algorithm standards.
Audit and Review
Audit Clause The joint oversight working group shall commission or require independent audits of algorithmic pricing systems for operators above a defined market share threshold at minimum every two years. Audits shall assess: accuracy of disparate impact testing; compliance with data minimisation requirements; whether opt-out mechanisms are genuine and accessible; and whether wage-setting practices comply with the wage surveillance prohibition. Audit results must be published in full.
Review Clause This policy shall be reviewed every two years to account for: new data types and sources used in pricing algorithms; developments in EU Digital Markets Act enforcement and case law; outcomes of FTC and national enforcement actions; and emerging evidence on the impact of surveillance pricing on specific demographic groups. Review must include a public consultation with targeted outreach to directly affected communities.
Real-World Examples
European Union — Digital Markets Act (DMA)
Enacted: 2022; enforcement from March 2024 Type: EU Regulation Link: https://digital-markets-act.ec.europa.eu/ Summary: The DMA designates the largest digital platforms (“gatekeepers”) and imposes specific obligations including: prohibiting combining personal data across services without consent; requiring transparency in ranking and recommender systems; prohibiting leveraging data from business users to compete against them; and mandating interoperability. While the DMA does not explicitly prohibit personalised pricing, its data combination and transparency obligations significantly constrain the data infrastructure on which surveillance pricing depends. The European Commission has opened formal non-compliance proceedings against multiple gatekeepers. Limitation: the DMA applies only to designated gatekeepers; smaller operators are not covered; surveillance pricing by non-dominant firms falls outside scope.
United States — FTC Surveillance Pricing Inquiry
Enacted/Proposed: FTC orders issued January 2024; investigation ongoing Type: Regulatory investigation and orders Link: https://www.ftc.gov/news-events/topics/competition-enforcement Summary: The FTC ordered eight major US retailers and consumer pricing firms — including Mastercard, Revionics, Bloomreach, JPMorgan Chase, Accenture, McKinsey, Pros Holdings, and Bloomreach — to provide detailed information about their surveillance pricing practices, including what consumer data they use, how prices are personalised, and what the effects on consumers are. This is the most significant US regulatory action on surveillance pricing to date, though it has not yet produced enforcement rules. The FTC’s inquiry confirmed that major retailers are using surveillance pricing tools that access browsing history, location, device type, and third-party data broker profiles to set individualised prices. Limitation: No substantive rule has resulted yet; the inquiry is investigative rather than enforcement.
Colorado — SB21-169 (Insurance Algorithms)
Enacted: Signed July 2021; compliance reports from December 2024 Type: State law Link: https://doi.colorado.gov/for-consumers/sb21-169-protecting-consumers-from-unfair-discrimination-in-insurance-practices Summary: Colorado insurers must test algorithmic pricing systems for unfair discrimination against protected groups — including race, religion, sex, sexual orientation, and disability — and submit governance policies and compliance reports to the Division of Insurance. This is the most directly applicable enacted law targeting discriminatory algorithmic pricing. It establishes the principle that insurers bear the burden of demonstrating their algorithms do not discriminate, and requires ongoing monitoring rather than one-time certification. Limitation: applies only to insurance; no equivalent for retail, gig work, or other consumer pricing contexts.
European Union — Unfair Commercial Practices Directive and Modernisation Directive
Enacted: 2005 (UCPD); 2019 Modernisation Directive added digital-specific provisions Type: EU Directive Link: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32005L0029 Summary: The 2019 Omnibus Directive (2019/2161) updated the UCPD to require traders to disclose when online search results are personalised based on automated decision-making, to disclose when consumer prices are personalised, and to establish additional protections against digital-specific manipulative practices. Member states were required to transpose the directive by May 2022. This is the first EU-wide requirement for consumer-facing disclosure of personalised pricing. Limitation: requires disclosure of personalised pricing but does not prohibit it; enforcement is member-state dependent and has been inconsistent.
Gaps and Known Weaknesses
- No comprehensive prohibition — No jurisdiction has yet enacted an outright prohibition on surveillance pricing in consumer markets. All existing frameworks require disclosure or testing but permit the practice. The gap between regulation and prohibition is significant.
- Data broker ecosystem — Surveillance pricing relies heavily on data purchased from brokers who aggregate behavioural data from thousands of sources. Regulating pricing at the point of sale does not address the upstream data infrastructure. Meaningful regulation requires action on the data broker market.
- Labour market equivalent is underregulated — Wage surveillance prohibition (Standard 5 in this model) has no equivalent in enacted law anywhere. Platform wage suppression through acceptance rate monitoring is documented but largely unaddressed.
- Bundled pricing and subscription opacity — Surveillance pricing increasingly operates through subscription tiers, bundle pricing, and loyalty programme manipulation where the personalised price is hidden within complex product structures. This model’s disclosure requirements may need extension to cover these contexts.
- Real-time price discrimination in essential services — The strongest case for sector-specific prohibition (Aspirational Standard 3) is for essential services including utilities, healthcare, food, and housing. This model identifies the need but does not provide detailed sector-specific language.
- Cross-border enforcement — Most surveillance pricing by major platforms operates across borders. Enforcement by any single national authority is constrained by jurisdictional limits. International cooperation mechanisms are underdeveloped.
Cross-Domain Dependencies
| Related Domain | Relationship |
|---|---|
| Algorithmic Accountability | Algorithmic pricing systems require bias audits equivalent to those required for hiring and credit; disparate impact standards and audit methodology apply |
| Platform Work & Algorithmic Labour Rights | Algorithmic wage-setting and surveillance pricing are two sides of the same mechanism; pay formula transparency is in both domains |
| Smart Cities & Privacy | Location and sensor data from smart city infrastructure can feed surveillance pricing; data minimisation standards apply |
| Digital Sovereignty | Data broker ecosystems and cross-border data flows are a sovereignty and governance challenge as well as a pricing one |
| Data Centers | Real-time surveillance pricing requires persistent data infrastructure; energy and environmental standards apply |
| Freedom of Expression & Content Governance | Algorithmic amplification of content is a form of personalisation with economic effects analogous to pricing |
Glossary
Algorithmic Pricing: The use of automated systems to set prices or wages based on data inputs including personal data, market conditions, supply and demand signals, and behavioural profiles. Distinct from simple dynamic pricing (which responds only to supply/demand signals) in that it incorporates individual-level data.
Data Broker: A company that collects, aggregates, and sells personal data — typically including browsing history, purchase history, location history, and inferred demographic and psychographic profiles — to third parties including retailers, insurers, and employers, without a direct relationship with the individuals whose data is sold.
Disparate Impact: A legal doctrine holding that a facially neutral practice is discriminatory if it produces outcomes that disproportionately disadvantage a protected group, regardless of intent. Applied in this domain to pricing algorithms that use proxy variables to produce race- or class-correlated pricing differences.
Dynamic Pricing: Real-time adjustment of prices based on supply and demand signals — for example, airline seat prices that change as booking windows close and availability decreases. Distinct from surveillance pricing in that it responds to market-level rather than individual-level signals, though the boundary is contested in practice.
Personalised Pricing / Price Discrimination: Setting different prices for identical goods or services for different individuals or groups. Theoretically possible without personal data (based on stated willingness to pay, for example), but in practice in digital markets primarily implemented through surveillance data.
Surveillance Pricing: The practice of using personal data — including behavioural surveillance data, location data, financial status data, and inferred psychological profiles — to set individualised prices that extract maximum willingness to pay from each consumer, or to set pay offers that minimise what workers will accept. The FTC has used this term in its 2024 inquiry.
Wage Surveillance: The use of data about workers’ acceptance behaviour, financial circumstances, and geographic constraints to set algorithmic pay offers that suppress wages by identifying and exploiting individual workers’ limited alternatives. The labour-market equivalent of surveillance pricing in consumer markets.
Contributing to This Policy Model
Priority contribution needs for this model:
- Sector-specific prohibition model clauses — Model language for outright prohibition of surveillance pricing in specific essential services sectors (healthcare, housing, utilities, emergency transport)
- Data broker regulation — Model language addressing the upstream data infrastructure on which surveillance pricing depends
- Wage surveillance evidence — Documentation and analysis of platform wage suppression practices with citations to research
- International examples — Any enacted or proposed regulation specifically targeting personalised pricing outside the EU and US
- Opt-out mechanism standards — Detailed requirements for what constitutes a genuine and accessible opt-out from personalised pricing, including standards for the non-personalised alternative
All substantive changes go through a minimum 14-day public comment period before merging.
Changelog
| Version | Date | Summary of changes |
|---|---|---|
| 0.1 | 2026-04-18 | Initial draft — four pillars, real-world examples from EU DMA, FTC, Colorado, EU UCPD |
This policy model is provided for educational and advocacy purposes. It requires adaptation by qualified legal practitioners before formal adoption. It is not legal advice.
Policy Assistant
Choose your persona to open the right prompt builder for this policy domain.