Data Fidelity and Ethical Mapping

Why This Matters

Digital data can help emergency responders find people who need help.

But digital data can also mislead.

A spike in app usage does not mean a spike in need. A quiet zip code does not mean a safe zip code.

This guide helps you use data carefully—with confidence and caution.

Tone: This guide supports the use of technology in emergency response. It also names its blind spots. Use both views together.


Section 1: Defining “Ground Truth”

Two Kinds of Data

Not all data is equal. Before acting on any data source, ask: How was this collected?

Direct Data

Definition: Information that comes from a known person who chose to share it.

Examples:

Strengths:

Weaknesses:

Inferred Data

Definition: Information derived from patterns of digital behavior, not from direct contact.

Examples:

Strengths:

Weaknesses:


⚠️ Data Integrity Warning

Digital activity is a proxy for attention, not necessarily need.

When you see a spike in searches for “emergency shelter” in a neighborhood, it may mean:

  • People in that area need shelter
  • People in that area are worried and searching for information
  • News coverage of that area increased
  • One person searched many times
  • People outside the area are searching on behalf of someone there

Before acting on inferred data, ask:

  1. Could this pattern be explained by something other than direct need?
  2. Who is missing from this data?
  3. Does this match what our direct data (registries, 911 calls) shows?

Digital signals are useful starting points. They are not ground truth.


Section 2: The “Data Desert” Risk

What Is a Data Desert?

A data desert is a geographic area or demographic group with little or no digital signal—not because they are safe, but because they are not online.

Data deserts are dangerous because silence looks like safety.

Who Is Missing from Digital Data?

Digital data systematically underrepresents:

Group Why They May Be Missing
Older adults (75+) Lower smartphone use, less social media activity
Low-income households Limited data plans, fewer connected devices
People with cognitive disabilities May not use digital tools independently
Rural residents Limited broadband access
Undocumented immigrants Avoid digital systems due to fear
People who are unhoused Intermittent device and connectivity access
Non-English speakers Platforms may not support their language

List format (for screen readers):

Selection Bias in Emergency Response

Selection Bias occurs when the data you collect is not representative of the population you are trying to reach.

In emergency response, selection bias can be life-threatening.

Example:

A city monitors Twitter for distress signals during a flood. The downtown core shows hundreds of posts asking for help. A low-income neighborhood two miles away shows almost no activity.

Incorrect conclusion: The downtown core needs more resources.

What may actually be true: The low-income neighborhood has fewer smartphones, less data access, and older residents who do not use Twitter. They may need more resources, not fewer.

Acting only on digital signals amplifies existing inequality.

Protocol: When Digital Data Is Unavailable or Unreliable

Use this protocol when you have limited or no digital signal for an area:

Step 1: Assume need, not absence.

Step 2: Check your direct data sources.

Step 3: Cross-reference with demographic data.

Step 4: Activate community networks.

Step 5: Send boots on the ground.


Section 3: The “Search vs. Status” Distinction


🔍 Reality Check: What a Search Spike Actually Means

Scenario: You are monitoring search trends during a disaster. You notice a spike in searches for “ASL interpreter” coming from zip code 97201.

What it might mean—and what it might not:

What it looks like What it might actually mean
“The Deaf community is concentrated in 97201” People in 97201 are searching because no ASL services exist there
“ASL services are needed here more than elsewhere” A single advocacy organization is searching for resources on behalf of clients across the region
“There is high demand for ASL in 97201” Hearing people are searching to find out where to volunteer

The critical distinction:

A spike in searches tells you that people are looking for something.

It does not tell you where the people who need that thing are located.

A spike in searches for “ASL interpretation” in a zip code may indicate a lack of services in that area—not the physical location of the Deaf community.

Do not use search trends to map where a disability community lives.

Use direct data—community organization records, service registries, and in-person outreach—to understand where people are.


Applying the Distinction in Practice

When you see a digital signal, ask two questions:

  1. Who is searching? (The person who has the need, or someone looking for them?)
  2. What are they finding? (A service that exists, or evidence that a service is missing?)

Examples of misreadings to avoid:

The rule: Search trends map the gap between need and service, not the location of need.


Section 4: Mitigation Strategies — Triangulating Your Data

Triangulation means using more than one data source to check your understanding before acting.

No single data source is sufficient. Use at least three.


What to do:

When a digital signal (search spike, social media cluster, app data) points to a specific area, compare it against your direct data before allocating resources.

How:

  1. Pull registry data for the flagged zip code or neighborhood.
  2. Compare: Does the digital signal match what the registry shows?
  3. If the signal is high but registry coverage is low, treat that as a data gap, not a reliable demand signal.
  4. Flag the area for follow-up using Step 2 (Boots on the Ground).

Why it works:

Functional Needs Registries reflect people who have actively opted in. Digital signals reflect people who are active online. The overlap is often small. Where they diverge, you need more information.

Cross-reference tools:


Strategy 2: “Boots on the Ground” Verification

What to do:

Before committing major resources based on a digital signal, send a small verification team to the flagged area.

How:

  1. Identify 2–3 key addresses or community anchor points in the area (community center, faith organization, senior center).
  2. Send a field team to make direct contact.
  3. Ask simple questions: “Are people in this neighborhood getting what they need? Is anyone asking for help that we haven’t reached?”
  4. Document what you find. Update your direct data records.
  5. Report back within 1 hour. Adjust your resource allocation accordingly.

Why it works:

Field verification converts inferred data into direct data. It also builds community trust, which improves future registration rates.

Important: Boots-on-the-ground verification is not a replacement for registry data. It is a supplement. Use it to fill gaps, not to replace systematic tracking.


Strategy 3: Prioritize Direct Communication Over Passive Data Mining

What to do:

When possible, reach out directly to people who may need help rather than waiting to detect need through passive monitoring.

How:

  1. Use direct communication channels: Zello (radio-style app for emergencies), Signal (encrypted messaging), WhatsApp (multilingual, widely used).
  2. Partner with trusted community messengers—faith leaders, senior center staff, home health workers, pharmacies—to make direct contact.
  3. Send outbound messages to known registrants first, then expand to broader community networks.
  4. Create two-way channels: allow people to signal need directly, not just receive information.

Why it works:

Passive data mining (monitoring what people search for, post, or click) captures only people who are already engaging with digital systems. Direct outreach reaches people who are not.

Direct communication tools:

See also: Multi-Platform Outreach: Zello, Signal, WhatsApp for implementation guidance.


Summary: The Triangulation Checklist

Before acting on a digital data signal, verify:


Key Terms

Direct Data: Information collected directly from a known individual, through registration, intake, or contact.

Inferred Data: Information derived from patterns of digital behavior (searches, social posts, app activity).

Data Desert: A geographic area or demographic group underrepresented in digital data, often due to low connectivity, language barriers, or distrust of digital systems.

Selection Bias: A systematic error that occurs when the people in your data are not representative of the people you are trying to serve.

Triangulation: The practice of cross-referencing multiple independent data sources before drawing conclusions or allocating resources.

Zero-Baseline Principle: The recognition that absence of data about a population does not mean that population has no needs—it may mean they are invisible to your data systems.



Digital data is a tool, not a map. Use it to ask better questions—then go find the answers in the field.