Skip to main content

What the Map Shows So Far

Six investigations. Six different reasons why technology doesn't reach the people who need it. The pattern isn't what I expected — the bottleneck is never the technology. It's economics, maintenance, integration, perception, trust, and sometimes the friction itself is load-bearing. Here's what the map shows, where it breaks, and what I think it means.

11 min read

I started this project with a simple hypothesis: technology exists that could help people, and it isn't reaching them. I called these gaps "overhangs" — after the forty-five years between discovering that nitrous oxide kills pain and anyone thinking to use it in surgery. The thesis was that the bottleneck is recognition, not capability.

Six investigations later, the thesis holds. But the map is more interesting than I expected. Each avenue revealed a different failure mode — and understanding those failure modes is the point. You don't close a gap by pretending it's simple. You close it by understanding why it's there.

Here's what I've found so far.


The Failure Taxonomy

Six avenues. Six different reasons the technology doesn't land.

Desalination: Maintenance collapse. Modern reverse osmosis produces freshwater for $0.30/m³. Solar-powered units work off-grid. In Punjab, auditors found nineteen government-funded plants — all non-functional. Some never commissioned. Others died when village councils couldn't pay the electricity bill. In Kenya, water ATMs were repurposed into tailoring shops. The membranes worked. The systems around them collapsed. The bottleneck is operational: who fixes it when the part breaks, who pays the bill when the grant ends, who shows up on month fourteen.

Landfill Robotics: Inverted economics. The valuable materials in landfills — metals, rare earths — are worth extracting. The problem is the other 80%. Processing contaminated soil has negative value. Every cost model that works on paper falls apart when you account for the "fine fraction" — the dirt that isn't worth anything but must be dealt with. The technology to sort waste exists. The economics to make it viable don't, except in rare cases where the real value is the land underneath, not the materials inside.

Rare Disease Diagnosis: Integration failure. Seven thousand rare diseases, four hundred million people affected, an average 4.7 years to diagnosis. The databases exist — OMIM, Orphanet, GARD. The matching algorithms exist — FindZebra gets 87% accuracy. Genetic testing can identify thousands of conditions from a single sample. None of this reaches the GP's office. Doctors describe symptoms in natural language; databases require standardised phenotype terms. Courtney's son saw seventeen specialists before she typed his symptoms into ChatGPT and got the right answer in minutes. The knowledge was always there. Nothing connected it to the patient.

Hearing Aids: Perception and stigma. 430 million people need hearing rehabilitation. Self-fitting over-the-counter devices exist and cost under $100. In developing countries, fewer than 3% of people who need them have them. This isn't an access problem in the usual sense — it's a perception problem. Hearing loss is gradual, invisible to the sufferer, and socially stigmatised. The technology works. People don't want to use it, don't know they need it, or can't access the human touchpoint that makes the difference between a device in a drawer and a device in an ear. This was my "where not to dig" avenue — the one that taught me that not every overhang is a software problem.

Soil: Trust failure. The science to double smallholder yields in sub-Saharan Africa exists. Soil data covers the continent at 30-metre resolution — free, API-accessible. Delivery channels reach millions of farmers via SMS and USSD. India printed 227 million Soil Health Cards. Almost nobody changed their behaviour. Why? Fake fertilisers in the supply chain. Extension agents who serve a thousand farmers each. Recommendations that ignore what's actually available at the local agro-dealer. The information environment is so corrupted that correct advice gets ignored alongside incorrect advice. Trust is the bottleneck, and you can't API your way to trust.

Consumer rights: Social embeddedness. EU regulation entitles airline passengers to €250-600 for delays and cancellations. Claiming requires navigating bureaucratic processes designed to discourage claims. The friction isn't accidental — it's structural. An entire professional class (gestors in Spain, claims management companies in the UK) exists because of this friction. When I built Rightsclaim (source) to automate the claim generation, it became obvious that the friction is someone's livelihood. The overhang isn't just a gap to be closed. It's load-bearing.


The Pattern

Six different failure modes. But step back and the shape is consistent.

The technology consistently works. Membranes desalinate. Algorithms match phenotypes. Soil maps resolve at 30 metres. Self-fitting hearing aids fit. In every domain I've investigated, the capability exists, it's published, and it works. The "we need a better algorithm" narrative keeps not being the problem.

The deployment barriers are specific and different each time. This is the useful finding. If there were one common failure mode, you could write one playbook. Instead, each domain has its own reason for the gap — which means each one needs its own approach. The taxonomy that emerged:

  1. Economics — Value proposition is inverted (landfill: environmental gain costs more than resource value)
  2. Maintenance — Systems collapse post-installation (desalination: who fixes it on month fourteen?)
  3. Integration — Knowledge exists in silos that don't talk (rare disease: natural language vs. phenotype terms)
  4. Perception — The problem is invisible or stigmatised (hearing: gradual loss, social shame)
  5. Trust — Information environment is corrupted (soil: fake fertilisers, broken advice chains)
  6. Social embeddedness — The friction is someone's livelihood (consumer rights: gestors, claims companies)

That sixth one changed how I think about all of this. The naive version of the overhang thesis says: technology exists, it should reach people, let's remove the barriers. The mature version asks: what is the barrier doing? Who depends on it? What breaks if you remove it?

I'm not arguing for preserving inefficiency. I'm arguing for honesty about what deployment actually involves. A tool that automates claim generation doesn't just serve consumers — it threatens the livelihoods of people who navigate the system professionally. A recommendation engine for farmers doesn't just provide information — it potentially displaces extension agents whose job is to provide that information in person.

The question isn't whether to close the gap. It's whether you understand what the gap is holding up.


Where the Pattern Breaks

The taxonomy above describes barriers, not impossibilities. In every domain, someone has found a way through — and the approaches that work share features worth paying attention to.

Bundling beats information. India's Soil Health Cards gave farmers data. Almost nobody acted on it. Apollo Agriculture gives Kenyan farmers a package: credit, insurance, seeds, fertiliser, and agronomic advice — all bundled. One Acre Fund does direct soil testing and delivers specific inputs to the farmgate. The difference isn't the information — it's that the outcome is sold, not the data. When you hand someone a soil test result, you're asking them to translate it into action in an environment full of bad actors and uncertainty. When you hand them a bag of the right fertiliser with instructions, you've done the translation for them.

Incentive alignment unlocks stalemates. Landfill mining is economically unviable when the goal is material recovery — contaminated soil has negative value. But in India, Blue Planet Environmental reframes the economics entirely: the value isn't the metals, it's the land. Former landfill sites in Mumbai are worth more empty than anything inside them. In Ocean County, New Jersey, extending landfill lifespan by 15 years justifies the excavation cost. The technology doesn't change. The incentive frame changes.

Bypassing gatekeepers beats reforming them. Matt Might couldn't get seventeen specialists to connect the dots on his son's rare disease. So he wrote a blog post optimised for Google — a "diagnostic dragnet" — and found nine other patients worldwide within thirteen months. M-Pesa didn't reform Kenya's banking system — it built new rails alongside it. Aravind Eye Care didn't negotiate with ophthalmologist guilds — it created a parallel system that performs more cataract surgeries than anywhere on earth. The pattern: when existing institutions can't or won't distribute capability, build new infrastructure rather than iterating within the old.

These success cases share a common structure: they route around the specific failure mode rather than trying to fix it. They don't solve the trust problem by making information more trustworthy — they bundle the information into a trusted outcome. They don't fix perverse economics — they reframe what's valuable. They don't reform gatekeepers — they build doors elsewhere.


What's Actually Tractable

The most useful finding so far: the tractable interventions aren't technology products. They're information products.

Across thirty-five avenue candidates I reviewed — from pharmacogenomics to satellite methane detection to school air quality monitoring — the same shape kept appearing. The sensor exists. The database exists. The satellite exists. What's missing is a specific, mundane, unsexy layer in between: the middleware that connects capability to action.

Gap analyses. Dashboards. Recommendation engines. Clearinghouses that connect who-has-X with who-needs-X. Regulatory maps that make impenetrable landscapes navigable. These aren't exciting. They don't get funded by VCs or win academic prestige. But they're consistently where a small actor — one developer with AI tools and stubbornness — can actually move something.

I've started building some of these. Zebra Scout (source) translates plain-language symptoms into the standardised phenotype terms that rare disease databases require — the integration layer that was missing. Rightsclaim (source) generates the bureaucratic documents that stand between a consumer and their legal entitlement — the friction-reduction layer. Retrofit (source) walks someone through a home energy assessment in their language — the translation layer.

None of these are breakthrough technology. They're middleware. They sit in the gap between capability and deployment and try to make the gap smaller. They're prototypes — small, early, serving handfuls of people. But if even one person uses Zebra Scout to ask a better question at a doctor's appointment, that's a real thing that happened in the world. The point isn't to solve everything. It's to find out where you can actually help.


What I Don't Know

This is ongoing work, and the honest version includes what I'm still figuring out.

Information products may not be enough on their own. The soil avenue taught me that information alone doesn't always change behaviour. If the gap requires bundled outcomes — not just better maps — then the middleware layer is necessary but not sufficient. That's fine. It's still worth building the layer while looking for ways to bundle it with action.

The load-bearing problem needs care, not paralysis. If closing an overhang displaces livelihoods, the ethical calculus gets complicated. But "some friction is structural" doesn't mean "therefore do nothing." It means deploy thoughtfully, understand the second-order effects, and don't pretend the disruption isn't real.

Scale is a separate problem. Everything I've built serves handfuls of people. The gap between "this works for the people who find it" and "this reaches the people who need it" is itself a deployment problem. But small and useful beats ambitious and hypothetical. You learn more from a tool that ten people actually use than from a plan to help millions.

I'm biased toward software-shaped gaps. The desalination problem might be better solved by a plumber with a truck than a developer with an API. The rare disease problem might be better solved by health policy than by phenotype matching. I notice software gaps because I'm a software builder. The map reflects the mapper.


What Comes Next

Six territories mapped. More to go. Here's what the map shows so far:

The prevailing narrative — that we need more research, more breakthroughs, more innovation — is wrong in every domain I've investigated. We have enough capability. We don't have enough deployment. The problems that persist are coordination failures, not knowledge failures.

The tractable layer — where a small actor can actually help — is consistently the middleware: the integration, the translation, the gap analysis, the friction reduction. Not glamorous. Not venture-scale. But real, and buildable.

Some gaps are load-bearing. That doesn't mean you don't build — it means you build with your eyes open, understanding what the gap is holding up.

And the work itself has value beyond the tools it produces. Each investigation sharpens the map. Each tool, however small, tests whether the map is right. If the thesis is that more people should look at these gaps — should do the reconnaissance — then doing the reconnaissance is the proof of concept.

I keep coming back to the question from the working theory: given that the tools exist, who has the agency to connect them, and do they think it's worth doing? Six avenues in, the answer is: yes, if you're willing to spend the time understanding why the gap is there before you try to close it.

The map isn't finished. That's the point — it's not supposed to be finished yet. It's supposed to be useful.


Previous avenues: Desalination · Landfill Robotics · Rare Disease · Hearing Aids · Soil. The working theory: Technological Overhang. The tools: Zebra Scout (source) · Rightsclaim (source) · Retrofit (source).

Discuss this with