You scanned a PokéStop. You got 500 Stardust and a few Poké Balls. Niantic got a centimeter-accurate 3D reconstruction of a street corner that now teaches robots how to not hit fire hydrants.

That’s the deal nobody agreed to.

Last week, news broke that Niantic — the company behind Pokémon Go — had spun out a subsidiary called Niantic Spatial and partnered with Coco Robotics, a last-mile delivery robot company. The asset being transferred isn’t code or cash. It’s a map. A map built from over 30 billion images collected by Pokémon Go players who were scanning real-world locations for in-game rewards.

Players thought they were catching Pokémon. They were actually building the most detailed pedestrian-scale map of the world ever assembled — and now it’s driving robots.

The Partnership Nobody Saw Coming

Niantic Spatial isn’t a side project. It’s a separate company purpose-built to commercialize the spatial data Niantic has been collecting for years. The Coco Robotics partnership is the first major public use case: Coco’s delivery robots will navigate sidewalks using Niantic’s Visual Positioning System (VPS) — a technology trained on the very same scans players submitted to PokéStops and Gyms.

The logic is straightforward. GPS tells you roughly where you are on a planet. VPS tells you exactly where you are relative to a specific doorway, bench, or curb cut — down to the centimeter. That precision is what separates a robot that delivers a burrito from one that drives into a flower bed.

Coco’s robots already operate in several US cities, navigating sidewalks autonomously to deliver food and packages. Until now, they relied on a patchwork of LIDAR, GPS, and manually annotated maps. Niantic’s dataset gives them something no competitor has: a crowdsourced, continuously updated, ground-level view of the world taken from the exact perspective a sidewalk robot needs — not from a satellite, not from a car’s roof rack, but from a human on foot.

How 30 Billion Images Became a Robot Brain

Here’s what was actually happening every time you scanned a PokéStop.

Starting around 2020, Niantic began asking Pokémon Go players to record short 30-second videos of real-world locations — called PokéStop Scans. The pitch was simple: submit a scan, get in-game rewards. Extra items, bonus XP, occasional rare spawns. Most players didn’t think twice about it.

But those scans weren’t just video. Niantic’s systems processed them through photogrammetry pipelines — reconstructing dense 3D point clouds of each location from multiple angles captured across thousands of individual player submissions. A single PokéStop might have been scanned by hundreds of players over months, each contributing different perspectives, lighting conditions, and seasonal variations.

The result is what Niantic calls its Large Geospatial Model — a neural network trained on 30 billion images covering roughly 10 million mapped locations worldwide. Unlike Google Street View (photographed from a car at street level once every few years) or satellite imagery (which sees rooftops, not sidewalks), Niantic’s data captures the pedestrian layer of reality. Where the sidewalk actually is. Where the tree overhangs. Where the construction barrier appeared last Tuesday.

This is exactly the information a sidewalk robot needs — and exactly the information almost nobody else has at this scale.

Let’s be clear about what the Pokémon Go terms of service say. Niantic’s ToS grants broad rights to user-generated content, including scans, for “operating, improving, and developing” its services. The word “services” is doing heavy lifting — because apparently “services” now includes licensing navigation data to third-party robotics companies.

Did any player explicitly consent to their scans being used to train delivery robots? No. Did Niantic technically have the legal right? Probably. But legal permission and informed consent are not the same thing.

The Reddit thread on r/artificial hit 473 points. The discourse isn’t just about Niantic — it’s about the crowdsourced data pipeline becoming a standard pattern in AI. Users contribute data for one purpose (fun, rewards). Companies repurpose it for something the user never imagined (training robots). Pokémon Go is the most visible example because the gap between “catching Pikachu” and “training delivery robots” is absurd.

Why This Map Is Different From Everything Else

If you’re wondering why Coco doesn’t just use Google Maps or OpenStreetMap, here’s the distinction that matters.

Google Street View captures roads from cars — it misses sidewalks, bike racks, and temporary obstacles. It also updates infrequently.

LIDAR maps are precise at capture but brittle. New construction or seasonal changes break them.

Niantic’s dataset is crowdsourced and continuously updated. When a PokéStop gets scanned fifty times a month, the system builds a living, evolving representation of that location — not a snapshot, but a time-lapse.

For a sidewalk robot that needs to navigate a world where construction zones appear overnight and cafe tables expand onto the pavement every spring, that living map is worth more than any static dataset.

What Happens When Players Find Out

The backlash is still forming, but the trajectory is clear. This story got picked up not just by gaming outlets (IGN, Eurogamer, PopSci) but by AI and technology communities (Slashdot, r/artificial, tech Twitter). It crosses those boundaries because it sits at the intersection of several anxieties people already have:

  • Data harvesting under the guise of gameplay. Players weren’t told their scans would power commercial robotics. The reward framing (“scan for items!”) looks increasingly like a transaction where one party didn’t know the real price.

  • Consent theater. The ToS technically covers this. Nobody reads the ToS. Everyone knows nobody reads the ToS. Pointing to clause 4.2(b) doesn’t make the practice less deceptive.

  • The asymmetry of value. Players contributed billions of images worth potentially hundreds of millions of dollars in training data. They received Stardust.

Niantic isn’t doing anything illegal. But the court of public opinion doesn’t operate on legal technicalities — and “we told you in paragraph 47 of the terms of service” has never satisfied a crowd that feels played.

The Bigger Pattern

Strip away the Pokémon branding and what you see is a template being replicated across the tech industry:

Waze users contribute traffic data that trains Google’s navigation models. ReCAPTCHA users label images for free that train computer vision systems. The pattern is consistent: build a product people want → collect behavioral data passively → repurpose that data for higher-value commercial applications → disclose it in terms of service nobody reads.

Pokémon Go is just the version where the gap between the product (catching fictional monsters) and the repurposed output (training delivery robots) is so dramatic that people notice.

What You Can Actually Do

If you’re a Pokémon Go player, you’re probably not going to quit over this. The game is fun. Your friends play it. Quitting feels disproportionate.

But there are concrete, non-performative steps worth taking:

  1. Stop scanning. You can opt out of PokéStop Scans. The in-game rewards aren’t worth what you’re contributing.

  2. Read Niantic’s privacy policy — not the ToS, the actual privacy policy. It explains what they do with spatial data more clearly than most.

  3. Watch the Niantic Spatial IPO. This spinout is almost certainly being positioned for a major funding round or public offering. When it happens, your 30 billion images will be the core asset being valued. Know what you built.

The Verdict

Niantic pulled off something genuinely clever. They convinced millions of people to build a commercially valuable spatial dataset by wrapping it inside a game people already wanted to play. The Coco Robotics partnership proves the dataset works — and it’s only the first customer.

The problem isn’t the technology. A sidewalk robot that navigates without hitting people is good. The problem is that the dataset was built by people who didn’t know they were building it, for purposes they never agreed to, in exchange for rewards that don’t reflect the value they created.

The next time a game asks you to scan something for a bonus, ask yourself: what are they actually building?


Found this useful? You might also like AI Agents Are Everywhere, but Which Ones Are Genuinely Useful? and How to Build a Practical AI Workflow Without Wasting Money.