Milo is a yellow lab with a goofy smile and a red collar. As of February 8th, 2026, he was the most important dog in America. He was also entirely fictional. Amazon invented him for a thirty-second Super Bowl ad, gave him a family that loved him, and built an entire neighborhood of cameras to bring him home. A hundred million people watched the country come together to find a dog that was never actually lost, which is a remarkable achievement in collective emotion if you think about it for even five seconds.
Meanwhile, in Tucson, an 84-year-old woman had been missing for over a week, and she had a real camera on her real door, and for a while it didn’t help at all.
Thirty Seconds of Television
Ring’s Super Bowl ad was the company’s first, and the pitch was disarmingly simple. Your dog goes missing, you upload a photo to the Ring app, and Ring cameras in your neighborhood that have opted into a feature called Search Party start scanning for a match. AI-powered, real time, automated. The ad ends with Jamie Siminoff, Ring’s founder, telling you to “be a hero in your neighborhood.” There’s a kid hugging a dog and swelling music and it works exactly the way thirty seconds of television is supposed to work.
Except what the ad actually shows you, if you can manage to watch it without the soundtrack, is a networked AI-powered visual surveillance system activating across an entire residential area. There’s a bird’s-eye shot where every camera in the neighborhood lights up simultaneously, like a grid powering on. They showed you the architecture of the thing. They just made sure you were crying when you saw it.
And none of it was dishonest, which is what makes it so effective. The ad is a perfectly accurate depiction of what the technology does: a photo goes in, cameras activate, AI matches a visual profile against live footage, a target gets flagged, boxed, tracked, and reported. The only creative decision was what to put in front of the cameras first, and they chose the one thing on earth nobody could argue with, because objecting to finding Milo would make you sound like a monster, and everybody involved in making that ad knew it.
The Camera on the Door
Nancy Guthrie is 84 years old and lives alone in the Catalina Foothills north of Tucson. Her daughter is Savannah Guthrie, the co-anchor of the Today show. She was last seen on the evening of January 31st and reported missing on February 1st, a week before the Super Bowl.
She had a single Google Nest doorbell camera. Her kids had set it up so they could check that she got home safe, which is one of those small acts of love that turns out to matter enormously in retrospect. She didn’t pay for Nest Aware, the subscription tier that saves footage to the cloud. Without that subscription, the camera processes footage on Google’s servers briefly and then discards it, typically within a few hours.
Sometime in the early morning of February 1st, the power to her home was cut. The Nest camera was a wireless model with a small battery, so it kept running even after the house went dark. According to the Pima County Sheriff, it disconnected from WiFi at approximately 1:47 AM. At around 2:12 AM, the camera’s software detected motion.
The Pima County Sheriff’s Department said the camera footage couldn’t be recovered. No subscription, no saved video, data overwritten. End of story.
Then the FBI got involved, and the footage appeared.
On February 11th, FBI Director Kash Patel posted six black-and-white photos and three video clips on X. The images show a masked individual approaching Guthrie’s front door, armed, wearing gloves, sneakers, and a black Ozark Trail backpack. In one clip, the person raises a gloved hand to cover the camera lens. In another, according to CBS News, they appear to be holding a flashlight in their mouth while using nearby vegetation to further conceal the device. Patel said law enforcement had “uncovered these previously inaccessible new images” and that the video was “recovered from residual data located in backend systems.”
Residual data located in backend systems. In plain language, when Google’s servers processed the footage from Guthrie’s camera they marked it for deletion, but marking something for deletion and actually deleting it are two very different operations. Modern cloud infrastructure runs on distributed systems where data gets copied across multiple servers for redundancy, cached at various processing stages, and queued for analysis before it’s queued for removal. “Deleted” really just means “scheduled to be overwritten,” and in a distributed system, garbage collection doesn’t happen instantaneously or uniformly. Data lingers on servers the way dust settles in a house you think you’ve cleaned. If someone shows up with a warrant before every copy has been swept away, the data is still sitting there, waiting.
One cybersecurity expert told CBS News that Google’s servers may have detected the camera being tampered with and automatically flagged that last piece of footage for extended retention—a kind of tamper mode that could keep data alive well past its normal deletion window. Google hasn’t confirmed this, but the implication is significant: “deleted” may never have meant what any of us thought it meant.
The Ladder
The most important detail in the Guthrie story is the one being treated as a footnote. Local law enforcement said the footage was gone. The FBI recovered it. That gap is structural. It’s how the system was designed to work.
Think of it as a ladder. At the bottom rung, the individual homeowner: you can see your own porch, review your own footage if you pay for a subscription. One step up, the neighbors, who can see what people in their area are posting through Ring’s Neighbors app or voluntarily share footage with local police. Above that, local law enforcement, who can submit requests through official channels and serve warrants. And then above that, federal law enforcement, which can go to a company’s backend infrastructure with warrants, subpoenas, and national security letters, and extract data that everyone below them was told didn’t exist.
This is what tends to get lost in the usual debate between the “nothing to hide” crowd and the conspiracy crowd, both of whom are arguing about the wrong thing. It’s a gradient. Tiers of access to the same data, arranged by power, and you don’t get to know which tier applies to you until the moment someone decides to look.
The hierarchy becomes more concrete when you examine the infrastructure that almost expanded it further.
In October 2025, Ring announced a partnership with Flock Safety. For those who haven’t had the pleasure, Flock makes automated license plate readers: solar-powered cameras installed on streets and in neighborhoods all across the country, photographing every vehicle that passes by. Software reads the plate and logs it in a searchable, centralized database with the plate number, make, model, color, time, and location. The system checks every plate against the National Crime Information Center and local police watchlists in real time. Flock operates in over five thousand communities and performs more than twenty billion vehicle scans a month.
The Ring-Flock partnership would have integrated Flock’s license plate network with Ring’s Community Requests feature, the system that lets police departments ask Ring users to share doorbell footage. If it had launched, home camera footage from millions of doorbells would have been feeding into the same ecosystem as street-level vehicle tracking data, all accessible to participating law enforcement through a single pipeline.
Then came the Super Bowl ad, and the backlash. The ad put Ring’s surveillance capabilities on a screen in front of 125 million people at the exact moment the Flock partnership was already drawing fire from the EFF, the ACLU, and privacy advocates alarmed by reports that local police had been searching Flock’s camera network on behalf of ICE. On February 12th, Ring announced the partnership was done.
The coverage mostly framed this as a victory, which is understandable and also somewhat beside the point. The Flock partnership never actually launched. No footage was ever shared between the two systems. Ring still partners with Axon, another major police technology company. Community Requests is still active. The mechanism for sharing doorbell footage with law enforcement remains fully intact. The most controversial brand name got peeled off the side of an arrangement that is otherwise exactly what it was before. The infrastructure didn’t change. The press release changed.
The Proof of Concept
There’s something in the choice of a dog specifically that goes deeper than emotional manipulation, though it is obviously that too.
The entire premise of Search Party is that a living thing moving freely through a neighborhood is a problem to be solved. Milo is out there somewhere, untracked, unaccounted for, and that constitutes an emergency. When you frame it that way—as a body in unmonitored motion being treated as a body out of place—the dog starts to feel less like a mascot and more like a proof of concept.
The scholar Simone Browne wrote about this in Dark Matters: On the Surveillance of Blackness, and her work traces a line that most surveillance discourse is too polite to draw. In colonial New York, there were laws that required Black and Indigenous people to carry lit candles after dark whenever they walked through the city. They were called lantern laws, and the stated purpose was public safety, and the actual function was that an entire category of people was required to make themselves visible, to carry their own light source so that they could be identified and monitored by anyone who wanted to look. Browne argues that this is the template. Modern surveillance technologies, from CCTV to facial recognition to the Ring doorbell on your neighbor’s porch, inherit a logic that was built from the very beginning around the idea that certain bodies moving through public space need to be seen, identified, and accounted for. The technology changes constantly. The assumption underneath it doesn’t.
In the American South, slave patrols were among the first organized policing structures in the country, and they existed for one specific purpose: to monitor the movement of Black people through public space, to stop them and demand proof that they were where they were supposed to be. Growing up in the Mississippi Delta, you don’t have to go looking for this history. It’s in the layout of the towns, the architecture of visibility built into the geography itself long before anyone put a camera on a doorbell. GPS ankle monitors, no-fly lists, sex offender registries, the entire framework of “papers, please”: the through-line across centuries is remarkably consistent. A body moving through space without being tracked is a body the system treats as a threat.
Ring took that entire history and put a wagging tail on it. The lost dog is the friendliest possible version of the same underlying logic: that the natural state of a neighborhood is one in which everything moving through it is identified and accounted for, and anything unidentified is a problem the cameras should solve. The ad doesn’t just normalize the technology. It normalizes the premise.
Milo was never the point. Milo was just the reason you agreed to turn the cameras on.
Carry Your Own Lantern
On February 9th, one day after Ring’s ad aired, Discord announced that starting in March, every user on the platform worldwide would be defaulted to a restricted, teen-appropriate experience unless they verified their age. The options: let the app scan your face, or upload a photo of your government-issued ID. If you don’t comply, you lose access to servers, channels, direct messages, and features you may have been using for years. Discord has over 200 million monthly users. This is one of the largest rollouts of mandatory identity verification in the history of consumer technology.
Discord framed it as child safety. “Teen-by-Default,” they called it, announced on Safer Internet Day. And just like Ring’s ad, the justification is real. Children do need protection online. What the mechanism actually requires, though, is that you make yourself visible to the platform, by face or by papers, as a condition of full participation. If you can’t or won’t produce identification, your access gets restricted. You are, in the most literal sense possible, being asked to carry your own lantern.
Here is the detail that makes the whole thing feel like a parable someone would invent if they were trying to be heavy-handed about it, except it actually happened. On October 3rd, 2025, four months before this global rollout, Discord disclosed that hackers had breached 5CA, a third-party vendor the company used for age verification appeals. The attackers stole at least 70,000 government ID images. Discord’s response to a breach that exposed tens of thousands of identity documents collected through age verification was to expand age verification to every user on the planet.
The Electronic Frontier Foundation pointed out that Discord didn’t have to do this globally, for every user, in jurisdictions that don’t require it. Other platforms have fought age verification laws in court. Reddit sued. Pornhub blocked entire states rather than comply, which is a sentence that reveals something interesting about relative corporate courage. An EFF analyst referenced Timothy Snyder’s On Tyranny and argued that Discord was voluntarily imposing surveillance infrastructure that most of its users aren’t legally subject to, because it’s cheaper to build one system for everyone than to fight the laws country by country.
The EFF found that about fifteen million adult U.S. citizens don’t have a driver’s license. Another 2.6 million have no government-issued photo ID at all. Black and Hispanic Americans are disproportionately less likely to have current identification. Undocumented immigrants often can’t obtain state IDs. For 43% of transgender Americans who lack ID reflecting their correct name or gender, age verification forces an impossible choice between submitting documents with a dead name or losing access to their community. The people most likely to be locked out are, consistently, the same populations that Browne traces through the history of surveillance: the people the system was designed to watch in the first place.
Legibility
There’s a book that explains the deeper pattern better than any of the Super Bowl discourse has managed, and it was written almost thirty years ago by a political scientist and anthropologist named James C. Scott.
Seeing Like a State is about one question that has obsessed governments for as long as governments have existed: how do you make a population visible enough to manage? Scott calls this legibility—the process of taking something complex, local, and organic and reorganizing it into something the state can read, count, and control.
Take surnames. Before the modern era, most people didn’t have fixed last names in any modern sense. Naming conventions were local and fluid. You might be known by your father’s name, or your occupation, or a nickname, and these shifted across generations because they were meant to be useful to the community—which is a different thing entirely from being useful to a tax collector hundreds of miles away. But states need fixed identities. You can’t tax someone you can’t find twice, you can’t conscript someone whose name changes every generation. So governments imposed standardized, permanent, hereditary surnames. The Spanish colonial administration distributed a catalog of approved surnames to entire provinces in the Philippines, literally assigning names from a book so the population could be sorted, indexed, and made administratively readable.
Or take forests. Before scientific forestry emerged in 18th-century Prussia, a forest was an impossibly complex, layered ecosystem that served dozens of functions for the people who lived near it. The state looked at that same forest and saw a mess it couldn’t measure or efficiently extract revenue from. So it invented the managed forest: neat rows of single-species trees, optimized for timber yield, stripped of everything that wasn’t legible from a ledger. The first generation produced more timber than the wild ones ever had. But within a generation the monocultures began to fail, because all the complexity that had been eliminated turned out to be what made the forest a forest.
Scott traces this pattern across dozens of cases: grid-planned cities that turned out to be perfectly unlivable, agricultural programs that destroyed the farming practices they were supposed to modernize. The architecture is always the same. A state decides the existing arrangement is too messy to govern efficiently, imposes a scheme of simplification from above, and in the process destroys the thing that made the existing arrangement work. The scheme almost always arrives wearing something friendly: efficiency, modernization, progress, public safety.
The Product
Scott was writing about states — governments with centralized, identifiable power. The version of legibility we’re living inside now is one he didn’t quite anticipate, because the pattern succeeded so completely that it evolved.
The legibility project got privatized.
Nobody experiences buying a Ring doorbell as submitting to a regime of visibility. They experience it as getting an alert when a package arrives. Nancy Guthrie’s family set up her camera as an act of love. The tools of legibility have been so completely folded into consumer life that they’ve stopped registering as tools of legibility at all. They feel like convenience. They feel like care. They feel like a doorbell that happens to have a camera in it.
And the entity doing the seeing is no longer cleanly the state. Amazon builds Ring cameras. Google builds Nest cameras. Flock aggregates license plates. Axon equips the police. Oracle runs backend infrastructure for federal agencies. Palantir integrates with law enforcement databases. These aren’t one unified system. They’re parallel ecosystems, each with its own data, its own terms of service. They all connect to the same law enforcement pipeline, though, through contracts, procurement agreements, and memoranda of understanding that most people will never read. The access hierarchy doesn’t live inside any single company. It lives in the spaces between them.
The state doesn’t need to build a panopticon anymore. The surveillance infrastructure gets built by private companies, funded by consumers, installed voluntarily on tens of millions of homes, and pointed outward at public streets, and then when the state needs to see something, it asks. Or it issues a warrant. Or, as in the Guthrie case, it sends the FBI to compel data from backend servers that everyone else had been told were empty.
The Korean-German philosopher Byung-Chul Han calls this psychopolitics. The old model of power forced you to be watched. The new power seduces. It doesn’t need to force you to install a camera because it’s figured out how to make you want one, and Han argues this is actually more totalizing because coercion always produced resistance. People knew they were being watched and they pushed back. The seductive model produces compliance that doesn’t recognize itself as compliance. You’re not submitting to surveillance. You’re setting up your new doorbell.
Han’s model has a limit, though, and Discord shows you exactly where it is. For years, Discord seduced. Then one day it told 200 million people: submit your face or your papers, or we restrict what you can do here. The seduction gave way to the demand, and the demand looked exactly like the lantern laws Browne described, just routed through a vendor partner instead of a colonial magistrate. The power doesn’t stay invisible forever. Eventually, the product tells you what it actually needs from you.
The Loop
Which brings us back to the ad, and back to Nancy Guthrie. Scott was describing states that wanted to make populations visible enough to tax and conscript. Milo’s cameras, though, aren’t just recording. They’re scanning, matching, anticipating. And the footage from Guthrie’s door wasn’t just stored—it was processed through systems sophisticated enough that the FBI could recover images from data everyone else said was gone.
In 2019, Shoshana Zuboff published The Age of Surveillance Capitalism, and her argument takes the whole thing somewhere Scott never went. Scott’s states wanted to see you so they could manage you. Zuboff argues that the companies building today’s surveillance infrastructure want to see you so they can predict you, and eventually shape your behavior before you’ve even decided what you’re going to do.
Her argument is that what Google and Amazon and the major tech platforms have built is a new economic logic in which human experience itself—what you look at, where you walk, what you say near your doorbell camera, how long you linger on a screen—gets converted into behavioral data. That data becomes raw material for prediction products sold to whoever wants to know what you’re going to do next. The user isn’t the customer. The user is the mine.
Consider Ring’s Neighbors app. You buy a camera because you want to see your porch. Ring gives you Neighbors, a feed of crime alerts and suspicious activity from your whole area, curated by algorithm and pushed to your phone. Ring is taking the surveillance data generated by millions of cameras and feeding a filtered version of it back to the same people who generated it. And what does that feed do to you? It makes you more alert, more cautious, more likely to check the app again, more likely to share when the police ask, more likely to tell your neighbor they should get a camera too. The version of your neighborhood that the Neighbors app shows you is not the neighborhood you actually live in. It’s a highlight reel of everything that went wrong within a five-mile radius, arriving on your phone in real time, all day, every day.
So your behavior changes. You stop letting the kids walk to the park alone. You upgrade to the subscription because now you want your footage saved. And every one of those changes gets fed back into the system as more data, which makes the predictions more accurate, which makes the alerts more targeted, which makes you more anxious, which makes you more dependent on the cameras. The anxiety that justifies the surveillance is itself a product of the surveillance, and that’s the business model. The loop closes and it does not reopen, because every point on the loop is a revenue event for someone.
The Trade
There’s one more thing to sit with, and it’s the thing that makes all of this genuinely difficult rather than merely alarming. The easy version of this essay would just be “surveillance bad,” and that’s not what’s going on here.
Before Ring and Nest and Flock, home security was something rich people had: monitored alarm systems with monthly contracts that cost more than some people’s phone bills, or private security patrols in gated communities. A Ring doorbell costs $99. A Nest camera costs even less. For the first time in history, a single mother in an apartment complex and a family on a suburban cul-de-sac and an elderly woman living alone in Tucson all have access to the same basic security infrastructure, and whatever you think about where that data goes, the democratization of that access is not nothing.
Ring cameras have provided critical evidence in cases ranging from porch theft to kidnapping to the shooting near Brown University in December 2025, where seven neighbors responded to Community Requests within hours, sharing 168 videos. Flock’s readers have helped recover stolen vehicles. Insurance companies offer discounts to homeowners with doorbell cameras. For communities historically underserved by police — where a 911 call might take forty-five minutes to get a response — a camera that records what happened at least gives you something to hand to a detective when they finally show up.
These are real benefits. People who buy these cameras are making a rational calculation that the security is worth whatever abstract privacy they’re giving up, especially when the alternative is no security at all. The idea that everyone who owns a Ring doorbell has been tricked into building a surveillance state is condescending and wrong.
The problem has never been the justification. The problem is that the infrastructure built to serve the justification doesn’t disappear when the justification is satisfied. The cameras that find Milo don’t turn off when Milo comes home. The backend servers that held Nancy Guthrie’s footage don’t stop holding yours just because her case gets resolved. The $99 camera that gives a single mother peace of mind also gives whoever sits at the top of the access hierarchy a feed of everyone who walks down her street, and that variable isn’t listed on the box.
Every expansion of this infrastructure comes attached to an emotional justification so compelling that questioning the infrastructure feels like opposing the justification. You can’t worry about Search Party without sounding like you hate dogs. You can’t question the Guthrie footage without seeming callous toward a missing grandmother. You can’t push back on age verification without being the person who doesn’t care about kids online. The human need that the system serves becomes the shield the system hides behind, and the infrastructure grows while everyone argues about the dog.
The Modern Forest
An estimated 27% of American households now have a doorbell camera, and that number increases every year. We are building, voluntarily, the most comprehensive visual surveillance network in human history, motivated by the entirely genuine desire to feel safe and keep the people we love safe. The legibility project that Scott spent four hundred pages warning about is functionally complete, and it arrived as a product. It arrived as care. It arrived as a $99 doorbell and a thirty-second ad about a dog that never existed, and what we traded for the feeling of safety is a kind of complexity we might not be able to name yet: the complexity of a life lived without every movement being recorded, processed, and made available to a hierarchy of watchers whose full shape we can’t see.
There may not be a word yet for what was actually exchanged. The ordinary, unremarkable experience of walking down your own street without being part of anyone’s dataset. The ability to move through the world without being read.
As for Milo: yes, everyone saw Milo. Every camera in the neighborhood saw him. That was the whole point. The system worked exactly as designed. The dog was found, the kid stopped crying, the music swelled, and a hundred million people felt the warm glow of a problem solved.
The cameras didn’t turn off when Milo came home. They’re still on. They’re still watching. And the question was never really whether anyone had seen Milo.
The question is who else they saw while they were looking.
For the deeper infrastructure story, start with “Who TF Am I Working For??” Part 1.
Sources Cited
Nancy Guthrie Disappearance & FBI Footage Recovery
CBS News – reporting on FBI recovery of Nest doorbell footage, cybersecurity expert comments on tamper mode and extended retention — includes “residual data located in backend systems” detail.
https://www.cbsnews.com/news/cybersecurity-experts-nancy-guthrie-surveillance-footage-recovery/
Pima County Sheriff’s Department
CBS News – initial statements confirming footage could not be recovered — no subscription, data overwritten.
https://www.cbsnews.com/news/nancy-guthrie-disappearance-video-analysis-fbi/
Ring – Partnership Cancellation Announcement (Feb. 12, 2026)
Ring Blog – partnership cancellation announcement (Feb. 12, 2026) — cited as joint decision, resource constraints.
https://blog.ring.com/about-ring/ring-and-flock-cancel-partnership/
Flock Safety — License Plate Reader Network
Flock Safety – company operates in 5,000+ communities, performs 20+ billion vehicle scans per month, real-time checks against NCIC and local watchlists. https://www.flocksafety.com
Ring Community Requests / Axon Partnership
Axon Blog – Ring/Axon ongoing partnership for police technology integration, including Community Requests feature allowing law enforcement to request doorbell footage from Ring users.
https://www.axon.com/blog/building-safer-communities-together-axon-and-ring
Brown University Shooting — Community Requests Case Study
The Verge – Providence Police Department used Ring Community Requests following shooting near Brown University (Dec. 2025) — seven neighbors responded within hours, sharing 168 videos.
https://www.theverge.com/news/878447/ring-flock-partnership-canceled
Discord Age Verification Rollout
TechCrunch – Discord “Teen-by-Default” announcement (Feb. 9, 2026) — mandatory age verification for all 200M+ monthly users worldwide, via face scan or government ID, announced on Safer Internet Day.
https://techcrunch.com/2026/02/09/discord-to-roll-out-age-verification-next-month-for-full-access-to-its-platform/
Discord / 5CA Data Breach
Bitdefender – disclosure of 5CA vendor breach (Oct. 3, 2025) — hackers (”Scattered LAPSUS$ Hunters”) stole 70,000+ government ID images from age verification appeals.
https://www.bitdefender.com/en-us/blog/hotforsecurity/discord-data-breach-5ca-leak-70000-ids
Reddit – Age Verification Legal Challenge
CBC News – Reddit lawsuit challenging Australia’s under-16 social media ban and age verification law.
https://www.cbc.ca/news/world/reddit-lawsuit-austalia-social-media-ban-9.7012795
Pornhub – Age Verification / State Blocking
PCMag – Pornhub blocked access in multiple U.S. states rather than comply with age verification mandates.
https://www.pcmag.com/news/pornhub-blocked-23-us-states-france-uk-how-to-watch-anyway-free-vpn
Books & Academic Works
Simone Browne – Dark Matters: On the Surveillance of Blackness (Duke University Press, 2015)
https://www.dukeupress.edu/dark-matters
James C. Scott – Seeing Like a State (Yale University Press, 1998)
https://yalebooks.yale.edu/book/9780300078152/seeing-like-a-state/
Shoshana Zuboff – The Age of Surveillance Capitalism (PublicAffairs, 2019)
https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/
Byung-Chul Han – Psychopolitics: Neoliberalism and New Technologies of Power (Verso Books, 2017)
https://www.versobooks.com/products/132-psychopolitics
Timothy Snyder – On Tyranny: Twenty Lessons from the Twentieth Century (Tim Duggan Books, 2017)
https://www.penguinrandomhouse.com/books/558816/on-tyranny-by-timothy-snyder/
Corporate Infrastructure — Oracle / Palantir
Oracle Newsroom – Oracle and Palantir formal partnership to deliver AI solutions to governments and enterprises (Apr. 3, 2024).
https://www.oracle.com/news/announcement/oracle-and-palantir-join-forces-to-deliver-mission-critical-ai-solutions-to-governments-and-enterprises-2024-04-03/












