On March 1, 2026, Iranian drones hit three Amazon data centers in the UAE and Bahrain. These were not military bases or weapons depots but buildings full of servers, the kind that process bank transactions, run ride-hailing apps, and store hospital records. Careem went down. Financial systems across the Gulf went dark. AWS told its customers to migrate their data to other regions immediately because the situation was, in their word, “unpredictable.”
Most people saw the headline and kept scrolling.
Here’s what they missed: those buildings weren’t just running civilian applications. They were running the Pentagon’s AI too, on the same servers, in the same physical facilities, and nobody had been told: not the governments hosting those buildings, not the customers whose data lived inside them, not the communities surrounding them. This wasn’t a bug in the system. It was a feature. And we are only now beginning to understand what that means.
So let’s actually talk about it.
First, let’s do some math.
When most people hear “data center,” they picture the infrastructure behind the apps they use every day: ChatGPT, Google, Netflix. Those things do live in data centers, but a single ChatGPT query uses about 0.3 watt-hours of energy. Even with 700 million weekly users, consumer AI is a rounding error on the power grid.
So why is Amazon building a campus in Indiana that will consume 2.2 gigawatts of electricity, roughly equivalent to powering half the homes in the entire state? Why is Meta planning a facility in Louisiana drawing 5 gigawatts, roughly half the electricity consumption of New York City? Why is OpenAI’s Stargate facility in Abilene, Texas already running at 200 megawatts and scaling toward 1.2 gigawatts?
Nobody running consumer applications needs that much compute. These are training and inference clusters built for frontier AI at a scale that only makes sense when you understand who the actual customer is. The military AI market is growing from $9.2 billion in 2023 to nearly $39 billion by 2028, a 33% annual growth rate. The Pentagon awarded a $9 billion contract in 2022 to AWS, Google, Microsoft, and Oracle to run cloud services at every classification level, from unclassified traffic to Top Secret. The Army has mandated that all new cloud procurement run through that contract. The Pentagon has also announced plans to build AI data centers on four military bases, specifically Fort Hood, Fort Bragg, Fort Bliss, and Dugway Proving Ground, through private partnerships with Google, Amazon, Oracle, Microsoft, and SpaceX.
When you see a gigawatt data center going up in rural America and the company won’t tell your mayor what it’s for, you are allowed to ask the question. Maybe it’s training the next generation of consumer AI. Or maybe it’s training the system that just helped select targets in Iran. The point is you don’t know, and nobody is legally required to tell you.
The law has a canyon-sized hole in it.
Data centers have zero explicit protection under international humanitarian law. The Geneva Conventions were written in 1949 and the Additional Protocols came in 1977, and nobody drafting those documents was thinking about cloud computing. What they were thinking about was a clear distinction between military infrastructure and civilian life, and they built an entire legal framework around maintaining that separation.
That framework has a specific vulnerability that matters here. Under Article 52(2) of Additional Protocol I, a civilian object becomes a lawful military target the moment it makes an “effective contribution to military action.” The rule is not ambiguous. The second the Pentagon runs classified AI on a commercial data center, that building, your bank’s building, the one holding your hospital records, has a legal argument for being a legitimate target.
Oona Hathaway at Yale Law School published a major study last year called “The Dangerous Rise of Dual-Use Objects in War.” She built an original dataset tracking how the United States has progressively expanded what counts as a dual-use target, from bridges in 1991 to apartment buildings in Syria. Data centers are the next domino in that sequence, and we walked right into it.
The part that doesn’t get enough attention is that the law puts obligations on defenders too. Article 58 of the same protocol requires parties to a conflict to keep military assets away from civilian populations, not embed military operations inside civilian infrastructure. The United States did the exact opposite, deliberately, through the JWCC contract, through Palantir, through a series of policy choices that nobody asked the public to weigh in on. When you hide your weapons program inside a commercial building and that building gets hit, the legal and moral question of who bears responsibility for the civilian data destroyed inside it is genuinely unresolved. The experts who wrote the Tallinn Manual, NATO’s framework for cyber warfare, couldn’t even agree on whether data itself counts as a protected object under the laws of war, meaning a strike that destroys every medical record, every bank file, every photo stored in a bombed facility might not legally register as harming a civilian object at all.
The Red Cross has formally called on governments to update these rules. The UN Security Council unanimously passed Resolution 2573 condemning attacks on civilian infrastructure. None of it is binding, none of it defines data centers, and none of it stopped those drones.
The Anthropic timeline.
I want to walk you through a specific sequence of events, because it illustrates exactly how this works in practice and how fast it moves.
In July 2025, Anthropic, the company that makes Claude, an AI system explicitly designed with safety guardrails, received a $200 million contract to run on classified military networks through Palantir. Seven months later, on February 24, 2026, Defense Secretary Hegseth sent Anthropic a document demanding full, unrestricted access to Claude for any lawful military purpose. The phrase “any lawful military purpose” included lethal targeting: the use of AI to assist in selecting targets for military strikes.
Anthropic said no. They wanted to maintain guardrails specifically against autonomous weapons and domestic surveillance. On February 27th, Hegseth’s deadline passed without agreement, and the Pentagon designated Anthropic a “supply chain risk to national security,” a designation normally reserved for Chinese companies suspected of espionage.
The following day, February 28th, the United States and Israel launched Operation Epic Fury against Iran. The day after that, on March 1st, drones hit the AWS facilities, the same commercial infrastructure still running Anthropic’s military AI in support of U.S. Central Command operations. The civilian data sitting alongside those workloads, banking records, business applications, personal data belonging to millions of people across the Gulf, was collateral damage in a conflict they had no idea they were part of.
An AI company tried to draw a line around autonomous killing. The Pentagon called them a national security threat for drawing it. Days later, the building hosting that argument got bombed. That is the timeline, not editorializing.
I’ve been making videos about underground data centers and the strategic importance of hardened infrastructure for months now, and I got called a conspiracy theorist for it. The specific argument I made, that if AI warfare requires data centers then data centers become military targets for any adversary that can’t compete technologically, turned out not to be a conspiracy theory. It turned out to be CNBC’s headline on March 6th, 2026.
I’m not telling you that to be smug about it. I’m telling you because the gap between “fringe observation” and “mainstream reality” on this topic closed in about six months, and there are a lot of other observations that haven’t closed yet.
It’s already happening here.
The instinct might be to treat this as a Middle East problem, a distant conflict with distant infrastructure, not immediately relevant to daily life in America. That instinct is wrong.
In Hermantown, Minnesota, elected officials signed non-disclosure agreements before they were even allowed to learn what was being built on 200 acres of local land. The project was internally coded “Project Loon.” Residents couldn’t find out what company was behind it, and their own elected representatives were legally barred from telling them. When they eventually found out it was Google, a community group called Stop the Hermantown Data Center formed almost immediately. A state lawmaker proposed a bill banning officials from signing NDAs on data center projects. It failed.
In Doña Ana County, New Mexico, a predominantly Latino community of 15,000 people discovered after the fact that their new data center was part of the $500 billion Stargate Project, the OpenAI and Oracle joint venture that has received enthusiastic promotion from the current administration. In Prince George’s County, Maryland, a resident named Taylor Frazier McCollum organized 20,000 petition signatures to block a hyperscale data center one mile from her home; the county responded with a six-month moratorium. In New Brunswick, New Jersey, hundreds of residents packed a city council meeting and forced the council to scrap a planned facility entirely. They got a park instead.
The Minnesota Center for Environmental Advocacy has filed multiple lawsuits against cities that advanced data center projects with inadequate transparency and environmental review. Over 230 organizations have called for a nationwide moratorium on new construction. Senator Liz Krueger introduced legislation in New York for a two-year halt. Community opposition to data centers surged 125% in a single quarter of 2025.
These communities are fighting over water consumption, noise, power grid strain, and the fact that their elected officials signed legal documents preventing them from sharing basic information with the people who voted for them. They don’t yet know they may also be living next to undisclosed military infrastructure with a target profile that nobody disclosed. When that conversation arrives, and it will, the people already organizing are going to have a lot of very pointed questions.
So what do we actually do?
The conversation we should have had five years ago needs to happen now, with more urgency and less deference to the industry that created this situation.
The first fix is straightforward: military workloads need to be segregated from civilian cloud infrastructure. If the government wants to run AI targeting systems, it should build dedicated facilities on military land, behind military security perimeters, with appropriate classification controls. The Pentagon is already moving in this direction with its military base proposals, which means it knows this is the correct architecture. The remaining question is whether they will also commit to removing classified workloads from commercial infrastructure, or simply add military-dedicated capacity while continuing to use AWS for everything else because it’s cheaper. That loophole needs to be closed through legislation, not left to the discretion of procurement officers.
Second, the NDA problem is not a gray area. An elected official who cannot tell their constituents what is being built on 200 acres of local land because they signed a corporate confidentiality agreement has had their representative function removed from them. That should be illegal at the federal level, with real penalties for developers who make community engagement conditional on official silence. Communities also deserve to know when facilities near them are operating under military contracts. Not the classified details, but the basic fact that the risk profile of a nearby building has changed.
Third, international law needs to catch up to the infrastructure we’ve already built. Microsoft’s Brad Smith called for a Digital Geneva Convention in 2017. The Red Cross has been asking for binding protections for civilian data infrastructure for years. These are not radical proposals; they are the logical extension of protections we already extend to hospitals, water systems, and power grids. Data centers that serve civilian populations should carry protected status under international humanitarian law, and the United States should be driving that negotiation rather than quietly benefiting from the ambiguity that lets it embed military operations inside civilian buildings without legal consequence.
And finally, a pause. The over 230 organizations calling for a moratorium are right. We are building this infrastructure faster than we are writing the rules for it. A temporary halt on new construction is not an attack on technological progress; it is the same precautionary logic we applied to nuclear power plants, to chemical facilities, to every category of infrastructure where we decided the stakes were high enough to require rules before scaling. We are clearly past that threshold. The drone strikes proved it.
This conversation needed to happen before a building in Bahrain got bombed. It absolutely needs to happen before one gets bombed in Virginia. The families living next to these facilities didn’t agree to live next to military targets. The communities that signed moratoriums and filed lawsuits and packed city council meetings didn’t know the half of what they were actually fighting. And the international legal system designed to protect civilians from exactly this kind of exposure has a canyon-sized hole in it that nobody in power has any obvious incentive to close.
That’s the conversation. And the fact that we’re only having it now, after the drones, is itself part of the problem.
SOURCES CITED
Amazon cloud unit’s data centers in UAE, Bahrain damaged in drone strikes — Reuters, March 2, 2026
https://www.reuters.com/world/middle-east/amazon-cloud-unit-flags-issues-bahrain-uae-data-centers-amid-iran-strikes-2026-03-02/
Mark Zuckerberg says Meta is building a 5GW AI data center — TechCrunch, July 13, 2025
https://techcrunch.com/2025/07/14/mark-zuckerberg-says-meta-is-building-a-5gw-ai-data-center/
Artificial Intelligence (AI) in Military Market worth $38.8 billion by 2028 — MarketsandMarkets, May 2023
https://www.marketsandmarkets.com/PressReleases/artificial-intelligence-military.aspPentagon splits $9 billion cloud contract among Google, Amazon, Oracle, Microsoft — Reuters, December 8, 2022
https://www.reuters.com/technology/pentagon-awards-9-bln-cloud-contracts-each-google-amazon-oracle-microsoft-2022-12-07/The Dangerous Rise of “Dual-Use” Objects in War — Yale Law Journal, June 30, 2025
https://yalelawjournal.org/article/the-dangerous-rise-of-dual-use-objects-in-war
Security Council Strongly Condemns Attacks against Critical Civilian Infrastructure, Unanimously Adopting Resolution 2573 (2021) — United Nations, April 27, 2021
https://press.un.org/en/2021/sc14506.doc.htm
Pentagon officials sent Anthropic best and final offer for military use of AI — CBS News, February 25, 2026
https://www.cbsnews.com/news/pentagon-anthropic-offer-ai-unrestricted-military-use-sources/
Minnesota Data Center Approvals Happening With Secrecy — GovTech, December 17, 2025
https://www.govtech.com/policy/minnesota-data-center-approvals-happening-with-secrecy
230+ Groups Call for National Moratorium on New Data Centers — Food & Water Watch, December 8, 2025
https://www.foodandwaterwatch.org/2025/12/08/230-groups-call-for-national-moratorium-on-new-data-centers/
The need for a Digital Geneva Convention — Microsoft On the Issues (Brad Smith), February 14, 2017
https://blogs.microsoft.com/on-the-issues/2017/02/14/need-digital-geneva-convention/












