0:00
/
0:00

Part 12: Who TF Am I Working For?? | Oracle Optimizes Away Oversight

In Las Vegas, under the bright glare of Oracle AI World 2025, Larry Ellison walked onto the stage with the confidence of a man unveiling the future. Behind him loomed a wall of screens stretching the length of city blocks. In front of him sat what I can only assume were eager executives, investors, and tech industry insiders waiting to hear what brilliant innovation Oracle had ready for them next.

What came next made me sick.

Ellison has never been shy about his ambitions to control critical infrastructure, and honestly, it takes a lot to shock me at this point. But what he described felt less like a promise of the future and more like a confession of what’s already been done. He delivered it with the casual ease of someone describing weekend plans rather than a scheme to fundamentally reshape healthcare regulation in America.

This wasn’t a pitch. This wasn’t a proposal. This was a progress report on infrastructure that’s already built, already operational, already using your data in ways you never consented to.

The Confession

I didn’t even make it six minutes before I started dry heaving.

About that six minute into the keynote, right after explaining how Oracle powers Tesla’s systems, Ellison drops this:

“Most of the world’s high value data is already in Oracle Database. We just had to change (and in past tense, we had to change, we did change) that database so the data that’s already in the Oracle database can make it available to AI models for reasoning. So the AI model can reason on not just public data, but on private data.”

Past tense. We did change it. Already done.

They already modified your private medical data to feed AI training models. You weren’t asked. You didn’t consent.

Oracle’s Healthcare Empire

In 2022, Oracle acquired Cerner Healthcare for $28 billion. Cerner’s software runs in over 2,000 hospitals; roughly one in five hospitals in America. Their electronic health record systems are used by the Department of Veterans Affairs, the Department of Defense, and hundreds of major hospital networks.

And if I am not mistaken, statistically that would mean that if you’ve been hospitalized, especially if you’re a veteran or active military, then you likely have the name Cerner in your patient portal. That means Oracle has your data.

In 2023, just a year after the acquisition, Cerner’s systems at VA hospitals experienced data breaches that exposed veterans’ medical records. There were documented cases of incorrect medication orders, missing lab results, and systems going down during critical care situations.

Oracle didn’t pump the brakes. They doubled down. The goal was never just to run hospital software efficiently. The goal was to control the data infrastructure of American healthcare.

What “Private Data” Actually Means

So when Ellison says “private data,” what exactly is he talking about?

Your medical records aren’t just doctor’s notes and prescriptions. They contain your lab work, diagnostic imaging, heart rate monitoring, mental health diagnoses, genetic test results, family medical history, every specialist you’ve seen, every medication you’re on, every procedure you’ve had. Whether you’re pregnant. Whether you’ve had an abortion. Whether you’re HIV positive. The most intimate details of your physical and mental health.

This data is supposed to be protected by HIPAA (the Health Insurance Portability and Accountability Act) precisely because it’s attached to you as a human being. Your body. Your biology. Your most vulnerable moments.

The Loophole: De-Identification Is a Lie

So how is Oracle able to do this without violating HIPAA?

They rely on something called “de-identification” or “anonymization.” Strip your name off the data, run it through some algorithms, and suddenly it’s not “your” data anymore. Legally speaking. HIPAA says once data is de-identified, it’s no longer protected health information. Companies can do whatever they want with it.

Except that’s complete horsesh*t. (Excuse my french)

De-identification is a legal fiction that stopped working the moment we entered the age of big data and AI. In 1997, researcher Latanya Sweeney re-identified then-Massachusetts Governor William Weld’s medical records from a supposedly anonymized dataset. She bought the voter database for $20 and matched it with hospital records using just ZIP code, birth date, and gender.

Since then, researchers have repeatedly proven the same point. In 2016, the Australian Department of Health released de-identified medical records for 10% of the population—researchers re-identified them six weeks later. As Stanford researcher Nigam Shah explains, “it is never possible to guarantee that de-identified data can’t or won’t be re-identified. That’s because de-identification is not anonymization.”

Modern AI can cross-reference millions of data points within seconds. Your medical records contain way more than the 15 attributes studies show are needed for re-identification. You are not anonymous. That math they’re hiding you in? It’s transparent.

But here’s where it gets even darker: your genetic data isn’t just about you. When they train AI on your genome, they’re using information about everyone who came before you and everyone who will come after you. Think about it: your DNA contains your parents’ genetic information. Your grandparents’. Your children’s. Your grandchildren’s. When they feed it into a machine learning model, they’re taking your entire bloodline. Three generations back. Three generations forward.

And I’m willing to be that nobody in your family consented to that.

The Most Dangerous Part: Automating Regulators

Later in the keynote, Ellison explains his philosophy of “automating entire ecosystems.” He compares it to how Elon couldn’t just build electric cars, but how he had to build the charging infrastructure too.

As if this information wasn’t insulting enough, somehow hearing it sandwiched between free PR for Larry’s billionaire best friend’s car company was just enough to turn the dry heaving into full-on nausea.

Larry Ellison continues to explain that, similarly, you can’t just automate hospitals. You have to automate everything connected to them. Then he rattles off the list:

“You have to automate the patient, the provider, the payer, the regulator, the pharma companies, banks who finance hospitals, and governments who regulate hospitals.”

The most alarming item on that list? Regulators.

When a hospital harms a patient through negligence, who investigates? Regulators. When an insurance company systematically denies care to maximize profit, who intervenes? Regulators. When a pharmaceutical company lies about drug safety, who sanctions them? Regulators.

Then he says:

“If we want to automate hospitals and clinics, those hospitals and clinics are not gonna be very efficient if the people who regulate those hospitals and clinics are not also automated.”

The people who regulate. Not the regulations. Not the paperwork. The people.

Think about what happens when regulators are automated. Right now, when your insurance claim gets denied, you can appeal to a human regulator. Someone with medical training, ethical obligations, and accountability. Someone who can look at your case and actually have a shot at successfully contesting it.

In Ellison’s vision, you appeal to what? An AI trained on Oracle’s data, evaluated by Oracle’s models, with Oracle’s financial interests baked into every decision? When the system that processes your claims is the same system that “regulates” whether those claims are legitimate, who exactly is protecting you?

Who This Is Really For

When Ellison talks about AI figuring out “the best possible care that is fully reimbursable,” who is he talking to?

Oracle’s customers are United Healthcare. Aetna. Cigna. The insurance companies that already spend billions finding ways to deny your claims. The AI is being trained to find the care that’s cheapest for the insurance company while still defensible enough to avoid lawsuits. It’s being trained to identify “high risk” (expensive) patients to price them out of coverage before they get sick.

Your premiums are about to skyrocket because AI (trained on your medical data) decided you’re too expensive to cover based on genetic risks you didn’t even know you had.

What You Can Do

So what do we do?

Look, I don’t have all the answers. I’m figuring this out just like you. But what I find deeply ironic (and honestly, a bit disappointing) is that I had to learn this from watching his entire keynote myself. Where is Congress in all of this? The irony is not lost on me that we’re in a government shutdown right now because they’re fighting to keep healthcare premiums from going up. But the reality is they’re going up anyway. They’re all fighting over the thermostat while Larry Ellison is changing the weather. And if we don’t do something about this right now, it won’t matter who wins that fight. It shouldn’t be up to us to fix this. These are our public servants. But if it is, then here’s what I know works: being loud. Whenever we make noise, people pay attention.

Call your representatives. Tell them:

“My biological data is not for sale. My body is not a product. I do not consent to what Oracle is doing with medical records.”

Ask your healthcare providers:

“Do you use Oracle or Cerner? What data is collected? How is it used? Can I opt out?”

The Oracle keynote is still on YouTube. The comments are still open. Leave one. Share it. Make noise.Your medical data is not a product. Your body is not raw material for algorithmic optimization. Your genetic code is not training data. Larry Ellison said the quiet part out loud.

Now it’s our turn to respond; loudly, persistently, and with the full weight of democratic pressure that reminds the powerful they still answer to us.


Sources Cited

Take Action:

Oracle AI World 2025 Keynote:
Go to their comment section and past the following:

“I do not consent to the use of my private medical data for AI training or commercial purposes without my explicit written permission.”

Oracle’s Acquisition of Cerner and Healthcare Data:

• “Oracle Buys Cerner,” Oracle Press Release, December 20, 2021. https://www.oracle.com/news/announcement/oracle-buys-cerner-2021-12-20/

De-identification and Re-identification Research:

• Latanya Sweeney, “Weaving Technology and Policy Together to Maintain Confidentiality,” Journal of Law, Medicine & Ethics, 1997. https://dataprivacylab.org/projects/identifiability/paper1.pdf

• Luc Rocher, Julien M. Hendrickx, and Yves-Alexandre de Montjoye, “Estimating the success of re-identifications in incomplete datasets using generative models,” Nature Communications, July 23, 2019. https://www.nature.com/articles/s41467-019-10933-3

• Khaled El Emam et al., “A Systematic Review of Re-Identification Attacks on Health Data,” PLOS ONE, December 2, 2011. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0028071

• Nigam Shah, “De-Identifying Medical Patient Data Doesn’t Protect Our Privacy,” Stanford HAI, November 15, 2021. https://hai.stanford.edu/news/de-identifying-medical-patient-data-doesnt-protect-our-privacy

HIPAA and Medical Data Privacy:

• “Methods for De-identification of PHI,” U.S. Department of Health and Human Services. https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html

Discussion about this video

User's avatar