Chrome is quietly writing a 4 GB Gemini Nano model to your drive without consent, a prompt, or a real opt-out. If you delete the file, Chrome downloads it again. At a billion-device scale, the climate cost alone is staggering.


The Discovery

Security researcher Alexander Hanff found Chrome silently creating a weights.bin file under OptGuide/OnDeviceModel/ on macOS. The file is exactly the Gemini Nano on-device LLM — 4 GB. Chrome downloads and installs it without any user interaction. No consent dialog. No settings toggle. No mention in the installer.

Forensic analysis via macOS .fseventsd kernel logs shows the full timeline:

Event Time (April 24, 2026)
Chrome creates OptGuide directory 16:38:54 CEST
Unpacker writes weight.bin + metadata 16:47:22 CEST
Model moved to final location 16:53:22 CEST
Total install time 14 minutes, 28 seconds

Zero human input. Zero consent.

The Persistence Mechanism

Here's the part that crosses the line: Chrome re-downloads the model if you delete it. Feature flags like OnDeviceModelBackgroundDownload are enabled by default and fire before users can reach the settings UI to disable them. The control component is fetched over plain HTTP from http://edgedl.me.gvt1.com/edgedl/diffgen-puffin/.

The Local State JSON file in Chrome's profile directory contains an optimization_guide.on_device block that confirms the model version and hardware performance class — Google knows exactly which devices have it.

The Deceptive UX

Chrome 2026 prominently features an "AI Mode" pill in the omnibox. A reasonable user would assume that visible AI surface is powered by the 4 GB model already on their disk. It's not. That "AI Mode" routes queries to Google's cloud servers. The local model sits there as dead weight — a sunk cost on your SSD.

The Climate Math

At a billion-device scale, "just 4 GB" adds up fast:

Metric Value Equivalent
Download energy ~240 GWh 72,000 UK households / year
CO2 from transfer ~60,000 tonnes 13,000 cars / year
Embodied carbon (SSD wear) ~640,000 tonnes 140,000 cars / year

These are not abstract numbers. Every delete-and-redownload cycle doubles the waste.

Legal Exposure

The researcher argues this violates multiple regulations:

  • ePrivacy Directive Article 5(3) — storing information on terminal equipment without consent
  • GDPR Articles 5(1) & 25 — lawfulness, fairness, transparency, data-protection-by-design
  • CSRD — potential failure to report significant environmental harm

"An engineering team at a large AI vendor decided that the user's machine is a deployment surface to be optimised for the vendor's product roadmap, not a personal device whose owner is the legal authority on what runs there." — Alexander Hanff


What Surprised Me

What gets me isn't the file size or even the consent violation. It's the re-download loop. That's not a bug or an oversight — that's an architectural decision. Some engineer wrote code that says "if the user deletes this, fetch it again." At what point does aggressive feature rollout become malware behavior?

Also: HTTP for the control fetch in 2026? Google pushes HTTPS everywhere for two decades, then fetches AI model triggers over cleartext?

Sources

https://www.thatprivacyguy.com/blog/chrome-silent-nano-install https://news.ycombinator.com/item?id=48019219 https://cybernews.com/security/google-chrome-ai-model-device-no-consent/ https://news.aibase.com/news/25955