Framing this as needing "consent" is deeply misguided. It's as silly as claiming that Microsoft Word installed an English language spellcheck dictionary without your consent. It's just part of the software. You consented to installing the software and having it autoupdate. That covers it.
Now we can argue whether or not it's an appropriate amount of disk space or bandwidth to use, but that's just a reasonable practical discussion to have. Framing it around consent is unnecessarily inflammatory and makes it harder to have a discussion, not easier.
If Chrome has the #optimization-guide-on-device-model and #prompt-api-for-gemini-nano flags enabled, either because it's part of some Origin Trial / Early Stable Release or something, then web pages will have access to the new Prompt API which allows any webpage to initiate the (one-time) download of the ~2.7 GiB CPU or ~4.0 GiB GPU model using LanguageModel.create()
https://developer.chrome.com/docs/ai/prompt-api
When Chrome 148 releases tomorrow, this will be the default behaviour on desktop.
To download, it should check for 22 GiB free disk space on the volume where your Chrome data dir is, and at least double the model size of free space in your tmp dir.
An extra 4GB per user on our NFS home file server is going to be a huge pain (several thousand students). And for our Windows lab machines, they end up in AppData\Local (which isn’t redirected for operational reasons) so we either leave the profiles in place and let them accumulate (suboptimal) or clear out the profiles as we normally do and let it redownload, over and over again.
As much as I’m against unexpected 4GB bloat for an AI model, I’d much prefer it to install one copy, system-wide. 4GB per Windows or Linux lab machine, rather than a 4TB minimum load on our NFS server and 4GB downloads per user, per machine on our Windows labs.
Pure, unadulterated, poorly researched and factually incorrect clickbait.
Question from last November, even referring to macOS, by @paulirish: https://superuser.com/q/1930445/can-i-delete-the-chromes-opt...
With policy setting, debug url, docs in the answers.
One search away.
> Energy intensity of network data transfer: 0.06 kWh per GB, the mid-band of Pärssinen et al. (2018) "Environmental impact assessment of online advertising", Science of The Total Environment [14]. The paper reports a 0.04-0.10 kWh/GB range depending on the share of fixed-line vs mobile transfer and inclusion of end-user device energy. 0.06 is a defensible mid-point.
2018? An estimate from 8 years ago is going to be off by a factor of 10 or so.
Not sure you'd get far with the legal arguments unless you're actually a lawyer. Too easy to misunderstand the jargon (i.e. the same reason why it's dangerous to use an LLM as your lawyer).
(As an aside, the whole thing reads to me like the style LLMs use; not saying for sure it was, just giving me those vibes).
I stopped using chrome 15 years ago and de-googled my life 5 years ago. The hardest thing to let go in fact was Gmaps (most alternatives, until recently, were not great) and I'm still captured by android, but rome was not built in a day.
Quitting chrome these days is the easiest thing to do. The writing is on the way. You don't control the browser on your network, google does. ANd for better or worse, google's priority is AI at this time.
Sysadmins should take notice.
If the network is ~65% chrome and thus deemed painful, take the gradual approach. Do not push chrome on new devices or users. Watch that problem slowly go away.
"Silently installs" is misleading. They are including a file in the package which is presumably related to the functionality of the software. I don't use chrome for a long list of reasons but it is not standard or expected to get consent for that.
This might be worth it if Gemma4 E2B were a good model, but honestly it's absolutely useless in all our testing without further training and finetuning, and those aren't usecases that are fit for normal web browser use such that one would care to support it by adding such overly broad and expensive infrastructure to make it happen.
Gemma 4 E4B is a much better model, but it's too large to simply download and run everywhere.
IMHO, this is jumping the gun. Google's going through a lot of effort to release a model that will give everyone a very poor first impression of what on-device models are capable of, souring it for everyone for a long time afterwards. It would be better to wait until a smaller, better model ships before doing this.
This is what I've done after spending some time to look into it, this is for Linux Desktop:
Delete Chrome's silent 4 GB AI model file and AI
In Chrome, go to: chrome://flags
Search for and Disable these:
Enables optimization guide on device
Prompt API for Gemini Nano
AI Mode
Open DevTools (F12 or Ctrl+Shift+I). Click the Settings (gear icon).
Go to AI Innovations and uncheck Enable AI assistance.
For Linux, in a bash shell, this should prevent Chrome from trying to download the file again because the root user instead of my user, will own the file/directory. sudo rm -rf ~/.config/google-chrome/OptGuideOnDeviceModel
sudo rm -rf ~/.config/googlechrome/Default/OptGuideOnDeviceModel
sudo touch ~/.config/google-chrome/OptGuideOnDeviceModel
sudo chmod 400 ~/.config/google-chrome/OptGuideOnDeviceModel
sudo touch ~/.config/google-chrome/Default/OptGuideOnDeviceModel
sudo chmod 400 ~/.config/google-chrome/Default/OptGuideOnDeviceModel
In case they already existed from doing the above previously, make sure root user owns them. sudo chown root:root ~/.config/google-chrome/OptGuideOnDeviceModel
sudo chown root:root ~/.config/google-chrome/Default/OptGuideOnDeviceModel
List to check them. ls -l ~/.config/google-chrome/OptGuideOnDeviceModel
ls -l ~/.config/google-chrome/Default/OptGuideOnDeviceModelHow hard would have been to add a simple message, warning people about it and offering to opt out? Most would have clicked OK without reading anyway, and Google could pretend they give a shit about users. Unless they expected blowback, and that kind of message is the "compromise" they want to eventually land on.
Framing 4GB of data moving in a world of petabytes of traffic as a specific environmental disaster is kind of a stretch, regardless of whether we want the model.
Not on my devices. Auto update has been abused so often now that it is an embarrassment to the industry. Auto update should be for bug fixes and security issues only.
The following seems to keep Chrome from re-downloading this beast:
# From one's $HOME dir:
rm -fr ./.config/google-chrome/OptGuideOnDeviceModel
mkdir -p ./.config/google-chrome/OptGuideOnDeviceModel
touch ./.config/google-chrome/OptGuideOnDeviceModel/weights.bin
chmod 0400 ./.config/google-chrome/OptGuideOnDeviceModel/weights.bin
chmod 0500 ./.config/google-chrome/OptGuideOnDeviceModel
Adapt as appropriate for your OS. For "Chrome Unstable" installs, the dir name is google-chrome-unstable.This has, so far, kept Chrome from (re)installing that file on my system.
Hypothetically the parts involving weights.bin aren't needed so long as the containing directory is not writable.
Why use a browser from Google or Microsoft in 2026? Why in the world?
Somebody's promotion packet depended on pushing this through the approval process.
I was working on on-device AI for 3 years. This was the prime idea we were exploring, how can someone undercut the OS providers and ship an LLM that other apps can also use on-device. Like if meta decides to do this, it can serve an API to all mobile app companies for an on-device LLM long before the OS is there. This is Google's way of reaching LLM distribution on laptops, since they don't have their own
I don't see how this is going to work when every application decides to ship and run a 4GB model, competing for video memory. It's going to be the Electron problem times 10.
The site is currently unavailable 503 so I can't read it. But I wonder, what should you consent to? Every dependency? Every dependency above 1GB?
> At Chrome's scale, the climate bill for one model push, paid in atmospheric CO2 by the entire planet, is between six thousand and sixty thousand tonnes of CO2-equivalent emissions, depending on how many devices receive the push.
Environmental analysis for operations? Not a fan of thinking in such terms.
> For users on capped mobile data plans, particularly in regions where smartphone-as-only-internet is dominant (much of Africa, much of South and Southeast Asia, most of Latin America), 4 GB of unrequested download is on the order of a month's data allowance, vapourised by Chrome on the user's behalf. Google has not, to my knowledge, published any analysis of the welfare impact of this on the populations whose internet access is metered.
THIS is a valid concern. Otherwise I'm not buying into "ask for consent because of dependency X". Users don't like questions/consents.
However OS (at least windows) has an way to set network connection as a metered so software can make informed decisions. Also Android has "Data Saver" function which should also be honored by software.
Not too long ago, someone submitted an AI demo to HN that resulted in a 3.1GB download upon visiting the page: https://news.ycombinator.com/item?id=47823460
It reminds me of the "dialup warnings" common 2 decades ago on huge pages (often containing many images). Yes, bandwidth and storage has gotten cheaper, but the unwanted waste should still be called out. I'm not even anti-AI, having waited several hours recently to get some local models to experiment with, but that's because I wanted to and made the decision to use that bandwidth.
Looks like the site's struggling to keep up with the traffic. A couple mirror links:
https://web.archive.org/web/20260505052217/https://www.thatp...
https://archive.ph/sM7O5 (missing images and styling, but the content all seems to be there)
If you back up to "intention" it's fully insane to make a GDPR argument against on-device AI. Yes it downloads bits, but those bits are not there to identify you - they are basically a local copy of the internet. This enables private data to be kept on-device. Having no personal data leave the device is fantastic for GDPR compliance.
The good point in this article is about how the "AI" features in Chrome all use Google's cloud API and not a local model. That's true and some of it should be local. ("AI mode" uses the Web index, so it fundamentally cannot be local, but there are features that could be.)
And that's why we have, promote, and (hopefully) all use Chromium on our Linuxes.
Or Firefox of course.
If anything I am glad a bit of shift to local llm's. Their gemma4 is pretty powerful for such small model so I guess that's what they are delivering.
Man the longer all this crap goes on the more I realise Stallman was right
And that will be 4GB per chrome instance I assume? (not profiles, instances) And what happens with each electron app if it uses chrome?
languagemodel should be an OS service..
It is very ironic that this post comes from "The Privacy Guy", given that the whole point of this model is to run inference on your own device rather than sending queries to the cloud, which is also much less power intensive than sending a query to OpenAI.
AI generated header image and a heavy scent of LLM prose, but this guy still complains about the "insane climate costs" of google's 4GB on device LLM?
There's simply no reason to be using Google Chrome in 2026. Purge it from your computer and install a less user hostile browser.
Google Chrome just exists to make Google money at your expense, to sell your data and deplete your battery.
One upside to this is that it doesn't use Gemma and instead uses Gemini. So at least for Gemini Nano (apparently called XS internally by Google) it means that the weights are now de facto open and you no longer need a current Android phone to get the latest and best model in this class. This also makes it the only open American frontier-level model right now.
Wow, so glad to see this on HN because yesterday coincidentally I told codex to figure out what was taking up space on my computer and lo and behold their was an ai model in my chrome folder... And i certainly didnt recall downloading that myself.
Hard to believe it's over 10 years since they first started pulling crap like this by downloading a binary to listen for 'OK Google' (including on chromium builds): https://lwn.net/Articles/648392/
Alternative to archive.ph
Works without Javascript, no CAPTCHA, no DDoS, no geoblocking, etc.
https://web.archive.org/web/20260504192142if_/https://www.th...
I am trying to wrap my head around this: if I remove Chrome Browser, will I reclaim the disk space for this model? Thanks in advance.
I think this policy will disable the automatic download of the model:
https://chromeenterprise.google/policies/#GenAILocalFoundati...
The prompt API can be tested here: https://chrome.dev/web-ai-demos/prompt-api-playground/
It would be really helpful if there was a way to download the model to a central location, so multiple users on a single system could easily share it.
What’s wrong with shipping a local llm? This is quite nice IMO is there a privacy concern with running it locally? I already have a few games I wrote web based using this and it’s quiet nice to not need a server to run my game in pure HTML from my file system
I use brave. Firefox doesn't work in my qemu VM with (none pass through) hardware acceleration, it just crashes the VM.
Brave has always just worked for me and seems light on memory usage. Dunno why anyone would use chrome.
Chrome also silently installs a powerful relational database engine without warning or consent.
All of your history, trivially searchable. Imagine the waste heat generated by the browser bar conducting thousands of non-consensual searches every time you type.
It's funny how they steal 4gb of local storage but also will sell you cloud storage when you run low on space.
OK, I rarely use Chrome (I like Safari and only ever open Chrome on the increasingly rare occasions when a site doesn’t work in Safari and lately it’s turned out that the site is just broken) but looking at the article and the comments here, I can’t figure out where this 4GB is supposed to be stored. None of the likely cases panned out when I looked.
That's timely. I had been thinking of trying Chrome out again, but it looks like it's in my interests to remain fully de-Googled.
On one level, I can't figure out how bent out of shape to get over this (but read on). Software I use downloads updates all the time, adds new features all the time, and I mostly don't ask for any of it.
So if you see this as just a new feature that provides some on-device AI, it's a bit, so what? A new feature? The last GT7 or Flight Sim patch was bigger than this, what's the big deal, etc.
However, that's not really what's going on. It theory Chrome gives you a local LLM that can provide local AI powered features. In practice, everything gets sent to the cloud anyway so the local LLM seems mostly to exist as a disguise for that, which is shady AF.
As others have pointed out, the solution is https://www.firefox.com/. And whilst it's been trendy on HN for several years to slag off Firefox and Mozilla, I went back to Firefox as my daily driver several years ago, and Chrome's high-handed enforcement of Manifest V3 extensions (meaning no full fat uBlock Origin) has only served to cement that decision.
It's mostly been great. The only downside is that some sites don't work properly on Firefox, and I'm 99.999% sure that's not Firefox's fault.
For example, Paypal's post-login verification step breaks so every time I want to buy something using Paypal I have to switch to Chrome. And, no, disabling uBlock Origin and other extensions on Paypal doesn't help - I've done this already. Seriously, Paypal, it's been months: will you please just fix signing in and paying on Firefox, please?
And many sites will assume you're a bot first and ask questions later if you hit them with anything other than Chrome or Safari... which is also extremely lame and scummy.
Wait for Ladybug to come out, it'll bury all the company-controlled browsers.
I’m guessing there’s some UX metric out there, that if they pre-load downloading the model, the user is more likely to stick with trying things out; rather than have them wait for a hefty download to complete.
The future is local models. This makes sense and I wouldn't be surprised if future web standards require this to be swappable so that you can use a model of your choice as the intelligence in the various APIs. Being able to use summarization and text extraction locally will be a powerful enabler. Apple's ability to copy text out of photos etc. is really useful.
Chrome has no moat and is always evil. I advocate against it whenever it comes up.
This is total flamebait
They do this so they don't have to host the model on Google servers and then have claims of "Google spies on chrome users and uploads all their data to Google servers, including private dm's".
I'm not a fan of this being downloaded by default. Still, I very much prefer that, if something if Chrome uses a LLM, that's done via a local LLM rather than by via an API call
https://archive.ph/vhTfm