They probably didn't even think about that. Admittedly, 4GB is quite big, but if I were in their shoes, I would have expected that people are thrilled about using a local LLM instead of sending data to a cloud-based LLM.
I am still stunned that there are people who hate AI so much that they have a problem with the weights of an LLM being on their computer. To me, that sounds rather esoteric.
They probably didn't even think about that. Admittedly, 4GB is quite big, but if I were in their shoes, I would have expected that people are thrilled about using a local LLM instead of sending data to a cloud-based LLM.
I am still stunned that there are people who hate AI so much that they have a problem with the weights of an LLM being on their computer. To me, that sounds rather esoteric.