logoalt Hacker News

fastballtoday at 6:37 AM0 repliesview on HN

It is very ironic that this post comes from "The Privacy Guy", given that the whole point of this model is to run inference on your own device rather than sending queries to the cloud, which is also much less power intensive than sending a query to OpenAI.