logoalt Hacker News

2ndorderthoughttoday at 12:39 PM0 repliesview on HN

Ollama is definitely not the way to go once you have an interest beyond "how quickly can I run a new LLM" rather then "how do I use a local llm to do things in a remotely optimal way"