logoalt Hacker News

dur-randiryesterday at 6:20 PM1 replyview on HN

How do I connect it to a local llama.cpp instance?


Replies

GodelNumberingyesterday at 8:26 PM

It supports LMStudio or you can start a local endpoint, then run

OPENAI_COMPATIBLE_CUSTOM_KEY="xxx" dirac -y --provider "https://localhost/v1" --model <model_name> "hi..."