logoalt Hacker News

anonymous908213yesterday at 3:43 PM1 replyview on HN

We were not a ChatGPT wrapper; we used a finetuned open-source model running on our own hardware, so we naturally had full control of the input parameters. I apologize if my language was ambiguous, but by "expose seeds" I simply meant users can see the seed used for each prompt and input their own in the UI, rather than "exposing secrets" of the frontier LLM APIs, if that's what you took it to mean.


Replies

x_mayyesterday at 5:41 PM

I just wanted deterministic outputs and was curious how you were doing it. Sounds like probably temp = 0, which major providers no longer offer. Thanks for your response.

show 1 reply