logoalt Hacker News

jcgrillotoday at 1:43 AM1 replyview on HN

> people make stuff to make stuff. There don't need to be product use cases.

OK. Great! So it doesn't need to be a commercial product. But does it do something (anything?) interesting? I'm interested in your games example, I'd love to see it done in real life. IIUC, game AIs are actually much more constrained and predictable for play-ability reasons. If you let it go all free form a plurality of players have a "WTF??!?" experience which is super Not Good.


Replies

digdugdirktoday at 2:13 AM

It doesn't have to do any thing interesting - it's completely fascinating all on it's own. If you understand anything about the math and science behind LLMs, you'll understand that this is an achievement worthy of sharing to a community like HN.

That being said, small models like these have plenty of use cases. They allow for extra "slack" to be introduced into a programmatic workflow in a compute constrained environment. Something like this could help enable the "ever present" phone assistant, without scraping all your personal data and sending it off to Google/OpenAI/etc. Imagine if keywords in a chat would then trigger searches on your local data to bring up relevant notes/emails/documents into a cache, and then this cache directly powers your autocomplete (or just a sidebar that pops up with the most relevant information). Having flexible function calling in that loop is key for fault tolerance and adaptability to new content and contexts.

Its cool. Enjoy it.

show 1 reply