logoalt Hacker News

mootothemaxyesterday at 11:28 AM2 repliesview on HN

> > BERT isn’t a SLM Huh? BERT is literally a language model that's small and uses attention.

Astute readers will note what’s been missed here.

Fascinating, really. Your confidently-statement yet factually void comments I’d have previously put down to one of the classic programmer mindsets. Nowadays though - where do I see that kind of thing most often? Curious.


Replies

ricericericeyesterday at 12:07 PM

After some research, I think I understand what you're getting at here - BERT being a model for encoding text but not architecturally feasible to generate text with it, which "LLMs" (the lack of definition here is resulting in you two talking past eachother), maybe more accurately referred to as GPTs, can do.

Also the irony of your comment when it in itself was confidently stated yet void of any content was not missed either - consider dropping the superiority complex next time.

show 3 replies
krisoftyesterday at 12:19 PM

> Astute readers will note what’s been missed here.

I’m not astute enough to see what was missed here. Could you explain?