logoalt Hacker News

pphyschyesterday at 2:48 PM1 replyview on HN

Big business LLMs even have the opposite incentive, to churn as many tokens as possible.


Replies

jjk7yesterday at 8:25 PM

At least tokens are equivalent to measuring 'thinking'... I wouldn't mind if it burned 100k tokens to output a one line change to fix a bug.

The problem is maximizing code generated per token spent. This model of "efficiency" is fundamentally broken.