logoalt Hacker News

endymion-lightyesterday at 1:55 PM0 repliesview on HN

Exponential take-off is great until it stops- genuinely, what are the signals showing any of the large models are performing exponential takeoff and recursive self-improvement?

Currently a lot of that appears to be marketing hype to drive up usage. Is it exponential, or are the labs spending exponentially more for smaller and smaller gains from LLMs?