logoalt Hacker News

impulser_today at 3:48 AM0 repliesview on HN

People really need to stop assuming more training data the better. This is not how it works. LLM thrive off consistency.

Go for example has significantly less training data than Python, but LLMs are the best at it. Why? Go is often written the same. You go from project to project and the code looks all the same. There only a very few ways to write Go.