logoalt Hacker News

ta20211004_1today at 12:42 PM2 repliesview on HN

Can't agree more on 5. I've repeatedly found that any really tricky programming problem is (eventually) solved by iterative refinement of the data structures (and the APIs they expose / are associated with). When you get it right the control flow of a program becomes straightforward to reason about.

To address our favorite topic: while I use LLMs to assist on coding tasks a lot, I think they're very weak at this. Claude is much more likely to suggest or expand complex control flow logic on small data types than it is to recognize and implement an opportunity to encapsulate ideas in composable chunks. And I don't buy the idea that this doesn't matter since most code will be produced and consumed by LLMs. The LLMs of today are much more effective on code bases that have already been thoughtfully designed. So are humans. Why would that change?


Replies

alain94040today at 4:03 PM

Agreed, in my experience, rule 5 should be rule 1. I think I also heard it said (paraphrased) as "show we your code and I'll be forever confused, show me your database schema and everything will become obvious".

Having implemented my shared of highly complex high-performance algorithms in the past, the key was always to figure out how to massage the raw data into structures that allow the algorithm to fly. It requires both a decent knowledge of the various algorithm options you have, as well as being flexible to see that the data could be presented a different way to get to the same result orders of magnitude faster.

show 3 replies
zer00eyztoday at 3:56 PM

> refinement of the data structures (and the APIs they expose / are associated with)

I think rule 5 is often ignored by a lot of distributed services. Where you have to make several calls, each with their own http, db and "security" overhead, when one would do. Then these each end up with caching layers because they are "slow" (in aggregate).

show 1 reply