logoalt Hacker News

apothegmyesterday at 3:47 PM1 replyview on HN

The JS processing and rendering time on an underpowered CPU is the issue, not the payload size. It’s difficult to describe how excruciatingly slow some seemingly simple e-commerce and content sites are to render on my 2019 laptop or how slowly they react to something as simple as a mouseover or how they peg the CPU - while absolutely massively complex and large server-rendered HTML loads and renders in an eyeblink.


Replies

lelanthranyesterday at 7:12 PM

Which sites are you thinking off?

I can't really speak for those sites anyway, or why they are so slow doing things on the client, but like I said, I've written client-side processing and used my 2011 desktop, and there has been no pegging of the CPU or large latencies when filtering/sorting data client-side.

> while absolutely massively complex and large server-rendered HTML loads and renders in an eyeblink.

I've not had that experience - a full page refresh with about 10MB of data does not happen in an eyeblink. It takes about 6 seconds. There's a minimum amount of time for that data to download, regardless of whether it is pre-rendered into `<table>` elements or whether it is sent as JSON. Turning that JSON into a `<table>` on the client takes about 40ms on my 2011 desktop. Sorting it again takes about 5ms.

For this use-case (fairly large amounts of data), doing a full-page refresh each time the user sets a new sort criteria is unarguably a poorer experience than a bit of JS that goes through the table element and re-orders the `<tr>` elements.

In this case, using server-rendered HTML is always going to be 6000ms whenever the user re-sorts the table. Using a client JS function takes 5ms. On a machine purchased in 2011.

show 1 reply