logoalt Hacker News

JadeNB05/03/20254 repliesview on HN

> The optimal solution would be using a template engine to generate static documents.

This helps the creator, but not the consumer, right? That is, if I visit 100 of your static documents created with a template engine, then I'll still be downloading some identical content 100 times.


Replies

giantrobot05/03/2025

XSLT solved this problem. But it had poor tool support (DreamWeaver etc) and a bunch of anti-XML sentiment I assume as blowback from capital-E Enterprise stacks going insane with XML for everything.

XSLT did exactly what HTML includes could do and more. The user agent could cache stylesheets or if it wanted override a linked stylesheet (like with CSS) and transform the raw data any way it wanted.

show 1 reply
iainmerrick05/04/2025

I'll still be downloading some identical content 100 times.

That doesn't seem like a significant problem at all, on the consumer side.

What is this identical content across 100 different pages? Page header, footer, sidebar? The text content of those should be small relative to the unique page content, so who cares?

Usually most of the weight is images, scripts and CSS, and those don't need to be duplicated.

If the common text content is large for some reason, put the small dynamic part in an iframe, or swap it out with javascript.

If anyone has a genuine example of a site where redundant HTML content across multiple pages caused significant bloat, I'd be interested to hear about it.

show 1 reply
Klaster_105/04/2025

Compression Dictionary Transport [0] seems like something that can potentially address this. If you squint, this looks almost like XSLT.

[0] https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/Com...

codr705/03/2025

True for any server side solution, yes.

On the other hand it means less work for the client, which is a pretty big deal on mobile.