logoalt Hacker News

Lasangtoday at 1:44 AM8 repliesview on HN

The idea of exposing a structured crawl endpoint feels like a natural evolution of robots.txt and sitemaps.

If more sites provided explicit machine-readable entry points for crawlers, indexing could become a lot less wasteful. Right now crawlers spend a lot of effort rediscovering the same structure over and over.

It also raises interesting questions about whether sites will eventually provide different views for humans vs. automated agents in a more formalized way.


Replies

_heimdalltoday at 1:53 AM

I expect that if we still used REST indexing would be even less wasteful.

I've found myself falling pretty hard on the side of making APIs work for humans and expecting LLM providers to optimize around that. I don't need an MCP for a CLI tool, for example, I just need a good man page or `--help` documentation.

berkestoday at 7:51 AM

I know in practice it no longer is the case, if it ever was.

But semantic HTML is exactly that explicit machine-readable entrypoint. I am firmly entrenched in the opinion that HTML, and the DOM is only for machines to read, it just happens to be also somewhat understandable to some humans. Take an average webpage, have a look at all characters(bytes) in there: often two third won't ever be shown to humans.

Point being: we don't need to invent something new. We just need to realize we already have it and use it correctly. Other than this requiring better understanding of web tech, it has no downsides. The low hanging fruit being the frameworks out there that should really do a better job of leveraging semantics in their output.

PeterStuertoday at 7:38 AM

The only ones benefitting from 'wastefull' crawling are the anti-bot solution vendors. Everyone else is incentivized to crawl as efficiently as possible.

Makes you think, right?

show 1 reply
catlifeonmarstoday at 1:48 AM

> It also raises interesting questions about whether sites will eventually provide different views for humans vs. automated agents in a more formalized way.

This question raises an interesting question about if this would exacerbate supply chain injection attacks. Show the innocuous page to the human, another to the bot.

pocksuppettoday at 3:32 AM

Apart from the obvious problem: presenting something different to crawlers and humans.

threwaway035today at 1:24 PM

Isn't it already covered by sitemaps and sitemap index files, which are machine readable XML?

rglovertoday at 2:14 AM

I just do a query param to toggle to markdown/text if ?llm=true on a route. Easy pattern that's opt-in.

pdntspatoday at 2:43 AM

They already do...

A lot of known crawlers will get a crawler-optimized version of the page

show 1 reply