logoalt Hacker News

oceanplexianyesterday at 3:38 AM4 repliesview on HN

[flagged]


Replies

QuiDortDineyesterday at 3:43 AM

Not sure why you're talking like OP pissed in your cheerios. They are a victim of a broken system, it shouldn't be on them to spend more effort protecting their stuff from careless-to-malicious actors.

simonwyesterday at 3:43 AM

A varnish cache won't help you if you're running something like a code forge where every commit has its own page - often more than one page, there's the page for the commit and then the page for "history from this commit" and a page for every one of the files that existed in the repo at the time of that commit...

Then a poorly written crawler shows up and requests 10,000s of pages that haven't been requested recently enough to be in your cache.

I had to add a Cloudflare Captcha to the /search/ page of my blog because of my faceted search engine - which produces may thousands of unique URLs when you consider tags and dates and pagination and sort-by settings.

And that's despite me serving ever page on my site through a 15 minute Cloudflare cache!

Static only works fine for sites that have a limited number of pages. It doesn't work for sites that truly take advantage of the dynamic nature of the web.

show 2 replies
aguacaterojoyesterday at 3:45 AM

How would a LAMP stack help his git server?

anonnonyesterday at 5:55 AM

Your post is pure victim-blaming, as well as normalizing an exploitative state of affairs (being aggressively DDOSed by poorly-behaved scrapers run by Big Tech that only take and never give back, unlike pre-AI search engines, which previously, at least, would previously send you traffic) that was unheard of until just a few years ago.