Ask yourself, why would a scraper ddos? Why would a ddos-protection vendor ddos?
The number of git forges behind Anubis et al and the numerous public announcements should be enough.
Scrappers seem to be exceedingly careless in using public resources. The problem is often not even DDOS (as in overwhelming bandwidth usage) but rather DOS through excessive hits on expensive routes.
Ask yourself, why would everyone except you say that they do?
> Ask yourself, why would a scraper ddos?
Don't need to ask anything i can tell you exactly - because they have no regard for anything but their own profit.
Let me give you an example of this mom and pop shop known as anthropic.
You see they have this thing called claudebot and at least initially it scraped iterating through IP's.
Now you have these things called shared hosting servers, typically running 1000-10000 domains of actual low volume websites on 1-50 or so IPs.
Guess what happens when it is your networks time to bend over? Whole hosting company infrastructure going down as each server has hundreds of claudebots crawling hundreds of vhosts at the same time.
This happened for months. Its the reason they are banned in WAFs by half the hosting industry.
Because the scraper is either impatient, careless or indifferent; and if they scrape for training data they don't plan to come back. If they don't plan to come back they don't care if you tighten up crawling protections after they have moved on. In fact they are probably happy that they got their data and their competition won't