The majority of the traffic on the web is from bots. For the most part, these bots are used to discover new content. These are RSS Feed readers, search engines crawling your content, or nowadays AI bo
Severely disrupting other people’s data processing of significant import to them. By submitting malicious data requires intent to cause harm, physical destruction, deletion, etc, doesn’t. This is about crashing people’s payroll systems, ddosing, etc. Not burning some cpu cycles and having a crawler subprocess crash with OOM.
Why the hell would an ISP have a look at this. And even if, they’re professional enough to detect zip bombs. Which btw is why this whole thing is pointless anyway: If you class requests as malicious, just don’t serve them. If that’s not enough it’s much more sensible to go the anubis route and demand proof of work as that catches crawlers which come from a gazillion IPs with different user agents etc.
Severely disrupting other people’s data processing of significant import to them. By submitting malicious data requires intent to cause harm, physical destruction, deletion, etc, doesn’t. This is about crashing people’s payroll systems, ddosing, etc. Not burning some cpu cycles and having a crawler subprocess crash with OOM.
Why the hell would an ISP have a look at this. And even if, they’re professional enough to detect zip bombs. Which btw is why this whole thing is pointless anyway: If you class requests as malicious, just don’t serve them. If that’s not enough it’s much more sensible to go the anubis route and demand proof of work as that catches crawlers which come from a gazillion IPs with different user agents etc.