Quote:
Originally Posted by Dgc2002
[You must be logged in to view images. Log in or Register.]
The amount of traffic generated by the previous version of this PigParse feature is probably a blip on the radar compared modern scrapers/bots. Without something like CloudFlare between your site and the open internet you'll be hammered all day by indiscriminate bots.
I would honestly be surprised if traffic volume is really taking down the Wiki. I just tested with a mirror of the wiki and each mob/item page is a few hundred kB which is nothing.
|
The devil is on the details. For instance, even 0kb requests require server effort, and enough such requests can take down a server.
Similarly, overloading is more about lots of requests at one time. Wiki servers are web servers: they're designed to handle a basic number of requests at any given time, so it's about making a lot more requests than it can handle
at once.
It does seem unlikely to me that a tool (even a horribly written one that makes tons of requests) would consistently, over time, overload the wiki ... but if most of a guild uses it and they all spam requested the same raid mob page, that certainly could slow or crash it briefly.
I'd suspect the bigger issue is AI agents, which are overloading servers all over the Internet right now. But like I said, the devils in the details, and since I don't have access to the wiki server logs, I don't have those details ... so this is just speculation.