A thought experiment to illustrate a question I have for those more knowledgeable-
Code:
Pretend I am the administrator for Elfwiki which runs on mediawiki. I want to backup the entirety of the site consisting of the totality of pages and all dependencies (such as tables, formatting, images) which have been created by users into, ultimately, a single container (e.g. a .zip file) at the present moment: time zero.
The stated goal at time zero would be a scenario where visitors to Elfwiki.com could download this backup container, and with a simple and FOSS toolchain, themselves host mirrors of the wiki site which then are indexed by Elfwiki servers.
The stated goal at times beyond time zero is a scenario where at intervals, administrators can release supplemental container files consisting of only the changes made to pages/dependencies(which would be necessarily smaller than a full backup), and those hosting mirrors while still using an FOSS toolchain could update their mirrors using these small, periodic updates.
In a perfect scenario all images would be cryptographically signed by the official source and mirrors would be automatically validated and indexed without the need for human intervention, or even, in the case of a site outage on Elfwiki's primary host/server, there will exist a means to seamlessly and automatically point visitors to the wiki's URL address (which directs to the primary server during uptime) instead to an online and validated mirror.
Is there already a codebase extant which is available for use that would allow me to achieve all my stated goals as its primary stated design purpose? If not, how can I achieve all stated goals using FOSS tools?