Anon 12/16/2023 (Sat) 21:19 No.9065 del
>>9062
So with saving wikis there's about two considerations:
- downloading Web ARChive files which contain all of the web pages/files from the wiki
- downloading XML data for everything, so the resulting .xml file(s) can work as a drop-in for any MediaWiki thing that anyone is running; such XML is a standard across MediaWiki-based wikis like Wikipedia, Fand*m, etc.

>While I have some space constraints, I will check it out.
By default, grab-site writes .warc.gz files, which are Gzip-compressed (level 9/max compression IIRC). wget with warc writing options writes raws plus warc files (so it uses more storage space than grab-site). Raws = non-warc data or files downloaded from the HTTP Internet.

>>9064
something broke that link:
>https://archive.4plebs.org/pol/search/text/ %22 my%20little%20pony %22 /page/2/

>!!
Does shell expansion
>wrapf() { !!; } # [1] !! = 14 commands = one screenful of commands
>wrapf; wrapf; wrapf; wrapf # [2]
No shell expansion anywhere other than [1], and [2] runs the command >28 times without "messing anything up" anywhere.