"Heinessen, the 4chan archiver" was the first website to automatically download 4chan /mlp/ images somewhat consistently for months. Copied: . torrent? related to 221.89-GiB torrent magnet:?xt=urn:btih:79760b305316b095d30e7bdfa49b97d49f236618&dn=heinessen-mlp-images.tar -> an extracted version with extra data . where? zb, offline, in MFS but not pinned /ipfs/QmcX2DkjkkXmCRBhGn6swbHrQKoY1WT1oXqSRVRYGTqYeY <- "ipfs add -wrH --raw-leaves --chunker=size-1048576 [folders]; [...]" . size? 238,142,756,028-B folder /ipfs/QmVVnfdiiTP5EoZvzJtBQjJPopwVszRwouKj49kU7SdaMK - took about >24 hours due to problems and it contains 635,770 files total . note? See below.
Extra data: . "heinessen-mlp-images_hashes_via.txt" = "readme file" . "heinessen-mlp-images_hashes.txt.zst" = precomputed data from months ago (which took days to compute); this contains stuff such as MD5 hashes which are relevant to Desuarchive /mlp/. I recently used it to see the file count: '$ zstd -cd /z0/data/1735f69/heinessen-mlp-images_hashes.txt.zst | head -n5499000 | grep "File: " | vim -'