Anon 01/06/2024 (Sat) 14:44 No.9238 del
(160.58 KB 2048x1402 1702124719854.jpg)
(240.17 KB 2000x2000 1703712096538.jpg)
(118.52 KB 541x709 1704433250235.png)
>>9237
>>9234
www.ponychan.net-2024-01-06-6f431dbf finished in "2024-01-06T13:50:45.555092990Z".

>>9237
>archive.org cli program being dumb
Maybe GNU Screen would help. I looked at "ia upload --help" and saw this:
> -H, --header=<key:value>... ; S3 HTTP headers to send with your request.
"nerd shit"
> -c, --checksum ; Skip based on checksum. [default: False]
So -c skips based on checksum if a bit-identical file at the same path was already uploaded? Unclear.
> -R, --retries=<i> ; Number of times to retry request if S3 returns a 503 SlowDown error.
> -s, --sleep=<i> ; The amount of time to sleep between retries [default: 30]
might need to do this
> --no-backup ; Turn off archive.org backups. Clobbered files will not be saved to history/files/$key.~N~ [default: True].
nice option I guess, but I more so need a thing to skip same-path-and-filesize (or same path+filesize+hash).

Message too long. Click here to view full text.