Anon 06/29/2024 (Sat) 17:36 No.10574 del
>>10573
Next step would be collecting a bunch of statuses/tweets from an account. One way to do that: manually looking through https://x.com/GigiKittyDays/with_replies or https://x.com/GigiKittyDays ("167 posts") and copying URLs. I ran this:
$ echo -n "GigiKittyDays/status/1168989468088557574 LemonFoDrizzle/status/1764052936471011657 GigiKittyDays/status/1789028236250251369 GigiKittyDays/status/1792249338283880450 GigiKittyDays/status/1792547776854630441 GigiKittyDays/status/1792665397138993607 GigiKittyDays/status/1793336889476370524 GigiKittyDays/status/1799469894905577882 GigiKittyDays/status/1800510963491479989 RAR1JACK/status/1800690674003845489 GigiKittyDays/status/1804973546919063569 GigiKittyDays/status/1794704287823757758 GigiKittyDays/status/1802422677589700859 GigiKittyDays/status/1805935560680153112 GigiKittyDays/status/1804921129145864413 GigiKittyDays/status/1806773072818524538 GigiKittyDays/status/1806740746482979204 GigiKittyDays/status/1807080594586685618 taigatism/status/1807079106023977078 GigiKittyDays/status/1807089899100598410 CandiCornArt/status/1807076088117407800 JadeTheElf/status/1806150235707551752 GigiKittyDays/status/1806700983465955801" | xargs -d " " sh -c 'for args do gallery-dl --write-pages --write-metadata --write-info-json https://twitter.com/$args; done' _ # skipped tweets where I didn't feel like downloading them or their images.
[twitter][info] No results for https://twitter.com/GigiKittyDays/status/1168989468088557574
./gallery-dl/twitter/LemonFoDrizzle/1764052936471011657_1.jpg
[twitter][info] No results for https://twitter.com/GigiKittyDays/status/1789028236250251369
./gallery-dl/twitter/GigiKittyDays/1792249338283880450_1.jpg
./gallery-dl/twitter/GigiKittyDays/1792249338283880450_2.jpg
[...]
Does gallery-dl download tweets without images?

>>10554
>>10566
>wget
GNU Wget downloads colliding filenames to "file.txt.1" (second time downloading "file.txt" to the same folder). This method isn't great if you download hundreds of different files which all have the same filename. Or appending ".1" each time looks dumb here, where filenames were unique'd by vim commands:
https://bafybeibgtzzcb4gb362wz6n3znyiqkkw6amgovfifje4oslpbus4tgmnvq.ipfs2.eth.limo/10k_drop_1_dedup/3745/
Maybe Wget instead does this: "file.txt.2" (3rd time downloading file.txt to the same dir).