10/20/2020 (Tue) 08:11:20
A couple years ago I made a whitelist script to lock down the firewall rules so no packet could get in or out without having a rule for it. If I wanted to browse a website, I'd invoke the script with the site's domain name and it would add the necessary IP address(es) that matched the site and any dependencies I configured, because sites often offload images or whatever to another domain name (for example, wikipedia.org -> upload.wikimedia.org).
That worked basically fine for sites that didn't branch out to unknown domains. The problems started happening when I couldn't even download files from sourceforge.net or archive.org because their stuff is hosted on CDNs that use a gazillion hostnames and I can't manually track all that shit and put it into a config file. I thought about improving the script a bit and having a manual option ("Allow this host?") but I just gave up on it after moving away from x86, and moving to ARM SBCs where I can just run the pozzed firefox browser on its own dedicated computer, and do my regular browsing with Lynx/Links on another computer. In fact, this way my main computer doesn't need all the crappy modern browser depedencies, so when it boots up the initial memory footprint of all running processes is only a tad over 100 MB (that doesn't include Xorg since I startx manually if I want to run it).
On the plus side, their bullshit is web-only, so old school stuff like FTP, gopher, irc, BBS, etc. are spared of this crap. BTW it's interesting that google is in the process of removing FTP support in chrome. They give some bullshit excuse ("outdated protocol, yada, yada) but in reality it's probably because they can't run their spyware crap on those old simple protocols. Their whole plan from the start was to lock everyone into their little walled garden, where they tightly control everything.