Anonymous ID: df7cd8 June 1, 2018, 5:46 a.m. No.1606326   🗄️.is 🔗kun

>>1603342

So Vice, NPR & the other Left-led orgs want to block archiving?

I smell a fail

Archiving can be done regardless in just about every case, the following are what I know of (disclaimer, I am not an expert):

****

Option1:

PrintEdit or PrintEdit WE Firefox extension - create PDF files directly from webpages

 

Option2:

wget or curl command line tool for Linux (I'm sure something similar exists for Windows, or one can use a Linux emulator for Win like Cygwin] wget is VERY malleable, for example it can be made to appear to sites like a web spider, which websites certainly want for their SEO rankings.

Including a pdf manual for wget

*****

Relevant Links:

>How to block web scrapers and GET on a server:

https:// clients.stabiliservers.com/knowledgebase.php?action=displayarticle&id=3#

 

>Stackexchange: Should I Block wget - responses explain it's a lost cause:

https:// webmasters.stackexchange.com/questions/60234/should-i-block-agent-wget-1-12-linux-gnu

 

>web scraping security concerns? Nervous about your IP being traced? Does the server you want to scrape have anomalous surfing detection implemented? Read this:

https:// security.stackexchange.com/questions/174805/wget-what-security-issues-am-i-not-considering

 

>Discussions about bypassing 403 Forbidden access denials:

https:// www.linuxquestions.org/questions/linux-software-2/wget-error-403-can-i-get-around-this-606755/

 

*****

There are certainly other options, but these 2 have been rather solid in my experience

 

Once local archives are grabbed, they can be compressed and dispersed on the internet.

The archive sites are great indeed, but this Vice development should be an indication that other methods must be familiarized as well. Expand that Utility Belt. What if those archive sites are attacked?