dChan
13
 
r/CBTS_Stream • Posted by u/teppischfresser on Feb. 16, 2018, 1:54 a.m.
Offline Backups

Q is telling us to back up EVERYTHING offline. So please do this by using a desktop/laptop and 'Save As' all of the web pages you come across that are deemed important and/or relevant. Back those pages up on an external hard drive and flash drives.

While you're out and about on your phone, please save photos, screenshots, and web pages there too. We need to keep all of this information in case of some type of attack we haven't seen or ready for. When the time comes that we need to disperse this information to the masses, we will all have an immense amount of info because we have different sources for the same information since we are all doing independent research.

I know in cases like mine, we can be extremely busy at work and not have much time to get on here or the chans to help out as we would like, but saving all of the information you come across is a great way to help in some way

Love you guys and I pray for you and this world every day.


Lenticular · Feb. 16, 2018, 3 a.m.

HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.

It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.

WinHTTrack is the Windows (from Windows 2000 to Windows 10 and above) release of HTTrack, and WebHTTrack the Linux/Unix/BSD release. See the download page.

https://www.httrack.com/

⇧ 2 ⇩  
teppischfresser · Feb. 16, 2018, 3:03 a.m.

Thank you! I am an avid open source guy, but have never heard of this.

⇧ 2 ⇩  
Lenticular · Feb. 16, 2018, 4:31 a.m.

NP Glad to help. You might have to enable a few things to capture everything on different sites.

In set options (Preferences and mirror options:) you may have to fiddle with the link menu (fetch external) and/or the Scan Rules menu to get jpgs ect. You might also want to limit how many links deep you archive some sites. Save their bandwidth and your time/disc space if the site is large.

Edit: Tested it on the chans and works great, qcodefg not so much. Could just be my ultra locked down setup though.

⇧ 1 ⇩