There are a LOT of options for archival, as listed in graphic (1) and article (2). There are synopses for Wget in (3) and (4).
I prefer Wget, for the simple reason of power and flexibility. Those of you who use *nix prolly already know this, but the mirror of choice is Wget, and that goes really well if you have access to s VPS. You can queue it up and then tgz and sftp it when it's complete. Sometimes it can take days to mirror a full site if they've got aggressive leech protection.
You'll want to be aware of the robots option and the retry option, if you notice a server blocking your access because of too many requests in rapid succession, or a bitchy robots.txt.
MAN:: Wget - The non-interactive network downloader.SYNOPSIS wget [option]... [URL]...OPTIONS Download Options -w seconds --wait=secondsWait the specified number of seconds between the retrievals. Use of this option is recommended, as it lightens the server load by making the requests less frequent. Instead of in seconds, the time can be specified in minutes using the "m" suffix, in hours using "h" suffix, or in days using "d" suffix.Specifying a large value for this option is useful if the network or the destination host is down, so that Wget can wait long enough to reasonably expect the network error to be fixed before the retry. The waiting interval specified by this function is influenced by "--random-wait", which see.
My recommended initial configuration is below, but I'm sure you can tailor it to suit your needs.
wget --mirror --page-requisites --adjust-extension --no-parent --no-clobber --no-check-certificate --convert-links -e robots=off https:// example.com/
Happy archiving.