Downloading an Entire Web Site with wget or HTTrack

You, WGETHTTrack

If you ever need to download an entire Web site, perhaps for off-line viewing, wget can do the job—for example:

$ wget \\
     --recursive \\
     --no-clobber \\
     --page-requisites \\
     --html-extension \\
     --convert-links \\
     --restrict-file-names=windows \\
     --domains \\
     --no-parent \\
         []( "")

This command downloads the Web site

The options are:

Source from :

Is that all? NO!! there are number of other ways/options avilable you can use 

wget -p -k
wget -A pdf,jpg -m -p -E -k -K -np http://site/path/
wget -m -p -E -k -K -np http://site/path/
wget --no-clobber --convert-links --random-wait -r -p -E -e robots=off -U mozilla http://site/path/
wget --user-agent=Mozilla --content-disposition --mirror --convert-links -E -K -p

But is that all? are those better? NO!!!

The best solution I could came across is HTTracker, it's fast and it organize your content with all the local URLs, so it will be the best solution to scrape and clone a online website to use offline.

© Heshan Wanigasooriya.RSS

🍪 This site does not track you.