SimonTech Development
  • Contact Us
  • Request Quote
  • Consulting
      • Back
      • CMS Strategy
      • Custom Development
      • Server Management
      • Integrated Automation
  • Development & Design
      • Back
      • Joomla CMS
      • WordPress CMS
      • Custom Application Development
      • Marketing Automation
  • Website Maintenance
      • Back
      • Managed Cloud Hosting
      • CMS Maintenance
      • Site Security Mitigation
      • Backup, Repair & Restore
  • Plans & Pricing
  • Support
Account
Please wait, authorizing ...
Not a member? Sign up now
×

Knowledge Base

Linux

Behold The Power of Wget - Clone or Download a whole website

Details
Parent Category: Knowledge base
Category: Linux

Wget is the perfect tool to transfer an old site where you do no longer have server access or simply for offline browsing and archiving.
Say you want all of the english version of wikipedia.org downloaded in a local folder or in a folder on your server. Simply use the terminal and navigate to the folder you wish the site to be available from and execute the following, to traverse all pages and internal links:

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains en.wikipedia.org \
     --no-parent \
         en.wikipedia.org

Beware, and brace yourself for a long wait if the site has many subpages. For very big sites like wikepedia, it could be all night and gigabytes of drive space would be needed.

Oh, and for all of you windows users out there, you now can download a Windows version http://gnuwin32.sourceforge.net/packages/wget.htm

  • Services
  • Support Tickets
  • Knowledge Base
  • Extensions and Releases
  • Contact Us
Facebook LinkedIn Twitter Github
  • Terms & Conditions
  • Privacy Policy
© 2008 - 2025 SimonTech Development