Wget is the perfect tool to transfer an old site where you do no longer have server access or simply for offline browsing and archiving.
Say you want all of the english version of wikipedia.org downloaded in a local folder or in a folder on your server. Simply use the terminal and navigate to the folder you wish the site to be available from and execute the following, to traverse all pages and internal links:

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains en.wikipedia.org \
     --no-parent \
         en.wikipedia.org

Beware, and brace yourself for a long wait if the site has many subpages. For very big sites like wikepedia, it could be all night and gigabytes of drive space would be needed.

Oh, and for all of you windows users out there, you now can download a Windows version http://gnuwin32.sourceforge.net/packages/wget.htm