Back to Troubleshooting

How To Transfer Files With WGET

Most should be familiar with wget as a command to pull web-accessible files to the server. If shell access is not available, but you have a way to tar or zip up all the necessary files on the other host, this is a great alternative. Wget is very fast, any web server can do it, and it is one of the easiest commands to run. Essentially, compress it all up and place the final file at a web-accessible location on the other host’s server. Then run a wget command from the directory into which you want to pull the file:


This will by default give you a status bar with information about the download and a message when you finish. 

WGET using FTP Protocol

There will be a few cases, hopefully rare, where none of the above options will work for various reasons: no shell access, no way to compress files, web access to the other host has been shut off, etc. If this is the case you are stuck using plain old FTP. If there isn’t much content, it is probably best to just use an FTP client such as Filezilla to download it locally and then upload it. It is simple and works pretty well if the total size is under a couple of GB and less than a couple thousand files. But of course, not all hosts have great download speeds and some sites will have 5+ GB of content and thousands upon thousands of files making this a bad option.

To get around this, we can set up Wget to download all of the files using FTP directly to the server and let it run in the background for as long as necessary. The syntax may seem a little complicated, but stick with the example and you should be fine:

wget --ftp-user=username --ftp-password=ftppassword --mirror

a more usable example:

wget --ftp-user=whuser --ftp-password=tDBmcWzB1 --mirror

Notice that the path is not necessarily the absolute path. You will start from the FTP user’s home directory on the server, not at the root, just like normal FTP. The second command would download all the contents of the www directory in the client’s home directory.  Now here is the big failing point of this method, each file is moved individually, one at a time with a new connection for each.  This is extremely slow with large amounts of files, even if the connection speed is quite fast.  One way to help with this is to split the data into separate directories if possible and run separate processes for each directory. 

For example, in the web root directory, there are 4 main directories: images, blog, plugins, and members.  I would run 4 separate commands:

wget --ftp-user=whuser --ftp-password=tDBmcWzB1 --mirror

wget --ftp-user=whuser --ftp-password=tDBmcWzB1 --mirror

wget --ftp-user=whuser --ftp-password=tDBmcWzB1 --mirror

wget --ftp-user=whuser --ftp-password=tDBmcWzB1 --mirror

There are likely a bunch of files right in the web root, so typically you can get those via an FTP client separately if there aren’t too many. Otherwise, you may want to use an FTP client to copy them all into a 5th folder and run a 5th process to grab it. 

Related Articles

My Emails I Send Are Being Received In The Junk Folder
Can I Backup My Outlook Mail Data
How To Change An Email Account Password
How To Change An Email Forwarder Into Full Email Account
How To Create An Email Account

Can’t Find what you need?

No worries, Our experts are here to help.