wget command is used for file transfers using ftp,http,https and hftp if a remote proxy is enabled.
wget can be used to download files as a background process and to recursively replicate remote directories. the command also supports download completion of partially downloaded files, which can save a lot of time during periods of intermittent connectivity or broken connections.
ex:- wget ftp://phudson:mypasswd@stinky/mp3/*
In this example, the user retrieves all files in a directory named mp3 on the remote host named stinky. the wget command will first retrieve a directory listing , then proceed to download the specified files. note that you can specify username and password on the command. this generally is not a good idea.. a better, but still not secure, approach is to save the password in a file named .wgetrc in your home directory.
another popular use of wget is downloading complete copies of websites for offline reading.to download an entire site, you need to specify the --mirror, --convert-links, and -p parameters,followed by the url of the site to download. the first parameter tells wget to download all the pages and pictures from the site, following links it can. the second tells it to rewrite the html so that it works when browsed locally. the last parameter -p tells wget to download all the files referenced in the html, such as sounds, css files,and other related documents. you might also want to specify the w parameter, which allows you to specify a number of seconds b/w individual wget requests; this stops your download from overloading the web server
ex:- wget -mirror --convert-links p w 2 http://www.example.com
see:-
http://www.gnu.org/software/wget/manual/wget.html
man wget
No comments:
Post a Comment