Wget command instance
Wget is a Linux/Unix Command Line File Download tool. It is a free non-interactive download tool for downloading files on a website. It supports HTTP, HTTPS, and FTP protocols, as well as HTTP Proxy retrieval. Wget is non-interactive, which means it can work in the background when the user does not log on to the system.
In this post, we will discuss some different examples of using the wget command.
Instance 1: Download a single file
- # Wget http://mirror.nbrc.ac.in/CentOS/7.0.1406/isos/x86_64/CentOS-7.0-1406-x86_64-DVD.iso
This command downloads the CentOS 7 ISO file to your current working directory.
Example 2: resumable multipart download
There are always some scenarios where the Internet is disconnected when we start to download a large file. In this case, we can use the '-C' option of the wget command to resume the download from the breakpoint.
- # Wget-c http://mirror.nbrc.ac.in/centos/7.0.1406/isos/x86_64/CentOS-7.0-1406-x86_64-DVD.iso
Example 3: Background File Download
You can use the '-B' option in the wget command to download files in the background.
- Linuxtechi @ localhost :~ $ Wget-B http://mirror.nbrc.ac.in/centos/7.0.1406/isos/x86_64/
- CentOS-7.0-1406-x86_64-DVD.iso
- Continuingin background, pid 4505.
- Output will be written to 'wget-log '.
As we can see above, the download process is captured to the 'wget-log' file in the user's current directory.
- Linuxtechi @ localhost :~ $ Tail-f wget-log
- 2300K ....................................... ........... 0% 48.1K18h5m
- 2350K ....................................... ........... 0% 53.7K18h9m
- 2400K ....................................... ........... 0% 52.1K18h13m
- 2450K ....................................... ........... 0% 58.3K18h14m
- 2500K ....................................... ........... 0% 63.6K18h14m
- 2550K ....................................... ........... 0% 63.4K18h13m
- 2600K ....................................... ........... 0% 72.8K18h10m
- 2650K ....................................... ........... 0% 59.8K18h11m
- 2700K ....................................... ........... 0% 52.8K18h14m
- 2750K ....................................... ........... 0% 58.4K18h15m
- 2800K ....................................... ........... 0% 58.2K18h16m
- 2850K ....................................... ........... 0% 52.2K18h20m
Instance 4: download speed limit
By default, the wget command tries to download files at full speed, but sometimes you may use Internet sharing. If you try to use wget to download large files, the network of other users will be slowed down. In this case, if you use the '-limit-rate' option to limit the download speed, you can avoid this situation.
- # Wget -- limit-rate = 100 k http://mirror.nbrc.ac.in/centos/7.0.1406/isos/x86_64/CentOS-7.0-1406-x86_64-DVD.iso
In the preceding example, the download speed is limited to 100 kb.
Instance 5: Use the '-I' option to download multiple files.
If you want to use the wget command to download multiple files, you must first create a text file and add all URLs to the file.
- # Cat download-list.txt
- Url1
- Url2
- Url3
- Url4
Now, run the following command:
- # Wget-I download-list.txt
Instance 6: Increase the number of retries
You can use the '-tries' option to increase the number of retries. By default, the wget command retries 20 times until the download is successful.
This option is useful when an Internet connection problem occurs during the process of downloading a large file, because in that case, it increases the chance of downloading failure.
- # Wget -- tries = 75 http://mirror.nbrc.ac.in/centos/7.0.1406/isos/x86_64/CentOS-7.0-1406-x86_64-DVD.iso
Instance 7: Use the-o option to redirect wget logs to files
We can use the '-O' option to redirect the log of the wget command to a log file.
- # Wget-o download. log http://mirror.nbrc.ac.in/centos/7.0.1406/isos/x86_64/CentOS-7.0-1406-x86_64-DVD.iso
The preceding command creates the download. log file in the current directory.
Instance 8: Download the entire website for local viewing
- # Wget -- mirror-p -- convert-links-P./<Local-Folder> website-url
In view
- -Mirror: Enable the option for images.
- -P: Download all necessary files that correctly display the specified HTML page.
- -Convert-links: After the download is complete, the link in the conversion document is used for local viewing.
- -P./Local-Folder: saves all files and directories to the specified directory.
Instance 9: file type rejected during download
When you are about to download the entire website, we can use the '-reobject' option to force wget not to download images.
- # Wget -- reject = png Website-To-Be-Downloaded
Instance 10: Use wget-Q to set the download quota
We can use the '-Q' option to force the wget command to exit the download when the download size exceeds the specified size.
- # Wget-Q10m-I download-list.txt
Note that the quota does not affect the download of a single file. So if you specify wget-Q10m, all contents of the ftp://wuarchive.wustl.edu/ls-lR.gz,ls-lR.gz will be downloaded. This is also true when downloading multiple URLs specified by the command line. However, it is worth using recursively or retrieving from an input file. Therefore, you can safely enter 'wget-Q10m-I download-list.txt 'and the download will exit when the quota is exceeded.
Instance 11: download files from password-protected websites
- # Wget -- ftp-user = <user-name> -- ftp-password = <password> Download-URL
Another way to specify the user name and password is in the URL.
Any method exposes your password to those who run the "ps" command. To prevent passwords from being viewed, store them in. wgetrc or. netrc, and use "chmod" to set proper permissions to protect these files from being viewed by other users. If the password is really important, do not go away while they are still lying in the file. After Wget starts downloading, edit the file or delete it.
Linux wget command details
Use wget/aria2 for offline bulk download in Linux
An error is reported when wget is used in Linux.
Linux download command wget Usage Details
Wget
Tips for using the Linux Command Line download tool wget
This article permanently updates the link address: