wget Command
Beginner Networking man(1)Download files from the web (HTTP, HTTPS, FTP)
👁 10 views
📅 Updated: Mar 15, 2026
SYNTAX
wget [OPTION]... [URL]...
What Does wget Do?
wget is a non-interactive file downloader that supports HTTP, HTTPS, and FTP protocols. It is designed for reliable downloads even on unstable connections, with automatic retry and resume capabilities.
wget excels at downloading files, mirroring websites, and recursive downloads. Unlike curl which outputs to stdout by default, wget saves directly to files. It handles redirects, cookies, authentication, and can download entire website structures.
wget is ideal for scripting and automation, working in the background, downloading large files over unreliable connections, and creating offline mirrors of websites.
wget excels at downloading files, mirroring websites, and recursive downloads. Unlike curl which outputs to stdout by default, wget saves directly to files. It handles redirects, cookies, authentication, and can download entire website structures.
wget is ideal for scripting and automation, working in the background, downloading large files over unreliable connections, and creating offline mirrors of websites.
Options & Flags
| Option | Description | Example |
|---|---|---|
| -O | Save to specific filename | wget -O latest.tar.gz https://example.com/file |
| -c | Resume interrupted download | wget -c https://example.com/large-file.iso |
| -r | Recursive download | wget -r -l 2 https://example.com/docs/ |
| -q | Quiet mode | wget -q https://example.com/file.zip |
| -b | Download in background | wget -b https://example.com/large.iso |
| --mirror | Mirror a website (recursive + timestamps) | wget --mirror https://example.com |
| -P | Download to specific directory | wget -P /tmp https://example.com/file.zip |
| --limit-rate | Limit download speed | wget --limit-rate=1M https://example.com/large.iso |
| -i | Download URLs listed in a file | wget -i urls.txt |
Practical Examples
#1 Download a file
Downloads the file and saves it with its original name.
$ wget https://example.com/archive.tar.gz#2 Resume interrupted download
Continues a previously interrupted download from where it stopped.
$ wget -c https://example.com/large-file.iso#3 Download with custom filename
Saves the download with a specified filename.
$ wget -O backup.sql.gz https://db.example.com/dump/latest#4 Mirror a website
Creates an offline copy of the website with working links.
$ wget --mirror --convert-links --page-requisites https://docs.example.com#5 Download multiple files
Downloads all URLs listed in the file, one per line.
$ wget -i download-list.txt#6 Rate-limited download
Limits download speed to 500KB/s to avoid saturating the connection.
$ wget --limit-rate=500K https://example.com/large-file.iso#7 Background download
Runs in background, logging progress to download.log.
$ wget -b -o download.log https://example.com/huge.isoTips & Best Practices
Resume large downloads: Always use wget -c for large files. If the connection drops, re-running the same command resumes from where it stopped.
wget vs curl: wget is better for file downloads, recursive crawling, and background operations. curl is better for API testing, custom headers, and supporting many protocols.
Recursive download caution: wget -r without -l (depth limit) can download entire websites and consume massive disk space. Always set a depth limit.
Frequently Asked Questions
How do I resume an interrupted download?
Use wget -c URL. The -c flag checks the partially downloaded file and continues from where it stopped.
How do I download an entire website?
Use wget --mirror --convert-links --page-requisites URL. This creates an offline copy with working local links.
How do I download a file silently?
Use wget -q URL for quiet mode. Add -O- to output to stdout instead of a file: wget -qO- URL.
Related Commands
More Networking Commands
Master Linux with Professional eBooks
Curated IT eBooks covering Linux, DevOps, Cloud, and more
Browse Books →