Looking for wget command syntax, examples, or interview questions?
This guide covers everything from basic wget usage to advanced options, including a quick cheat sheet, real-world scenarios, and commonly asked interview questions to help you master wget in Linux.
wget command - Quick Cheat Sheet
| Command | Description |
|---|---|
wget URL | Download a file from URL |
wget -O file URL | Save downloaded file with custom name |
wget -P /path URL | Download file to specific directory |
wget -c URL | Resume interrupted download |
wget -b URL | Download file in background |
wget -q URL | Quiet mode (hide output) |
wget --show-progress URL | Show detailed progress |
wget --limit-rate=1m URL | Limit download speed |
wget -i file.txt | Download multiple URLs from file |
wget -m URL | Mirror entire website |
wget -r URL | Recursive download |
wget -np -r URL | Recursive download without parent directory |
wget --level=2 -r URL | Limit recursive depth |
wget -t 10 URL | Retry download 10 times |
wget --retry-connrefused URL | Retry on connection refused |
wget --timeout=30 URL | Set timeout for download |
wget --user=user --password=pass URL | Download with authentication |
wget --ftp-user=user --ftp-password=pass URL | FTP download with credentials |
wget --http-user=user --http-password=pass URL | HTTP authentication |
wget -R "file" -r URL | Reject specific files during download |
wget -A ".zip" -r URL | Accept only specific file types |
wget -nd URL | Do not create directories |
wget -nH URL | Disable host directory creation |
wget --cut-dirs=2 URL | Skip directories while downloading |
wget -e robots=off URL | Ignore robots.txt restrictions |
wget --no-check-certificate URL | Ignore SSL certificate errors |
wget URL && unzip file.zip | Download and unzip in one command |
wget --spider URL | Check if URL is accessible (no download) |
wget --mirror --convert-links --adjust-extension URL | Full website mirror with usable links |
wget --background --tries=inf URL | Infinite retry in background |
wget --user-agent="Mozilla" URL | Set custom user agent |
wget --header="Authorization: Bearer TOKEN" URL | Add custom headers |
wget -o log.txt URL | Save logs to file |
wget -S URL | Show server response headers |
wget --dns-timeout=30 URL | Set DNS timeout |
wget --connect-timeout=30 URL | Set connection timeout |
wget command syntax
The basic syntax of the wget command in Linux is:
wget [options] [URL]wget→ command used to download files from the web[options]→ flags to control behavior (output, retries, speed, etc.)[URL]→ the file or website location to download
Example:
wget https://example.com/file.zipThis downloads the file to the current directory.
wget command examples
Download a single file using wget
Downloads file to current directory.
wget https://example.com/file.zipDownload file with custom name (-O)
wget -O myfile.zip https://example.com/file.zipSaves file as myfile.zip.
Download to specific directory (-P)
Downloads file into specified directory.
wget -P /home/user/downloads https://example.com/file.zipDownload multiple files using list (-i)
Create a file urls.txt:
https://example.com/file1.zip
https://example.com/file2.zipRun:
wget -i urls.txtDownloads all files listed.
Download files silently (hide progress)
wget -q https://example.com/file.zip-q→ quiet mode (no output)
For logging instead:
wget -o log.txt https://example.com/file.zipCombine wget with unzip in one command
wget https://example.com/file.zip && unzip file.zip- Downloads file
- Immediately extracts it
Alternative (auto naming):
wget -O temp.zip https://example.com/file.zip && unzip temp.zipwget in Linux (Real-world usage scenarios)
Resume interrupted downloads (wget retry and -c)
If a download is interrupted due to network issues, you can resume it using the -c option:
wget -c https://example.com/largefile.zip- Continues downloading from where it stopped
- Useful for large files or unstable connections
Limit download speed and bandwidth control
To avoid consuming full bandwidth, limit the download speed:
wget --limit-rate=500k https://example.com/file.zip500k→ limits speed to 500 KB/s- Can also use
mfor MB (e.g.,2m)
Run wget in background (nohup, &)
Run downloads in the background so they continue after terminal is closed:
Using &:
wget https://example.com/file.zip &Using nohup:
nohup wget https://example.com/file.zip &nohupensures process continues even after logout- Output is saved in
nohup.out
Retry failed downloads automatically
By default, wget retries failed downloads. You can customize retries:
wget --tries=10 https://example.com/file.zipSet infinite retries:
wget --tries=0 https://example.com/file.zipAdd delay between retries:
wget --wait=5 --tries=10 https://example.com/file.zipDownload files with timestamping (avoid duplicates)
Avoid re-downloading unchanged files using timestamping:
wget -N https://example.com/file.zip- Downloads only if remote file is newer
- Useful for syncing files or scripts
Advanced wget usage and options
Recursive download and website mirroring
Download entire websites using recursive mode:
wget -r https://example.comFor full mirroring:
wget -m https://example.com-r→ recursive download-m→ mirror mode (includes recursion + timestamping)
Limit recursion depth and file types
Control how deep wget should crawl:
wget -r -l 2 https://example.com-l 2→ limit depth to 2 levels
Download only specific file types:
wget -r -A .pdf https://example.com-A→ accept only matching extensions
Download entire directory or website
Download a directory listing:
wget -r -np https://example.com/files/-np→ no parent (stay within directory)
Exclude specific files and directories
Exclude unwanted file types:
wget -r -R .jpg,.png https://example.comExclude directories:
wget -r --exclude-directories=images,css https://example.comControl directory structure and output
Avoid creating nested directories:
wget -r -nd https://example.com-nd→ no directories (flat structure)
Customize directory structure:
wget -r --cut-dirs=2 https://example.com/path/to/files/- Removes first 2 directory levels
Save all files to a specific directory:
wget -r -P /home/user/downloads https://example.comwget command with authentication and security
Download files with username and password
You can authenticate while downloading files using HTTP/FTP credentials:
wget --user=username --password=password https://example.com/file.zipFor FTP:
wget ftp://username:password@example.com/file.zipFor better security (avoid exposing password in history), use:
wget --ask-password --user=username https://example.com/file.zipUse wget with cookies and session handling
Useful for downloading content behind login sessions.
Save cookies:
wget --save-cookies cookies.txt --post-data "username=user&password=pass" https://example.com/loginUse cookies:
wget --load-cookies cookies.txt https://example.com/protected-file.zipKeep session cookies:
wget --keep-session-cookies --save-cookies cookies.txt https://example.comIgnore SSL certificate errors (when needed)
If SSL certificate validation fails:
wget --no-check-certificate https://example.com/file.zip- Useful for self-signed certificates
- Not recommended for production (security risk)
Use wget with headers and tokens
Add custom headers (e.g., API tokens):
wget --header="Authorization: Bearer YOUR_TOKEN" https://api.example.com/dataMultiple headers:
wget --header="Header1: value1" --header="Header2: value2" https://example.comwget configuration and customization
Understanding wgetrc configuration file
wgetrc is a configuration file used to define default settings.
Example:
user_agent = MyCustomAgent
wait = 5
limit_rate = 200kThis avoids repeating options in every command.
Global vs user wgetrc settings
- Global config:
/etc/wgetrc- User config:
~/.wgetrcUser config overrides global settings.
Customize default behavior of wget
Example customizations in ~/.wgetrc:
continue = on
tries = 5
timeout = 30Now wget will:
- resume downloads automatically
- retry 5 times
- timeout after 30 seconds
wget vs curl
Key differences between wget and curl
| Feature | wget | curl |
|---|---|---|
| Primary use | Download files | Transfer data |
| Recursive download | Yes | No |
| Resume support | Yes | Yes |
| Protocol support | HTTP, HTTPS, FTP | Many (HTTP, FTP, SCP, etc.) |
| Ease of use | Simple | More flexible |
When to use wget vs curl
Use wget when:
- Downloading files
- Mirroring websites
- Automating downloads
Use curl when:
- Calling APIs
- Sending POST/PUT requests
- Debugging HTTP requests
Performance and use case comparison
wgetis optimized for bulk downloads and automationcurlis optimized for data transfer and API interaction
Example (API request using curl):
curl -X GET https://api.example.com/dataExample (file download using wget):
wget https://example.com/file.zipwget interview questions and answers
Basic wget interview questions
1. What is wget in Linux?wget is a command-line utility used to download files from the internet using protocols like HTTP, HTTPS, and FTP.
2. What is the basic syntax of wget?
wget [options] [URL]3. How do you download a file using wget?
wget https://example.com/file.zip4. Which option is used to resume downloads?
wget -c https://example.com/file.zip5. How do you download a file silently?
wget -q https://example.com/file.zipIntermediate wget questions with answers
1. How do you download multiple files using wget?
wget -i urls.txt2. How do you save a file with a different name?
wget -O newname.zip https://example.com/file.zip3. How do you limit download speed?
wget --limit-rate=500k https://example.com/file.zip4. How do you retry failed downloads?
wget --tries=10 https://example.com/file.zip5. How do you download files in the background?
wget https://example.com/file.zip &Advanced wget scenario-based questions
1. How do you mirror an entire website?
wget -m https://example.com2. How do you download only specific file types (e.g., PDFs)?
wget -r -A .pdf https://example.com3. How do you exclude certain file types?
wget -r -R .jpg,.png https://example.com4. How do you use authentication with wget?
wget --user=username --password=password https://example.com/file.zip5. How do you pass headers (e.g., API tokens)?
wget --header="Authorization: Bearer TOKEN" https://api.example.com/dataReal-world troubleshooting questions
1. Why is wget failing with SSL error?
Use:
wget --no-check-certificate https://example.com2. Why is wget not resuming download?
Ensure:
- Server supports partial downloads
- Use
-coption
3. Why is download very slow?
Possible reasons:
- Server-side throttling
- Network issues
- Use
--limit-rateor check bandwidth
4. How to debug wget issues?
wget -d https://example.com/file.zip5. How to handle redirects properly?
Wget follows redirects by default, but ensure URL is correct and accessible.
Frequently Asked Questions
1. What is wget command in Linux?
wget is a command-line tool used to download files from the internet using HTTP, HTTPS, and FTP protocols.2. What is the basic syntax of wget?
The basic syntax is wget [options] URL, where options control download behavior.3. How to download a file using wget?
Use wget URL to download a file directly from the given link.4. How to resume download using wget?
Use wget -c URL to continue a partially downloaded file.5. What are common wget options?
Common options include -O (output file), -c (resume), -q (quiet mode), and --limit-rate to control speed.Conclusion
The wget command is one of the most powerful tools in Linux for downloading files, automating tasks, and handling large-scale data transfers. From basic file downloads to advanced use cases like website mirroring and authenticated requests, wget provides flexibility and reliability. By understanding its syntax, options, and real-world scenarios, you can significantly improve your productivity in Linux environments.
Official Documentation
For complete reference and advanced usage, refer to the official GNU documentation:
You can also access documentation directly from your system using man wget.









