wget Command in Linux: 30 Practical Examples

The wget command is a reliable workhorse for pulling data over HTTP, HTTPS, and FTP without opening a browser. Whether you are downloading a Linux ISO, mirroring documentation for offline access, or automating nightly backup retrievals, wget gives you fine-grained control over retries, bandwidth limits, authentication, and logging that graphical download managers rarely match.

This guide walks through 30 practical wget examples you can apply on modern Linux distributions, covering everything from basic single-file downloads to advanced scenarios like cookie-based authentication, recursive mirroring, and API interactions. You will learn how to resume interrupted transfers, throttle bandwidth to keep networks responsive, handle server authentication, and structure wget commands for reliable automation in cron jobs or deployment scripts. Pair this with our curl command examples when you need to push data or interact with REST APIs beyond simple downloads.

What Is the wget Command?

If you are new to wget, think of it as a command-line download manager that fetches files from web servers. The basic syntax is straightforward:

wget [options] URL

Breaking this down:

  • wget: The command itself, which stands for “web get”.
  • options: Optional flags that modify wget’s behavior. For example, -c resumes interrupted downloads, -O saves with a custom filename, or -b runs in the background. You can combine multiple options in a single command.
  • URL: The web address of the file or page you want to download. This can be an HTTP, HTTPS, or FTP link.

At its simplest, wget https://example.com/file.zip downloads file.zip to your current directory. The real power comes from options that let you resume broken transfers, mirror entire sites, throttle bandwidth, or automate downloads in scripts.

Here is a quick reference of common options organized by task. Do not worry about memorizing these; the examples below show each option in action:

TaskHelpful OptionsWhat They Do
Single downloads-O, -P, -c, --content-dispositionRename files, choose directory, resume, or use server-suggested names
Website mirroring--mirror, -r, -l, -A, -RDownload entire sites, control depth, accept/reject file types
Automation-b, --tries, --append-output, --quietRun in background, retry on failure, log quietly for scripts
Authentication--user, --ask-password, --header, --load-cookiesHandle login prompts, API tokens, or session cookies
Networking--limit-rate, --wait, -4, --proxyLimit bandwidth, add delays, force IPv4/IPv6, route through proxy

Install wget on Major Distributions

Most Linux distributions ship with wget pre-installed, but you can add it with a single package command if your minimal server image or container lacks it. First, verify whether wget is already available:

wget --version

If the command is not found, install wget using your distribution’s package manager. Choose the command that matches your system:

Ubuntu and Debian-based distributions

sudo apt install wget

Fedora, RHEL, Rocky Linux, and AlmaLinux

sudo dnf install wget

Arch Linux and Manjaro

sudo pacman -S wget

openSUSE

sudo zypper install wget

Alpine Linux

sudo apk add wget

Gentoo Linux

sudo emerge --ask net-misc/wget

Void Linux

sudo xbps-install -S wget

In rare cases, you may need a newer version of wget to access HTTP/2 endpoints or use cutting-edge TLS features not included in frozen enterprise repositories. When that happens, download and compile wget from the GNU wget source, though the repository version suffices for the vast majority of use cases.

30 Practical wget Command Examples

These examples cover day-to-day tasks along with automation-friendly tricks. Adapt the URLs and file paths to match your environment.

Example 1: Download a Single File

Start with the most basic use case: downloading a single file. wget saves it using the filename provided by the server, placing it in your current working directory.

wget https://downloads.example.com/images/debian.iso

After the download completes, the file appears in whichever directory you were in when you ran the command. Verify its integrity with sha256sum before using it.

sha256sum debian.iso
1f2d3c4b5a6e7890cdef1234567890abcdef1234567890cdef1234567890ab  debian.iso

Example 2: Run Downloads in the Background

When downloading large files, you may want to continue using your terminal for other tasks. The -b flag runs wget in the background, freeing up your command line immediately.

wget -b https://downloads.example.com/archives/app.tar.gz

wget writes progress updates to a file named wget-log in your current directory. Monitor the download with tail -f wget-log. This approach is especially useful when working on remote servers via SSH, or within terminal multiplexers like tmux or screen.

Example 3: Limit Download Bandwidth

To prevent wget from consuming your entire network connection, use --limit-rate to throttle the download speed. This keeps your connection responsive for other applications and users.

wget --limit-rate=500k https://downloads.example.com/releases/backup.tar.gz

The rate limit accepts values with suffixes: k for kilobytes per second, m for megabytes per second, or g for gigabytes per second. In this example, wget caps the download at 500 KB/s, leaving bandwidth available for browsing or other transfers.

Example 4: Save with a Custom Filename

By default, wget uses the filename from the URL. When you need a different name, for instance to maintain consistent naming across versioned releases, use the -O option (that’s a capital letter O, not zero).

wget -O latest.tar.gz https://downloads.example.com/releases/app-2025.1.tar.gz

This downloads app-2025.1.tar.gz but saves it as latest.tar.gz. Scripts that always extract “latest.tar.gz” work without modification when a new version arrives.

Example 5: Save into a Specific Directory

Instead of navigating to a target directory before downloading, use -P (prefix) to specify where wget should save the file.

wget -P ~/Downloads/isos https://downloads.example.com/images/fedora.iso

The file lands in ~/Downloads/isos/ regardless of your current working directory. Ensure the target directory exists first, or wget will fail with a “No such file or directory” error.

Example 6: Resume an Interrupted Download

Network hiccups or dropped SSH connections can interrupt large downloads. Instead of starting over, use the -c flag to continue from where the transfer stopped.

wget -c https://downloads.example.com/weekly/backup.tar.gz

wget requests only the remaining bytes from the server, saving time and bandwidth. The server must support range requests (most do), otherwise wget falls back to re-downloading the entire file.

Example 7: Skip Unchanged Files with Timestamping

Use -N to download a file only when the remote copy is newer than your local version.

wget -N https://updates.example.com/daily-report.csv

This is ideal for cron jobs that should grab reports once per day without overwriting unchanged files.

Example 8: Honor Server Filenames

APIs often redirect download links. The --content-disposition flag asks wget to use the filename suggested by the server.

wget --content-disposition https://downloads.example.com/get?id=12345

It prevents generic names like get?id=12345 and keeps your artifacts labeled correctly.

Example 9: Download URLs from a List

When you have multiple files to download, save their URLs in a plain text file (one URL per line) and feed it to wget with the -i option.

wget -i /srv/config/url-list.txt

wget processes each URL sequentially, downloading all files in the order they appear. This is particularly handy for batch updates, mirror synchronization, or deploying multiple packages in configuration management systems.

Example 10: Mirror a Site for Offline Use

wget can download entire websites for offline browsing, which is useful for documentation portals, internal wikis, or knowledge bases. The --mirror option combines several features to create a complete local copy.

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://docs.example.com/

Each flag in this command plays a role:

  • --mirror: Enables recursive downloading with timestamping to keep copies current.
  • --convert-links: Rewrites links so pages work when opened locally.
  • --adjust-extension: Adds .html where needed so browsers render files correctly.
  • --page-requisites: Pulls stylesheets, images, and scripts that pages reference.
  • --no-parent: Stops wget from crawling above the starting directory.

Check the site’s robots.txt file before mirroring to ensure you follow their crawling policies.

Example 11: Mirror a Specific Subdirectory

Limit a mirror to an area you care about while staying on the same host.

wget --mirror --no-parent --include-directories=/docs/api https://docs.example.com/

This copies only content under /docs/api, so you avoid sucking down unrelated marketing pages.

Example 12: Limit Recursion Depth

Set a recursion depth when you want a quick snapshot of near-neighbor pages without crawling an entire site.

wget -r -l 2 -np https://intranet.example.com/news/

-l 2 tells wget to follow links two levels deep, which is often enough for release notes or team dashboards.

Example 13: Download Only Specific File Types

Target PDFs or images from a site without hauling everything else along.

wget -r -np -A 'pdf,epub' https://library.example.com/publications/

The -A list accepts comma-separated extensions, helping you curate download sets for research archives.

Example 14: Reject Unwanted File Types

Skip bloated archives or binaries when you only need HTML and text.

wget -r -R 'zip,tar.gz' https://downloads.example.com/datasets/

Pair rejections with -A filters to zero in on the artifacts that matter.

Example 15: Slow Down to Respect Busy Servers

Throttle requests between pages when mirroring community sites so you do not trigger rate limits.

wget -r --wait=2 --random-wait https://projects.example.org/releases/

--random-wait adds jitter, which mimics human browsing and keeps admins happy.

Example 16: Detect Broken Links with Spider Mode

Spider mode tells wget to check URLs without actually downloading files. This is perfect for auditing websites to find broken links (404 errors) or testing site structure.

wget --spider -r -l 4 -o wget-spider.log https://docs.example.com/

The --spider flag skips downloads and only requests headers. wget logs all HTTP responses to wget-spider.log. After the scan completes, search the log for “404” or “broken link” messages using grep:

grep '404' wget-spider.log

Example 17: Inspect Headers Without Downloading

Combine spider mode with --server-response to confirm metadata such as Last-Modified before pulling a huge file.

wget --spider --server-response https://downloads.example.com/images/server.img

The command prints response headers and exits, so you can spot stale mirrors or unexpected redirects.

Example 18: Authenticate with Basic Credentials

Hit intranet portals or staging servers that require HTTP basic auth.

wget --user=buildbot --password="$WGET_PASS" https://intranet.example.com/builds/report.html

Store the password in an environment variable or use a secrets manager so it does not land in your shell history.

Example 19: Prompt for Passwords Interactively

Protect credentials on shared servers by prompting instead of passing a password inline.

wget --user=admin --ask-password https://vault.example.com/exports/config.yaml

wget asks for the password securely and does not expose it in process listings.

Example 20: Reuse Session Cookies

Some dashboards gate downloads behind login flows. Import cookies captured from a browser session.

wget --load-cookies cookies.txt https://portal.example.com/reports/monthly.pdf

Export cookies in Netscape format (browser extensions such as cookies.txt for Firefox or Chrome can generate this file) and keep the file secure, since it grants the same access as your login.

Example 21: Capture Cookies for Later Requests

Use --save-cookies to stash session data during a scripted login, then call wget again with --load-cookies.

wget --save-cookies=session.txt --keep-session-cookies \
  --post-data="user=admin&password=$PORTAL_PASS" \
  https://portal.example.com/login

This command posts credentials stored in $PORTAL_PASS and captures the authenticated session in session.txt.

wget --load-cookies=session.txt \
  https://portal.example.com/reports/monthly.pdf

Follow up with the resource you actually need while reusing the saved cookies. Remove the cookie file afterward so credentials do not linger on disk.

Example 22: Send Custom Headers

APIs often require bearer tokens or feature flags in headers. Add them with --header.

wget --header="Authorization: Bearer $API_TOKEN" https://api.example.com/v1/metrics

You can repeat --header for additional values like X-Request-ID or content negotiation hints.

Example 23: Submit Form Data with POST

Push data to an endpoint using --post-data. wget sets the request method to POST automatically.

wget --post-data="name=backup&status=completed" https://hooks.example.com/deploy

Remember that wget is still a download tool; it prints the response body so you can log or inspect success codes.

Example 24: POST JSON from a File

Load structured payloads from disk using --post-file alongside the correct content type header.

wget --header="Content-Type: application/json" --post-file=payload.json https://api.example.com/v1/import

Pair with --quiet or log redirection when you only need the HTTP status in automation.

Example 25: Validate TLS with a Custom CA Bundle

Internal services often use private certificate authorities. Point wget at the correct trust store.

wget --ca-certificate=/etc/pki/internal-ca.pem https://intranet.example.com/dashboard/

This maintains TLS validation instead of disabling checks, which keeps connections protected from man-in-the-middle attacks.

Example 26: Temporarily Skip Certificate Checks

Use --no-check-certificate only during controlled testing when you must reach a host with a broken certificate chain. The command below temporarily bypasses verification so you can confirm if TLS is the culprit.

Follow up by fixing the certificate or switching to a trusted CA bundle; leaving this flag in production is risky.

wget --no-check-certificate https://staging.example.com/healthz

Example 27: Route Traffic Through a Proxy

Respect corporate proxies or route around geofenced resources with -e directives.

wget -e use_proxy=yes -e http_proxy=http://proxy.example.com:3128 https://downloads.example.com/tools/cli.tar.gz

Set https_proxy similarly when you need to route encrypted traffic through the proxy as well.

Example 28: Force IPv4 or IPv6

Debug dual-stack issues by telling wget to use IPv4 (-4) or IPv6 (-6) explicitly.

wget -4 https://downloads.example.com/repository/index.html

If the IPv4 path fails while IPv6 works, or vice versa, you know where to focus your troubleshooting.

Example 29: Customize the User-Agent

Some CDNs tune responses based on user agents. Provide a descriptive string or mimic a browser.

wget --user-agent="LinuxCapable-Wget/1.0 (+https://linuxcapable.com/)" https://downloads.example.com/assets/theme.css

A clear user agent also helps server owners understand legitimate automation traffic in their logs.

Example 30: Quiet Mode with Persistent Logs

Script-friendly output combines --quiet with --append-output, so logs accumulate across runs.

wget --quiet --append-output=/var/log/wget-sync.log -P /srv/mirror https://mirror.example.com/latest.tar.gz

Check the log during audits without losing earlier entries, which is perfect for long-lived cron jobs.

Conclusion

The wget command remains essential in any Linux toolkit because it delivers predictable, scriptable downloads that integrate seamlessly into automation workflows. From simple file retrieval to complex site mirroring with authentication, the 30 examples above show how wget handles real-world scenarios that graphical tools struggle with.

Start with the basics: resuming interrupted downloads with -c, limiting bandwidth with --limit-rate, and running transfers in the background with -b. As your needs grow, layer on authentication options, cookie management, and recursive crawling to build robust automation around software updates, backup retrieval, or content synchronization. The beauty of wget is that each option composes cleanly with others, so you can build exactly the download behavior your infrastructure demands without fighting the tool.

Keep this reference handy when scripting deployments or troubleshooting downloads. The more deliberately you apply wget’s options to match your network conditions and server requirements, the more reliable your automation becomes.

Leave a Comment