cURL Command in Linux: Complete Guide with Examples

The cURL command in Linux is a dependable way to move data between your machine and remote services. Whether you are pulling release artifacts, debugging HTTP headers, or wiring a script into a Representational State Transfer (REST) API, cURL delivers reliable behavior across distributions. This guide walks you through installing cURL, understanding the syntax that powers every command, scripting API requests, and tuning network behavior beyond basic downloads.

If you want a quick comparison for file transfers, pair this guide with our wget command examples so you know when each tool fits best.

Verify and Install cURL

Check Your Existing Installation

Most Linux systems include cURL by default. Before proceeding, verify the version so you know which features and protocol support are available:

curl --version

The output shows the cURL version, supported protocols, and enabled features:

curl 8.5.0 (x86_64-pc-linux-gnu) libcurl/8.5.0 OpenSSL/3.0.13 zlib/1.3
Release-Date: 2023-12-06
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp
Features: alt-svc AsynchDNS HSTS HTTP2 HTTPS-proxy IPv6 Largefile libz NTLM SSL threadsafe TLS-SRP UnixSockets

If the command is missing on a minimal image, install it using the package manager for your distribution.

These package managers mirror familiar tools from Windows: APT on Ubuntu and Debian works like Windows Update, while DNF, Pacman, APK, Zypper, Portage, and XBPS are the default package managers for Fedora and RHEL-based systems, Arch, Alpine, openSUSE, Gentoo, and Void respectively.

Ubuntu and Debian: Install cURL with APT

sudo apt update
sudo apt install curl

Fedora, CentOS Stream, AlmaLinux, and Rocky Linux: Install cURL with DNF

On Fedora 41 and newer, the dnf command already maps to DNF5, so you can rely on familiar syntax:

sudo dnf install curl

Arch Linux and Derivatives: Install cURL with Pacman

sudo pacman -S curl

Alpine Linux: Install cURL with APK

sudo apk add curl

openSUSE and SUSE Linux Enterprise: Install cURL with Zypper

sudo zypper install curl

Gentoo: Install cURL with Portage

sudo emerge net-misc/curl

Void Linux: Install cURL with XBPS

sudo xbps-install curl

Package repositories keep cURL patched against the latest security fixes, so fold these commands into your regular maintenance schedule.

Understand the cURL Syntax

If you are new to cURL, think of it as a command-line download manager that also speaks APIs, FTP, SMTP, and more. Simply describe what you want, and cURL handles the protocol details for you.

Almost every cURL command follows the same structure:

curl [options] [URL]

To better understand this, break the syntax down into three parts so you can combine flags confidently:

  • curl: The binary itself. Many shells already include it in PATH, so you can call it from scripts or interactive terminals.
  • [options]: Optional flags that change behavior. Chain as many as you need to handle authentication, redirects, headers, output files, or protocol tweaks.
  • [URL]: The target endpoint. cURL accepts multiple URLs in one invocation, and you can mix schemes like https://, ftp://, and dict://.

At its simplest, cURL fetches a resource and prints it to standard output:

curl https://example.com/status

From there, you can layer on additional options as needed. Some of the most useful flags include:

  • -o file: Write response data to a file of your choice.
  • -O: Save the response to a file using the remote name.
  • -L: Follow HTTP redirects automatically.
  • -H: Add custom request headers.
  • -d or --data: Send request bodies (POST, PUT, PATCH).
  • -u user:pass: Supply HTTP basic authentication credentials.
  • -I: Fetch only the response headers.
  • -v: Enable verbose output for troubleshooting.

For quick reference, memorize a few go-to flags so you can work faster:

TaskUseful OptionsWhat They Do
Downloads-O, -o, --remote-name-all, --limit-rateControl filenames and bandwidth so large transfers finish reliably.
APIs & Forms-d, --data-urlencode, --form, -HPost data safely and mirror how browsers format requests.
Authentication-u, --oauth2-bearer, --certHandle credentials for basic auth, tokens, or client certificates.
Debugging-v, --trace, --fail-with-bodyCapture verbose traffic details and fail fast when APIs break.
Networking-x, --resolve, --interfaceRoute calls through proxies or custom hosts during testing.

Download and Upload Files

cURL handles one-off downloads and scripted transfers. The examples below focus on the download workflows you will use most often.

Fetch Output to the Terminal

Run a quick GET request and print the response to standard output:

curl https://example.com/status

Alternatively, pipe to jq, the lightweight JSON parser, or similar tooling to format JSON responses directly in pipelines.

Save a File with Remote or Custom Names

Use -O to store a file using the server filename, or -o to specify your own name:

# Keep the original filename
curl -O https://downloads.example.com/app.tar.gz

# Choose a custom output path
curl -o latest-release.tar.gz https://downloads.example.com/app.tar.gz

Combine -L when the download link redirects through a short URL or CDN.

Add --fail-with-body to stop on HTTP errors and avoid saving error pages as release files:

curl -L --fail-with-body -O https://downloads.example.com/app.tar.gz

Limit the number of redirects cURL will follow to prevent infinite redirect loops:

curl -L --max-redirs 5 -O https://short.link/abc123

This protects against malicious redirect chains while still following legitimate short URLs or CDN routing.

Resume Interrupted Downloads

To resume a partial download, simply add -C -. When you do this, cURL inspects the local file and continues from the previous offset:

curl -C - -O https://downloads.example.com/app.tar.gz

This works well on unstable connections or when pulling large artifacts to remote servers. Make sure the server supports HTTP range requests by checking for an Accept-Ranges header before relying on resumes.

Download Multiple Files in One Pass

Queue several URLs in a single command so you can stage a release or sync artifacts without looping in a script:

curl -O https://downloads.example.com/file1.tar.gz \
     -O https://downloads.example.com/file2.tar.gz

cURL processes the URLs from left to right. When you need nested folders created automatically, switch to -o /path/to/file.tar.gz for each URL and pair those paths with --create-dirs.

Show Download Progress

Display a progress bar for large downloads so you can monitor transfer speed and remaining time:

curl -# -O https://downloads.example.com/large-file.iso

The -# flag shows a simple progress bar instead of the detailed statistics:

######################################################################## 100.0%

Omit the flag to see detailed transfer statistics including speed, time elapsed, and bytes transferred.

Run Downloads in the Background

Kick off a long transfer and reclaim your shell by sending the command to the background:

curl -O https://downloads.example.com/image.qcow2 &

Check progress with jobs -l or bring the task back to the foreground with fg if you need an interactive progress bar. For long-running downloads on remote servers, consider running cURL inside a terminal multiplexer like tmux or screen (tools that keep terminal sessions alive even when you disconnect) so sessions survive disconnections.

Upload a File to an HTTP Endpoint

Send a file to a server that accepts PUT uploads or form-based file submissions:

# HTTP PUT upload
curl -T backup.tar.gz https://uploads.example.com/backups/backup.tar.gz

# Multipart form upload
curl -F "file=@backup.tar.gz" https://uploads.example.com/api/import

The -F flag mirrors browser-style form submissions, so the server receives boundary markers (delimiters that separate form fields) and metadata alongside your file.

Work with Web APIs

cURL is the quickest way to test API endpoints, experiment with headers, or script service calls.

Send Custom Headers

Add one or more headers with -H to simulate browser requests or include tokens:

curl -H "Accept: application/json" \
     -H "User-Agent: linuxcapable-cli" \
     https://api.example.com/v1/health

This pattern keeps your test calls aligned with production clients, especially when services inspect user agents or accept headers.

Manage Cookies and Sessions

Read cookies from a file or send lightweight session data inline:

# Send a simple cookie inline
curl -b "name=value" https://www.example.com

# Load cookies from a file generated by a previous request
curl -b cookies.txt https://www.example.com/dashboard

Pair cookie files with -c to capture updates from the server between requests.

# Save cookies from a login request
curl -c cookies.txt -d "user=admin&pass=secret" https://api.example.com/login

# Use saved cookies in subsequent requests
curl -b cookies.txt https://api.example.com/dashboard

The -c flag stores server-set cookies in the file, letting you maintain authenticated sessions across multiple commands without re-authenticating.

Override User Agent Strings

Override the default user agent (the string identifying your client to the server, similar to how browsers identify themselves) when services gate behavior on client identity:

curl -A "Mozilla/5.0" https://www.example.com

Keep a short list of trusted strings for browsers, CLI tools, or monitoring agents so you can reproduce issues quickly.

Send JSON in a POST Request

When an API accepts JSON payloads, set the method, content type, and body data explicitly:

curl -X POST https://api.example.com/v1/tickets \
     -H "Content-Type: application/json" \
     -d '{"title":"Network latency","priority":"high"}'

A successful POST returns the created resource:

{
  "id": 1234,
  "title": "Network latency",
  "priority": "high",
  "status": "open",
  "created_at": "2025-11-15T10:30:45Z"
}

Use single quotes around the JSON payload to keep the shell from interpreting special characters. Reach for --data-binary if the body includes escaped newlines or needs to remain untouched.

Submit Form Data

Encode form fields the same way browsers do by using --data-urlencode for individual entries:

curl https://api.example.com/v1/search \
     --data-urlencode "query=error logs" \
     --data-urlencode "limit=25"

This method handles spaces, ampersands, and special characters without double quoting the entire payload. Pipe the output through grep when you need to filter large result sets.

Send Data from a File

Point cURL at a file when you keep request bodies under version control or generate them from templates:

curl -X POST https://api.example.com/v1/import \
     -H "Content-Type: application/json" \
     -d @payload.json

The @ prefix reads the file contents verbatim, which keeps large JSON blobs and newline-heavy files intact.

Send DELETE or PUT Requests

Override the default GET method when you need to interact with RESTful endpoints that update or remove resources:

# Delete a record
curl -X DELETE https://api.example.com/v1/tickets/42 \
     -H "Authorization: Bearer $API_TOKEN"

# Replace a record with PUT
curl -X PUT https://api.example.com/v1/tickets/42 \
     -H "Content-Type: application/json" \
     -d '{"status":"closed"}'

Pair these verbs with --fail-with-body when you want automation to stop on HTTP errors but still capture the response text.

Handle Authentication and Security

APIs frequently require credentials or tighter Transport Layer Security (TLS) settings. cURL provides those options through concise flags.

Authenticate with Basic Credentials

Supply a username and password with -u. cURL prompts for the password if you omit it:

curl -u admin https://api.example.com/v1/metrics

For security, use an environment variable or credential helper to avoid storing secrets in shell history.

Send Bearer or API Tokens

Most modern APIs expect tokens in the Authorization header:

curl https://api.example.com/v1/profile \
     -H "Authorization: Bearer $API_TOKEN"

Export the token in your session or read it from a secrets manager to keep scripts portable.

Use Client Certificates

Mutual TLS endpoints require a certificate and private key:

curl https://secure.example.com/report \
     --cert /etc/ssl/certs/client.pem \
     --key /etc/ssl/private/client.key

Before proceeding, confirm file permissions with chmod and combine the cert and key into a single PEM file if your provider expects that layout.

Validate TLS Configuration

cURL validates HTTPS certificates by default. Use -k (or --insecure) only for short-lived testing on self-signed labs, and document the risk when you bypass TLS verification.

Whenever possible, provide the issuing certificate chain with --cacert instead so production scripts never skip proper validation.

Point cURL at a trusted certificate bundle to keep verification enabled:

curl --cacert /etc/ssl/certs/ca-bundle.crt https://staging.internal.example.local/health

If you must briefly bypass verification on a staging host, run:

curl -k https://staging.internal.example.local/health

Switch back to --cacert or a trusted certificate as soon as the test completes.

Troubleshoot and Inspect Responses

Verbose diagnostics save minutes when you need to inspect headers or latency.

Inspect Response Headers Without Body Content

Fetch response headers without body content to confirm caching, redirects, or status codes:

curl -I https://example.com/docs

The output shows the response headers:

HTTP/2 200
date: Fri, 15 Nov 2025 10:30:45 GMT
content-type: text/html; charset=UTF-8
server: nginx/1.24.0
cache-control: max-age=3600
etag: "33a64df551425fcc55e4d42a148795d9f25f89d4"
x-frame-options: DENY
content-length: 12450

Add -L to follow redirects until you reach the final destination and inspect the final headers.

Save Headers and Response Body to Separate Files

Store headers in one file and the payload in another, useful for scripted comparisons:

curl -D response.headers \
     -o response.json \
     https://api.example.com/v1/summary

Inspect Latency and Traces

Turn on verbose mode to review the full request and response handshake, or print timing details with -w:

curl -v https://api.example.com/v1/health

The verbose output reveals the complete handshake:

* Trying 192.0.2.15:443...
* Connected to api.example.com (192.0.2.15) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
*   CAfile: /etc/ssl/certs/ca-certificates.crt
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384
* ALPN, server accepted to use h2
> GET /v1/health HTTP/2
> Host: api.example.com
> User-Agent: curl/8.5.0
> Accept: */*
>
< HTTP/2 200
< content-type: application/json
< date: Fri, 15 Nov 2025 10:30:45 GMT
<
{"status":"healthy","uptime":3600}

Print timing details for performance analysis:

curl -o /dev/null -sS -w "Total: %{time_total}s\n" \
     https://api.example.com/v1/health

The output shows precise timing breakdown:

Total: 0.342s

Expand the format string to capture DNS resolution, connection time, TLS handshake, and transfer speed for comprehensive performance analysis. For deeper analysis, add --trace or --trace-time so you can diff request flows between environments.

Check HTTP Status Codes

Verify if a URL exists or track redirect chains by printing only the status code:

curl -o /dev/null -s -w "%{http_code}\n" https://example.com/page

This outputs just the status code:

200

Use this pattern in scripts to check link validity, monitor endpoint health, or verify redirects before processing responses.

Control Network Behavior

Beyond basic transfers, cURL includes flags for proxies, protocol negotiation, throttling, and reliability.

Route Traffic Through a Proxy

Point cURL at an HTTP or SOCKS proxy, with optional authentication credentials:

curl -x http://proxy.example.com:8080 \
     -U proxyuser:proxypass \
     https://api.example.com/v1/health

Prefix the proxy URL with socks5h:// to resolve hostnames through a SOCKS proxy (similar to how VPNs route traffic).

Store proxy credentials in environment variables or a .netrc file so they stay out of shell history and process listings.

Override DNS Resolution

Test against staging hosts or force specific backends by supplying custom DNS entries:

curl --resolve api.example.com:443:192.0.2.15 \
     https://api.example.com/v1/health

In this scenario, cURL connects to 192.0.2.15 but still sends api.example.com in the Host header, which is perfect for blue/green cutovers or CDN troubleshooting.

Limit Transfer Rates

Throttle downloads or uploads to avoid saturating a shared connection:

curl --limit-rate 2m -O https://downloads.example.com/app.tar.gz

Note that the value accepts suffixes like k, m, or g for kilobytes, megabytes, or gigabytes per second.

Retry Transient Failures

Automatically retry flaky downloads with exponential backoff:

curl --retry 5 --retry-delay 2 --retry-connrefused \
     -O https://downloads.example.com/tool.tar.gz

This command retries the download up to five times, pauses two seconds between attempts, and also retries on connection refusal, which helps on rate-limited mirrors.

Request Compressed Responses

Ask the server for gzip or brotli compression to save bandwidth on large API responses:

curl --compressed https://api.example.com/metrics

Once enabled, cURL negotiates supported encodings automatically and decompresses the body locally.

Negotiate HTTP Versions

Force HTTP/2 or HTTP/3 when you need to confirm protocol support:

# Prefer HTTP/2
curl --http2 -I https://www.example.com

# Prefer HTTP/3 (requires cURL built with HTTP/3 support)
curl --http3 -I https://www.example.com

If the negotiated protocol differs from your expectation, check the Alt-Svc headers or CDN configuration.

Set Connection Timeouts

Prevent scripts from hanging on slow endpoints by capping connection and total execution time:

curl --connect-timeout 5 --max-time 20 https://api.example.com/v1/health

This example allows five seconds to establish the TCP or TLS session and 20 seconds overall before cURL exits with an error.

Use cURL Beyond HTTP

Beyond HTTP, cURL speaks several other protocols, which helps when you automate legacy systems or need to script cross-protocol workflows.

Interact with FTP Servers

List directories or transfer files when you run into legacy FTP endpoints:

curl -u user:pass ftp://ftp.example.com/public/

Add -T to upload or -O to pull down artifacts. Use ftps:// for implicit FTPS servers (typically on port 990), or keep ftp:// and add --ssl for explicit FTPS endpoints that upgrade a standard connection.

# Explicit FTPS example
curl --ssl -u user:pass ftp://ftp.example.com/public/report.csv -O

Send Mail over SMTP

Automate status updates or alerts by piping a prepared message through SMTP:

curl --url smtp://smtp.example.com \
     --mail-from sender@example.com \
     --mail-rcpt ops@example.com \
     -T email.txt \
     -u acct:Sup3rSecret

For encrypted connections, swap in smtps:// for encrypted sessions and rotate credentials regularly.

Query DICT Dictionaries

Keep lightweight lookups in scripts without pulling in full-text dependencies:

curl dict://dict.org/d:latency

This returns concise definitions or translations over the DICT protocol, handy for CLI utilities or dashboards.

Conclusion

The cURL command in Linux gives you a single tool that can download assets, shape API requests, replay traffic through proxies, and validate TLS policies. Keep these command patterns in your shell history or scripts to move faster across infrastructure changes, staging environments, and production incidents.

Leave a Comment