The cURL command in Linux is a dependable way to move data between your machine and remote services. Whether you are pulling release artifacts, debugging HTTP headers, or wiring a script into a REST API, cURL gives you consistent behavior across distributions.
If you want a quick comparison for file transfers, pair this guide with our wget command examples so you know when each tool fits best.
Verify and Install cURL
Most Linux systems include cURL by default. Confirm the installation and version with:
curl --version
If cURL is missing on a minimal image, install it with your package manager:
# Ubuntu or Debian
sudo apt update
sudo apt install curl
# Fedora, CentOS Stream, AlmaLinux, or Rocky Linux
sudo dnf install curl
# Arch Linux
sudo pacman -S curl
# Alpine Linux
sudo apk add curl
# OpenSUSE or SUSE
sudo zypper install curl
# Gentoo
sudo emerge net-misc/curl
# Void Linux
sudo xbps-install curl
Package repositories keep cURL patched against the latest security updates, so run your standard system updates alongside API or automation work.
Understand the cURL Syntax
Almost every cURL command follows the same structure:
curl [options] [URL]
Options tweak how the request behaves, while the URL points at the resource to fetch or update. You can chain multiple options to mix authentication, custom headers, and output handling in a single call.
Some of the most useful flags include:
-o file: Write response data to a file of your choice.-O: Save the response to a file using the remote name.-L: Follow HTTP redirects automatically.-H: Add custom request headers.-dor--data: Send request bodies (POST, PUT, PATCH).-u user:pass: Supply HTTP basic authentication credentials.-I: Fetch only the response headers.-v: Enable verbose output for troubleshooting.
Memorize a few go-to flags so you can work faster:
| Task | Useful Options |
|---|---|
| Downloads | -O, -o, --remote-name-all, --limit-rate |
| APIs & Forms | -d, --data-urlencode, --form, -H |
| Authentication | -u, --oauth2-bearer, --cert |
| Debugging | -v, --trace, --fail-with-body |
| Networking | -x, --resolve, --interface |
Download and Upload Files
cURL handles one-off downloads as well as scripted transfers. These examples cover common workflows.
Fetch Output to the Terminal
Run a quick GET request and print the response to standard output:
curl https://example.com/status
Add | jq or similar tooling to parse JSON responses in pipelines.
Save a File with Remote or Custom Names
Use -O to store a file using the server filename, or -o to specify your own name:
# Keep the original filename
curl -O https://downloads.example.com/app.tar.gz
# Choose a custom output path
curl -o latest-release.tar.gz https://downloads.example.com/app.tar.gz
Combine -L when the download link redirects through a short URL or CDN.
Resume Interrupted Downloads
Resume a partial download by adding -C -. cURL inspects the local file and continues from the previous offset:
curl -C - -O https://downloads.example.com/app.tar.gz
This is especially handy on unstable connections or when pulling large artifacts to remote servers.
Download Multiple Files in One Pass
Queue several URLs in a single command so you can stage a release or sync artifacts without looping in a script:
curl -O https://downloads.example.com/file1.tar.gz \
-O https://downloads.example.com/file2.tar.gz
cURL processes the URLs from left to right. Pair this with --create-dirs when you want cURL to build nested folders for you.
Run Downloads in the Background
Kick off a long transfer and reclaim your shell by sending the command to the background:
curl -O https://downloads.example.com/image.qcow2 &
Check progress with jobs -l or bring the task back to the foreground with fg if you need an interactive progress bar.
Retrieve Files over FTP
Legacy services often expose updates over FTP. Authenticate inline and mirror a directory or single file:
curl ftp://ftp.example.com/releases/update.iso \
--user deploy:Sup3rSecret
Swap the URL for ftps:// when the server supports TLS so credentials and binaries stay encrypted in transit.
Upload a File to an HTTP Endpoint
Send a file to a server that accepts PUT uploads or form-based file submissions:
# HTTP PUT upload
curl -T backup.tar.gz https://uploads.example.com/backups/backup.tar.gz
# Multipart form upload
curl -F "file=@backup.tar.gz" https://uploads.example.com/api/import
The -F flag mirrors browser-style form submissions, so the server receives boundary markers and metadata alongside your file.
Work with Web APIs
cURL is the quickest way to test API endpoints, experiment with headers, or script service calls.
Send Custom Headers
Add one or more headers with -H to simulate browser requests or include tokens:
curl -H "Accept: application/json" \
-H "User-Agent: linuxcapable-cli" \
https://api.example.com/v1/health
This pattern keeps your test calls aligned with production clients, especially when services inspect user agents or accept headers.
Manage Cookies and Sessions
Read cookies from a file or send lightweight session data inline:
# Send a simple cookie inline
curl -b "name=value" https://www.example.com
# Load cookies from a file generated by a previous request
curl -b cookies.txt https://www.example.com/dashboard
Pair cookie files with -c to capture updates from the server between requests.
Customize User Agents
Override the default user agent when services gate behavior on client identity:
curl -A "Mozilla/5.0" https://www.example.com
Keep a short list of trusted strings for browsers, CLI tools, or monitoring agents, so you can reproduce issues quickly.
Send JSON in a POST Request
When an API accepts JSON payloads, set the method, content type, and body data explicitly:
curl -X POST https://api.example.com/v1/tickets \
-H "Content-Type: application/json" \
-d '{"title":"Network latency","priority":"high"}'
Use single quotes around the JSON payload to keep the shell from interpreting special characters. Reach for --data-binary if the body includes escaped newlines or needs to remain untouched.
Submit Form Data
Encode form fields the same way browsers do by using --data-urlencode for individual entries:
curl https://api.example.com/v1/search \
--data-urlencode "query=error logs" \
--data-urlencode "limit=25"
This method handles spaces, ampersands, and special characters without double quoting the entire payload.
Send Data from a File
Point cURL at a file when you keep request bodies under version control or generate them from templates:
curl -X POST https://api.example.com/v1/import \
-H "Content-Type: application/json" \
-d @payload.json
The @ prefix reads the file contents verbatim, which keeps large JSON blobs and newline-heavy files intact.
Send DELETE or PUT Requests
Override the default GET method when you need to interact with RESTful endpoints that update or remove resources:
# Delete a record
curl -X DELETE https://api.example.com/v1/tickets/42 \
-H "Authorization: Bearer $API_TOKEN"
# Replace a record with PUT
curl -X PUT https://api.example.com/v1/tickets/42 \
-H "Content-Type: application/json" \
-d '{"status":"closed"}'
Pair these verbs with --fail-with-body when you want automation to stop on HTTP errors but still capture the response text.
Handle Authentication and Security
APIs frequently require credentials or tighter TLS settings. cURL keeps those options available through concise flags.
Authenticate with Basic Credentials
Supply a username and password with -u. cURL prompts for the password if you omit it:
curl -u admin https://api.example.com/v1/metrics
Use an environment variable or credential helper to avoid storing secrets in shell history.
Send Bearer or API Tokens
Most modern APIs expect tokens in the Authorization header:
curl https://api.example.com/v1/profile \
-H "Authorization: Bearer $API_TOKEN"
Export the token in your session or read it from a secrets manager to keep scripts portable.
Use Client Certificates
Mutual TLS endpoints require a certificate and private key:
curl https://secure.example.com/report \
--cert /etc/ssl/certs/client.pem \
--key /etc/ssl/private/client.key
Confirm file permissions and combine the cert and key into a single PEM file if your provider expects that layout.
Validate TLS Configuration
cURL validates HTTPS certificates by default. Use -k (or --insecure) only for short-lived testing on self-signed labs, and document the risk when you do:
curl -k https://staging.internal.example.local/health
A safer option is to supply the issuing certificate chain with --cacert when working against private certificate authorities.
Troubleshoot and Inspect Responses
Diagnostics mode can save minutes when you need to inspect headers or latency.
View Only Headers
Fetch response headers without body content to confirm caching, redirects, or status codes:
curl -I https://example.com/docs
Add -L to follow redirects until you reach the final destination and inspect the final headers.
Capture Headers and Body Separately
Store headers in one file and the payload in another, useful for scripted comparisons:
curl -D response.headers \
-o response.json \
https://api.example.com/v1/summary
Inspect Latency and Traces
Turn on verbose mode to review the full request and response handshake, or print timing details with -w:
curl -v https://api.example.com/v1/health
curl -o /dev/null -sS -w "Total: %{time_total}s\n" \
https://api.example.com/v1/health
For deeper analysis, add --trace or --trace-time so you can diff request flows between environments.
Control Network Behavior
cURL includes flags for proxies, protocol negotiation, throttling, and reliability.
Route Traffic Through a Proxy
Point cURL at an HTTP or SOCKS proxy, with optional authentication credentials:
curl -x http://proxy.example.com:8080 \
-U proxyuser:proxypass \
https://api.example.com/v1/health
Prefix the proxy URL with socks5h:// to resolve hostnames through a SOCKS proxy.
Override DNS Resolution
Test against staging hosts or force specific backends by supplying custom DNS entries:
curl --resolve api.example.com:443:192.0.2.15 \
https://api.example.com/v1/health
cURL connects to 192.0.2.15 but still sends api.example.com in the Host header, which is perfect for blue/green cutovers or CDN troubleshooting.
Limit Transfer Rates
Throttle downloads or uploads to avoid saturating a shared connection:
curl --limit-rate 2m -O https://downloads.example.com/app.tar.gz
The value accepts suffixes like k, m, or g for kilobytes, megabytes, or gigabytes per second.
Retry Transient Failures
Automatically retry flaky downloads with exponential backoff:
curl --retry 5 --retry-delay 2 --retry-connrefused \
-O https://downloads.example.com/tool.tar.gz
This command retries the download up to five times, pauses two seconds between attempts, and also retries on connection refusal, which helps on rate-limited mirrors.
Request Compressed Responses
Ask the server for gzip or brotli compression to save bandwidth on large API responses:
curl --compressed https://api.example.com/metrics
cURL negotiates supported encodings automatically and decompresses the body locally.
Negotiate HTTP Versions
Force HTTP/2 or HTTP/3 when you need to confirm protocol support:
# Prefer HTTP/2
curl --http2 -I https://www.example.com
# Prefer HTTP/3 (requires cURL built with HTTP/3 support)
curl --http3 -I https://www.example.com
If the negotiated protocol differs from your expectation, check the Alt-Svc headers or CDN configuration.
Use cURL Beyond HTTP
cURL speaks several other protocols, which helps when you automate legacy systems or need to script cross-protocol workflows.
Interact with FTP Servers
List directories or transfer files when you run into legacy FTP endpoints:
curl -u user:pass ftp://ftp.example.com/public/
Add -T to upload or -O to pull down artifacts. Switch the URL to ftps:// when the service supports TLS.
Send Mail over SMTP
Automate status updates or alerts by piping a prepared message through SMTP:
curl --url smtp://smtp.example.com \
--mail-from sender@example.com \
--mail-rcpt ops@example.com \
-T email.txt \
-u acct:Sup3rSecret
Swap in smtps:// for encrypted sessions and rotate credentials regularly.
Query DICT Dictionaries
Keep lightweight lookups in scripts without pulling in full-text dependencies:
curl dict://dict.org/d:latency
This returns concise definitions or translations over the DICT protocol, handy for CLI utilities or dashboards.
Conclusion
The cURL command in Linux gives you a single tool that can download assets, shape API requests, replay traffic through proxies, and validate TLS policies. Keep a handful of these command patterns in your shell history or scripts and you will move faster across infrastructure changes, staging environments, and production incidents.