乐闻世界logo
搜索文章和话题

What are the performance optimizations and best practices for cURL?

3月7日 19:38

Performance optimization and best practices significantly improve cURL efficiency and reliability. Mastering these techniques is crucial for production environments.

Timeout Settings

bash
# Connection timeout (seconds) curl --connect-timeout 10 https://api.example.com # Maximum transfer time (seconds) curl --max-time 30 https://api.example.com # Combined usage curl --connect-timeout 5 --max-time 30 https://api.example.com # Millisecond precision (7.68.0+) curl --connect-timeout 3.5 --max-time 10.5 https://api.example.com

Retry Mechanism

bash
# Automatic retry curl --retry 3 https://api.example.com # Retry delay (seconds) curl --retry 3 --retry-delay 2 https://api.example.com # Exponential backoff retry curl --retry 3 --retry-delay 1 --retry-max-time 10 https://api.example.com # Retry on specific errors curl --retry 5 --retry-connrefused https://api.example.com

Connection Reuse

bash
# Keep connection (HTTP Keep-Alive) curl --keepalive-time 60 https://api.example.com # Limit connection rate curl --limit-rate 100K https://api.example.com # Concurrent requests (using xargs) cat urls.txt | xargs -P 5 -I {} curl -s {} -o /dev/null

Compressed Transfer

bash
# Request compressed response curl --compressed https://api.example.com # Manual compression specification curl -H "Accept-Encoding: gzip, deflate" https://api.example.com

Performance Analysis

bash
# Detailed performance metrics curl -w "DNS: %{time_namelookup}s\nConnect: %{time_connect}s\nSSL: %{time_appconnect}s\nTTFB: %{time_starttransfer}s\nTotal: %{time_total}s\nSize: %{size_download}B\nSpeed: %{speed_download}B/s\n" \ -o /dev/null -s https://api.example.com # Save performance report curl -w "@perf-format.txt" -o /dev/null -s https://api.example.com > perf.log

Batch Processing Optimization

bash
# Batch download (parallel) #!/bin/bash urls=( "https://api.example.com/users" "https://api.example.com/products" "https://api.example.com/orders" ) for url in "${urls[@]}"; do curl -s "$url" -o "$(basename $url).json" & done wait # Using GNU Parallel cat urls.txt | parallel -j 4 curl -s {} -o {/}.json # Connection pool reuse curl -K - <<EOF url = "https://api.example.com/users" output = "users.json" url = "https://api.example.com/products" output = "products.json" EOF

Memory and Resource Optimization

bash
# Limit download speed curl --limit-rate 1M https://example.com/large-file.zip -O # Chunked download curl -r 0-1024000 https://example.com/large-file.zip -o part1.zip curl -r 1024001-2048000 https://example.com/large-file.zip -o part2.zip # Resume download curl -C - -O https://example.com/large-file.zip # Stream processing (no memory storage) curl -s https://api.example.com/stream | jq '.[] | .name'

Best Practices Checklist

1. Security

bash
# Always verify SSL certificates curl https://api.example.com # Use latest TLS version curl --tlsv1.2 --tls-max tls1.3 https://api.example.com # Don't expose passwords in command line curl -u "username" https://api.example.com # Let cURL prompt for password # Use environment variables curl -H "Authorization: Bearer $API_TOKEN" https://api.example.com

2. Reliability

bash
# Set reasonable timeouts curl --connect-timeout 10 --max-time 60 https://api.example.com # Enable retry mechanism curl --retry 3 --retry-delay 2 --retry-max-time 30 https://api.example.com # Error handling curl -f https://api.example.com || echo "Request failed" # Check exit code curl -s https://api.example.com if [ $? -ne 0 ]; then echo "Error occurred" fi

3. Maintainability

bash
# Use configuration file # ~/.curlrc verbose connect-timeout = 10 max-time = 60 retry = 3 # Use variables and functions #!/bin/bash API_BASE="https://api.example.com/v1" TOKEN="your_token" api_call() { curl -s -H "Authorization: Bearer $TOKEN" "$API_BASE/$1" } api_call "users" api_call "products"

4. Logging and Monitoring

bash
# Log requests curl -v https://api.example.com 2>&1 | tee request.log # Structured logging curl -w "$(date '+%Y-%m-%d %H:%M:%S') | URL: %{url_effective} | Status: %{http_code} | Time: %{time_total}s\n" \ -o /dev/null -s https://api.example.com >> api.log # Monitoring script #!/bin/bash while true; do STATUS=$(curl -w "%{http_code}" -o /dev/null -s https://api.example.com/health) echo "$(date): $STATUS" [ "$STATUS" != "200" ] && echo "Alert: API unhealthy!" | mail -s "API Alert" admin@example.com sleep 60 done

Performance Comparison

OptimizationBeforeAfterImprovement
Connection reuseNew connection each timeKeep-Alive30-50%
Compressed transferOriginal sizeGzip compression60-80%
Concurrent requestsSerial executionParallel processing3-5x
DNS cachingResolve each timeCache results10-20%
Resume transferRe-downloadResumeBandwidth savings

Complete Optimization Example

bash
#!/bin/bash # Production-grade API call script # Configuration API_URL="https://api.example.com/v1/data" TOKEN="${API_TOKEN:-$(cat ~/.api_token)}" TIMEOUT=30 RETRY=3 # Call function optimized_call() { curl -s -S \ --connect-timeout 10 \ --max-time "$TIMEOUT" \ --retry "$RETRY" \ --retry-delay 2 \ --compressed \ --tlsv1.2 \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: application/json" \ -H "Accept: application/json" \ -H "User-Agent: MyApp/1.0" \ -w "\n{\"status\":${http_code},\"time\":${time_total},\"size\":${size_download}}\n" \ "$@" } # Execute response=$(optimized_call "$API_URL") # Handle response if [ $? -eq 0 ]; then echo "Success: $response" | jq '.' else echo "Error: Request failed" exit 1 fi

标签:cURL