乐闻世界logo
搜索文章和话题

What are the redirection and pipe mechanisms in Shell scripts? How to use them?

3月6日 21:33

Redirection and pipe mechanisms in Shell scripts are important ways for inter-process communication and data flow control.

Standard Input/Output

Three Standard Streams

bash
stdin (0) # Standard input - default read from keyboard stdout (1) # Standard output - default output to screen stderr (2) # Standard error - default output to screen

Output Redirection

Redirect Standard Output

bash
# Overwrite redirection (>) echo "Hello" > file.txt # Create or overwrite file ls -l > filelist.txt # Append redirection (>>) echo "World" >> file.txt # Append to end of file date >> log.txt

Redirect Standard Error

bash
# Redirect error output ls /nonexistent 2> error.log # Append error output ls /nonexistent 2>> error.log # Redirect both output and error command > output.txt 2>&1 command &> output.txt # Bash 4.0+ shorthand # Redirect separately command > output.txt 2> error.txt

Discard Output

bash
# Discard standard output command > /dev/null # Discard error output command 2> /dev/null # Discard all output command > /dev/null 2>&1 command &> /dev/null

Input Redirection

Read from File

bash
# Read input from file wc -l < file.txt # Multi-line input cat << EOF > script.sh #!/bin/bash echo "Hello, World!" EOF

Here Document

bash
# Multi-line input cat << EOF This is line 1 This is line 2 This is line 3 EOF # Use variables name="John" cat << EOF Hello, $name! EOF # Disable variable substitution cat << 'EOF' This is $name - not expanded EOF

Here String

bash
# Single-line input wc -w <<< "Hello World" # Process variables text="Hello World" grep "World" <<< "$text"

Pipes

Basic Pipes

bash
# Use output of one command as input to another ps aux | grep nginx cat file.txt | grep "pattern" ls -l | sort -k5 -n # Connect multiple pipes cat file.txt | grep "pattern" | wc -l

Pipes with Redirection

bash
# Pipe output to file command | tee output.txt # Pipe error output command 2>&1 | grep "error" # Read from pipe while read line; do echo "Line: $line" done < <(ls -l)

Process Substitution

Basic Syntax

bash
# Output process substitution (<()) diff <(ls dir1) <(ls dir2) # Input process substitution (>()) tar -cf >(gzip > archive.tar.gz) directory/ # Multiple process substitutions paste <(cut -f1 file1) <(cut -f2 file2)

Practical Applications

bash
# Compare two directories diff <(ls dir1) <(ls dir2) # Merge multiple files paste <(cut -d: -f1 /etc/passwd) <(cut -d: -f3 /etc/passwd) # Real-time monitoring tail -f /var/log/syslog | grep --line-buffered "ERROR" | tee errors.log

Advanced Redirection

File Descriptor Operations

bash
# Open custom file descriptor exec 3> output.txt echo "Line 1" >&3 echo "Line 2" >&3 exec 3>&- # Read from file descriptor exec 4< input.txt read line <&4 echo "$line" exec 4<&- # Swap file descriptors command 3>&1 1>&2 2>&3

Simultaneous Read/Write

bash
# Read and write same file simultaneously exec 3<> file.txt read -u 3 line echo "New content" >&3 exec 3>&-

Practical Application Examples

Log Processing

bash
# Separate normal output and error output ./script.sh 1> success.log 2> error.log # Merge logs ./script.sh > all.log 2>&1 # Real-time log monitoring tail -f /var/log/app.log | grep --line-buffered "ERROR" | tee errors.log

Data Processing

bash
# Process CSV files cat data.csv | cut -d, -f1,3 | sort -u > output.txt # Statistics cat access.log | awk '{print $1}' | sort | uniq -c | sort -rn > stats.txt # Batch processing find . -name "*.txt" -exec cat {} \; | grep "pattern" > results.txt

System Administration

bash
# Backup important command output df -h > disk_usage_$(date +%Y%m%d).txt ps aux > process_list_$(date +%Y%m%d).txt # Monitor system while true; do date >> monitor.log free -m >> monitor.log echo "---" >> monitor.log sleep 60 done

Script Development

bash
# Capture command output result=$(command) echo "Result: $result" # Process multi-line output while IFS= read -r line; do echo "Processing: $line" done < <(find . -name "*.sh") # Create temporary file tmpfile=$(mktemp) command > "$tmpfile" # Process file rm -f "$tmpfile"

Best Practices

  1. Use /dev/null to discard unwanted output: Reduce log noise
  2. Handle stdout and stderr separately: Easier debugging and log analysis
  3. Use pipes to combine commands: Improve processing efficiency
  4. Use tee to output and save simultaneously: Easy real-time monitoring
  5. Be aware of pipe buffering: Use --line-buffered for real-time data
  6. Use process substitution: Avoid creating temporary files
  7. Close file descriptors promptly: Prevent resource leaks
  8. Use mktemp for temporary files: Avoid filename conflicts
标签:Shell