乐闻世界logo
搜索文章和话题

What are the usage methods and common scenarios of Linux pipes and redirection?

2月17日 23:36

Linux pipes and redirection are core functions of command line operations, enabling inter-process communication and data flow control.

Standard input/output:

  • Standard input (stdin, file descriptor 0): defaults to keyboard
  • Standard output (stdout, file descriptor 1): defaults to screen
  • Standard error (stderr, file descriptor 2): defaults to screen
  • /dev/stdin: standard input device
  • /dev/stdout: standard output device
  • /dev/stderr: standard error device

Redirection operators:

  • Output redirection:
    • : redirect standard output to file (overwrite)

    • : append standard output to file

    • 2>: redirect standard error to file (overwrite)
    • 2>>: append standard error to file
    • &>: redirect standard output and standard error to file (overwrite)
    • &>>: append standard output and standard error to file
    • file 2>&1: redirect both standard output and standard error to file

    • 2>&1: redirect standard error to standard output
  • Input redirection:
    • <: read input from file
    • <<: here document, read input from command line until delimiter is encountered
    • <<<: here string, read input from string
  • File descriptor redirection:
    • n>&m: redirect file descriptor n to file descriptor m
    • n<&m: redirect file descriptor n as a copy of file descriptor m
  • Disable output:
    • /dev/null: redirect output to null device (discard)

    • 2> /dev/null: redirect errors to null device (discard)

Pipe operators:

  • |: use standard output of previous command as standard input of next command
  • |&: use both standard output and standard error of previous command as standard input of next command
  • Pipe examples:
    • ps aux | grep nginx: find nginx process
    • cat file.txt | grep "pattern" | wc -l: count matching lines
    • ls -l | awk '{print $5}' | sort -n: sort by file size
    • find / -name "*.log" 2>/dev/null | xargs grep "error": find errors in logs

Common pipe commands:

  • grep: text search
  • awk: text processing
  • sed: stream editor
  • sort: sorting
  • uniq: deduplication
  • cut: cut
  • head: display first few lines
  • tail: display last few lines
  • wc: count lines, words, characters
  • tee: read standard input and write to file and standard output
  • xargs: build and execute commands from standard input

tee command:

  • tee: read standard input, write to file and standard output
  • Common options:
    • -a: append to file (do not overwrite)
    • -i: ignore interrupt signals
  • Examples:
    • command | tee output.txt: save output to file and display on screen
    • command | tee -a output.txt: append output to file and display on screen

xargs command:

  • xargs: read arguments from standard input and execute commands
  • Common options:
    • -n num: use num arguments each time
    • -I str: replace str with input arguments
    • -p: prompt for confirmation before execution
    • -t: display executed commands
  • Examples:
    • find . -name "*.txt" | xargs rm -f: delete all .txt files
    • echo "file1 file2" | xargs -n 1 cat: display file contents one by one
    • ls *.jpg | xargs -I {} cp {} /backup/: copy all jpg files to backup directory

Process substitution:

  • <(command): use command output as temporary file
  • (command): use command input as temporary file

  • Examples:
    • diff <(sort file1) <(sort file2): compare sorted files
    • command > >(logger): send output to log system

Named pipes (FIFO):

  • mkfifo: create named pipe
  • Named pipes are special files in the filesystem
  • Allow unrelated processes to communicate
  • Examples:
    • mkfifo mypipe
    • cat > mypipe: write data to pipe
    • cat < mypipe: read data from pipe

Practical application examples:

  • Log analysis:
    • tail -f /var/log/nginx/access.log | grep "404": monitor 404 errors in real-time
    • cat access.log | awk '{print $1}' | sort | uniq -c | sort -rn | head -10: count most visited IPs
  • Data processing:
    • cat data.csv | cut -d, -f1 | sort | uniq: extract first column and deduplicate
    • ps aux | awk '{sum+=$3} END {print sum}': calculate total CPU usage
  • System monitoring:
    • vmstat 1 | awk '{print $4}': monitor memory usage
    • iostat -x 1 | grep -v "^$": monitor disk I/O
  • Batch operations:
    • find . -type f -name "*.sh" | xargs chmod +x: batch add execute permissions
    • cat servers.txt | xargs -I {} ssh {} "uptime": batch view server uptime

Best practices:

  • Use pipes to connect simple commands and build complex functionality
  • Reasonably use redirection to save output results
  • Use /dev/null to discard unwanted output
  • Use tee to view and save output simultaneously
  • Use xargs to process large numbers of files
  • Pay attention to pipe buffer limitations
  • Use 2>&1 or |& when handling error output
标签:Linux