Data Stream redirection in Linux and redirect (redirect) Names in short for code usage standard input stdin 0 <, use the data of the file as input for other commands <, set the standard output stdout 1> of the string after the input ends, and output the correct data in overwriting mode>, standard error output (stderr 2) is the correct data output standard error output (stderr). overwrite the data output standard error 2>, the Accumulate method redirects the wrong data output data streams. You can output the command stdout or stderr to other files or devices respectively. You can also input files or devices as stdin and commands. Execution sequence of multiple commands $? Indicates the command execution result. If the value is 0, the command is successful. The commands on both sides of cmd1; cmd2; semicolons are irrelevant, but cmd1 & cmd2 1 is executed in order. If cmd1 is executed successfully and correctly ($? = 0), then execute Route 22. If cmd1 is finished and it is an error ($? <> 0), then do not execute ipv2cmd1 | cmd2 1 If cmd1 is completed and executed correctly ($? = 0), then do not execute limit 22 if cmd1 is completed and is an error ($? <> 0), then execute ipv2cmd1 & cmd2 | cmd3 1 If cmd1's ($? = 0), then execute Route 22 if cmd1's ($? <> 0), execute the cmd3 pipeline command (pipe) cmd1 | stdout of cmd2 cmd1. As the stdin pipeline command of cmd2, only stdout is processed, for stderr, the pipeline command will be ignored. It must be able to accept the data of the previous command and become stdin for further processing. That is to say, cmd2 is a specific pipeline command with limited cut from each line, extract the grep that meets the condition in a row. If there is a qualified part, this row is output. Regular Expressions are supported. Sort sorts multiple rows in the file uniq. If multiple rows are repeated, only the rows, words, characters, and bytes in the first wc statistic file are output. tee reads from stdin, and output to stdout and file tr to replace or delete a paragraph of text col tab to convert to multiple spaces, or man page to text join to two rows of the same data in the two files, concatenate a paste row and merge the two rows of the two files into one row, and use tab to split expand to convert tab to space split to split a large file into several small files. You can use cat pieces *> file to restore xargs from stdin, multiple parameters are separated by spaces or line breaks. Many other commands do not support pipelines. You can use xargs to provide the parameter minus sign-which indicates stdin or stdout to package the files in/home, the packaged data is output to stdout. The first "-" indicates stdout. After going through the pipeline, extract the package's data from stdin. The second "-" represents the stdinLinux code $ tar-cvf-/home | tar-xvf-