Discussion on the details of IO and condition and cyclic processing in Linux shell programming _linux Shell

Source: Internet
Author: User
Tags stdin

> and < where is the difference?
when it comes to I/O redirection, let's get to know file descriptor (FD) first. The operation of the program, in most cases, is the processing of data (data), which is read from where? And where to send it? This is the function of file descriptor (FD).

In a shell program, the most commonly used FD is about three, respectively:

    • 0:standard Input (STDIN)
    • 1:standard Output (STDOUT)
    • 2:standard Error Output (STDERR)

In standard cases, these FD are associated with the following device (device):

    • stdin (0): keyboard
    • STDOUT (1): Monitor
    • STDERR (2): monitor

We can use the following command to test:

$ mail-s Test root is
a test mail.
Please skip.

^d (press Crtl with D key at the same time)
Obviously, the data that the mail program reads is read from stdin or keyboard. However, the stdin of each program is not necessarily read as mail from keyboard, because the program author can read stdin from the file parameters, such as:

$ cat/etc/passwd

But what if there is no file parameter after cat? Oh, please play it by yourself ... ^_^

$ cat

(Please note where the data output goes, and finally don't forget to press ^d to leave ...) As for stdout and stderr, um ... Then, let's continue to look at the stderr.

In fact, stderr is not difficult to understand: to put it bluntly is the "wrong message" to which side to send the only ... For example, if you read the file parameters do not exist, then we see on the monitor:

$ ls no.such.file
ls:no.such.file:No such file or directory

What if a command produces stdout and stderr at the same time? That's not easy, just send it to monitor:

$ touch my.file
$ ls my.file no.such.file
ls:no.such.file:No such file or directory
my.file

Okay, so far, about FD and its name, and associated equipment, I believe you are no problem? Well, then let's look at how to change these FD preset data channels, and we can use < to change the read Data channel (stdin) to read from the specified file. We can use > to change the Sent data channel (STDOUT,STDERR) to output to the specified file. Say:

$ Cat < My.file

is to read data from My.file.

$ mail-s Test Root </etc/passwd

is read from the/etc/passwd ...

In this way, stdin will no longer be read from the keyboard, but read from the file ... Strictly speaking the,< symbol needs to specify a FD (there can be no white space), but because 0 is < preset, so < and 0< is the same!okay, this is good to understand?

So, what if you use two <<? This is the so-called here Document, which allows us to enter a paragraph of text until the string that is specified after the << is read. Say:

$ cat <<finish-A-i-
second line there third line
nowhere
FINISH

In this way, cat reads 3 lines without having to read the data from keyboard and wait for ^d to end the input.

Okay, and to Syriac time ~ ~ ~ When you understand the 0< original is to change the stdin data input channel, I believe to understand the following two redirection is not difficult:1> 2>. The former is to change the STDOUT data output channel, the latter is to change the STDERR data output channel. Both are shifting the data that was supposed to be sent to monitor to output to the designated file.

Since 1 is a > preset,,1> is the same as >, and is changing stdout. Use the last LS example to illustrate the good:

$ ls my.file no.such.file 1>file.out
ls:no.such.file:No such file or directory

So the monitor is left with only stderr. Because stdout to write into the file.out.

$ ls my.file no.such.file 2>file.err
my.file

So monitor is left stdout, because stderr written in File.err.

$ ls my.file no.such.file 1>file.out 2>file.err

So monitor has nothing, because stdout and stderr to go to the file ...

Oh, it seems to understand > is not difficult! I didn't lie to you, did I? ^_^ However, there are some places to pay attention to.
First, the problem of simultaneous writing. This example is as follows:

$ ls my.file no.such.file 1>file.both 2>file.both

If stdout (1) and stderr (2) are simultaneously written to the File.both, the "overwrite" method is taken: the one that was later written in the front. Let's assume that a stdout and stderr are written to file.out at the same time:

    • First STDOUT write 10 characters
    • Then stderr writes 6 characters

Then, the original stdout output of the 10 characters are covered by stderr. So, how to solve it? The so-called mountain does not turn the road, the road does not turn people, we can change a thinking: will stderr into the stdout or will stdout guide into the Sterr, instead of everyone in the same file, not on the line!bingo! is like this:

    • 2>&1 is to stderr stdout for output
    • 1>&2 or >&2 is to stdout the stderr for output

As a result, the previous error action can be changed to:

$ ls my.file no.such.file 1>file.both 2>&1

Or

$ ls my.file no.such.file 2>file.both >&2

In the Linux file system, there is a device file located in/dev/null. A lot of people have asked me what that thing is. I told you: that's "Empty"! Empty Emptiness is null ... does the donor suddenly have a mistake? But congratulations ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ^_^ This null is useful in I/O redirection:

    • If you turn FD1 and FD2 to/dev/null, you can get stdout and stderr out of sight.
    • If the FD0 received/dev/null, it is read into nothing.

For example, when we execute a program, the screen sends out both STDOUT and stderr,

If you don't want to see stderr (and don't want to save it to a file), you can:

$ ls my.file no.such.file 2>/dev/null
my.file

To the contrary: just want to see stderr? It's not easy! stdout get null on the line:

$ ls my.file no.such.file >/dev/null
ls:no.such.file:No such file or directory

Then, if simply running the program, do not want to see any output results? Oh, here I left the last program did not talk about the way, special gifts to the people! ... ^_^ besides using >/dev/null 2>&1, you can do this:

$ ls my.file no.such.file &>/dev/null

(Hint: Change &> to >& also OK ~~! )

Okay Speaking of Buddha, Next, let us look at the following:

$ echo "1" > File.out
$ cat file.out
1
$ echo "2" > File.out
$ cat file.out
2

It seems that when we redirect stdout or stderr into a file, we always seem to get only the results of the last import. So, what about the previous content? Ah ~ ~ ~ To solve the question is very simple, the > replaced by >> is good:

$ echo "3" >> file.out
$ cat file.out
2
3

As a result, the content of the redirected target file is not lost, and the new content is added to the last level. Easy Oh, ^_^.

However, if you once again use a single > to redirect, then the old content will be "washed" off! How do you avoid it? ----Backup!yes, I hear you! But ... is there anything better? Since there is such a fate with the donor, Lona will send you a tip of the Jin-bag:

$ set-o noclobber
$ echo "4" > File.out
-bash:file:cannot Overwrite existing file

So, how do we remove this "restriction"? Oh, change the set-o to set +o on the line:

$ set +o noclobber
$ echo "5" > File.out
$ cat file.out
5

Ask again: that ... Is there a way to cover the target file without canceling and "temporary"? Oh, Buddha Yue: Can not sue also! Ah ~ joking, joking ~ ^_^ Alas, the heart is already expected to be inadequate!

$ set-o noclobber
$ echo "6" >| file.out
$ cat file.out
6

Notice No: Add a "|" After > Just fine (note that there is no gap between:> and |) ... Call... (Take a deep breath and spit it out) ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ^_^ Another problem for you to fathom:

$ echo "Some text here" > file
$ cat < file
Some text here
$ cat < file > File.bak
$ cat & Lt File.bak
Some text here
$ cat < file > file
$ cat < file

Well?! Notice not?!! ----How did the last cat command see the file as empty? !why? Why Why When you are in class ~ ^_^
mentioned above: $ cat < file > File The original contents of the file after the result was washed out! To understand this is not difficult, it is only a priority problem:

In IO redirection, the stdout and stderr pipes are ready to read data from stdin. In other words, in the example above,,> file empties the file before it is read into < file, but the file is emptied, so it becomes unreadable ... Oh ~ ~ ~ ~ ~ ~ ~ ^_^

That... What about the following two cases?

$ cat <> file
$ cat < file >> file

Well... Students, these two answers as exercises, please hand in your homework before next class! Well, I/O redirection also quickly finished, sorry, because I only know so much just ~ giggle ^_^ However, there is a certain thing is to be said, the audience (please own soundtrack ~! #@! $%):----is pipe line also!

When it comes to pipe line, I believe a lot of people are not unfamiliar: "|" We often see on many command line "|" The symbol is pipe line. However, what exactly is pipe line? Don't worry, don't worry. Check the English-Chinese dictionary first, and see what pipe mean? Yes, it means "plumbing." Well, can you imagine how the water pipe is one after another? And what about the input and output between each pipe? Well?? Flash: The original pipe line I/O and the water pipe I/O is identical: The previous command stdout received the next command stdin went! No matter how many pipe line you use on command line, the I/O of the two command is connected to each other! (Congratulations: You finally got it! ^_^)

But... However... But ... what about stderr? That's a good question. But it's easy to understand: What if the plumbing leaks? In other words: in pipe line, the previous command of the stderr is not connected to the next command stdin, its output, if not 2> guide to file to go, it is still sent to the camera above! You must pay attention to the use of pipe line. Then, perhaps you will ask: Is there a way to stderr also feed into the next order stdin go? (The Greedy guy!) method Of course is there, and you have already learned! ^_^ me a hint just good: How will you merge stderr into stdout together output? If you can't answer it, ask me again after class ... (If you're really thick-skinned ...) )

Perhaps, you still have not done! Perhaps, you have encountered the following questions:

In CM1 | cm2 | cm3 ... In this pipe line, would you like to save CM2 's results to a file?

If you write CM1 | cm2 > File | cm3, then you will surely find cm3 's stdin is empty! (of course, you've got the hose to the other pool!) Smart you might be so resolved: CM1 | cm2 > file; cm3 < file Yes, you can do that, but the worst thing is: In this way, file I/O will double! file I/O is the most common performance killer throughout the execution of the command. Any experienced shell operator will try to avoid or reduce the file I/O frequency. So, is there a better way to do that? Yes, that's tee command.

The tee command is to copy the stdout to the file without affecting the original I/O. Therefore, the command line above can be played like this:

CM1 | cm2 | Tee File | cm3

On the preset, tee overwrites the target file, which can be achieved with the-a parameter if you want to add the content instead.
Basically, the application of pipe line is very extensive in shell operation, especially in text filtering,
Where Lift Cat,more,head,tail,wc,expand,tr,grep,sed,awk, ... And so on word processing tools, with the pipe line to use, you will be the original command line is alive so wonderful! Often let people have "the public find him 1100 degrees, suddenly looking back, that person is in the lights dim place!" the feeling ... ^_^

Do you want an if or a case?
put on a happy Spring festival holiday, people also become lazy loose ... Just, promised everyone's homework, or to adhere to the completion is ~ ~ ~

Do you remember the return value we introduced in the 10th chapter? Yes, the next introduction is related to the content, if your memory is also the holiday happy time to offset the words, then, suggest you go back to review and then back ...

If you remember return value, I think you should also remember && | | What does that mean? With these two symbols and then the command group, we can make shell script more intelligent Oh. Say:

Comd1 && {
  comd2
  comd3
  :
} | | {
  comd4
  comd5
}

The meaning is: if Comd1 return value is true, but executes COMD2 and COMD3, otherwise executes COMD4 and COMD5.

In fact, when we write shell script, we often need to use such conditions to make different processing actions.
With && | | It is possible to achieve the effect of conditional execution, however, from the "human language" to understand, but not so intuitive.
More often, we still like to use if....then...else ... Such keyword to express conditional execution. In the bash shell, we can modify the previous code like this:

If Comd1
then
  comd2
  comd3
else
  comd4
  comd5
fi

This is also the most commonly used if judgment in shell script: As long as the command line at the back of the If returns true return value (we most often use the test command to send the return value), but executes the commands behind the then. Otherwise, the command after else is executed; Fi is used to end the keyword of judgment.

In an if judgment, the else part may not be used, but then is required. (If you do not want to run any command after then, use: this null command instead). Of course, after then or else, you can use a more level of conditional judgment, which is common in shell script design.

If a number of conditions require "sequentially" to be judged, then we can use a keyword such as elif:

if comd1; Then
  comd2
elif comd3 then
  comd4
else
  comd5
fi

The meaning is: if Comd1 is true, but executes COMD2, otherwise tests COMD3, but executes COMD4, if COMD1 and COMD3 are not established, then executes COMD5.

Examples of if judgments are common, as you can see from a lot of shell script, I'm not going to give an example here ... The next thing we want to introduce is a case-determination.

Although the if formula is already manageable for most of the conditions, it is not flexible in some cases, especially in the case of string style judgments, as follows:

QQ () {
  echo-n "Do your want to continue?" ( yes/no): "
  read YN
  if [" $YN "= Y-o" $YN "= Y-o" $YN "=" yes "-o" $YN "=" yes "-o" $YN "=" yes "]
  then
    QQ
  else
    exit 0
  Fi
}
QQ

From the example, we can see that the most troublesome part is to judge the value of yn may have several styles. Smart you might be so modified:

If echo "$YN" | Grep-q ' ^[yy]\ ([ee][ss]\) *$ '

That is, using regular expression to simplify the code. (We have the opportunity to introduce re again) just ... Is there any other way to make it easier? Some, is to use the case to judge the type can:

QQ () {
  echo-n "Do your want to continue?" ( yes/no): "
  read YN case
  " $YN "in
    [yy]|[ YY][EE][SS])
      QQ
      ;;
    *)
      exit 0
      ;;
  Esac
}
QQ

We often use case judgements to determine that a variable is handled differently at different values (usually string), for example, by judging the script parameters to perform different commands. If you're interested and use Linux, dig in the/etc/init.d/* in the pile script. The following is an example:

Case "in Start"
    start
    ;;
  Stop)
    stop
    ;;
  Status)
    rhstatus
    ;;
  restart|reload)
    restart
    ;;
  Condrestart)
    [-f/var/lock/subsys/syslog] && Restart | |:
    ;;
  *
    echo $ "Usage: $ {Start|stop|status|restart|condrestart}"
    exit 1
  Esac

(If your impression of positional parameter is blurred, please read the 9th Chapter again.) Okay, 13 asked only one question, in a few days to fix it ... ^_^

For what? Where is the difference with until?

The last thing to introduce is a common "loop" in shell script design. The so-called loop is a script in a section of the code repeated under certain conditions. The most commonly used loops in bash shells are as follows three: for while until

For loop is to read a variable value from a list of lists and "sequentially" loop through the command line between do and done. Cases:

For Var in one two three four five does echo
  -----------
  echo ' $var is ' $var
  echo
done

The results of the previous example will be:

The for will define a variable called var with a value of one two three four five.
Because there are 5 variable values, the command line between do and done is cycled 5 times.
Each cycle uses echo to produce three lines of sentence. The $var in the second row, which is not within hard quote, is replaced by one two three four, in turn.
When the last variable value is processed, the loop ends.
It is not difficult to see that in the For loop, the number of variable values determines how many times the loop is made. However, whether a variable is used in a loop or not is not necessarily dependent on the design requirements. If the For loop does not use in this keyword to specify a list of variable values, its value is inherited from $@ (or $*):

for Var; Do
...
Done

(If you forget positional parameter, please review the 9th chapter ...) )

For loop is handy for working with list items, which can be obtained from variable substitution or command substitution, in addition to explicitly specifying or obtaining from positional parameter ... (Once again: Don't forget the "reorganization" feature of the command line!) )

However, for some "cumulative change" items (such as Integer subtraction), for can also handle:

For ((i=1;i<=10;i++))
do
  echo ' num is $i '
done

In addition to the for loop, we can use the while loop instead:

Num=1 while
["$num"-le], do
  echo ' num is $num '
  num=$ (($num + 1))
done

The while loop is slightly different from the For loop: it does not process the value of a variable in a list, but depends on the return value of the command line following the while:

If it is ture, the command between do and done is executed, and then the return value after the while is again judged.
False to end the loop without executing the command between do and done.
Analysis of the previous example:

    • Before while, define the variable num=1.
    • Then test whether the $num is less than or equal to 10.
    • The result is true, and then the ECHO is executed and the value of NUM is added one.
    • Second round test, at which time the value of NUM is 1+1=2, still less than or equal to 10, so true, continue the loop.
    • The test will fail until NUM is 10+1=11 ... So the loop ends.

It is not hard to find that the loop will be executed permanently if the test result of the while is always true:

While:;d o
  echo Looping
... Done

In the example above: the null command for bash, no action, except return value of TRUE. So this cycle does not end, called a dead loop. The generation of dead loops may be intentionally designed (such as running daemon), or it may be a design error. To end a dead loop, you can terminate it through signal (such as pressing CTRL-C). (about process and signal, wait for the opportunity to add later, 13 ask temporarily skip.) )

Once you can understand the while loop, you can understand the until loop:

Instead, until enters the loop when return value is false or ends.
Therefore, the previous example we can easily use until to write:

Num=1
until [! "$num"-le 10]; Do
  echo ' num is $num '
  num=$ (($num + 1))
done

Or:

Num=1
until ["$num"-gt]; do
  echo ' num is $num '
  num=$ (($num + 1))
done

Okay, the three commonly used loop about Bash is temporarily introduced here. Before closing this chapter, add two more loop-related commands: Break continue

These two commands are commonly used in a composite loop, that is, between the do...done and a further layer of the loop, of course, used in a single cycle is also all right ... ^_^

The break is used to interrupt the loop, which is the "forced End" loop. If you specify a value of n after the break, then "from Inside Out" interrupts the nth loop, and the default value is break 1, which interrupts the current loop.
When you use break you need to be aware that it is different from return and exit:

    • Break is end loop
    • Return is the End function
    • Exit is the end Script/shell

And Continue is the opposite of break: Force into the next cyclic action. If you don't understand it, you can simply see that the sentence between continue and done is skipped and returns to the top of the loop ... Same as break: Continue can also specify a value of N to determine which layer (from the inside out) of the loop, the default value is continue 1, that is, continue the current loop.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.