Original: |
Http://www.javacodegeeks.com/2013/10/shell-scripting-best-practices.html#BP11
|
Translation: |
Aven |
Most programming languages have a series of codes that use this language to follow good programming practices. However, I did not find a more comprehensive shell script, so I decided to write my own programming habits based on my years of writing shell experience.
Portability Note: since the main scripting shell script is running on a system with Bash 4.2 installed, I never worry about porting, and you don't need to worry! The following list is written using Bash 4.2 (and other modern shells). If you are writing a portable script, some points may not apply. Needless to say, after you have changed this list, you should do enough testing.
Here are my good programming habits about shell scripts (no special Order):
1.Using functions
Unless you write very small scripts, use functions to modularize your code and make it easy to read, reusable, and well maintained. All of my scripts use the following template. As you can see, all the code is written inside the function. The script begins with the invocation of main.
#!/bin/bash set-e usage () {} my_function () {} main () {} main "[Email protected]"
2. Writing a comment for a function
Add sufficient documentation for your functions, specify what they dare, and what parameters they need to call.
Here is an example:
# Processes a file. # $1-the Name of the input file # $2-the name of the output file Process_file () {}
3. Use shift to read function parameters
Instead of using $1,$2 to get the function arguments, use shift as shown below. This will make it easier to record parameters if you later change your mind.
# Processes a file. # $1-the Name of the input file # $2-the name of the output file Process_file () {local-r input_file= "$" ; Shift Local-r output_file= "$"; Shift}
4. Declare your variables
If your variable is an integer, declare it as follows. Also, make your variable read-only, unless you want to modify its value in your script later. Use local for variables declared inside a function. This will help to express your intentions. If portability is considered, use typeset instead of delcare. Here are some examples:
Declare-r-I port_number=8080 declare-r-a my_array= (Apple orange) My_function () {Local-r name=apple }
5. Add double quotation marks for all variable extensions
To prevent word-splitting and file extensions, you must add double quotes for all variable extensions . This is especially true when you need to work with filenames that contain whitespace characters (or other special characters). Consider this example;
# create a file containing a space in its nametouch "Foo bar "declare -r my_file=" Foo bar "# try rm-ing the file without quoting the variablerm $my _file# it fails because rm sees two arguments: "foo" and "bar" # rm: cannot remove ' foo ': no such file or directory# rm: cannot remove ' Bar ': no such file or directory# need to quote the variablerm "$my _file" # file globbing example:mesg= "My pattern is *.txt" echo $MESG # this is not quoted so *.txt will undergo expansion# will print "My pattern is foo.txt bar.txt" # need to quote it for correct outputecho "$msG
It's a good habit to add double quotes to all your variables. If you need word-splitting, consider array substitution. Refer to the next point.
6. Use the array when appropriate
Do not store the collection of elements inside a string. Replace with an array. For example:
# using a string to hold a collectiondeclare-r hosts= "host1 host2 host3" for host in $hosts # not quoting $hosts here, si nCE we want Word splittingdo echo "$host" done# use an array instead!declare-r-a host_array= (host1 host2 host3) for H OST in ' ${host_array[@]} ' do echo ' $host ' done
7. Use "[email protected]" to refer to all variables
Do not use $*. Refer to my previous article difference between $*, [email protected], "$*" and "[email protected]". Here is an example:
Main () {# Print each argument-i in ' [email protected] ' do echo ' $i ' done}# pass all arguments to M Ainmain "[Email protected]"
8. Use uppercase only for environment variables
My personal preference is that all variables use lowercase unless they are environment variables. For example:
Declare-i port_number=8080# java_home and CLASSPATH are environment variables "$JAVA _home"/BIN/JAVA-CP "$CLASSPATH" app. Main "$port _number"
9. Prefer to use the shell to bring commands instead of extension programs
The shell has the ability to handle strings and simple arithmetic, so you don't have to call programs like cut and sed. Here are some examples:
Declare-r my_file= "/var/tmp/blah" # instead of DirName, Use:declare-r file_dir= "{my_file%/*}" # instead of basename, use:d Eclare-r file_base= "{my_file##*/}" # instead of sed ' S/blah/hello ', Use:declare-r new_file= "${my_file/blah/hello}" # Instead of BC <<< "Grepping", Use:echo $ ((+) # Instead of a pattern in a string, use:[[$line =~. *blah$ ]]# instead of cut-d:, use an array:ifs=: read-a arr <<< "One:two:three"
Note that the extension will perform better when it comes out of large files or input.
10. Avoid non-essential piping
The pipeline will increase the overhead, so try to keep your pipeline to a minimum. Often useless examples are cat and echo, as shown below:
1. Avoid the non-essential cat
If you're not familiar with the useless cat verdict, look here. The cat command should only be used in a series of files, rather than sending the output to another command.
# instead Ofcat File | command# Usecommand < file
2. Avoid the use of non-essential echo
Only if you want to output text to stdout, stderr, files, etc. If you want to send text to another command, do not use echo to send it through a pipeline. Replace with a here-string. Note that here-string is not portable (but most modern shells support them). So use a note if you write a portable script. (Refer to my previous article:useless use of Echo. )
# instead Ofecho text | command# Usecommand <<< text# for portability, use a heredoccommand << endtextend
3. Avoid the non-mandatory grep
Piping from grep to awk or sed is not necessary. Since awk and sed can be grep, you won't have to use the pipe to grep. (Refer to my previous article: useless use of Grep)
# instead ofgrep pattern file | awk ' {print '} ' # Useawk '/pattern/{print $} ' # instead ofgrep pattern file | Sed ' s/foo/bar/g ' # usesed-n '/pattern/{s/foo/bar/p} ' file
4. Other non-essential piping
Here are some examples:
# instead Ofcommand | Sort | uniq# Usecommand | sort-u# instead Ofcommand | grep pattern | wc-l# Usecommand | Grep-c pattern
11. Avoid parsing ls
The problem with LS is to output the file name with a new line, so if your file name contains a newline character, you won't be able to parse it correctly. If LS can output a file name without a delimiter, it's better, unfortunately, it can't. In addition to LS, use a file extension or a replacement command that outputs no delimiters, such as find-print0.
12. Using globbing
A Globbing (or file extension) is a way for a shell to produce a series of files in a matching pattern. In bash, you can use the EXTGLOB option to make globbing more powerful by opening the extended pattern match character. Also, you can open Nullglob to get an empty list if no match is found. On some occasions you can use globbing instead of find, declaring again, do not parse ls! Here are some examples:
Shopt-s nullglobshopt-s extglob# get all files with a. Yyyymmdd.txt suffixdeclare-a dated_files= (*.[ 0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9].txt) # Get all Non-zip filesdeclare-a non_zip_files= (! *.zip))
13. Use non-delimited outputs whenever possible
To properly handle file names that contain whitespace characters and line breaks, you need to use the NUL output, each separated by a line () instead of a 00
newline character. Most programs support this. For example, find-pirnt0 outputs a file name that ends with a null character, and xargs-0 reads the parameters separated by a null character.
# instead Offind. -type F-mtime +5 | Xargs rm-f# Usefind. -type F-mtime +5-print0 | xargs-0 rm-f# looping over Filesfind. -type f-print0 | While ifs= Read-r-D $ ' filename; Do echo ' $filename ' done
14. Do not use back slashes
Use $ (command) instead of ' command ' because it is easier to nest multiple commands and be easier to read. This is a simple example:
# Ugly escaping required when using nested backticksa= ' Command1 \ ' command2\ ' # $ (...) is cleanerb=$ (Command1 $ (command2))
15. Use process substitution instead of using temporary files
In most cases, if the command uses one file as input, the file can be substituted for another command using:<. This will allow you to avoid writing to a temporary file, passing the temporary file to a command, and finally deleting the temporary file. As shown below:
# using temp filescommand1 > File1command2 > File2diff file1 file2rm file1 file2# using process Substitutiondiff < ;(Command1) < (COMMAND2)
16.use Mktemp If you must create a temporary file
Try to avoid creating temporary files. If necessary, use mktemp to create a temporary folder, and then write the file in. Ensure that the directory is removed after you have finished executing it.
# set up a trap to delete the temp dir when the script Exitsunset temp_dirtrap ' [[-d ' $temp _dir "]] && rm-rf" $t Emp_dir "' exit# Create the temp dirdeclare-r temp_dir=$ (mktemp-dt MyApp. XXXXXX) # Write to the temp dircommand > "$temp _dir"/foo
17. Use for judging conditions ((and [[
use [[...]] Rather than [...] because it is more secure and provides richer features. For arithmetic conditions use ((...)), as it runs you use more similar mathematical operators such as < and, instead of-LT and-GT. Note that if you want to design for portability, you must keep the old way [...]. Here are some examples:
[[$foo = "foo"]] && echo "Match" # don ' t need to quote variable inside [[[[$foo = = ' a ' && $bar = = ' a ' ]] && echo "Match" Declare-i num=5 (num <) && echo "Match" # don ' t need the $ on $num in ( (
18. Use commands in judging conditions instead of using exit status
If you want to check whether a command returns success before doing anything else, use the command directly in the judging condition instead of checking the exit status of the command.
# don ' t use exit Statusgrep-q pattern fileif ($ = = 0) then echo "pattern is found" fi# use the command as the cond itionif grep-q pattern Filethen echo "pattern was found" fi
19. Using SET-E
Put this on top of your script. This is telling the shell script to exit as soon as possible if any statement returns a non-0 exit code
20. Output error messages to stderr
Error message belongs to stderr and not stdout
echo "An error message" >&2
Shell script-L good habit