20 habits of UNIX experts

Source: Internet
Author: User
Tags file transfer protocol

The 20 good habits to be used in Unix are:

1) create a directory tree in a single command.

2) change the path. Do not move the archive.

3) combine commands with control operators.

4) exercise caution when referencing variables.

5) Use escape sequences to manage long input.

6) group commands in the list.

7) Use xargs outside of find.

8) Know When grep should execute count-when should it bypass.

9) match certain fields in the output, not just the rows.

10) Stop using pipelines for cat.

11) use the file name completion function ).

12) Use historical extensions.

13) reuse previous parameters.

14) Use pushd and popd to manage directory navigation.

15) Search for large files.

16) do not use the editor to create temporary files.

17) use the curl command line utility.

18) use regular expressions most effectively.

19) determine the current user.

20) Use awk to process data.

1. Create a directory tree in a single command

Listing 1 demonstrates one of the most common UNIX bad habits: defining a directory tree at a time.

Listing 1. Bad habits 1 Example: define each directory tree separately

~ $ Mkdir tmp

~ $ Cd tmp

~ /Tmp $ mkdir

~ /Tmp $ cd

~ /Tmp/a $ mkdir B

~ /Tmp/a $ cd B

~ /Tmp/a/B/$ mkdir c

~ /Tmp/a/B/$ cd c

~ /Tmp/a/B/c $

Using the-p option of mkdir and creating all parent directories and Their subdirectories in a single command is much easier. However, even for administrators who know this option, they are still bound to gradually creating sub-directories at each level when creating sub-directories on the command line. It is worth taking the time to consciously develop this good habit.

Listing 2. Example of Good Habit 1: Use a command to define a directory tree

~ $ Mkdir-p tmp/a/B/c

You can use this option to create the entire complex directory tree (which is ideal for use in scripts), instead of simply creating a simple hierarchy.

Listing 3. Another example of good habit 1: using a command to define a complex directory tree

~ $ Mkdir-p project/{lib/ext, bin, src, doc/{html, info, pdf}, demo/stat/}

In the past, the only excuse for defining directories separately was that your mkdir implementation did not support this option, but it was no longer the case in most systems. This option is now available for IBM, AIX, mkdir, GNU mkdir, and other systems that comply with a Single UNIX Specification.

For a few systems that still lack this function, you can use the mkdirhier script (see references), which is the packaging of mkdir that executes the same function:

~ $ Mkdirhier project/{lib/ext, bin, src, doc/{html, info, pdf}, demo/stat/}

2. Change the path. Do not move the archive.

Another bad usage mode is to move the. tar archive file to a directory, because this directory is exactly the directory where you want to extract the. tar file. In fact, you do not need to do this. You can decompress any. tar archive file to any directory-this is the purpose of the-C option. When extracting an archive file, use the-C option to specify the directory in which the file is to be decompressed:

Listing 4. Example of Good Habit 2: Use Option-C to decompress the. tar archive file

~ $ Tar xvf-C tmp/a/B/c newarc.tar.gz

In contrast to moving an archive file to the location where you want to decompress it, switch to this directory before decompressing it, the habit of using-C is more desirable-especially when the archive file is located somewhere else.

3. Combine commands and Control Operators

You may already know that in most shells, you can combine commands on a single command line by placing a semicolon (;) between commands. This Semicolon is a Shell control operator. Although it is useful for concatenating discrete commands on a single command line, it is not applicable to all situations. For example, if you use a semicolon to combine two commands, the correct execution of the second command is completely dependent on the successful completion of the first command. If the first command does not exit as expected, the second command will still run -- the result will cause failure. Instead, you should use more appropriate control operators (some operators described in this article ). As long as your Shell supports them, it is worth developing the habit of using them.

3.1 run a command only when the other command returns the zero exit status

Use the & control operator to combine two commands to run the second command only when the first command returns the zero exit status. In other words, if the first command runs successfully, the second command runs. If the first command fails, the second command does not run at all. For example:

Listing 5. Examples of good habits 3: combining commands with Control Operators

~ $ Cd tmp/a/B/c & tar xvf ~ /Archive.tar

In this example, the archived content is extracted ~ /Tmp/a/B/c directory, unless the directory does not exist. If the directory does not exist, the tar command will not run, so it will not extract any content.

3.2 run a command only when another command returns a non-zero exit status

Similarly, | the control operator separates two commands and runs the second command only when the first command returns a non-zero exit status. In other words, if the first command is successful, the second command will not run. If the first command fails, the second command will run. This operator is usually used to test whether a given directory exists. If the directory does not exist, create it:

Listing 6. Another example of good habits 3: combining commands with Control Operators

~ $ Cd tmp/a/B/c | mkdir-p tmp/a/B/c

You can also combine the control operators described in this section. Each operator affects the execution of the final command:

Listing 7. Examples of good habits 3: combining commands with Control Operators

~ $ Cd tmp/a/B/c | mkdir-p tmp/a/B/c & tar xvf-C tmp/a/B/c ~ /Archive.tar

4. Exercise caution when referencing variables

Always use Shell extensions and variable names with caution. Generally, it is best to include variable calls in double quotation marks unless you have enough reasons for not doing so. Similarly, if you use a variable name directly after a letter or number, make sure that the variable name is included in square brackets, to separate it from the surrounding region. Otherwise, Shell interprets the trailing text as part of the variable name -- and may return a null value. Listing 8 provides examples of variables for reference and non-reference and their impact.

Listing 8. Example of good habits 4: Referencing (and not referencing) variables

~ $ Ls tmp/

A B

~ $ VAR = "tmp /*"

~ $ Echo $ VAR

Tmp/a tmp/B

~ $ Echo "$ VAR"

Tmp /*

~ $ Echo $ VARa

~ $ Echo "$ VARa"

~ $ Echo "$ {VAR}"

Tmp/*

~ $ Echo $ {VAR}

Tmp/

~ $

5. Use escape sequences to manage long input

You may have seen a code example that uses a backslash (\) to extend a long line to the next line, in addition, you know that most shells treat the content you typed on subsequent rows connected by backslash as a single long line. However, you may not use this function in the command line as usual. If your terminal cannot properly process multi-line loopback, or your command line is smaller than usual (for example, when there is a long path in the prompt), The backslash is particularly useful. The backslash is also useful for understanding the meaning of long input lines, as shown in the following example:

Listing 9. Example of Good Habit 5: using a backslash for long input

~ $ Cd tmp/a/B/c | \

> Mkdir-p tmp/a/B/c &&\

> Tar xvf-C tmp/a/B/c ~ /Archive.tar

Alternatively, you can use the following configurations:
Listing 10. alternative example of Good Habit 5: using a backslash for long input

~ $ Cd tmp/a/B/c \

>|| \

> Mkdir-p tmp/a/B/c \

> &&\

> Tar xvf-C tmp/a/B/c ~ /Archive.tar

However, when you divide an input row into multiple rows, Shell always treats it as a single continuous row because it always deletes all backslash and extra spaces.

Note: In most shells, when you press the up arrow, the entire multi-line input will be re-painted to a single long input line.

6. Group commands in the list

Most shells have methods to group commands in the list so that you can pass their aggregate output down to a pipe or redirect any part or all of the streams to the same place. Generally, you can run a command list in a Subshell or a command list in the current Shell.

6.1 run the command list in Subshell.

Use parentheses to include the command list in a single group. In this way, the command will be run in a new Subshell, And you can redirect or collect the output of the entire group of commands, as shown in the following example:

Listing 11. Example of good habits 6: run the command list in Subshell

~ $ (Cd tmp/a/B/c/| mkdir-p tmp/a/B/c &&\

> VAR = $ PWD; cd ~; Tar xvf-C $ VAR archive.tar )\

> | Mailx admin-S "Archive contents"

In this example, the archived content is extracted to the tmp/a/B/c/directory, and the output of the group command (including the list of extracted files) send the email to the admin address.

When you redefine the environment variables in the command list, and you do not want to apply those definitions to the current Shell, Subshell is preferred.

6.2 run the command list in the current Shell.

Enclose the command list with braces ({}) to run the command in the current Shell. Make sure that there are spaces between the brackets and the actual commands. Otherwise, Shell may not be able to correctly interpret the brackets. In addition, make sure that the last command in the list ends with a semicolon, as shown in the following example:

Listing 12. Another example of good habits 6: run the command list in the Current Shell

~ $ {Cp $ {VAR} a. & chown-R guest. guest &&\

> Tar cvf newarchive.tar a;} | mailx admin-S "New archive"

7. Use xargs outside of find

Use the xargs tool as a filter to take full advantage of the output selected from the find command. The find operation usually provides a list of files that match certain conditions. This list is passed to xargs, which then uses this file list as a parameter to run some other useful commands, as shown in the following example:

Listing 13. Typical xargs usage examples

~ $ Find some-file-criteria some-file-path | \

> Xargs some-great-command-that-needs-filename-arguments

However, do not regard xargs as a helper tool for finding; it is one of the underutilized tools, and when you get into the habit of using it, you will want to perform all the experiments, it includes the following usage.

7.1 list of passed Spaces

In the simplest form of calling, xargs is like a filter that accepts a list (each member is on a separate row) as input. This tool places those members on rows separated by a single space:

Listing 14. Output example generated by xargs

~ $ Xargs

A

B

C

Control-D

A B c

~ $

You can send the output of any tool that outputs file names through xargs to obtain the parameter list for some other tools that accept file names as parameters, as shown in the following example:

Listing 15. Example of xargs

~ /Tmp $ ls-1 | xargs

December_report1_readme a archive.tar mkdirhier. sh

~ /Tmp $ ls-1 | xargs file

December_Report.pdf: PDF document, version 1.3

README: ASCII text

A: directory

Archive.tar: POSIX tar archive

Mkdirhier. sh: Bourne shell script text executable

~ /Tmp $

The xargs command is not only used to pass file names. You can also use it whenever you need to filter text into a single row:

Listing 16. Example of Good Habit 7: Use xargs to filter text to a single row

~ /Tmp $ ls-l | xargs

-Rw-r -- 7 joe 12043 Jan 27 20:36 december_report1_- rw-r -- 1 \

Root 238 Dec 03 README drwxr-xr-x 38 joe 354082 Nov 02 \

A-rw-r -- 3 joe 5096 Dec 14 archive.tar-rwxr-xr-x 1 \

Joe 3239 Sep 30 mkdirhier. sh

~ /Tmp $

7.2 Exercise caution when using xargs

Technically speaking, xargs is rarely troublesome. By default, the end string of a file is an underscore (_). If this character is sent as a single input parameter, all subsequent content is ignored. To prevent this, you can use the-e flag, which completely disables the end string without parameters.

8. Know When grep should execute count-when should it bypass

Avoid sending grep to wc-l in a pipeline to count the number of output rows. The-c option of grep provides a count of rows that match a specific pattern and is generally faster than sending data to wc through pipelines, as shown in the following example:

Listing 17. Example of good habit 8: Using and without grep row counting

~ $ Time grep and tmp/a/longfile.txt | wc-l

2811

Real 0m0. 097 s

User 0m0. 006 s

Sys 0m0. 032 s

~ $ Time grep-c and tmp/a/longfile.txt

2811

Real 0m0. 013 s

User 0m0. 006 s

Sys 0m0. 005 s

~ $

In addition to the speed factor, the-c option is a good way to execute the count. For multiple files, grep with the-c option returns a separate count for each file, with one count per row. For wc pipelines, the total number of combinations of all files is provided.

However, regardless of the speed, this example shows another common error to be avoided. These counting methods only provide the number of rows that contain the matching mode-if that is the result you are looking for, this is no problem. However, when a row has multiple instances in a specific mode, these methods cannot provide you with a real count of the number of instances actually matched. In the final analysis, you still need to use wc to count instances. First, run the grep command using the-o option (If your version supports it. This option only outputs the matching mode. Each line has one mode without losing the trip itself. However, you cannot use it with the-c option. Therefore, you must use wc-l to count rows, as shown in the following example:

Listing 18. Example of good habit 8: Using grep to count mode instances

~ $ Grep-o and tmp/a/longfile.txt | wc-l

3402

~ $

In this example, calling wc is a little faster than calling grep for the second time and inserting a virtual mode (such as grep-c) to match and count rows.

9. match some fields in the output, not just the rows.

When you only want to match the mode in a specific field in the output line, tools such as awk are superior to grep.

The following simplified example shows how to list only files modified on January 1, December.

Listing 19. Bad habits 9 example: Use grep to find the pattern in a specific field

~ /Tmp $ ls-l/tmp/a/B/c | grep Dec

-Rw-r -- 7 joe 12043 Jan 27 December_Report.pdf

-Rw-r -- 1 root 238 Dec 03 08:19 README

-Rw-r -- 3 joe 5096 Dec 14 archive.tar

~ /Tmp $

In this example, grep filters rows and outputs all the files with Dec in the modification date and name. Therefore, files such as December_Report.pdf match, even if they have not been modified since January. This may not be the expected result. To match the pattern in a specific field, it is best to use awk. One of the Relational operators matches the exact field, as shown in the following example:

Listing 20. Examples of good habits 9: Use awk to find the pattern in a specific field

~ /Tmp $ ls-l | awk '$6 = "Dec "'

-Rw-r -- 3 joe 5096 Dec 14 archive.tar

-Rw-r -- 1 root 238 Dec 03 08:19 README

~ /Tmp $

10. Stop using pipelines for cat

A common basic usage error of grep is that cat output is sent to grep in a pipeline to search for the content of a single file. This is absolutely unnecessary. It is a waste of time, because tools such as grep accept file names as parameters. You do not need to use cat in this case, as shown in the following example:

Listing 21. Examples of good habits and bad habits 10: Use grep with or without cat

~ $ Time cat tmp/a/longfile.txt | grep and

2811

Real 0m0. 015 s

User 0m0. 003 s

Sys 0m0. 013 s

~ $ Time grep and tmp/a/longfile.txt

2811

Real 0m0. 010 s

User 0m0. 006 s

Sys 0m0. 004 s

~ $

This error exists in many tools. Since most tools accept standard input using hyphens (-) as a parameter, the parameter is usually invalid even if cat is used to distribute multiple files in stdin. Only when you use cat with one of multiple filtering options, it is really necessary to perform a connection before the pipeline.

11. Complete with the file name

Isn't that great if you don't need to type long and confusing file names at the command prompt? Indeed, you do not need to do this. Instead, you can configure the most popular UNIX Shell to use the file name. This function works in a slightly different way in various shells, so I will show you how to use file names in the most popular Shell. After the file name is complete, you can enter the file more quickly and avoid errors. Laziness? Maybe. More efficient? Of course!

Common acronyms

1) MB: MB

2) HTTP: Hypertext Transfer Protocol

3) HTTPS: HTTP over Secure Sockets Layer

4) FTP: File Transfer Protocol

5) FTPS: FTP over Secure Sockets Layer

6) LDAP: Lightweight Directory Access Protocol

Which Shell is running?

What if you don't know which Shell is used currently? Although this tip is not a formal component of the other 10 good habits, it is still useful. You can use the echo $0 or ps-p $ command to display the Shell you are using. For me, Bash Shell is running.

Listing 1. Determine your Shell

$ Echo $0

-Bash

$ Ps-p $

PID TTY TIME CMD

6344 ttys000. 02-bash

C Shell

C Shell supports the most direct file name completion function. Set the filec variable to enable this function. (You can run the set filec command .) After you start typing a file name, you can press Esc to complete the file name or finish as many parts as possible. For example, assume that you have a file named file1, file2, and file3. If you type f and Press Esc, the file is filled, and you must type 1, 2, or 3 to complete the file name.

Bash

Bash Shell also provides a complete file name, but uses the Tab key instead of the Esc key. You can enable file name completion without setting any options in Bash Shell. This option is set by default. Bash also implements other functions. After you type a part of the file name, press the Tab key. If multiple files meet your request and you need to add text to select one of the files, you can press the Tab key twice to display the list of files that match your current input. Use a file example named file1, file2, and file3. First, type f. When you press the Tab key once, Bash completes file. When you press the Tab key again, the list of file1 file2 file3 is displayed.

Korn Shell

For a Korn Shell user, the file name is completed depending on the value of the EDITOR variable. If the EDITOR is set to vi, enter a part of the name, Press Esc, followed by a backslash. If the EDITOR is set to emacs, type A part of the name and press Esc twice to complete the file name.

12. Use History extension

What happens if you use the same file name for a series of commands? Of course, there is a shortcut to quickly obtain the file name you used last time. As shown in List 2 ,! $ The command returns the file name used by the previous command. Search the position where the word pickles appears from the file this-is-a-long-lunch-menu-file.txt. After the search ends, use the vi command to edit the this-is-a-long-lunch-menu-file.txt file without retyping the file name. You use an exclamation point (!) And then use the dollar sign ($) to return the last field of the previous command. If you repeatedly use long file names, this is a very good tool.

List 2. Use! $ Get the file name used by the previous command

$ Grep pickles this-is-a-long-lunch-menu-file.txt

Pastrami on rye with pickles and onions

$ Vi! $

13. reuse previous Parameters

! $ Command returns the previous file name parameter used by a command. But what if a command uses multiple file names and you only want to reuse one of them ?! : 1 operator returns the first file name used by a command. The example in listing 3 shows how to associate this operator! $ Operator combination. In the first command, a file is renamed as a more meaningful name, but a symbolic link is created to keep the original file name available. Rename the file kxp12.c to improve readability. Then, use the link command to create a symbolic link to the original file name to avoid using the file name elsewhere .! $ Operator returns the file_system_access.c file name, and! The: 1 operator returns the kxp12.c file name, which is the first file name of the previous command.

Listing 3. Combined use! $ And! : 1

$ Mv kxp12.c file_system_access.c

$ Ln-s! $! : 1

14. Manage directory navigation using pushd and popd

UNIX supports various directory navigation tools. Pushd and popd are the two favorite tools to improve work efficiency. Of course, you know how to use the cd command to change your current directory. If you want to navigate to multiple directories but want to quickly return to a location, what should you do? The pushd and popd commands create a virtual directory stack. The pushd command is used to change your current directory and store it in the stack, the popd command is used to remove the directory from the top of the stack and let you return this location. You can use the dirs command to display the current directory stack without pressing in or popping up a new directory. Listing 4 shows how to use the pushd and popd commands to quickly navigate to the directory tree.

Listing 4. Use pushd and popd to navigate to the directory tree

$ Pushd.

~ ~

$ Pushd/etc

/Etc ~ ~

$ Pushd/var

/Var/etc ~ ~

$ Pushd/usr/local/bin

/Usr/local/bin/var/etc ~ ~

$ Dirs

/Usr/local/bin/var/etc ~ ~

$ Popd

/Var/etc ~ ~

$ Popd

/Etc ~ ~

$ Popd

~ ~

$ Popd

Pushd and popd commands also support processing Directory stacks using parameters. Use the + n or-n parameter, where n is a number, you can move the stack left or right, as shown in listing 5.

Listing 5. Rotating directory Stack

$ Dirs

/Usr/local/bin/var/etc ~ ~

$ Pushd + 1

/Var/etc ~ ~ /Usr/local/bin

$ Pushd-1

~ /Usr/local/bin/var/etc ~

15. Search for large files

Do you need to find out what is occupied by all your idle disk space? You can use the following tools to manage your storage devices. As shown in Listing 6, the df command shows you the total number of blocks used on each available volume and the percentage of free space.

Listing 6. determine the volume usage

$ Df

Filesystem 512-blocks Used Available Capacity Mounted on

/Dev/disk0s2 311909984 267275264 44122720 86%/

Devfs 224 224 0 100%/dev

Fdesc 2 2 0 100%/dev

Map-hosts 0 0 100%/net

Map auto_home 0 0 0 100%/home

Do you want to search for large files? The-size parameter is attached to the find command. Listing 7 shows how to use the find command to find files larger than 10 MB. Note that the-size parameter is measured in KB.

Listing 7. Search for all files larger than 10 MB

$ Find/-size + 10000 k-xdev-exec ls-lh {}\;

16. Do not use the editor to create temporary files

The following is a simple example: You need to quickly create a simple temporary file without starting your editor. Use the cat command with the> file redirection operator. As shown in listing 8, use the cat command without a file name to display only any content typed into the standard input;> redirect to capture the input to the specified file. Note that you must provide the end character of the file when typing the end, usually Ctrl-D.

Listing 8. quickly create a temporary file

$ Cat> my_temp_file.txt

This is my temp file text

^ D

$ Cat my_temp_file.txt

This is my temp file text

You need to perform the same operation, but attach it to an existing file instead of creating a new file. As shown in listing 9, use the> operator instead.> The file redirection operator attaches content to an existing file.

Listing 9. Quickly Append content to a file

$ Cat> my_temp_file.txt

More text

^ D

$ Cat my_temp_file.txt

This is my temp file text

More text

17. Use the curl command line utility

The curl command allows you to use HTTP, HTTPS, FTP, FTPS, Gopher, DICT, TELNET, LDAP, or FILE protocols to retrieve data from the server. As shown in List 10, I can use the curl command to learn about the current weather conditions in my location (bufaro, New York) from the US National Meteorological Administration. When used in combination with the grep command, I can retrieve weather conditions in bufaro city. Use the-s command line option to disable curl from processing output.

Listing 10. Use curl to retrieve the current weather condition

$ Curl-s http://www.srh.noaa.gov/data/ALY/RWRALY | grep BUFFALO

Buffalo mosunny 43 22 43 NE13 30.10R

As shown in listing 11, you can also use the curl command to download an HTTP-hosted file. Use the-o parameter to specify the location where the output is saved.

Listing 11. Using curl to download files hosted by HTTP

$ Curl-o archive.tar http://www.somesite.com/archive.tar

This is actually a prompt that you can complete the operation using the curl command. You only need to enter man curl at the command prompt to display the complete usage information of the curl command.

18. use regular expressions most effectively

A large number of UNIX commands use regular expressions as parameters. Technically speaking, a regular expression is a string that represents a certain pattern (that is, a string consisting of letters, numbers, and symbols) and is used to define a string of zero or longer length. Regular Expressions use metacharacters (for example, asterisks [*] and question marks [?]) To match part or all of other strings. Regular expressions do not necessarily contain wildcards, but they can make regular expressions play a greater role in search mode and file processing. Table 1 shows some basic regular expression sequences.

Table 1. Regular Expression Sequence

Sequence
Description
Escape Character (^)
Match the expression that appears at the beginning of A row, for example, ^
Dollar sign ($)
Match the expression at the end of A row, for example, A $
Backslash (\)
Cancels the special meaning of the next character, for example \ ^
Square brackets ([])
Match any character, for example, [aeiou] (use a hyphen [-] to indicate a range, for example, [0-9]).
[^]
Match any character except the inclusive character, for example, [^ 0-9]
Period (.)
Match any single character except the end of the line
Asterisk (*)
Matches zero or multiple precursor characters or expressions
\ {X, y \}
Match the same content from x to y
\ {X \}
Exact match of x identical content
\ {X ,\}
Matches x or more of the preceding content.

Listing 12 shows some basic regular expressions used with the grep command.

Listing 12. Using Regular Expressions and grep

$ # Lists your mail

$ Grep '^ From:'/usr/mail/$ USER

$ # Any line with at least one letter

$ Grep '[a-zA-Z]' search-file.txt

$ # Anything not a letter or number

$ Grep '[^ a-zA-Z0-9] search-file.txt

$ # Find phone numbers in the form 999-9999

$ Grep '[0-9] \ {3 \}-[0-9] \ {4 \}' search-file.txt

$ # Find lines with exactly one character

$ Grep '^. $' search-file.txt

$ # Find any line that starts with a period "."

$ Grep '^ \.' search-file.txt

$ # Find lines that start with a "." and 2 lowercase letters

$ Grep '^ \. [a-z] [a-z]' search-file.txt

For an in-depth description of the regular expression in the command line, read the developerWorks article "speaking UNIX, Part 1: Regular Expressions ."

19. determine the current user

Sometimes you may want to determine whether a specific user has run your management script. To find out the answer, you can use the whoami command to return the name of the current user. Listing 13 shows the whoami command that runs independently; listing 14 shows an excerpt from the Bash script that uses whoami to ensure that the current user is not the root user.

Listing 13. Using whoami from the command line

$ Whoami

John

Listing 14. Using whoami in the script

If [$ (whoami) = "root"]

Then

Echo "You cannot run this script as root ."

Exit 1

Fi

20. Use awk to process data

The awk command seems to be always in the shadow of Perl, but it is a fast and practical tool for simple and command line-based data processing. Listing 15 shows how to start using the awk command. To obtain the length of each line of text in a file, use the length () function. To check whether the string ing appears in the file text, use the index () function. This function returns the position of the first appearance of ing, in this way, you can use it for further string processing. To tokenize a string (that is, to split a row into segments of the word length), use the split () function.

Listing 15. Basic awk Processing

$ Cat text

Testing the awk command

$ Awk '{I = length ($0); print I}' text

23

$ Awk '{I = index ($0, "ing"); print I}' text

5

$ Awk 'in in {I = 1} {n = split ($0, a, ""); while (I <= n) {print a [I]; I ++;} 'text

Testing

The

Awk

Command

Printing specified fields in a text file is a simple awk task. In listing 16, the sales file contains the name of each salesperson, followed by the monthly sales number. You can use the awk command to quickly obtain the total sales volume per month. By default, awk treats each value separated by commas as a different field. You can use the $ n operator to access each field.

Listing 16. Use awk to summarize data

$ Cat sales

Gene, 12, 23, 7

Dawn, 10, 25, 15

Renee, 15,13, 18

David, 8, 21, 17

$ Awk-F, '{print $1, $2 + $3 + $4}' sales

Gene 42

Dawn 50

Renee 46

David 46

Some practices are required to become a command line expert. It is easy to solve the problem in the same way, because you are used to it. Expanding your command line resources can significantly improve your work efficiency and lead you to the direction of UNIX Command Line experts!

From:

Http://www.ibm.com/developerworks/cn/aix/library/au-badunixhabits.html

Https://www.ibm.com/developerworks/cn/aix/library/au-unixtips/

This article from CSDN blog: http://blog.csdn.net/tianlesoftware/archive/2011/01/15/6140900.aspx

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.