Pick--How to use multicore CPUs to speed up your Linux commands-awk, sed, bzip2, grep, WC, etc.

Source: Internet
Author: User
Tags bz2

http://www.vaikan.com/use-multiple-cpu-cores-with-your-linux-commands/

Have you ever had to calculate a very large data (hundreds of GB) requirement? Or search inside, or anything else--something that can't be done in parallel. Data experts, I'm talking to you. You may have a 4-core or more core CPU, but our appropriate tools, such as grep, bzip2, WC, awk, sed , and so on, are single-threaded, Only one CPU core can be used.

Borrowing cartoon characters Cartman words, "How can I use these kernels"?

To get the Linux command to use all the CPU cores, we need to use the GNU Parallel Command, which allows all of our CPU cores to perform magical map-reduce operations on a single machine, and of course, with –pipes parameters that are seldom used (also called –spreadstdin). In this way, your load will be evenly distributed to each CPU, really.

BZIP2

BZIP2 is a better compression tool than gzip, but it's very slow! Come on, we have a way to solve the problem.

Previous practice:

Cat Bigfile.bin | bzip2--best > compressedfile.bz2

Now this:

Cat Bigfile.bin | Parallel--pipe--recend ' k bzip2--best > compressedfile.bz2

Especially for Bzip2,gnu parallel on multicore CPUs is super fast. When you're not careful, it's done.

Grep

If you have a very large text file, you might have done this before:

grep pattern Bigfile.txt

Now you can do this:

Cat Bigfile.txt | Parallel--pipe grep ' pattern '

or this:

Cat Bigfile.txt | Parallel--block 10M--pipe grep ' pattern '

This second usage uses the –block 10M parameter, which means that each kernel processes 10 million rows-you can use this parameter to adjust how many rows of data each CPU kernel processes.

Awk

Here is an example of using the awk command to calculate a very large data file.

General usage:

Cat Rands20M.txt | awk ' {s+=$1} END {print S} '

Now this:

Cat Rands20M.txt | Parallel--pipe awk \ ' {s+=\$1} END {print s}\ ' | awk ' {s+=$1} END {print S} '

This is a bit complicated: the –pipe parameter in the parallel command assigns the cat output to a number of blocks assigned to the awk call, resulting in a number of sub-computation operations. These sub-computations go through the second pipeline into the same awk command, which outputs the final result. The first awk has three backslashes, which is required by the GNU parallel to invoke awk.

Wc

Want the fastest speed to calculate the number of rows for a file?

Traditional practices:

Wc-l Bigfile.txt

Now you should:

Cat Bigfile.txt | Parallel--pipe Wc-l | awk ' {s+=$1} END {print S} '

Very ingenious, first use the parallel Command ' mapping ' to make a number of wc -l calls, form a sub-calculation, and finally through the pipeline sent to awk to summarize.

Sed

Want to use the SED command in a huge file to do a lot of substitution operations?

General Practice:

Sed s^old^new^g bigfile.txt

Now you can:

Cat Bigfile.txt | Parallel--pipe sed s^old^new^g

... You can then use the pipeline to store the output in the specified file.

Pick--How to use multicore CPUs to speed up your Linux commands-awk, sed, bzip2, grep, WC, etc.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.