Use shell scripts

Source: Internet
Author: User
1. Regular generation.

Search an English article on the Internet today, get a, http://web.eecs.umich.edu /~ Silvio/teaching/eecs598/lectures/lecture10_3.pdf. I guess there should be other PDF addresses. I tried several other addresses, such as lecture10_1 and lecture10_2, and I thought about generating similar addresses, then, you can use wget to download data in batches.

#!/bin/sh# download links http://web.eecs.umich.edu/~silvio/teaching/EECS598/lectures/*link_prefix="http://web.eecs.umich.edu/~silvio/teaching/EECS598/lectures/"for lec_num in `seq 0 1 10`dofor pdf_num in `seq 0 1 5`dowget ${link_prefix}lecture${lec_num}_${pdf_num}.pdfdonedoneexit 0

2. Obtain the absolute path of the directory where the shell script is located.

Original article: http://www.zeali.net/entry/497

To obtain the absolute path of the executing program/script itselfYou can use dirname (realpath (_ file __));
C # There are system. Windows. Forms. application. startuppath; Java does not seem to have any direct method, and can only be used
Codesource. In Linux shell scripts, if you want to obtain the absolute path of the current script file, there are no ready-to-use commands to call it. However, you can use the following statement to obtain the path:

baseDirForScriptSelf=$(cd "$(dirname "$0")"; pwd)echo "full path to currently executed script is : ${baseDirForScriptSelf}"

Although in most cases, we do not need such an absolute path to complete the work; but if we want to package multiple scripts, data files, and other content as a whole for delivery to others, we also hope that no matter which directory the user copies the script to, other content in the package can be found correctly, using such scripts to automatically locate the root directory of the package should be robust.

3. Add a new environment variable

In the past, my working directory was not enough, so I wanted to set a new environment variable work, and then run CD $ work to switch to the new working directory.

$ Vim ~ /. Bashrc

Wrok =/run/Media/huntinux/f/huntinux_work

Save and exit the terminal, and then open the terminal (or $ source ~ /. Bashrc does not need to restart the terminal)

$ Echo $ work

/Run/Media/huntinux/f/huntinux_work

$ CD $ work

Successful.

(About ~ /. Bashrc
~ /. The difference between bash_profile and so on will be here)

4. delete unused variables in the C source file. If there are too many unused variables, it is obviously too troublesome to manually delete them, so I want to use shell to help.
for i in `gcc -Wall main.c -lm 2>&1 | sed "1,2d" |tr '‘’' '  ' | cut -d' ' -f8`dosed -i '/'"$i"'/d' main.c done

You need to redirect the GCC error output to the standard output to use the pipeline. Next SED is used to delete the first and second rows. TR is used to replace ''with space cut to get unused variable names. Next, use sed to delete rows containing these variables and update the file.

5. Test your own programs. For example, poj 1328 saves the test data to a file, such as testdata. Then, execute the following script and you do not have to manually enter the test data each time.
#! /bin/sh## recompile and test the program#gcc -Wall main.c -lm && cat testdata | ./a.outexit 0

6. Download the webpage from daomubiji.com and extract the novel content. Wget + sed + Tr + cut + grep + basename + cat. The code is immediately pasted.

#! /Bin/sh # extract complete novels from daomubiji.com # V1.1 # mainurl = "www.daomubiji.com" # index.html's locationpwd = 'pwd' TMP = "$ PWD/tmp" # temp dirpagesdir = "$ tmp/pages" # each Chapter Page's local locationnoveldir = "$ tmp/novel" # Each chaptermainpg = "$ tmp/index.html" # index.html local locationurls = "$ TMP /URLs "# contains all the pages 'urlsuffix = ". TXT "# Suffix of novel # extract novel content from the downloaded webpage # parameter 0 indicates the webpage extract_page () {echo" Extract Page $1 "# obtain the chapter name Title = 'grep" 

New Version: supports interrupt processing, color output, and resumable data transfer.

#! /Bin/sh # extract complete novels from daomubiji.com # color and interrupt processing are supported, resumable upload # huntinux #2013-8-20 # v1.2 # mainurl = "www.daomubiji.com" # index.html's locationpwd = 'pwd' TMP = "$ PWD/tmp" # temp dirpagesdir = "$ tmp/ pages "# each Chapter Page's local locationnoveldir =" $ tmp/novel "# Each chaptermainpg =" $ tmp/index.html "# index.html local locationurls =" $ tmp/URLs "# contains all the pages 'urlurlsint = "$ tmp/urlsint" # contains all the pages 'urlsuf Fix = ". TXT "# Suffix of novelintfile =" $ tmp/continue "# If a signal is terminated during the download process # No matter whether the download is complete or not, delete the last downloaded Web page # To ensure that the web page can be completely downloaded # filename is defined in down_extract () # And the interrupt is saved to the intfile, so that you can resume # trap 'Red "interrupt occurred while downloading: $ FILENAME" red "delete file: $ pagesdir/$ FILENAME "red" Save interrupt point in $ intfile "RM-F $ pagesdir/$ filenameecho" $ FILENAME "> $ intfileexit-1 'int # color output # normal = $ (tput sgr0) green = $ (tput Setaf 2; tput bold) Yellow = $ (tput setaf 3) Red = $ (tput setaf 1) function red () {echo-e "$ red $ * $ normal"} function Green () {echo-e "$ green $ * $ normal"} function yellow () {echo-e "$ yellow $ * $ normal"} # extract novel content from the downloaded webpage # parameter 0 indicates the page for extracting content extract_page () {green "extract page $1" # obtain the chapter name # Title = 'grep "

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.