Common commands for Linux learning notes

Source: Internet
Author: User
Tags bz2 file url ftp site gz file mirror website

The following commands are mainly used in the command, for some often used, collect information, summed up.

Instruction Directory:

1.yum

2.wget

3.tar

1.yum command:

Yum (full name Yellow dog Updater, Modified) is a shell front-end package manager in Fedora and Redhat and SuSE. Based on RPM package management, the ability to automatically download the RPM package from the specified server and install, can automatically handle dependency relationships, and install all dependent software packages at once, without the tedious download and installation. Yum provides commands to find, install, and delete one, a group, or even all of the packages, and the commands are concise and well-remembered.
Yum's command form is generally as follows: Yum [options] [command] [package ...]
The options are optional, including-H (Help),-y (when the installation process prompts you to select all "Yes"),-Q (does not show the installation process), and so on. [command] for the action you want to take, [package ...] Is the object of the operation.
Summarizes some of the commonly used commands include:
Automatically search for the fastest image plugin: Yum install Yum-fastestmirror
Installing the Yum graphics window plugin: Yum install Yumex
View a list of possible bulk installations: Yum Grouplist
1 installation
Yum Install all installed
Yum Install Package1 installs the specified installation package Package1
Yum groupinsall group1 Installer group group1
2 Updates and Upgrades
Yum Update all Updates
Yum Update Package1 updates the specified package Package1
Yum check-update Check for updatable programs
Yum Upgrade Package1 upgrade specified package Package1
Yum groupupdate group1 upgrade program group Group1
3 Finding and displaying
Yum Info package1 Displays installation package information Package1
Yum list shows all packages that are installed and can be installed
Yum List Package1 shows the installation of the specified package Package1
Yum groupinfo group1 Display program group group1 info Yum Search string finds the installation package based on the keyword string
4 Removing programs
Yum Remove | Erase Package1 Remove Package Package1
Yum groupremove group1 Remove a program group group1
Yum deplist package1 Viewing program package1 dependencies
5 Clearing the Cache
Yum Clean packages clears the package from the cache directory
Yum clean headers clears the headers in the cache directory
Yum clean oldheaders Clear Cache directory for old headers
Yum Clean, yum clear all (= Yum packages; Yum oldheaders) clears the cache directory of packages and the old headers
For example, to install a game program group, first look for:
#:yum grouplist
It can be found that the installable game package name is "games and entertainment" so that it can be installed:
#:yum Groupinstall "Games and entertainment"
All the game packages are installed automatically. Here the name of games and entertainment must be selected in double quotes, because Linux encounters a space below that the file name ends, so you must tell the system that the package name is "games and entertainment" instead of "games."
In addition, you can modify the configuration file/etc/yum.conf Select the installation source. It can be seen how easy it is to configure the Yum program. More detailed options and commands, of course, just below the command prompt line: Man yum

2.wget command:

Linux wget is a tool for downloading files, which is used at the command line. This is an essential tool for Linux users, especially for network administrators, who often download software or restore backups from remote servers to a local server. If we use a virtual host, we can only download from the remote server to our computer disk and then upload it to the server using the FTP tool. This is a waste of time and energy, there is no way to do. To a Linux VPS, it can be downloaded directly to the server without having to upload this step. The Wget tool is small but fully functional, it supports breakpoint download function, supports FTP and HTTP download, supports proxy server and is easy to set up. Below we explain how to use wget as an example.

1. Use wget to download individual files

The following example is to download a file from the network and save it in the current directory

wget Http://cn.wordpress.org/wordpress-3.1-zh_CN.zip

A progress bar is displayed during the download, including (Percent download complete, bytes already downloaded, current download speed, remaining download time).

2. Download with Wget-o and save it with a different file name

wget defaults to the last character that matches the "/" and the file name for dynamically linked downloads is usually incorrect.
Error: The following example downloads a file and saves it by name download.php?id=1080

wget http://www.centos.bz/download?id=1
Even if the downloaded file is in the zip format, it still takes the download.php?id=1080 command.
Correct: To solve this problem, we can use the parameter-o to specify a file name:

Wget-o Wordpress.zip http://www.centos.bz/download.php?id=1080

3, use wget–limit-rate speed limit download
When you execute wget, it will use all possible broadband downloads by default. But when you're ready to download a large file and you still need to download other files, it's necessary to limit the speed.

wget–limit-rate=300k Http://cn.wordpress.org/wordpress-3.1-zh_CN.zip

4. Use wget-c breakpoint to continue transmission
To restart the download of the interrupted file using wget-c:

Wget-c Http://cn.wordpress.org/wordpress-3.1-zh_CN.zip
It is very helpful for us to download a large file because of the interruption of network and other reasons, we can continue to download the file instead of downloading it again. You can use the-c parameter when you need to continue the interrupted download.

5, use wget-b background download
For downloading very large files, we can use the parameter-B to download the background.

Wget-b Http://cn.wordpress.org/wordpress-3.1-zh_CN.zip
Continuing in background, PID 1840.
Output'll is written to ' Wget-log '.
You can use the following command to view the download progress

Tail-f Wget-log

6. Disguise Agent name Download
Some websites may reject your download request by judging the proxy name as not a browser. But you can disguise it by –user-agent parameters.

Wget–user-agent= "mozilla/5.0 (Windows; U Windows NT 6.1; En-US) applewebkit/534.16 (khtml, like Gecko) chrome/10.0.648.204 safari/534.16″ Download link

7. Use wget–spider test Download link
When you plan to do a timed download, you should test the download link at the scheduled time to see if it is valid. We can increase the –spider parameter to check.

Wget–spider URL
If the download link is correct, it will show

Wget–spider URL
Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response ... OK
length:unspecified [text/html]
Remote file exists and could contain further links,
But recursion is disabled-not retrieving.
This ensures that the download can take place at the scheduled time, but when you give the wrong link, the following error will be displayed

Wget–spider URL
Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response ... 404 Not Found
Remote file does not exist-broken link!!!
You can use the spider parameter in the following situations:

Check before scheduled download
Interval detect whether a site is available
Check for dead links on site pages

8. Use Wget–tries to increase the number of retries
It is also possible to fail if the network is having problems or downloading a large file. wget default retry 20 connection download file. If necessary, you can use –tries to increase the number of retries.

Wget–tries=40 URL

9. Download multiple Files using Wget-i
First, save a copy of the download link file

Cat > Filelist.txt
Url1
Url2
Url3
Url4
Then use this file and parameters-I download

Wget-i filelist.txt

10. Using Wget–mirror Mirror website
The following example is to download the entire website to local.

Wget–mirror-p–convert-links-p./local URL
–miror: Account opening image download
-P: Download all files for HTML page to display normal
–convert-links: After download, convert cost to link
-P./local: Save all files and directories to a locally specified directory

11, use wget–reject filter specified format download
You want to download a website, but you do not want to download pictures, you can use the following commands.

Wget–reject=gif URL

12. Use Wget-o to save the download information to the log file
You do not want the download information to be displayed directly in the terminal but in a log file, you can use the following command:

Wget-o Download.log URL

13, use wget-q limit the total download file size
When you want to download more than 5M files and exit the download, you can use the following command:

Wget-q5m-i filelist.txt
Note: This parameter does not work for a single file download and is only valid for recursive downloads.

14. Use Wget-r-A to download the specified format file
You can use this feature in the following situations

Download all pictures of a website
Download all videos of a website
Download all PDF files for a website
Wget-r-a.pdf URL

15, using wget FTP download
You can use wget to complete the download of the FTP link.
Using wget anonymous FTP download

wget Ftp-url

FTP download with wget user name and password authentication

Wget–ftp-user=username–ftp-password=password URL

Wget is an open source software developed under Linux, the author is Hrvoje Niksic, which was later ported to various platforms, including Windows. It has the following features and features:

(1) Support the breakpoint down-pass function; This is also the network Ant and flashget the biggest selling point of the year, now, wget can also use this feature, those networks are not too good users can rest assured;
(2) Support both FTP and HTTP download mode, although most of the software can now be downloaded using HTTP, but, in some cases, still need to use FTP mode to download software;
(3) Support proxy server; For a system with high security intensity, it is generally not to expose its own system directly to the Internet, so the support agent is the necessary function to download the software;
(4) easy to set up; maybe, accustomed to GUI user is not too accustomed to command line, but, the command line in the settings actually have more advantages, at least, the mouse can be a little bit less many times, do not worry about whether the mouse is wrong point;
(5) The procedure is small, completely free, the program is small can be considered, because the current hard disk is too big, completely free to consider, even if there are many so-called free software, but these software ads are not our favorite;

Although wget is powerful, it is relatively simple to use, and the basic syntax is: wget [parameter list] URL. Here are some examples to illustrate the use of wget.

1, download the entire HTTP or FTP site.
wget Http://place.your.url/here
This command can download the Http://place.your.url/here home page. Using-X forces a directory to be identical on the server, and if you use the-nd parameter, all content downloaded by the server is added to the local current directory.

Wget-r Http://place.your.url/here
This command will follow the recursive method of downloading all directories and files on the server, essentially downloading the entire site. This command must be used with caution, because at the time of download, all the addresses that the downloaded site points to are also downloaded, so if the site references other sites, the referenced sites will be downloaded as well! For this reason, this parameter is not commonly used. You can use the-l number parameter to specify the level of the download. For example, to download only two tiers, use-l 2.

If you want to create a mirror site, you can use the-m parameter, for example: Wget-m http://place.your.url/here
At this point wget will automatically determine the appropriate parameters to make the mirror site. At this point, wget will log on to the server, read into the robots.txt and follow the robots.txt rules.

2, the breakpoint continues to pass.
When the file is particularly large or the network is particularly slow, often a file has not been downloaded, the connection has been cut off, at this point, the need to continue to pass the breakpoint. Wget's breakpoint continuation is automatic and requires only the-c parameter, for example:
Wget-c Http://the.url.of/incomplete/file
Using a breakpoint to resume requires the server to support the continuation of the breakpoint. The-t parameter indicates the number of retries, such as the need to retry 100 times, then write-T 100, if set to-T 0, indicates an infinite retry until the connection succeeds. The-t parameter indicates a time-out wait, such as-t 120, which means that waiting for 120 seconds does not connect even if it times out.

3, Bulk download.
If you have more than one file to download, you can generate a file, write one line for each file URL, such as generate file Download.txt, and then use the command: Wget-i download.txt
This will download each URL listed in Download.txt. (If the column is a file to download the file, if the column is a site, then download the first page)

4, Selective download.
You can specify that you want wget to download only one type of file, or not to download it. For example:
Wget-m–reject=gif http://target.web.site/subdirectory
Indicates that the http://target.web.site/subdirectory is downloaded, but the GIF file is ignored. –accept=list can accept the file type, –reject=list rejects the accepted file type.

5, Password and authentication.
Wget can only handle websites that restrict access using username/password, with two parameters:
–http-user=user setting up an HTTP user
–http-passwd=pass Setting the HTTP password
For sites that require certificates for certification, you can only use other download tools, such as curl.

6, the use of Proxy server for download.
If the user's network needs to go through a proxy server, then you can let wget through the proxy server for file download. At this point, you need to create a. wgetrc file in the current user's directory. You can set up a proxy server in the file:
Http-proxy = 111.111.111.111:8080
Ftp-proxy = 111.111.111.111:8080
Represents the proxy server for HTTP and the proxy server for FTP, respectively. If the proxy server requires a password, use:
–proxy-user=user setting up a proxy user
–proxy-passwd=pass Setting the proxy password
These two parameters.
Use the parameter –proxy=on/off or close the agent.
Wget also has a lot of useful features that users need to dig into.

Appendix:

Command format:
wget [parameter list] [target software, Web page URL]

-v,–version Display the software version number and then exit;
-H,–HELP display software help information;
-e,–execute=command execute a ". Wgetrc" command

-o,–output-file=file Save the software output information to a file;
-a,–append-output=file Append the software output information to the file;
-d,–debug display output information;
-q,–quiet does not display output information;
-i,–input-file=file get the URL from the file;

-t,–tries=number Download count (0 = infinite)
-o–output-document=file download file to another file name
-nc,–no-clobber do not overwrite files that already exist
-n,–timestamping only download newer files than the local
-t,–timeout=seconds setting the time-out period
-y,–proxy=on/off Closing the agent

-nd,–no-directories do not create a directory
-x,–force-directories forcing a directory to be established

–http-user=user setting up an HTTP user
–http-passwd=pass Setting the HTTP password
–proxy-user=user setting up a proxy user
–proxy-passwd=pass Setting the proxy password

-r,–recursive Download the entire website, directory (use caution)
-l,–level=number Download Hierarchy

-a,–accept=list types of files that can be accepted
-r,–reject=list rejected file types
-d,–domains=list domain names that can be accepted
–exclude-domains=list denied domain name
-l,–relative Download Associated Links
–follow-ftp Download Only FTP links
-h,–span-hosts can download the outside host
-i,–include-directories=list Allowed Directories
Directories rejected by-x,–exclude-directories=list

The Chinese document name is encoded in the usual situation, but it is normal when –cut-dirs,
Wget-r-np-nh–cut-dirs=3 ftp://host/test/
Test. txt
Wget-r-np-nh-nd ftp://host/test/
%b4%fa%b8%d5.txt
wget "ftp://host/test/*"
%b4%fa%b8%d5.txt

Due to unknown reasons, it may be to avoid the special file name, wget will automatically take the part of the grab file name encode_string processed, so the patch will be encode_string processed into "%3a" this thing, with Decode_string restore into ":" and applies to the part of the directory and the file name, Decode_string is the wget built-in function.

Wget-t0-c-nh-x-np-b-m-p/home/sunny/nod32view/http://downloads1.kaspersky-labs.com/bases/-o wget.log

3.tar command:

  

Extract
Syntax: tar [main option + Secondary options] file or directory

When using this command, the main option is required, and it tells Tar what to do, auxiliary options are auxiliary and can be used.

Main options:

C Create a new profile. Select this option if the user wants to back up a directory or some files. Equivalent to packaging.

X release the file from the archive file. Equivalent to unpacking.

T list the contents of the archive file and see which files have been backed up.

In particular, in the release of the parameters, c/x/t can only exist one! Cannot exist at the same time! Because it is not possible to compress and decompress simultaneously.

Accessibility Options:

-Z: Do you have the properties of gzip at the same time? i.e. do I need to compress or decompress with gzip? The general format is xx.tar.gz or XX. tgz

-j: Do you have bzip2 properties at the same time? i.e. do I need to compress or decompress with bzip2? The general format is xx.tar.bz2

-V: Files are displayed during compression! This common

-F: Use the file name, please note, after F to immediately answer the file name Oh! Don't add any more parameters!

-P: Use original file properties (attributes are not changed according to user)

--exclude file: In the process of compression, do not package file!

Example:

Example one: Package all the files in the/etc directory into/tmp/etc.tar

[[email protected] ~]# TAR-CVF/TMP/ETC.TAR/ETC <== package only, do not compress!

[[email protected] ~]# tar-zcvf/tmp/etc.tar.gz/etc <== packaged, compressed with gzip

[[email protected] ~]# tar-jcvf/tmp/etc.tar.bz2/etc <== packaged, bzip2 compressed

# Note that the file name after parameter f is taken by yourself, and we are accustomed to using. Tar as a recognition.

# if the z parameter is added, the. tar.gz or. tgz represent the gzip compressed tar file ~

# If you add the J parameter, use. tar.bz2 as the file name.

# When the above instruction is executed, a warning message is displayed:

# "tar:removing leading"/"from member names" that is a special setting for absolute paths.

Example two: Check out what files are in the above/tmp/etc.tar.gz file?

[Email protected] ~]# tar-ztvf/tmp/etc.tar.gz

# because we use gzip compression, so to check the file in the tar file,

# you have to add Z to this parameter! It's important!

Example three: Extracting the/tmp/etc.tar.gz file under/USR/LOCAL/SRC

[Email protected] ~]# CD/USR/LOCAL/SRC

[Email protected] src]# tar-zxvf/tmp/etc.tar.gz

# in the case of presets, we can unzip the file anywhere! In this example,

# I'm going to transform my working directory under/USR/LOCAL/SRC and untie/tmp/etc.tar.gz

# then the unpacked directory will be in/usr/local/src/etc, plus, if you enter/USR/LOCAL/SRC/ETC

# you will find that the file attributes in this directory may be different from the/etc/.

Example four: under/tmp, I just want to untie the etc/passwd inside the/tmp/etc.tar.gz.

[Email protected] ~]# cd/tmp

[Email protected] tmp]# tar-zxvf/tmp/etc.tar.gz etc/passwd

# I can check the file name in the Tarfile through TAR-ZTVF, if only one file,

# can be released in this way! Notice that! The root directory within the etc.tar.gz/is taken away!

Example five: I want to back up home/home, etc, but don't/home/dmtsai

[Email protected] ~]# tar--EXCLUDE/HOME/DMTSAI-ZCVF myfile.tar.gz/home/*/etc

In addition: the C parameter of the TAR command

$ TAR-CVF File2.tar/home/usr2/file2
Tar:removing leading '/' from the Members names
Home/usr2/file2
This command can package the/home/usr2/file2 file into the File2.tar in the current directory, note that the source file identified with the absolute path, after compression with the tar command, with the absolute path (here is the home/usr2/, root directory '/' is automatically removed) is compressed in the same. The following conditions occur when decompressed with the tar command:
$ TAR-XVF File2.tar
$ ls
.............. .....
The extracted file name is not the imagined file2, but the home/usr2/file2.

$ TAR-CVF file2.tar-c/HOME/USR2 file2
The-C dir parameter in this command changes the working directory of tar from the current directory to/HOME/USR2 and compresses the File2 file (without an absolute path) into File2.tar. Note: The function of the-C dir parameter is to change the working directory, which is valid until the next time the command-C dir parameter.
Using the-C dir parameter of tar, it is also possible to extract files to other directories under the current directory/HOME/USR1, for example:
$ TAR-XVF file2.tar-c/HOME/USR2
The tar cannot be done without the-C dir parameter:
$ TAR-XVF FILE2.TAR/HOME/USR2
Tar:/tmp/file:not found in archive
Tar:error exit delayed from previous errors

Common commands for Linux learning notes

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.