SSH command overview, ssh command usage, and ssh command usage skills

Source: Internet
Author: User

Rm-rf mydir/* Delete the mydir directory */
Cd mydir/* enter the mydir directory */
Cd-/* Go back to the upper-level directory */
Cd ~ /* Return to the root directory */
Mv tools tool/* rename the tools directory to tool */
Ln-s tool bac

/* Create a symbolic link named bac for the tool directory. The most familiar thing is that www in FTP is linked to the public_html directory */

Cp-a tool/home/leavex/www/* Copy all files in the tool directory to the www directory */
Rm go.tar/* Delete the go.tar file */
Find mt. cgi/* find the file named mt. cgi */
Df-h/* Check the remaining disk space. It seems unnecessary unless you are too busy */
Tar xvf wordpress.tar/* Extract files in tar format */
Tar-tvf myfile.tar/* view files contained in the tar file */
Gzip-d ge.tar.gz/* decompress .tar.gzfile as a .tar file */
Unzip phpbb.zip/* decompress the ZIP file. It is a bit troublesome to compress a .tar.gz file in Windows */
Tar cf toole.tar tool/* pack the toolcontents as a toole.tar file */
Tar cfz geek.tar.gz tool
/* The size after the upload file is about 10 MB */

Related Articles: Wget command parameters and usage

Wget http://www.sevenapart.com/download/wp.tar.gz
/* Download files from the remote server to your server, saving the upload cost. If the server is not m, the bandwidth is M. It is not tens of seconds to download a 2-3 mb mt */
Http://www.eightapart.com/undone.zip wget-c
/* Continue to download the last undownloaded file */

Tar cfz geek.tar.gz tool
/* The size after the upload file is about 10 MB */

There are also some things to use in VIM. Let's also list them!

Mobile class:
H/j/k/l: move one cell to the left/bottom/up/right
W: Move backward words (How many words are added before the number)
B: Move the forward word (How many words are added before the number)
E: Move backward to the end of the word
Ge: move forward to the end of the word
$: End of a row
0: Beginning of the line
Tx: Search for x in the row to the right and move it there (to the left in uppercase)
33G: Move to the 33rd rows of the file
Gg: first line of the file
G: The end of the file.
33%: 33% of the file
H/M/L: the beginning, middle, and end of the screen.
Zt/zz/zb: Move the current row to the beginning/center/Bottom of the screen

Jump:
": Return to the jump location.
CTRL-O: Jump to an older place
CTRL-I: jump to a newer place

Search:
/: Search down (add a keyword)
? : Search up (followed by a keyword)
N: The next matched record

Edit:
I: Switch to insert mode
X: Delete the current character
.: Repeat the last modification operation (same as ctrl + f in PS to execute the filter)
U: Undo operation
CTRL-R: redo
P: insert the deleted characters to the current position (put)

Exit and save:
: Q: Exit
: Q! : Do not save and exit
ZZ: Save and exit
: E! : Abandon modification and re-edit

Log out of SSH and continue running!
# Nohup wget http://www.phpv.net/file.tar.gz &

Wget is a tool used in Linux to Extract files from the World Wide Web. It is a free software under the GPL license, and its author is Hrvoje Niksic. Wget supports HTTP and FTP protocols, and supports proxy servers and resumable data transfer. It can automatically recursively retrieve remote host directories, locate eligible files, and download them to a local hard disk; if necessary, wget will properly convert the super connections in the page to generate a browsed image in the mirror. Since there is no interactive interface, wget can run in the background, intercept and ignore the HANGUP signal, so after the user launches the login, it can continue to run. Generally, wget is used to download files on Internet websites in batches or create remote website images.

Syntax:

Wget [options] [URL-list]
URL format description: You can use a URL in the following format:
Http: // host [: port]/path
For example:
Http://fly.cc.fer.hr/
Ftp://ftp.xemacs.org/pub/xemacs/xemacs-19.14.tar.gz
Ftp: // username: password @ host/dir/file
In the last form, the user name and password are provided for the FTP host in the form of URL encoding (of course, you can also use the parameter to provide this information, as shown below ).

Parameter description:

Wget has many parameters, but most applications only need the following common parameters:
-R recursion; for HTTP hosts, wget first downloads the file specified by the URL, and then recursively downloads the file referenced by the file (Super connection) (if the file is an HTML file) all files (the recursive depth is specified by the parameter-l ). For the FTP host, this parameter means to download all files in the directory specified by the URL. The recursive method is similar to that for the HTTP host.

-N timestamp: this parameter specifies that wget only downloads updated files. That is to say, files with the same length as the last modification date in the local directory will not be downloaded.

-M image: equivalent to using both the-r and-N parameters.

-L sets the recursive level. The default value is 5. -L1 is equivalent to non-recursion.-l0 is infinite recursion. Note that when the recursive depth increases, the number of files will increase exponentially.

-T sets the number of retries. When the connection is interrupted (or times out), wget tries to reconnect. If-t0 is specified, the number of retries is set to an infinite number.

-C specifies the resumable upload function. In fact, wget has the resumable upload function by default. Only when you use another ftp tool to download a part of a certain file and want wget to complete this task, to specify this parameter.

Example:
Http://www.xker.com/wget-m-l4-t0/
Create an image at http://www.xker.com/on the local hard disk, and save the image file to a subdirectory named www.xker.comin the current directory (you can also use the-nH parameter to specify that this subdirectory is not created, but directly in
The directory structure of the image is created in the current directory. The recursive depth is 4, and the number of retries is infinite. (If the connection fails, wget will retry persistently until the task is completed !)

Some other parameters with lower frequency are as follows:
-A acclist/-R rejlist:
These two parameters are used to specify the file extension accepted or excluded by wget. Multiple Names are separated by commas. For example, if you do not want to download the MPEG video image file or. AU audio file, you can use the following parameters:
-R mpg, mpeg, au

Other parameters include:
-L only extends the relative connection. this parameter is useful for capturing the specified site and can avoid spreading to other directories on the host. For example, a personal website address is: http://www.xys.org /~ Ppfl/, use the following command line:
Wget-L http://www.xys.org /~ Ppfl/
Only the personal website is extracted, and other directories on the host www.xys.org are not involved.

-K conversion connection: when saving HTML files, convert non-relative connections to relative connections.

-X exclude specified directories when downloading files on the FTP host

In addition, the following parameters are used to set the wget working interface:
-V sets wget to output detailed work information.
-Q: When wget is set, no information is output.

If we have stored the connection of the file to be extracted in an HTML document (or a common text document), we can allow wget to extract information from the file directly, you do not need to provide the URL address in the command line. The parameter format is:
-I filename
An address file can also be an HTML file. For example, a common text file contains a list of URLs to be downloaded. We can use the following techniques to increase the download speed: Because Linux is a multitasking system, we can
Run multiple wgetthreads to increase the download speed. For example, download a main page file (index.html), and then download all the addresses listed in the file using an independent wget process.

  • 6 pages in total:
  • Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • Next Page

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.