Common commands for SSH in CentOS

Source: Internet
Author: User
Tags ftp protocol

Common commands for SSH in CentOS
Directory operation:
Shift + inset/* Paste option, insert copied */


Rm-rf mydir/* Delete the mydir directory */


Cd mydir/* enter the mydir directory */


Cd-/* Go back to the upper-level directory */


Cd ../* back to the parent directory with spaces in the middle */


Cd ~ /* Return to the root directory */


Mv tools tool/* rename the tools directory to tool */


Ln-s tool bac/* Create a symbolic link named bac for the tool directory. The most familiar thing is that www in FTP is linked to the public_html directory */


Cp-a tool/home/vpser/www/* Copy all files in the tool directory to the www directory */


File Operations:


Rm go.tar/* Delete the go.tar file */


Find mt. cgi/* find the file named mt. cgi */


Df-h/* Check the remaining disk space. It seems unnecessary unless you are too busy */


Decompress:


Tar xvf wordpress.tar/* Extract files in tar format */


Tar-tvf myfile.tar/* view files contained in the tar file */


Tar cf toole.tar tool/* pack the toolcontents as a toole.tar file */


Tar cfz vpser.tar.gz tool/* The variable file is about 10 MB */


Tar jcvf/var/bak/www.tar.bz2/var/www // * Create .tar.bz2 file, high compression rate */


Tar xjf www.tar.bz2/* decompress the tar.bz2 format */


Gzip-d ge.tar.gz/* decompress .tar.gzfile as a .tar file */


Unzip phpbb.zip/* decompress the ZIP file. It is a bit troublesome to compress a .tar.gz file in Windows */


Download:


Wget http://soft.vpser.net/web/nginx/nginx-0.8.0.tar.gz


/* Download files from the remote server to your server, saving the upload cost. If the server is not m, the bandwidth is M. It is not tens of seconds to download a 2-3 mb mt */


Http://soft.vpser.net/web/nginx/nginx-0.8.0.tar.gz wget-c


/* Continue to download the last undownloaded file */


Reprinted please indicate the source: VPS detective http://www.vpser.net


Process Management:


Ps-aux/* ps Process status query command */


Description of the ps command output field:


[List]


[*] USER: the USER name of the process owner.


[*] PID, process ID, which uniquely identifies a process.


[*] % CPU, percentage of CPU time and total time occupied by the process since the last refresh.


[*] % MEM, percentage of memory used by the process.


[*] VSZ, virtual memory used by the process, in K.


[*] RSS, total number of physical memory occupied by processes, in K.


[*] TTY, the terminal name related to the process.


[*] STAT, Process status, with (R -- run or prepare to run; S -- sleep status; I -- idle; Z -- frozen; D -- uninterrupted sleep; w-the process does not reside on the page; T stops or tracks .) These letters.


[*] START, the process START time.


[*] TIME, total cpu time used by the process.


[*] COMMAND, the COMMAND line to be executed.


[/List]


Ps-aux | grep nginx/* search for nginx processes in all processes */


Kill 1234/* 1234 is the process ID, that is, the PID in ps-aux */


Killall nginx/* killall directly kills all processes by program name. nginx is the process name */


Vim operation:


Mobile class:


H/j/k/l: move one cell to the left/bottom/up/right


W: Move backward words (How many words are added before the number)


B: Move Forward words (front


How many words can be moved by a number on the face)


E: Move backward to the end of the word


Ge: move forward to the end of the word


$ <End>: End of a row


0 <Home>: Beginning of the line


Tx: Search for x in the row to the right and move it there (to the left in uppercase)


33G: Move to the 33rd rows of the file


Gg: first line of the file


G: The end of the file.


33%: 33% of the file


H/M/L: the beginning, middle, and end of the screen.


Zt/zz/zb: Move the current row to the beginning/center/Bottom of the screen


Jump:


": Return to the jump point


CTRL-O: Jump to an older place


CTRL-I <Tab>: jump to a newer place


Search:


/: Search down (add a keyword)


? : Search up (followed by a keyword)


N: The next matched record


Edit:


I: Switch to insert mode


X: Delete the current character


.: Repeat the last modification operation (same as ctrl + f in PS to execute the filter)


U: Undo operation


CTRL-R: redo


P: insert the deleted characters to the current position (put)


Exit and save:


: Q: Exit


: Q! : Do not save and exit


ZZ: Save and exit


: E! : Abandon modification and re-edit


Directory:


1. BasicInstructions/basic


2. wget/download Tool


3. Crontab/scheduled task


4. tar/tar.gz/compressed file


5. view the file size


Cd [directory name] conversion path


Cd .. back to the parent directory


Ls displays all files in the current directory


Rm [-r]-f [] [file name] delete a file, and add [-r] to delete all the sub-files in the file, for example, rm-rf [abc] deletes the abc folder and all files in the folder.


Tar-[decompress the downloaded package]


Unzip [file name] decompress the file


Cp-rpf. A/* B: Copy all the files in the folder to its parent directory B.


Wget (the best command in linux can be used to quickly download the required files from the Network)


1. BasicInstructions basic operation commands


Generally, you can use "$ [Instructions]-help" to get help from the following command [instructions], including its parameters.


Number list definition.


-Ls: list all contents in the current folder


$ Ls-o: list all contents in the current folder, including details, but not groups.


$ Ls-l same as above, including group information


$ Ls-a lists all contents in the current folder, including files starting "."


$ Ls-t sort by change time


$ Ls-v by version


-Cd [dir]: Enter the folder


Cd .. exit the current folder and return to the parent directory


-Pwd: displays the current path


-Mkdir [dir]: Create a folder


-Chmod: change file/folder Permissions


$ Chmod [Mode] [dir], where the Mode is like "755" or "777.


$ Chmod [Mode] [file]


$ Chmod-R [Mode] [dir], recursively used to change permissions for all objects in the target folder


Mode has another expression, "755" is "-rwxr-xr-x", not listed.


-Rm [file]: delete a file/folder


$ Rm-f [file] is forcibly deleted. Ignore non-existing files. No prompt is displayed.


$ Rm-r [file] recursively deletes all content


$ Rm-rf Delete folder


-Cp copy


$ Cp [options] [source] [destination]


[Options] can be-f (forced copy) or-r (recursive copy)


-Rename or move music videos


$ Mv [options] [source] [destination]


[Options] commonly used:-f (force move/rename),-I (try before move/rename),-u (update)


For example


$ Mvwwwroot/cgi-bin. Convert/cgi-bin


Move the directory to the current directory


Export mvcronfile.txtmyfile.txtrename cronfile.txtas myfile.txt


Cp-rpf. A/* B: Copy all the files in the folder to its parent directory B.


2. wget download Tool


Wget is a non-interactive network file download tool. It can be used in linux to quickly download files from the network.


The required file does not need to go through


Wget [parameter list] URL


The simplest usage:


$ Wgethttp: // targetdomain.com/file.tar


Common wget parameters:-t [nuberoftimes]: number of attempts. The number of attempts allowed when wget cannot establish a connection with the server. Ratio


For example, "-t120" indicates a 120 attempt. When this parameter is set to "0", it indicates that the attempt is infinite Multiple times until the connection is successful,


This setting is very useful. When the recipient's server is suddenly shut down or the network is suddenly interrupted, it can resume normal succession.


Download continued?


-C: resumable upload is also a very useful setting. If a large file is downloaded, when the connection is restored


Instead of starting from scratch, the remote server also supports resumable data transfer. Generally, UNIX/linux Web/FTP servers support


Resumable data transfer;-T [numberofseconds]: Specifies the timeout time. If the remote server does not respond, the connection is interrupted and the connection starts.


Next attempt. For example, "-T120" indicates that if the remote server does not send data after 120 seconds, it will try again.


. If the network speed is faster, you can set a shorter time. On the contrary, you can set a longer time, generally the most


A value of no more than 900 is usually not less than 60. Generally, about 120 is suitable.-w [numberofseconds]: the number of seconds to wait between two attempts, for example, "-w100" indicates two attempts.


Wait for 100 seconds.-nd: No directory structure is downloaded. All files downloaded from all specified directories on the server are heap to the current directory.-x: the setting is the opposite to that of "-nd, create a complete directory structure,


For example, "wget-ndhttp: // www.gnu.org /",


The actual directory structure is created at the first level until all files are uploaded.-nH: do not create a directory with the target host domain name as the directory name, and directly go down to the directory structure of the target host.


Current Directory


-R: recursive download, in the current directory structure;-l [depth]: Download the depth of the remote server directory structure, for example, "-l5" Download directory depth is less than or equal to 5


Directory structure or file in;


-M: The option used to create a site image. If you want to create a site image, use this option to automatically set


-Np: only download the contents of the directory specified for the target site and its subdirectories. This is also a very useful option.


The personal homepage of a person has a connection to the personal homepage of another person on this site, and we only want to download this


If you do not set this option on the personal homepage of a person, you may even capture the whole site.


-Http-user = username-http-passwd = password: if the Web server needs to specify the user name and password,


These two items are used for setting;-O writes data to a file.


3. Crontab scheduled task execution


In the DreamHost system, you can use Shell to create your own crontab. The usage is as follows:


Using a terminal that supports shell login (such as fterm or putty), enter username@qiran.org: 22 in the address bar to SSH


Method to log on to the server.


Common crontab commands:


Crontab-l displays all existing cronjobs.


Crontab-r deletes the current cronjobs.


Crontab-e: edit the current "crontabfile". nano is recommended for DH.


Note that your crontab contains all cronjobs, each cron row, and the end of the broken row. A normal cron is as follows:


Description:


452 ***/home/user/script. pl


The first number is the number of minutes per hour,


The second number is the hour of the day,


The third digit is the day of the month,


The fourth digit is the month of the year,


The fifth digit is the day of the week.


Example:


32 *: 32nd minutes per hour.


12, 42 *: indicates two times of 12th and 42nd minutes per hour.


*/15 */2 ***: 0: 00, 0: 15, 0: 30 ,...


43 18 ** 7: run the command line at every Sunday.


After using nano in DreamHost to edit the file, use ctrl + o to save the file and ctrl + x to exit the editing.


4.tar command


The tar command is used as follows:


Tar [parameter list] [file name]


Parameter List:


-C generate a new backup and overwrite the old backup file


-X extract from the backup file


-T list the file directories in the backup file


-V: display the list of all operated files


-F generate a backup at a specified location


-U will not exist in the backup file, or add the modified file to the backup.


Example:


Tar cvf filename.tar/* Create backup */


Tar cvf tarfile.tar./filename/* upload the filenamefile to tarfile.tar */


Tar tvf filename.tar/* list the content of the tar document */


Tar xvf filename.tar/* export files from the tar file */


Tar zxpvf filename.tar.gz/* export the file from the tar.gz document */


Tar zxvf filename.tar.gz/* same as above */


Tar xvf tarfile.tar./filename/* export a single file in the tar file */


Tar-xzf filename.tar.gz unzip


Package the file for download and package the directory./wwwto the file www.tar.gz


QUOTE:


Tar czvf www.tar.gz./www


Or simply zip./www.


Pack all .php files in the current directory and compress them to the bak.tar.gz file.


5. view the folder size


Du-s discuz by KB


Du-sh discuz by M


Common SSH commands for linux system 21:56:37 read 205 comments 0 font size: large, medium, and small


Rm-rf mydir/* Delete the mydir directory */


Mkdir [dir] Create a folder


Cd mydir/* enter the mydir directory */


Cd-/* Go back to the upper-level directory */


Cd ~ /* Return to the root directory */


Mv tools tool/* rename the tools directory to tool */


Ln-s tool bac


/* Create a symbolic link named bac for the tool directory. The most familiar thing is that www in FTP is linked to the public_html directory */


Cp-a tool/home/leavex/www/* Copy all files in the tool directory to the www directory */


Rm go.tar/* Delete the go.tar file */


Find mt. cgi/* find the file named mt. cgi */


Df-h/* to view the remaining disk space, it seems unnecessary unless


You are too old */


Tar xvf wordpress.tar/* Extract files in tar format */


Tar-tvf myfile.tar/* view files contained in the tar file */


Gzip-d ge.tar.gz/* decompress .tar.gzfile as a .tar file */


Unzip phpbb.zip/* decompress the ZIP file. It is a bit troublesome to compress a .tar.gz file in Windows */


Tar cf toole.tar tool/* pack the toolcontents as a toole.tar file */


Tar cfz geek.tar.gz tool


/* The size after the upload file is about 10 MB */


Wget http://www.sevenapart.com/download/wp.tar.gz


/* Download files from the remote server to your server, saving the upload cost. If the server is not m, the bandwidth is M. It is not tens of seconds to download a 2-3 mb mt */


Http://www.eightapart.com/undone.zip wget-c


/* Continue to download the last undownloaded file */


Tar cfz geek.tar.gz tool


/* The size after the upload file is about 10 MB */


There are also some things to use in VIM. Let's also list them!


Mobile class:


H/j/k/l: move one cell to the left/bottom/up/right


W: Move backward words (How many words are added before the number)


B: Move the forward word (How many words are added before the number)


E: Move backward to the end of the word


Ge: move forward to the end of the word


$: End of a row


0: Beginning of the line


Tx: Search for x in the row to the right and move it there (to the left in uppercase)


33G: Move to the 33rd rows of the file


Gg: first line of the file


G: The end of the file.


33%: 33% of the file


H/M/L: the beginning, middle, and end of the screen.


Zt/zz/zb: Move the current row to the beginning/center/Bottom of the screen


Jump:


": Return to the jump location.


CTRL-O: Jump to an older place


CTRL-I: jump to a newer place


Search:


/: Search down (add a keyword)


? : Search up (followed by a keyword)


N: The next matched record


Edit:


I: Switch to insert mode


X: Delete the current character


.: Repeat the last modification operation (same as ctrl + f in PS to execute the filter)


U: Undo operation


CTRL-R: redo


P: insert the deleted characters to the current position (put)


Exit and save:


: Q: Exit


: Q! : Do not save and exit


ZZ: Save and exit


: E! : Abandon modification and re-edit


Log out of SSH and continue running!


# Nohup wget http://www.phpv.net/file.tar.gz &


Wget is a tool used in Linux to Extract files from the World Wide Web.


Verifiable


Under the Free Software, its author is Hrvoje Niksic. Wget supports HTTP and


FTP protocol


Supports proxy server and resumable data transfer to automatically recursive remote host directories and locate the appropriate


Text


And download it to the local disk. If necessary, wget will properly convert the super connection in the page


Local


Generate a browsed image. Because there is no interactive interface, wget can be run in the background, intercepted and ignored


HANGUP Signal


Therefore, you can continue running after logging on. Generally, wget is used for batch download.


Internet


Files or remote website images.


Syntax


:


Wget [options] [URL-list]


URL format description: You can use a URL in the following format:


Http: // host [: port]/path


For example:


Http://fly.cc.fer.hr/


Ftp://ftp.xemacs.org/pub/xemacs/xemacs-19.14.tar.gz


Ftp: // username: password @ host/dir/file


In the last form, the user name and password are provided for the FTP host in the form of URL encoding (of course, you can also


Use


This parameter is provided, as shown in later ).


Parameter description:


Wget has many parameters, but most applications only need the following common parameters:


-R recursion; for HTTP hosts, wget first downloads the file specified by the URL, and then (if


Parts are


Recursively download all files referenced by this file (hyperjoin)


Parameters


-L specified ). For the FTP host, this parameter means to download all the files in the directory specified by the URL, recursively


Method and


The HTTP host is similar.


-N timestamp: this parameter specifies that wget only downloads the updated files, that is


Of


Files with the same length as the last modification date will not be downloaded.


-M image: equivalent to using both the-r and-N parameters.


-L sets the recursive level. The default value is 5. -L1 is equivalent to non-recursion.-l0 is infinite recursion. Note,


When delivered


When the depth increases, the number of files will increase exponentially.


-T sets the number of retries. When the connection is interrupted (or times out), wget tries to reconnect. For example


Result


If-t0 is specified, the number of retries is set to an infinite number.


-C specifies the resumable upload function. In fact, wget has the resumable upload function by default.


Farewell


The ftp tool of to download a part of a file, and want wget to complete this work, only need


Specify


This parameter.


Example:


Http://oneweb.com.cn/wget-m-l4-t0/


Create an image at http://oneweb.com.cn/on the local hard disk, and save the image file to the first directory.


Name


In the subdirectory of oneweb.com.cn (you can also use the-nH parameter to specify that this subdirectory is not created, but directly in


Current


Directory), the recursive depth is 4, and the number of retries is infinite (if the connection fails)


,


Wget will try again forever, knowing that the task is complete !)


Some other parameters with lower frequency are as follows:


-A acclist/-R rejlist:


These two parameters are used to specify the file extension accepted or excluded by wget. Multiple Names are separated by commas. Example


For example,


If you do not want to download the MPEG video and image files and the. AU audio files, you can use the following parameters:


-R mpg, mpeg, au


Other parameters include:


-L only extends the relative connection. this parameter is useful for capturing a specified site and can avoid


Host


Other directories. For example, a personal website address is: http://www.xys.org /~ Ppfl/, use


As follows:


Command line:


Wget-L http://www.xys.org /~ Ppfl/


Only the personal website is extracted, and other directories on the host www.xys.org are not involved.


-K conversion connection: when saving HTML files, convert non-relative connections to relative connections.


-X exclude specified directories when downloading files on the FTP host


In addition, the following parameters are used to set the wget working interface:


-V: Set a detailed job for wget output.


As information.


-Q: When wget is set, no information is output.


If we have stored the connection of the file to be extracted in an HTML document (or plain text document ),


, Yes


To allow wget to extract information from the file directly, instead of providing the URL address in the command line. The parameter format is


:


-I filename


The address file can also be a non-HTML file, for example, a common text file, where you need to download


URL Column


Table.


We can use the following techniques to increase the download speed: Because Linux is a multitasking system, we can


Run


For example, download a main page file (index.html), and then


This article


All the addresses listed in the file are downloaded using an independent wget process.


For other parameters, refer to the wget man manual page. The command is:


Man wget


Use wget to create website Images


Use the wget command line in shell to create a website image. This method downloads all files (including images and CSS) and changes the links on the webpage to relative links, in this way, the links in the image still point to the original website and cannot work normally.


This method only requires one command line:


$ Wget-mk-W20 http://www.example.com/


20 in the command line indicates that an object is downloaded every 20 seconds, which can avoid too frequent Website access. You can tune it down, but when you back up others' websites, consider it for others' servers.


How to back up and restore a MySQL database using SSH (Shell)


For example:


Database parameters ::


MySQL address: mysql.dh.net


MySQL name: mysql_dbname


MySQL user: mysql_dbuser


MySQL password: mysql_dbpass


I want to back up the database as bak. SQL


Steps:


Similarly, you can use telnet or download a putty from the windows system. After you log on to the server, cd all the way to the appropriate directory (confirm that the current directory is writable ).


Enter the following command:


CODE: mysqldump-h mysql.dh.net-p mysql_dbname-u mysql_dbuser> bak. SQL


Then press enter, prompting you to enter the database login password. after entering the password, press Enter. OK! Back up the database to the current directory.


----------


How to restore the bak. sqldatabase :::


Enter the following command and press Enter.


CODE: mysql-h mysql.dh.net-p mysql_dbname-u mysql_dbuser


Prompt you to enter the password, enter the password, and then press ENTER


The MySQL operation prompt symbol is displayed.


Enter the following command


CODE: source bak. SQL


Check the bak. SQL file before proceeding to the current directory.


Press enter. OK, restore ing .....


Note: If the database is relatively large, it will take some time (I spent 2 minutes in the memory MB DZ database)


In fact, any space that supports SSH and does not block MySQL/MySQLDUMP/SOURCE commands can use this method. 99% without garbled characters, even if the database version is different !!


If you have other space that supports remote MySQL (generally, cPanel space databases support remote connection), you can log on to dh ssh and connect to your database remotely, then, the DH server restores or backs up data to your remote database.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.