How to use MySQL to learn how to import and export MongoDB

Source: Internet
Author: User
Tags mysql command line mysql import

In the above article, we learned how to use MySQL to learn MongoDB authorization and permissions. In this article, we will continue our learning journey and learn about the twoImport and Export.

1. MySQL Import and Export

(1) mysqlimport

This tool is located in the mysql/bin directory and is a very effective tool for MySQL to load (or import) data. This is a command line tool. There are two parameters and a large number of options to choose from. This tool imports a text file into your specified database and table. Finally, we want to import data from the student.txt file to the student table in the database class:

Mysqlimport class. student student.txt

(2) load data infile

This command is very similar to mysqlimport, but this method can be used in the MySQL command line. Like the mysqlimport tool, this command also has some optional parameters. For example, if you want to import data from your computer to a remote database server, run the following command:

Load data local infile "d: \ student.txt" into table student;

The local parameter above indicates that the file is a local file, and the server is the server you log on. In this way, you can use ftp to upload files to the server, and mysql completes the process for you.

(3) mysqldump

Mysqldump is similar to the opposite tool in many aspects. They have the same options. However, mysqldump can do more. It can load the entire database into a separate text file. This package contains all the SQL commands required to recreate your database. This command gets all the modes and converts them into DDL syntax, obtains all the data, and creates an INSERT statement from the data. This tool reverses all the designs in your database. Because everything is contained in a text file. This text file can be imported back to MySQL using a simple batch processing and a suitable SQL statement. This tool is incredibly simple and fast. There will never be a headache. Therefore, if you want to load the content of the entire database mydb to a file, you can use the following command:

Bin/mysqldump-p mydb> mydb.txt

2. MongoDB Import and Export

(1) Export Tool

MongoDB provides the export tool to export a collection to a file in json or csv format. You can specify which data items to export or export data based on the given conditions. The tool help information is as follows:

 
 
  1. [root@localhost bin]# ./mongoexport --help  
  2. options:  
  3. --help produce help message   
  4. -v [ --verbose ] be more verbose (include multiple times for more   
  5. verbosity e.g. -vvvvv)   
  6. -h [ --host ] arg mongo host to connect to ( /s1,s2 for sets)   
  7. --port arg server port. Can also use --host hostname:port   
  8. --ipv6 enable IPv6 support (disabled by default)   
  9. -u [ --username ] arg username   
  10. -p [ --password ] arg password   
  11. --dbpath arg directly access mongod database files in the given   
  12. path, instead of connecting to a mongod server -   
  13. needs to lock the data directory, so cannot be used   
  14. if a mongod is currently accessing the same path   
  15. --directoryperdb if dbpath specified, each db is in a separate   
  16. directory   
  17. -d [ --db ] arg database to use   
  18. -c [ --collection ] arg collection to use (some commands)   
  19. -f [ --fields ] arg comma separated list of field names e.g. -f name,age   
  20. --fieldFile arg file with fields names - 1 per line   
  21. -q [ --query ] arg query filter, as a JSON string   
  22. --csv export to csv instead of json   
  23. -o [ --out ] arg output file; if not specified, stdout is used   
  24. --jsonArray output to a json array rather than one object per   
  25. line   
  26. [root@localhost bin]#  

The following is a practical example to illustrate the usage of this tool:

Export table t1 in the foo database to json format:

 
 
  1. [root@localhost bin]# ./mongoexport -d foo -c t1 -o /data/t1.json   
  2. connected to: 127.0.0.1   
  3. exported 1 records   
  4. [root@localhost bin]#  

After the export is successful, check whether the style of the/data/t1.json file is what we want:

 
 
  1. [root@localhost data]# more t1.json   
  2. { "_id" : { "$oid" : "4f927e2385b7a6814a0540a0" }, "age" : 2 }   
  3. [root@localhost data]#  

The export is successful as described above, but there is a problem: What should we do if the migration of heterogeneous databases is successful? For example, what should we do if we want to import MongoDB data to MySQL? MongoDB provides a csv export format to solve the migration problem of heterogeneous databases. The following section exports the age and name columns of table t2 of the foo database, as follows:

 
 
  1. [root@localhost bin]# ./mongoexport -d foo -c t2 --csv -f age,name -o /data/t2.csv   
  2. connected to: 127.0.0.1   
  3. exported 1 records   
  4. [root@localhost bin]#  

View the export result of/data/t2.csv:

 
 
  1. [root@localhost data]# more t2.csv   
  2. age,name   
  3. 1,"wwl"   
  4. [root@localhost data]#  

We can see that MongoDB provides us with a strong data export tool.

(2) Export Import Tool

MongoDB provides the volume import tool to import the content of a file in a specific format to a collection. The tool help information is as follows:

 
 
  1. [root@localhost bin]# ./mongoimport --help   
  2. options:   
  3. --help produce help message   
  4. -v [ --verbose ] be more verbose (include multiple times for more   
  5. verbosity e.g. -vvvvv)   
  6. -h [ --host ] arg mongo host to connect to ( /s1,s2 for sets)   
  7. --port arg server port. Can also use --host hostname:port   
  8. --ipv6 enable IPv6 support (disabled by default)   
  9. -u [ --username ] arg username   
  10. -p [ --password ] arg password   
  11. --dbpath arg directly access mongod database files in the given   
  12. path, instead of connecting to a mongod server -   
  13. needs to lock the data directory, so cannot be used   
  14. if a mongod is currently accessing the same path   
  15. --directoryperdb if dbpath specified, each db is in a separate   
  16. directory   
  17. -d [ --db ] arg database to use   
  18. -c [ --collection ] arg collection to use (some commands)   
  19. -f [ --fields ] arg comma separated list of field names e.g. -f name,age   
  20. --fieldFile arg file with fields names - 1 per line   
  21. --ignoreBlanks if given, empty fields in csv and tsv will be ignored   
  22. --type arg type of file to import. default: json (json,csv,tsv)   
  23. --file arg file to import from; if not specified stdin is used   
  24. --drop drop collection first   
  25. --headerline CSV,TSV only - use first line as headers   
  26. --upsert insert or update objects that already exist   
  27. --upsertFields arg comma-separated fields for the query part of the   
  28. upsert. You should make sure this is indexed   
  29. --stopOnError stop importing at first error rather than continuing   
  30. --jsonArray load a json array, not one item per line. Currently   
  31. limited to 4MB.  

The following example shows how to use this tool:

Let's take a look at table t1 in the foo database:

 
 
  1. > db.t1.find();   
  2. { "_id" : ObjectId("4f937a56450beadc560feaa9"), "age" : 5 }   
  3. >  

T1 contains a record with age = 5. Let's take a look at what the data in the json file looks like:

 
 
  1. [root@localhost data]# more t1.json   
  2. { "_id" : { "$oid" : "4f937a56450beadc560feaa7" }, "age" : 8 }   
  3. [root@localhost data]#  

We can see that the t1.json file contains a data entry with age = 8. Next we will use the history import tool to import the records in the json file to table t1:

 
 
  1. [root@localhost bin]# ./mongoimport -d foo -c t1 /data/t1.json   
  2. connected to: 127.0.0.1   
  3. imported 1 objects  

The returned information of the tool inserts a record into the table. We will verify it in the database:

 
 
  1. [root@localhost bin]# ./mongo   
  2. MongoDB shell version: 1.8.1   
  3. connecting to: test   
  4. > use foo   
  5. switched to db foo   
  6. > db.t1.find();   
  7. { "_id" : ObjectId("4f937a56450beadc560feaa9"), "age" : 5 }   
  8. { "_id" : ObjectId("4f937a56450beadc560feaa7"), "age" : 8 }   
  9. >  

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.