Mongodb 3.4 + Centos6.5 configuration + mysql.sql to CSV or JSON import and export MONGO (64-bit system)

Source: Internet
Author: User
Tags mongodb import database



Installation steps under CentOS via Yum are as follows:



statement: Compared to those with the source code installed, less configuration and new log and data directory, this simple and rude,


1, create a warehouse file,
       vi /etc/yum.repos.d/mongodb-org-3.4.repo
2, copy the following configuration, save and exit
      [mongodb-org-3.4]
      name = MongoDB Repository
Baseurl = https: //repo.mongodb.org/yum/redhat/$releasever/mongodb-org/3.4/x86_64/
Gpgcheck = 1
Enabled = 1
Gpgkey = https: //www.mongodb.org/static/pgp/server-3.4.asc
  3, yum installation
       yum install -y mongodb-org

  4, modify the configuration file, other machines can also access, not only 127.0.0.1
      vi /etc/mongod.conf ----- "bind_ip default is 127.0.0.1 changed to 0.0.0.0

  5, start, stop
      service mongo start, service mongo stop

  6, log directory location --- "cat /var/log/mongodb/mongod.log
     Database file ---- "cat / var / lib / mongo
  
  7, uninstall mongo ----> yum erase $ (rpm -qa | grep mongodb-org)

I use --robo3T as a visualization tool to connect to mongo. Note that MongoVUE does not support mongodb3




MongoDB Common commands:


View database
     > show dbs
   Using the database
      > use database-name
    View table
      > show tables
    Query data
       > db.table-name.find (())
    Query conditions
       > db.table-name.find ({‘name‘: ‘zhangsan‘})
    Index
       > db.table-name.ensureIndex ({‘name‘: 1}) 1 is ascending, -1 is descending
    Composite index
      > db.table-name.ensureIndex {{‘name’: 1, ‘age’: 1}} Only querying name will index it. If you only query age, no index will be used. If you want to use the index, there must be N index columns.
    See if the index is built
      > db.table-name.getIndexes ()
    Delete index
      > db.table-name.dropIndex ({‘name‘: 1})
    Manage indexes and view all indexes:
     > db.system.index.find ()  





MONGO importing JSON or CSV practices


Disclaimer: 4G. sql file (20,050,144 data) imported MongoDB total time is 28 minutes + 25 minutes + 15 minutes
            ①, 28 minutes:  source./.sql Directory Import Database in Linux
            ②, 25 mins:  export format to CSV or JSON with NAVICAT for MySQL
            ③, 15 mins:  upload csv or json with the Mongoimport command from MONGO

The first 2 steps of the Fool-type operation, the only note is ② step:
                   JSON format: 4G size of. sql file, if using Navicat to JSON size to 11g,11g JSON upload MONGO will '/a ' error, experiment 200MB no problem
                   CSV format:  4G size of. sql file, if using NAVICAT to CSV size or 4G, where CSV will be garbled, no tube, direct upload MONGO no problem,
                                      Navicat Export When you choose to select the ' Include column title ' Post-upload name mongdb when you can fill in the parameters less than the column
The import command is as follows:
     Json:
Mongoimport--db table-name--collection table-name--file/test.json   
     Csv: mongoimport --db table-name--collection table-name--type csv--headerline--ignoreblanks--file/test.csv-- Numinsertionworkers 4
The Export command is as follows:
      Json:


mongoexport --db table-name--collection table-name-o/test.json


      

 Mongoexport --db table-name--collection table-name  --csv-  f id,province,city   -o/test.json
                                                                  --csv means to export to CSV format, you must indicate the exported columns after exporting the CSV
                                                                          -F indicates which columns need to be exported
 Parameter description --DB indicates which library to use, in this case " table-name"
 --collection indicates the table to be exported, in this case "Table-name"
          --type indicates the type of import, the default is JSON  --headerline applies only to imported data in CSV,TSV format, representing the first row in the file as the data header
          --ignoreblanks  ignore whitespace characters
          Location of the--file file
          --numinsertionworkers  in order to improve the insertion efficiency of MONGO, MongoDB recommended (numinsertionworkers) multi-threaded operation. Essentially, the Insert task is split into multiple threads.


Mongodb 3.4 + Centos6.5 configuration + mysql.sql to CSV or JSON import and export MONGO (64-bit system)


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.