MongoDB's Bulk Write operation

Source: Internet
Author: User
Tags mongodb

This article comes from a blog with your own

The Bulk Write operations operation is a new feature of mongodb3.2, with the following syntax:


Db.collection.bulkWrite ([<operation 1>, <operation 2>,. ...], {writeconcern: <document>, Ordered: <boolean>})


Where ordered is a need to be noted, according to the official description:


    1. The default is Ture, which is to insert the data sequentially, if there is an error in the middle, will not continue to execute

    2. If False, MONGO inserts the data in a concurrent manner, with errors in the middle that have no effect on subsequent operations


Examples are as follows


  • Initialize data, initialize 3


    > db.log.count (); 0> db.log.bulkwrite ( [...     { insertone  : {  "Document"  : {"_id"  : 1,  "char"  :  "Dithras",  " Class " : " barbarian ", " lvl "&NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...       { insertOne : {  "Document"  : {"_id"  : 2,  "char" &NBSP;:   "Dithras",  "class"  :  "barbarian",  "lvl" &NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...       { insertOne : {  "Document"  : {"_id"  : 3,  "Char"  :  "Dithras",  "class"  :  "barbarian",  "lvl" &NBSP;:&NBSP;4&NBSP;}&NBSP;}  }...     ],{ordered:true}); {         "acknowledged"  : true,          "Deletedcount"  : 0,         "Insertedcount"  : 3,         "Matchedcount"  : 0,         "Upsertedcount"  : 0,          "Insertedids"  : {                  "0"  : 1,                  "1"  : 2,                  "2"  : 3         },         "Upsertedids"  : {         }}> db.log.count (); 3
  • Order default: True, the second data primary key conflict, only the first data is inserted, the total data is 4


    The second data primary key conflict, then only one data > db.log.bulkwrite (&NBSP;[...&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;{&NBSP;INSERTONE&NBSP;:  {  "Document"  : {"_id"  : 4,  "char"  :  "Dithras",  "class"  :  "barbarian",  "lvl" &NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...      {  insertOne : {  "Document"  : {"_id"  : 2,  "char"  :  " Dithras ", " Class  :  "barbarian",  "lvl" &NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...       { insertOne : {  "Document"  : {"_id"  : 5,  "char"  :  "Dithras",  "class"  :  "barbarian",  "lvl" &NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;      ],{ordered:true});2017-04-10t17:48:37.960+0800 e query     [thread1] bulkwriteerror: write error at item 1 in bulk  operation :bulkwriteerRor ({         "Writeerrors"  : [                 {                          "Index"  : 1,                          "Code"  : 11000,                           "ErrMsg"  :  "e11000 duplicate key error collection: c_log.log  index: _id_ dup key: { : 2.0 } ",                          "OP"  : {                                  "_id"  : 2,                                   "char"  :  "Dithras",                                   "class"  :  "barbarian",                                   "LVL"  : 4                          }                }         ],         "Writeconcernerrors"  : [ ],         "ninserted"  : 1,          "nupserted"  : 0,         "Nmatched"  : 0,         "nmodified"  : 0,          "Nremoved"  : 0,          "upserted" &NBSP;:&NBSP;[&NBSP;]}) [email protected]/mongo/shell/bulk_api.js:372:48bulkwriteresult/[ Email protected]/mongo/shell/bulk_api.js:336:24bulk/[email protected]/mongo/shell/bulk_api.js:1173:1 [Email protected]/mongo/shell/crud_api.js:191:20@ (Shell): 1:1> db.log.count (); 4
  • Order modified to FALSE, first data primary key conflict, 2, 3 no problem, the total data is 6


    > db.log.bulkwrite ( [...     { insertOne : {  " Document " : {" _id " : 4, " char " : " Dithras ", " Class " : " Barbarian ", " lvl "&NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...      { insertone  : {  "Document"  : {"_id"  : 6,  "char"  :  "Dithras",  " Class " : " barbarian ", " lvl "&NBSP;:&NBSP;4&NBSP;}&NBSP;}&NBSP;},...       { insertOne : {  "Document"  : {"_id"  : 5,  "char" &NBSP;:   "Dithras",  "class"  :  "barbarian",  "LVL"  : 4 } } }...      ],{ordered:false});2017-04-10t17:49:36.539+0800 e query     [thread1] BulkWriteError: write error at item 0 in bulk  Operation :bulkwriteerror ({         "Writeerrors"  : [                 {                          "Index"  : 0,                           "Code"  : 11000,                          "ErrMsg"   :  "e11000 duplicate key error collection: c_log.log index: _id_  dup key: { : 4.0 } ",                          "OP"  : {                                   "_id"  : 4,                                   "char"  :  "Dithras",                                   "class"  :  "barbarian",                                   "LVL"  : 4                          }                }         ],         "Writeconcernerrors" &NBSP;:&NBSP;[&NBSP;],          "ninserted"  : 2,          "nupserted"  : 0,         "nmatched"  : 0 ,         "Nmodified"  : 0,          "nremoved"  : 0,         "upserted" &NBSP;:  [ ]}) [email protected]/mongo/shell/bulk_api.js:372:48bulkwriteresult/[email protected]/ Mongo/shell/bulk_api.js:336:24bulk/[email protected]/mongo/shell/bulk_api.js:1173:1[email protected] /mongo/shell/crud_api.js:191:20@ (Shell): 1:1> db.log.count (); 6


This article from "www.wangerbao.com" blog, reproduced please contact the author!

MongoDB's Bulk Write operation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.