Back to Catalog
Bulk INSERT in the EF era, the uncle has encapsulated itself, the principle is to reduce the number of SQL connections and the number of instructions sent to SQL to 1 times, or 1000 data 1 times, and for the statement generated by EF, this is undoubtedly performance-efficient, because the EF side when processing, each statement will be sent to SQL once, Of course, even if it is in a SQL connection, sending n more than one instruction to SQL is also poor performance.
For MongoDB, too, how to reduce the number of communication with the MONGO, is to improve the insertion operation of the premise, fortunately, the official driver for us to inherit this function, using the Writemodel type to store the collection to be inserted, Using the Insertonemodel type to host the object you want to insert, it's all intuitive and the code is clear!
Public void Insert (ienumerable<tentity> Item) { varnew list<writemodel< Tentity>>(); foreach (var in item) { list. ADD (new insertonemodel<tentity>(iitem)); } _table. Bulkwriteasync (list). Wait (); }
And after the test batch add, do a batch update and delete, but unfortunately, all failed, the following public a failure code, if you have a solution, welcome message!
Public voidUpdate (ienumerable<tentity>Item) { varList =NewList<writemodel<tentity>>(); foreach(varIIteminchItem) { varquery =NewQuerydocument ("_id",NewObjectId (typeof(TEntity). GetProperty (EntityKey). GetValue (IItem). ToString ())); List. ADD (NewUpdateonemodel<tentity> (Query, builders<tentity>. Update.combine (Generatormongoupdate (IItem))); } _table. Bulkwriteasync (list). Wait (); } Public voidDelete (ienumerable<tentity>Item) { varList =NewList<writemodel<tentity>>(); foreach(varIIteminchItem) { varquery =NewQuerydocument ("_id",NewObjectId (typeof(TEntity). GetProperty (EntityKey). GetValue (IItem). ToString ())); List. ADD (NewDeleteonemodel<tentity>(query)); } _table. Bulkwriteasync (list). Wait (); }
Back to Catalog
MongoDB Learning Notes ~ Implementation of BULK Insert method