Migrate data from MongoDB to PostgreSQL

Source: Internet
Author: User
It actually happened last month. It took a week to migrate the project database from MongoDB to PostgreSQL. The reason and process of migration are briefly described here. Because the project has not been launched, the business logic cannot be too detailed. The reason for migration has seen a lot of complaints over the past few months, mainly because of its memory usage.

It actually happened last month. It took a week to migrate the project database from MongoDB to PostgreSQL. The reason and process of migration are briefly described here. Because the project has not been launched, the business logic cannot be too detailed. The reason for migration has seen a lot of complaints over the past few months, mainly because of its memory usage.

It actually happened last month. It took a week to migrate the project database from MongoDB to PostgreSQL. The reason and process of migration are briefly described here. Because the project has not been launched, the business logic cannot be too detailed.

Migration reason

I have seen a lot of complaints over the past few months, mainly because its memory usage is exaggerated and its performance is poor in some application scenarios, however, considering that our project has been developed for a while and MongoDB's improvement in development efficiency, we are not planning to replace it. Even when Don't use MongoDB's famous phishing article appeared, it was quite calm.

Until I finally found that MongoDB does not support transactions. Some new functions have high requirements on data consistency. In some cases, data must be rolled back, And MongoDB cannot guarantee this by default, unless it finds a reliable workaround, otherwise, you must switch to RDBMS.

The various solutions I have found can be roughly divided into two categories: one is to simulate a transaction, and the other is to bypass it. The Perform Two Phase Commits in MongoDB's official Cookbook describes how to use an additional collection to record the data operation status to simulate transaction operations. The Master Detail Transactions in MongoDB and E-commerce articles both refer to the method of using the atomicity of a single document operation in MongoDB to ensure consistency.

Unfortunately, these two solutions are unreliable. The first method is too shanty because it simulates transaction operations at the application level rather than at the database level, making it difficult to ensure data security. The second method imposes too many design restrictions on the model. It is unrealistic to embed all relevant models into a model. In summary, MongoDB cannot be competent.

In addition, the Q & A on Stack Overflow Are there any e-commerce websites that use NoSQL databases. In this case, I also think that RDBMS is more reliable than NoSQL solutions. The extracted answer is as follows:

The overhead that makes RDBMS's so slow, is guaranteeing atomicity, consistency, isolation, durability, also known as ACID. some of these properties are pretty critical for applications that deal with money. you don't want to lose a single order when the lights go out.

NoSQL databases usually sacriice some or all of the ACID properties in return for severely discounted overhead. for each application, this is fine-if a few "diggs" go missing when the lights go out, it's no big deal.

For an ecommerce site, you need to ask yourself what you really need. Do you really need a level of performance that a RDBMS can't deliver? Do you need the reliability that an RDBMS provides? Unless you're dealing with traffic levels comparable to Amazon.com's, an RDBMs, even on modest hardware will probably satisfy your performance needs just fine, especially if you limit yourself to simple queries, and index properly.

After completing this investigation, I finally made up my mind to migrate the data.

Migration Process
  1. Eliminate all content related to the consumer ID in the Rails configuration file and switch back to ActiveRecord. The project cannot be started normally.

  2. Overwrite the migration of all models. I will skip the association-related content in the model and only consider the basic attributes of the model. The consumer ID feature is used in the model, and the complicated method is also commented out first.

  3. Ensure that the project works properlyStartThen, the code of each model and association. The association of the original HABTM are all changed to has_dependency: through.

This migration method is relatively primitive. Fortunately, our project has not yet been launched, so we don't need to consider the data migration issue. If you want to migrate a website that has been launched, you can refer to the article "moving from upload ID into ActiveRecord". The author has a detailed record of the entire process and has code for reference to XD.

Original article address: migrate data from MongoDB to PostgreSQL. Thank you for sharing it with the original author.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.