Tutorial on migrating a Django-used database from MySQL to PostgreSQL

Source: Internet
Author: User
We have completed the migration of the database from MySQL to PostgreSQL for the Django1.6 app, and if your environment is clean, the process will be simple, as long as you allow syncdb or migrate to create tables, truncating the data in the table, Then run the DumpData and Loaddatamanagement commands to finish.
The first step is to create an empty instance in your PostgreSQL database:

CREATE DATABASE dbname OWNER rolename;

The second step is to configure the database you created in your Django

In setting.py, we configure this:

DATABASES = {'  default ': '    ENGINE ': ' Django.db.backends.mysql ',    ' NAME ': ' dbname ',    ' USER ': ' Dbuser ', '    PASSWORD ': ' Dbpass ',    ' HOST ': ' mysql.example.com ',    ' PORT ': ',  },  ' PostgreSQL ': {    ' ENGINE ': ' django.db.backends.postgresql_psycopg2 ', '    NAME ': ' dbname ',    ' USER ': ' Dbuser ',    ' PASSWORD ': ' Dbpass ',    ' HOST ': ' postgresql.example.com ',    ' PORT ': ',  }}

This allows us to specify which database the name operation will be in the future.
Step three, create the table in the PostgreSQL instance

Python manage.py syncdb--migrate--no-initial-data--database=postgresql

Running syncdb and migrations in PostgreSQL, this process has no initial data.
Fourth step, truncate the newly created table

Although we specified the –no-initial-data in the previous step, in order to prevent the user's custom data from being added to the table in the unknown case during the migration process, we'd better truncate the new table, we can generate a SQL script to run:

Python manage.py Sqlflush--database=postgresql

Fifth step, back up data from MySQL to JSON file

Django has a dumpdata command that allows you to generate a database-independent backup with the default format JSON.

Python manage.py dumpdata--all--natural--indent=4 > Dbname.json

The-all parameter here is to make sure you have your own filtering and modifying data in the process of exporting the data, and the-natural parameter tells Django to use natural keys (if available) –indent parameters to make the output more readable.

You might want to export only the data in a particular application, or just export a celery logs, so you can use the –exclude parameter, for example:

Python manage.py dumpdata--all--natural--exclude=djcelery--exclude=search. Searchlog--indent=4 > Dbname.json

Sixth step, load the JSON data into the PostgreSQL database

Python manage.py loaddata Dbname.json--database=postgresql

Basically the migration process is over, and now you just have to modify your database configuration and then postgersql to become the default database.

DATABASES = {'  default ': {    ' ENGINE ': ' django.db.backends.postgresql_psycopg2 ',    ' NAME ': ' dbname ',    ' USER ': ' Dbuser ', '    PASSWORD ': ' Dbpass ',    ' HOST ': ' postgresql.example.com ',    ' PORT ': ',  }}

In our case, our database is not very clean, our database was created for a PHP legacy code, we are still stepping away from it, we have some database outside of Django, and are used by the program, in order to migrate these to PostgreSQL, I use this tool, the process is still simple.
Precautions
Django Signal

You may want to prohibit these, when a database record is created, your program may send you the use of the mail, in order not to disturb them, when loading the data, you need to ensure that they are banned, here's one-to-handle this Here we use the detector to ensure that the signal does not trigger.
Constraints (like non-null, unique, and foreign keys)

Our database migration process encountered a lot of this problem, such as a foreign key constraint, but the other table does not exist, there is a vacant record, but not allowed in the model definition, such as the replication process has a unique constraint, which requires manual intervention, I must use the SQL statement to clear out these, DumpData and LoadData will detect these, so your database must be a clean, consistent state.

Hard-coded primary key

This is painful because in our test suite, there are all the hard coding of the primary key, so a lot of tests fail, because the method of using PostgreSQL to process the sequence and MySQL is not the same, I have to manually modify more than 700 test cases, mostly simple modification but very time consuming.

Native SQL statements

Fortunately, we have only one place to use native SQL statements, and some of the functions in MySQL do not work in PostgreSQL, we only use functions that are modified to be the same function in PostgreSQL.

Case sensitive

String comparisons are case-sensitive in PostgreSQL, but not in MySQL, and I have some problems with migrating non-Django data tables, when the index is created, the command needs an ID, but the field name is ID (the letter i), I just rename it to ID.

  • Related Article

    Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.