How to export/import data in Oracle databases (1)

Source: Internet
Author: User

So far, the export/import toolset is still the preferred utility for transferring data across multiple platforms with the minimum labor intensity, although it is often complained that it is too slow. Import only reads each record from the export dump file, and then inserts it INTO the target table using the common insert into command. Therefore, the import process may be slow, this is not surprising.

Go to Oracle Data Pump. Similar Tools with faster update of the export/import toolkit in Oracle Database 10g are designed to speed up the process exponentially.

Data Pump reflects the complete innovation in the entire export/import process. Instead of using common SQL commands, it uses dedicated APIs to load and unload data at a much faster speed. In my tests, I saw that the export performance was 10-15 times higher than that in direct mode, and the import process performance was 5 times higher. In addition, unlike the export utility, it can only retrieve specific types of objects such as procedures ).

Data Pump Export

This new utility is called expdp, which is separated from the original export exp. In this example, we will use Data Pump to export a large table CASES with a size of about 3 GB. Data Pump uses file processing on the server to create and read files. Therefore, the directory is used as the location. In this case, we will use the File System/u02/dpdata1 to save the dump file.

create directory dpdata1 as '/u02/dpdata1';grant read, write on directory dpdata1 to ananda;

Next, we will export the data:

expdp ananda/abc123 tables=CASES directory=DPDATA1 dumpfile=expCASES.dmp job_name=CASES_EXPORT

Let's analyze each part of the command. The User ID/password combination, table, and dump file parameters are significant. Unlike the original export, files are created on the server rather than the client. The location is specified by the DIRECTORY parameter value DPDATA1, which points to the created/u02/dpdata1. This process also creates a log file at the location specified by the DIRECTORY parameter on the server ). By default, this process uses a directory named DPUMP_DIR; therefore, it can be created to replace DPDATA1.

Note that the preceding parameter job_name is a special parameter and does not exist in the original export. All Data Pump jobs are completed through jobs. Data Pump job-different from DBMS job-only a server process, which represents the master process processing Data. The master process is called the master control process. It coordinates the work through the advanced Queue (AQ). It creates a special table called the master table during the runtime. In our example, if you check the ANANDA mode when the expdp is running, you will notice that the corresponding parameter job_name export exists in a table ). When expdp ends, the table is discarded.

Export Monitoring

When Data Pump Export (DPE) is running, press Control-C; To prevent messages from being displayed on the screen, but do not stop the Export process itself. Instead, it displays the DPE prompt as follows ). Processes are now considered in "interactive" Mode:

Export>

This method allows you to enter several commands on the DPE job. To view the summary, run the STATUS command at the prompt:

    Export> statusJob:CASES_EXPORTOperation:EXPORT Mode:TABLE State:EXECUTING Degree: 1Job Error Count: 0Dump file:/u02/dpdata1/expCASES.dmp bytes written = 2048Worker 1 Status:State:EXECUTING Object Schema:DWOWNERObject Name:CASESObject Type:TABLE_EXPORT/TBL_TABLE_DATA/TABLE/TABLE_DATACompleted Objects: 1Total Objects: 1Completed Rows: 4687818

Remember, this is only the status display. Export in the background. To continue viewing messages on the screen, run the "CONTINUE_CLIENT" command from the Export> prompt.

Parallel Operation

You can use more than one thread for export through the PARALLEL parameter to significantly accelerate the job. Each thread creates a separate dump file, so the parameter dumpfile should have the same project as the degree of parallelism. You can specify a wildcard as a file name instead of explicitly entering each file name. For example:

expdp ananda/abc123 tables=CASES directory=DPDATA1 dumpfile=expCASES_%U.dmp parallel=4 job_name=Cases_Export


Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.