Use expdp (non-local) to remotely export data

Source: Internet
Author: User
Tags import database

Use expdp (non-local) to remotely export data

Background:

Some time ago, test data (one table) needs to be imported into the local database from a remote test database. The table has about 5 million data records and more than 160 fields. The exp/imp method is used, the speed is so slow that it cannot be tolerated. Therefore, the expdp/impdp method is used instead.

Expdp/impd introduction:

In addition to traditional exp/imp Import and Export tools, Oracle provides expdp/impdp Data Pump Import and Export tools starting from 10 Gb.

From the official documents, Oracle Data Pump consists of three parts:

> The command-line clients, expdp and impdp

> The DBMS_DATAPUMP PL/SQL package (also known as the Data Pump API)
 

> The DBMS_METADATA PL/SQL package (also known as the Metadata API)
 

The Data Pump clients, expdp and impdp, invoke the Data Pump Export utility and Data Pump Import utility, respectively.
The Data Pump client (expdp/impdp) will call the Data Pump Export/Import Utility application tool.


The expdp and impdp clients use the procedures provided in the DBMS_DATAPUMP PL/SQL package to execute export and import commands, using the parameters entered at the command line. these parameters enable the exporting and importing of data and metadata for a complete database or for subsets of a database.
The expdp/impdp client uses the stored procedure provided by DBMS_DATAPUMP PL/SQL package to execute the export/import command, and can add parameters to the command line, these parameters can be used to import and export data and metadata in the database, or some of them.

When metadata is moved, Data Pump uses functionality provided by the DBMS_METADATA PL/SQL package. The DBMS_METADATA package provides a centralized facility for the extraction, manipulation, and re-creation of dictionary metadata.

To import and export metadata, the data pump uses the functions provided by DBMS_METADATA PL/SQL package. The DBMS_METADATA Package provides convenient methods for extracting, controlling, and recreating data dictionary metadata.

The DBMS_DATAPUMP and DBMS_METADATA PL/SQL packages can be used independently of the Data Pump clients.
 

The PL/SQL packages of DBMS_DATAPUMP and DBMS_METADATA can be used independently of the Data Pump client.

All Data Pump Export and Import processing, including the reading and writing of dump files, is done on the system (server) selected by the specified database connect string. this means that for unprivileged users, the database administrator (DBA) must create directory objects for the Data Pump files that are read and written on that server file system. (For security reasons, DBAs must ensure that only approved users are allowed access to directory objects .) for privileged users, a default directory object is available. see "Default Locations for Dump, Log, and SQL Files" for more information about directory objects.
 

All data pump import and export processes, including reading and writing dump files, are on the database server. This means that for unauthorized users, DBAs must create directory objects that can be read and written on the server side for the Data Pump files. For security reasons, DBAs must ensure that only authorized users can access these directory objects. For authorized users, the default directory object can be used.

Summary:

The Data Pump tool expdp/impdp is actually calling two PL/SQL packages (one corresponding data and one corresponding metadata) during execution ), read and Write dump files are stored in the folder corresponding to the directory on the server.

The problem arises. If there is no server account, how can I get the dump for export even if it can be exported?

The requirement is that the local server uses expdp to export the data of a table from the remote server and import it to the local database using impdp without the Logon account of the remote server.

A remote data table can be imported and exported. Of course, exp/imp can be used here. Now let's see how expdp/impdp works.

Solution: Use dblink to import and export remote data.

1. Create a dblink for the local database to point to the remote database.

Create database link gf_local connect to username identified by password using 'tnsname ';

2. Create a directory object in the local database.

Create directory expdp_dir as '/home/oracle11g/expdp_dir /';

3. Create a data pump export parameter file in the local database.

Vi exp. par:

Userid = username/password // the username and password of the local database. You can use the dblink and directory created above.

Directory = expdp_dir // directory created by the local database.

Dumpfile = gf_expdp.dump // dump file name.

Logfile = gf_expdp.log // log file name.

Tables = depkf15.t _ cp // table to be exported.

Query = '"where rownum <1000001"' // export condition, first 1 million rows.

Network_link = gf_local // dblink name.

Note: there is a small problem here. It is worth noting that tables = depkf15.t _ cp. If the table to be exported is different from the login user, you need to add the exported Object schema (User) here ), otherwise, an error is reported.

4. Execute the Export command.

Expdp parfile = exp. par

When an error is reported, the following message is displayed:

ORA-31631: requires Permissions

ORA-39149: unable to link authorized users to unauthorized users

In this case, grant the remote database user exp_full_database permission:

GRANT exp_full_database TO username;

The next step is to wait. The dump file will be stored in the local expdp_dir pointing to the path.

Summary:

1. expdp/impdp is a data pump tool provided by more than 10 Gb. It will call two PL/SQL packages during operation. Of course, you can also run these two PL/SQL packages directly without trying, if you have tried it, you can share it with us.

2. The import database account must have the imp_full_database permission, and the export database account must have the exp_full_database permission.

3. expdp is better than exp. For example, you can see that expdp has more optional parameters from help = y. expdp has COMPRESSION parameters, which are described as follows:

Reduce the size of a dump file.

Valid keyword values are: ALL, DATA_ONLY, [METADATA_ONLY] and NONE.

Refer to secooler's article () to compare the compression ratio of the four parameters.

4. in my experiment, the network between the remote database and the local database is not very good, so the actual effect is that exp and expdp are relatively slow, and because dblink is used, it will not be further explored, this is just to illustrate how expdp/impdp can export remote databases.

After testing by a colleague, it took about one hour to export a MB file using exp, and expdp took less than 10 minutes. Note that the COMPRESSION parameter is not used here.

Some people say this is determined by the exp and expdp export mechanisms. The exp export process is actually the data loaded by the select statement, put it in the buffer cache, and then upload it to the export client to write the dump file. Expdp uses the direct path mode to read data directly from the disk, write data to PGA, and then upload the data to the export client to write the dump file. The export speed is determined to some extent without passing through the buffer cache.

5. some people on the Internet say that "expdp/impdp is a server program and only disk IO is affected". From the above introduction, it is not completely accurate. expdp/impdp is a client tool, during execution, two PL/SQL packages on the database server are called. The experiment above is also performed remotely through dblink and is not used on the server.

---------------------------- Lili split line ----------------------------

Oracle Import and Export expdp IMPDP details

Solution to Oracle 10g expdp export error ORA-4031

Oracle 10gr2 rac expdp error UDE-00008 ORA-31626

Use of expdp/impdp to back up databases in Oracle

Oracle backup recovery (expdp/impdp)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.