Oracle Data Pump tool series: Data Pump permission configuration details

Source: Internet
Author: User

Oracle Data Pump tool series: Data Pump permission configuration details 1. Concept of Data Pump: Oracle Data Pump technology can quickly migrate Data and metadata from one database to another. Oracle Data Pump consists of three unique parts: 1. command line Client: expdp and impdp2. DBMS_DATAPUMP PL/SQL package (also known as Data Pump API) 3. compared with traditional client-based Export/Import tools, DBMS_METADATA PL/SQL package (also called Metadata API) is a server-based utility. Oracle Data Pump and Export/Import functions are not compatible. Oracle Data Pump can also use the NETWORK_LINK function to migrate Data and metadata from a remote database between two databases without dumping files. 2. the installation and configuration of the Data Pump utility requires certain permissions to run the Data Pump. The minimum permission required to run an Export DataPump or Import DataPump job in Oracle 10g or later is as follows:-system permission: create session (or CONNECT role) -system permission create table-object permission read and write permissions on a valid directory object (create directory privilege with which a valid directory object was created) -You must have enough quota for the user's default tablespace (the primary sampling table used by DataPump job needs to be created ). In addition, Data Pump also requires the EXP_FULL_DATABASE role to execute the following tasks: -Run the full-Database Export DataPump job-run the transmission tablespace Export DataPump job-run the Export DataPump job with the TRACE parameter-run the Export DataPump job to Export other schemas. Note, the preceding permission requirements apply to users who are connected to a database for running an Export DataPump or Import DataPump job, users who are not exported or imported can grant these permissions explicitly or through roles. The method is as follows: CONNECT system/managerCREATE DIRECTORY my_dir AS 'full _ pre_existing_directory_path_here '; GRANT create session, create table TO scott identified by tiger; GRANT read, write on directory my_dir TO scott; alter user scott QUOTA unlimited ON users; or: CONNECT system/manager create directory my_dir AS 'full _ pre_existing_directory_path_here> '; create role expdp_role; GRANT create session, create table TO expdp_role; GRANT read, write on directory my_dir TO expdp_role; GRANT expdp_role TO scott; alter user scott DEFAULT RO LE all; alter user scott QUOTA unlimited ON users; privileged USER: A privileged user is an EXP_FULL_DATABASE role that executes an Export DataPump job, an IMP_FULL_DATABASE role that executes an Import DataPump job, or a DBA role that includes these two roles: SET lines 80 COL privilege FOR a40 SELECT grantee, granted_role, default_role FROM dba_role_privs WHERE granted_role IN ('dba ', 'exp _ FULL_DATABASE', 'Imp _ FULL_DATABASE ') order by; GRANTEE GRANTED_ROLE DEF ---------------------- Export --- DBA EXP_FULL_DATABASE yes dba IMP_FULL_DATABASE yes scott EXP_FULL_DATABASE yes scott IMP_FULL_DATABASE yes sys dba yes sys EXP_FULL_DATABASE yes sys IMP_FULL_DATABASE yes sysman dba yes system dba yes you can use the DataPump LOGFILE parameter to complete ongoing operations and all the messages related to the error are written into a log file. Check the Data Pump file first. You can also specify the unconfirmed public METRICS = y parameter to record the number of additional objects and time-related information in the log file. The error message reported in the log file does not mean that the Data Pump job fails. Some messages only report warning and notification information. For example, the object already exists and is skipped. In this case, you can adjust the IMPDP directory parameters to recreate these objects or append data to these existing objects. Practical SQL script related to DataPump job permissions -- check whether the user who connects to the database to execute DataPump job has the CREATE SESSION and CREATE TABLE permissions SET lines 80 pages 50 COL privilege FOR a40 SELECT grantee, privilege FROM dba_sys_privs WHERE (grantee IN ('Scott ', 'public') OR grantee IN (SELECT granted_role FROM dba_role_privs WHERE grantee IN ('Scott', 'public '))) AND privilege IN ('create session', 'create table') order by 1, 2; grantee privilege ---------------------------- revoke scott create session scott create table -- view the roles granted by the users running the DataPump job in the Linked database and Their default roles SET lines 80 SELECT grantee, granted_role, default_role FROM dba_role_privs WHERE grantee IN ('Scott ', 'public') order by 1, 2; GRANTEE GRANTED_ROLE DEF distribute privileges --- SCOTT EXP_FULL_DATABASE yes scott IMP_FULL_DATABASE YES query the directory permissions granted by the user who is connected to the database running DataPump job SET lines 100 COL privilege FOR a10 COL grantee FOR a20 COL owner FOR a20 SELECT p. grantee, p. privilege, p. owner, d. directory_name FROM dba_tab_privs p, dba_directories d WHERE p. table_name = d. directory_name AND (grantee IN ('Scott ', 'public') OR grantee IN (SELECT granted_role FROM dba_role_privs WHERE grantee IN ('Scott', 'public') order, 2; grantee privilege owner DIRECTORY_NAME running ---------- zookeeper IMP_FULL_DATABASE read sys has EXP_FULL_DATABASE write sys has been written SYS has scott read sys MY_DIR scott write sys MY_DIR -- view the Connection database running DataPump job user's default tablespace SET lines 80 SELECT username, default_tablespace FROM dba_users WHERE username IN ('Scott '); USERNAME DEFAULT_TABLESPACE ---------------------------- ------------------------------ scott users -- view the user's tablespace quota SET lines 100 numwidth 12 SELECT q for the Linked database running DataPump job. username, q. tablespace_name, q. bytes, q. max_bytes FROM dba_ts_quotas q, dba_users u WHERE q. username = u. username AND q. username in ('Scott '); USERNAME TABLESPACE_NAME BYTES MAX_BYTES -------------------- ------------ scott users 393216-1 Source

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.