Oracle 11g Logminer Analysis Redo Log Practice

Source: Internet
Author: User
Tags create directory

1. Installing Logminer

Run the following 2 scripts as DBA users:

  • Dbmslm.sql is used to create a DBMS_LOGMNR package to parse the log file.
sql> @ $ORACLE _home/rdbms/admin/dbmslm.sql; Package created. Grant succeeded. Synonym created.
  • Dbmslmd.sql is used to create a dbms_logmnr_d package that is used to create a data dictionary file.
sql> @ $ORACLE _home/rdbms/admin/dbmslmd.sql; Package created. Synonym created.


2. Setting the parameter Utl_file_dir

Add the parameter Utl_file_dir, which is the directory where the data dictionary file is placed in the server.

Sql> CREATE DIRECTORY utlfile as '/home/oracle/logminer ';D irectory created. Sql> alter system set utl_file_dir= '/home/oracle/logminer ' scope=spfile; System altered.

Restart the database to take effect on the settings above.

sql> shutdown immediate; Sql> startup; Sql> Show Parameters Utl_file_dir;

3. Start the Supplemental log

The supplemental log is not a necessary step for Logminer log parsing, but if you do not enable supplemental logging, many of the result set information for the analytic log will be displayed as "UNKNOWN", and the following is the minimal supplemental log that is turned on.

sql> ALTER DATABASE ADD supplemental LOG data;database altered. Sql> SELECT supplemental_log_data_min from V$database; Suppleme--------YES

4. Create a dedicated Logminer user

It is not necessary to be safe and convenient for management.

5. Create a data dictionary

Sql> EXECUTE dbms_logmnr_d.build (dictionary_filename = ' Dictionary.ora ', dictionary_location = '/home/ Oracle/logminer ');

6. Add an online log or archive log that needs to be analyzed

Sql> EXECUTE DBMS_LOGMNR. Add_logfile (LogFileName = '/home/oracle/flash_recovery_area/primary/archivelog/2016_04_25/o1_mf_1_199_ Ckv04o6w_.arc ', OPTIONS = DBMS_LOGMNR. NEW); Sql> EXECUTE DBMS_LOGMNR. Add_logfile (LogFileName = '/home/oracle/flash_recovery_area/primary/archivelog/2016_04_25/o1_mf_1_201_ Ckv08jyp_.arc ', OPTIONS = DBMS_LOGMNR. AddFile);P L/sql procedure successfully completed. Sql> EXECUTE DBMS_LOGMNR. Add_logfile (LogFileName = '/home/oracle/flash_recovery_area/primary/archivelog/2016_04_25/o1_mf_1_200_ Ckv05pmp_.arc ', OPTIONS = DBMS_LOGMNR. AddFile);P L/sql procedure successfully completed.

7. analyzing archived log files using a dictionary

Sql> EXECUTE DBMS_LOGMNR.START_LOGMNR (dictfilename=> '/home/oracle/logminer/dictionary.ora ');

8. Online Log Analysis Example

    • Inserting data

sql> INSERT into NN.T1 values (+, ' succ '); 1 row created. Sql> commit; Commit complete.

    • Querying the current log sequence number

Sql>  SELECT group#, sequence#, status, first_change#, first_time from V$log ORD                                                                             ER by first_change#;  group#  sequence# STATUS           first_change# first_tim--------------------------------------------------------- -         1        208 INACTIVE               2825768 25-apr-16         2        209 INACTIVE               2825872 25-apr-16         3        210 Current                2845771 25-apr-16

    • Join the current Log group

Sql> EXECUTE DBMS_LOGMNR. Add_logfile (LogFileName = '/home/oracle/oradata/orcl/redo03.log ', OPTIONS = DBMS_LOGMNR. AddFile);P L/sql procedure successfully completed.

    • Using Logminer for analysis

Sql>  EXECUTE dbms_logmnr.start_logmnr (dictfilename=> '/home/oracle/logminer/dictionary.ora ');P l/sql Procedure successfully completed.

    • Query analysis Results

Sql> SELECT Sql_redo, Sql_undo, Seg_owner from v$logmnr_contents WHERE seg_name= ' T1 '; Sql_redo      Sql_undo          seg_ownerinsert into "NN". " T1 "(" TID "," Tname ") VALUES ('", ' succ ');d elete from "NN". T1 "where" TID "= '" and "tname" = ' succ ' and ROWID = ' Aaar7yaaeaaaacraad '; NN


9. Close Logminer Session

Sql> EXECUTE DBMS_LOGMNR. END_LOGMNR;



Oracle 11g Logminer Analysis Redo Log Practice

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.