Oracle log mining

Source: Internet
Author: User

Oracle log mining (LogMnr)

① Set the utl_file_dir path in the parameter file

② Restart

③ Create directory file and use execute dbms_logmnr_d.build ()

④ Add a log file and use dbms_logmnr.add_logfile

Or delete the log file, use dbms_logmnr.removefile

⑤ Start logmnr and use dbms_logmnr.start_logmnr

⑥ Query, using V $ logmnr_contents (sqlredo/sqlundo)

Set the utl_file_dir directory.

Alter system set utl_file_dir = 'd: \ app \ Leo \ utl_file_dir 'scope = spfile;

Create a data dictionary for log mining.

Execute dbms_logmnr_d.build ('shwdict. ora ', 'd: \ app \ Leo \ utl_file_dir'); // The second parameter must match the utl_file_dir parameter.

If this package is not available, run @ d: \ oracle \ ora8 \ rdbms \ admin \ dbmslmd. SQL;

Create a list of analytic files (which can be online or archived logs)

Execute dbms_logmnr.add_logfile ('d: \ app \ Leo \ archivelog1 \ ARC00011_0720116156.001 ', dbms_logmnr.new );

Execute dbms_logmnr.add_logfile ('d: \ app \ Leo \ archivelog1 \ ARC00012_0720116156.001 ', dbms_logmnr.addfile );

Execute dbms_logmnr.add_logfile ('d: \ app \ Leo \ archivelog1 \ ARC00013_0720116156.001 ', dbms_logmnr.addfile );

Option: new-create a new log list. Addfile-Add a log file to the list.

Remove_logfile-Delete.

You can view the added log list in view v $ logmnr_logs.

Select log_id, filename from v $ logmnr_logs;

Start Analysis

Execute dbms_logmnr.start_logmnr (dictfilename => 'd: \ app \ Leo \ utl_file_dir \ shwdict. ora ');

Execute dbms_logmnr.start_logmnr (dictfilename => 'd: \ oracle \ shwdict. ora ', startscn => xxxxxx, endscn => xxxxx, starttime => to_date ('2017 12:15:00', 'yyyymmdd hh24: mi: ss '), endtime => to_date ('2014 15:40:30 ', 'yyyymmdd hh24: mi: ss '));

If no data dictionary is created and the dbms_logmnr.dict_from_online_catalog option parameter is used, logminer expects to find the data dictionary in the specified dbms_logmnr.add_logfile Or redo log.

Exec sys. dbms_logmnr.start_logmnr (options => sys. dbms_logmnr.dict_from_online_catalog );

Query analysis content

Select to_char (timestamp, 'yyyy-mm-dd hh24: mi: ss') time, SQL _redo from v $ logmnr_contents where seg_owner = 'Scott ';

End Analysis

Execute dbms_logmnr.end_logmnr;

Some logs cannot be archived

① Change the database to the force logging Mode

Alter database force logging;

Select force_logging from v $ database;

② Check whether the table is in archive Mode

Select table_name, logging from user_tables;

③ Check whether the Supplemental log is enabled for the database (if the Supplemental log is enabled, the unique column value will be appended to each update write redo)

SQL> select supplemental_log_data_pk, supplemental_log_data_ui from v $ database;

SUPPLEMENTAL_LOG_DATA_PK SUPPLEMENTAL_LOG_DATA_UI

------------------------------------------------

NO

SQL> alter database add supplemental log data (primary key, unique index) columns;

Database altered

SQL> select supplemental_log_data_pk, supplemental_log_data_ui from v $ database;

SUPPLEMENTAL_LOG_DATA_PK SUPPLEMENTAL_LOG_DATA_UI

------------------------------------------------

YES

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.