Understand and use Oracle log analysis tools-LogMiner

This article is reproduced, the first author is unknown, please contact me.

Oracle LogMiner is Oracle 8i from the product after the company provided a very useful analysis of the actual tool, the tool can be easily obtained using the Oracle redo log files (archived log files) in the specific content, in particular, the tool can analyze all the database operations DML (insert, update, delete, etc.) statement, can also get the necessary rollback of SQL statement. The tool is especially suitable for debugging, auditing or rollback a particular transaction.

LogMiner analysis tool is actually a group of PL / SQL package and some dynamic view (Oracle8i built-in part of the package), with it as part of Oracle Database release 8i products is a completely free tool. However, the tools and other than to use Oracle built-in tools seems a bit complicated, mainly because the tool does not provide any graphical user interface (GUI). This article details how to install and use the tool.

1, LogMiner uses

<Type = "text / javascript"> <type = "text / javascript" src = "**pagead2.googlesyndication**/pagead/show_ads.js"> log file for database recovery of all stored data, records against each database structure changes, that is all the DML operations against the database statements.

Prior to the Oracle 8i, Oracle did not provide any assistance to database administrators to read and interpret the contents of redo log files tool. System problems, the data manager for a general speaking, the only job is to make all the log files will be packaged and sent to Oracle's technical support, and then quietly wait for the Oracle Corporation technical support to our final answer. However, since from 8i, Oracle provides a powerful tool-LogMiner.

LogMiner tool that can be used to analyze online can also be used to analyze log files offline, that can analyze their own database redo log files, other databases can also be used to analyze the redo log files.

In general, LogMiner tool is mainly used in:

1. Tracking changes in the database: offline tracking database changes without affecting the online performance of the system.

2. Rollback database changes: rollback changes in specific data to reduce the point-in-time recovery implementation.

3. Optimization and expansion plans: analysis of the log file through the data to analyze the data growth.

Second, install the LogMiner

To install the LogMiner tool, you must first run the following two such scripts

l $ ORACLE_HOME / rdbms / admin / dbmslm.sql

2 $ ORACLE_HOME / rdbms / admin / dbmslmd.sql.

The two scripts are SYS user to run. The first script used to create DBMS_LOGMNR package, the package used to analyze the log files. The second script is used to create DBMS_LOGMNR_D package, the package used to create data dictionary files.

Third, using the LogMiner tool

The following will detail how to use LogMiner tool.

1, create a data dictionary file (data-dictionary)

As already mentioned, LogMiner tool is actually two new PL / SQL built-in package ((DBMS_LOGMNR and DBMS_ LOGMNR_D) and four V $ dynamic performance view (view is to start LogMiner using the process created when DBMS_LOGMNR.START_LOGMNR) composition. LogMiner tool in the use of redo log file, you can use DBMS_LOGMNR_D package to export the data dictionary as a text file. The dictionary file is optional, but without it, LogMiner out the statement to explain the data dictionary on the part of (such as table names, column names, etc.) and the value will be 16 hexadecimal form, we can not directly understandable. For example, the following sql statements:

INSERT INTO dm_dj_swry (rydm, rymc) VALUES (00005, 'Joe Smith');

LogMiner to explain that the results will be below like this

insert into Object # 308 (col # 1, col # 2) values (hextoraw ('c30rte567e436'), hextoraw ('4 a6f686e20446f65 '));

Create a data dictionary is to give the LogMiner data dictionary reference relates to the internal part of the actual name for them, rather than 16-band within the system. Data dictionary file is a text file, use the package DBMS_LOGMNR_D to create. If we are to analyze the database tables have changed, affecting the library's data dictionary has changed, then the need to re-create the dictionary file. Another is the analysis of another database redo log file must also be analyzed again to re-generate the database data dictionary file.

First of all, in the init.ora initialization parameter file, specify the location of the data dictionary file, which is to add a parameter UTL_FILE_DIR, this argument is placed in the server data dictionary files. Such as:

UTL_FILE_DIR = (e: \ Oracle \ logs)

Restart the database, plus the parameters of the new force, and then create a data dictionary file:



dictionary_filename =>; 'v816dict.ora',

dictionary_location =>; 'e: \ oracle \ logs');

2, create a list of log files to analyze

Oracle's redo log is divided into two types, on-line (online) and offline (offline) archive log files, the following were to discuss on this list two different log files created.

(1) Analysis of online redo log files

A. Create a list

SQL>; EXECUTE dbms_logmnr.add_logfile (

LogFileName =>; 'e: \ Oracle \ oradata \ sxf \ redo01.log',

Options =>;;

B. add additional log files to the list

SQL>; EXECUTE dbms_logmnr.add_logfile (

LogFileName =>; 'e: \ Oracle \ oradata \ sxf \ redo02.log',

Options =>; dbms_logmnr.addfile); (2) of offline log files

A. Create a list

SQL>; EXECUTE dbms_logmnr.add_logfile (

LogFileName =>; 'E: \ Oracle \ oradata \ sxf \ archive \ ARCARC09108.001',

Options =>;;

B. add additional log files to the list

SQL>; EXECUTE dbms_logmnr.add_logfile (

LogFileName =>; 'E: \ Oracle \ oradata \ sxf \ archive \ ARCARC09109.001',

Options =>; dbms_logmnr.addfile); on the log file list, the number of log files need to be analyzed completely up to you, but here the best way is to add a need to analyze a time log file analysis in the paper After re-add another file.

And add the log of the list corresponds to the use of the process 'dbms_logmnr.removefile' can also be removed from the list in a log file. The following example removes the log file to add the above e: \ Oracle \ oradata \ sxf \ redo02.log.

SQL>; EXECUTE dbms_logmnr.add_logfile (

LogFileName =>; 'e: \ Oracle \ oradata \ sxf \ redo02.log',

Options =>; dbms_logmnr. REMOVEFILE);

Created to analyze the log file list, the following can analyze the.

3, using LogMiner for Log Analysis

(1) an unconstrained

SQL>; EXECUTE dbms_logmnr.start_logmnr (

DictFileName =>; 'e: \ oracle \ logs \ v816dict.ora');

(2) the constraints

DBMS_ LOGMNR.START_LOGMNR through the process several different parameter settings (parameter meaning see Table 1), can reduce the scope of log files to analyze. By setting the start time and end time parameters we can restrict the scope of a time log. Such as the following example, we analyzed only on Sept. 18, 2001 in the log:

SQL>; EXECUTE dbms_logmnr.start_logmnr (

DictFileName =>; 'e: \ oracle \ logs \ v816dict.ora',

StartTime =>; to_date ('2001-9-18 00:00:00 ',' YYYY-MM-DD HH24: MI: SS ')

EndTime =>; to_date (''2001-9-18 23:59:59 ',' YYYY-MM-DD HH24: MI: SS '));

You can also set the start SCN and SCN as to limit the scope to analyze the log:

SQL>; EXECUTE dbms_logmnr.start_logmnr (

DictFileName =>; 'e: \ oracle \ logs \ v816dict.ora',

StartScn =>; 20,

EndScn =>; 50);

Table 1 DBMS_LOGMNR.START__LOGMNR meaning of parameters of the process parameters default parameter type meaning
StartScn numeric (Number) 0 redo log analysis log file part of the SCN ≥ StartScn
EndScn numeric (Number) 0 redo log analysis log file part of the SCN ≤ EndScn
StartTime date type (Date) 1998-01-01 analyze redo log file timestamp ≥ StartTime part of the log
EndTime date type (Date) 2988-01-01 analyze redo log file timestamp ≤ EndTime part of the log
DictFileName character (VARCHAR2) dictionary file, which contains a snapshot of the database directory. Use the file can get results is understandable text form, rather than the 16-band within the system
Options BINARY_INTEGER 0 system debugging parameters, actually rarely used

4, to observe and analyze the results (v $ logmnr_contents)

Until now, we have analyzed the received redo log file contents. Dynamic performance view v $ logmnr_contents LogMiner analysis contains all the information obtained.

SELECT sql_redo FROM v $ logmnr_contents;

If we just want to know a user for the operation of a particular table, you can get the following SQL query, the query can be made by the user DB_ZGXT table SB_DJJL all the work.

SQL>; SELECT sql_redo FROM v $ logmnr_contents WHERE username = 'DB_ZGXT' AND tablename = 'SB_DJJL';

Need to stress that it is, the view v $ logmnr_contents the results of the analysis is only as we run the process 'dbms_logmrn.start_logmnr' the life of this session there. This is because all the LogMiner stored in PGA memory, all other processes can not see it at the same time as the end of the process, the results will disappear.

Finally, the use of log analysis services to terminate the process DBMS_LOGMNR.END_LOGMNR, this time PGA memory area is cleared, the results also will no longer exist.

4 Other Notes

We can use LogMiner log analysis tools to analyze database instance redo log files generated, not just to analyze their own database instance installation LogMiner redo logs files. Use LogMiner analysis of other database instance, there are a few to note:

1. LogMiner analysis of the database instance must be created in the dictionary file, rather than install a database LogMiner dictionary file generated by the other to ensure the installation LogMiner database character set and are of the same database character set.

2. By analytical database platform must be the same as the current LogMiner where the database platform, that is if we are to analyze the file is run on UNIX platforms, Oracle 8i production, then must also be in a UNIX platform running Oracle instance run LogMiner, but not in others, such as Microsoft NT running on LogMiner. Of course, both the hardware does not require exactly the same conditions.

3. LogMiner log analysis tools to analyze Oracle 8 only after the product, for 8 the previous product, the tool can not do anything.

V. Conclusion

LogMiner for the database administrator (DBA) speaking is a very powerful tool, often also in their daily work to use a tool by means of the tool, you can get a lot of information on the activities on the database. One of the most important use is to restore the database without all of the database can be restored to a change. In addition, the tool can also be used to monitor or audit user activity, such as tools you can use LogMiner look at who might have changed the data and modify the data in the state before. We can also help of the tool of any version of Oracle 8 and later redo log file generated. The tool also has a very important feature is the ability to analyze database log files. In short, the tool for database administrators, this is a very effective tool, a deep understanding and mastery of the tool, a database administrator for each of the actual work is very helpful.
标签: database structure, oracle database, database operations, rollback, interface gui, graphical user interface, pl sql, analysis tool, sql package, dynamic view, log analysis tools, article details, database release, database administrators, structure changes, oracle 8i, oracle corporation, database recovery, final answer, tool system
分类: Database
时间: 2010-04-29


  1. Oracle log analysis tools - LogMiner Detailed

    LogMiner ---- Oracle 8i provided after the self through the form of SQL commands to query and resolve redo (re ...
  2. oracle log analysis tools to LogMiner use (combat)

    To install the LogMiner tool, you must first run the following two scripts so that the two are in the script m ...
  3. MySQL monitoring tools and log analysis tools to summarize

    Monitoring tools: innotop ( installation and description ), mysqlsniffer ( ) ...
  4. AWStats Description: Apache / Windows IIS log analysis tools to download, install, configure and use sample You do not have the patience to read all the content: a b ...
  5. request-log-analyzer log analysis tools ...
  6. Oracle log file analysis

    First, explain how to analyze the LogMiner From now, the only way of Oracle is to use the log provided by Orac ...
  7. Detailed logs of Oracle's tools - LogMiner

    Foreword Oracle's redo log is divided into two types of online redo log files (online Redo log) and archived r ...
  8. IIS log analysis methods and tools

    The importance of logging has been more and more attention to the programmer, IIS log is self-evident. Recomme ...
  9. Talk about log analysis

    Log Analysis Overview: Log in the computer system is a very broad concept, there may be any program output log ...
  10. 10 great open source Web traffic analysis tools

    Rui Business Enterprise CMS writes "Web traffic analysis tools are numerous, from the WebTrends Professio ...
  11. Oracle archive log analysis - LogMiner (rpm)

    Log Analysis Technical Overview: As the Oracle DBA, we sometimes need to track the malicious user data acciden ...
  12. Analysis tools to understand the value of the sampled data

    Visual Studio analysis tools sampling and analysis methods and computer processors collected periodically inte ...
  13. oralce8i/9i database log view Method (logminer use)

    First, how to analyze the LogMiner to explain the present circumstances, the only way to analyze Oracle log is ...
  14. Static analysis tools and the use of summary

    Static analysis tools and the use of summary (2) Here are three open source tools are introduced, PMD, CheckSt ...
  15. . Oracle TOPN analysis and knowledge of rownum

    ORACLE will be conducted frequently in the TOP N query, the list sorted according to certain data records. Thi ...
  16. Use of multi-threading analysis tools Mtrat

    A very good multi-threading analysis tools, the provider is IBM, the official description on the development o ...
  17. Five emotional analysis tools (from Yeeyan)

    Feelings of (sentiment analysis) is a popular trend in the long-term, will eventually become one of the key fu ...
  18. [Switch] with awstats log analysis of some records Nginx

    Original Address: System enviro ...
  19. Increasingly powerful static analysis tools

    Static source code analysis tools have evolved from simple grammar checker developed into a powerful tool that ...