As the Db2 Log records all data changes, it is ideally suited as a base for the provision of audit reports. Exploiting Db2’s log makes subsequent changes and supplements to established applications dispensable, also the known operational drawbacks of triggers are avoided. Therefore, compared to setting up a costly mechanism that continuously extracts and supervises all data changes, ongoing log analysis for the tables in question might be the better solution.
A log analyzer or log extractor program requires as control data at least the timestamps of start and end of the observation period and the name of the object to be observed. This sounds quite easy. In practice it might turn out that, as often, the devil is in the details. To specify and to execute a process that continuously delivers the changes to a set of tables or databases requires some efforts. It is preferable to deploy a tool with an on-line interface that supports the definition of the process and that facilitates operation and house-keeping.
The tool should offer a flexible way to specify the scope of the process, i. e. to select the set of objects to be observed, at best by generic declaration of databases, tablespaces, packages. For each of the monitored tables a file is needed to hold the extracted information either formatted as SQL or as LOAD input. Next, the target tables in that the detected changes are to be inserted/loaded must be created. Regular scheduled execution of a process that monitors a growing set of objects must consider naming conventions, possible restarts (dataset full), house-keeping (empty files), possible occurrence of new tables (monitor entire database), structural changes of the tables involved, etc. A log analysis tool that automates these things and for example automatically resumes at the end point of the previous run is ideally suited to usual data center requirements.