News
Perth, Australia
+ (61) 417713124
prosolutions@gavinsoorma.com

Exporting and Importing AWR snapshot data

  • Posted by Gavin Soorma
  • On July 25, 2009
  • 0 Comments
  • awr, awrextr, awrload.snapshot, dba_hist, wrh$

The AWR tables contains a wealth of important performance data which can be very useful in performance tuning trend analysis and also when comparing performance between two seperate periods of time.

AWR data is stored in the WRH$ and DBA_HIST tables in the SYSAUX tablespace. There could be performance implications if these tables were to grow too large in size or if the retention was increased beyond the default of 7 days.

A good solution is to have a central repository and move statistical AWR data periodically to this central repository database using the Oracle supplied awrextr.sql and awrload.sql scripts which can be found in the $ORACLE_HOME/rdbms/admin directory.

The AWR History is by default maintained for 7 days and the data is gathered in the AWR repository tables every hour by default.

The current snapshot retention settings and data gathering frequency can be determined by the query shown below. Note in this case the default settings of 7 days and 1 hour is displayed.

SQL> select to_char(snap_interval,'DD'),to_char(retention,'DD') FROM dba_hist_wr_control;

TO_CHAR(SNAP_INTER TO_CHAR(RETENTION,
------------------ ------------------
+00000 01:00:00.0  +00007 00:00:00.0;

The AWR default settings can be modified using the DBMS_WORKLOAD_REPOSITORY package as shown below. In this case the retention is being increased to 30 days and the interval to every 30 minutes.

BEGIN
DBMS_WORKLOAD_REPOSITORY.modify_snapshot_settings(
retention => 43200,
interval => 30);
END;
/
Extracting AWR data

Create a directory

SQL> CREATE DIRECTORY AWR_DATA AS
2 ‘/u01/oracle/’;

Directory created.

SQL> @?/rdbms/admin/awrextr.sql

The script will list the information we need to provide to it

AWR EXTRACT
~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~  This script will extract the AWR data for a range of snapshots  ~
~  into a dump file.  The script will prompt users for the         ~
~  following information:                                          ~
~     (1) database id                                              ~
~     (2) snapshot range to extract                                ~
~     (3) name of directory object                                 ~
~     (4) name of dump file                                        ~

After entering the range of snapshot ids, we will need to provide the directory location where the data pump export file will be located. We need to also enter the dumpfile name as well.

Note: the script will autiomatically append a ‘.dmp’ to the data punp export file name. So we need to just enter the dumpfile name without any extension.

Specify the Directory Name
~~~~~~~~~~~~~~~~~~~~~~~~~~

Directory Name                 Directory Path
------------------------------ -------------------------------------------------
ADMIN_DIR                      /u01/oracle/product/10.2.0/rmand/md/admin
AWR_DATA                       /u01/oracle/
DATA_PUMP_DIR                  /u01/oracle/product/10.2.0/rmand/admin/rmand/dpdu
                               mp/

ORACLE_OCM_CONFIG_DIR          /u01/oracle/product/10.2.0.4/rmand/ccr/state
WORK_DIR                       /u01/oracle/product/10.2.0/rmand/work

Choose a Directory Name from the above list (case-sensitive).

Enter value for directory_name: AWR_DATA

Using the dump directory: AWR_DATA

Specify the Name of the Extract Dump File
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The prefix for the default dump file name is awrdat_11369_11383.
To use this name, press  to continue, otherwise enter
an alternative.

Enter value for file_name: awrexp

After the export is complete, we will need to ftp the data pump dump file awrexp.dmp to the target server where our central repository database is located.

Loading AWR Data

On the repository database, we will create a directiory AWR_DATA as well and the ensure that the DIRECTORY_PATH corresponds to the directory where the awrexport.dmp file is located.

Also, a temporary staging schema AWR_STAGE is created. Objects are first imported into this staging schema and then inserted into the WR$ and DBA_HIST AWR historical tables.

We will need to provide information about the directory location, dump file and the staging schema name

@?/rdbms/admin/awrload.sql

AWR LOAD
~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~  This script will load the AWR data from a dump file. The   ~
~  script will prompt users for the following information:    ~
~     (1) name of directory object                            ~
~     (2) name of dump file                                   ~
~     (3) staging schema name to load AWR data into           ~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

While specifying the dumpfile name we need to only provide the file name without the .dmp extension

Specify the Name of the Dump File to Load
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Please specify the prefix of the dump file (.dmp) to load:

We need to specify the default tablespace and temporary tablespace for this staging user. Note – this user will be dropped once the load is completed.

Choose the AWR_STAGE users's default tablespace.  This is the
tablespace in which the AWR data will be staged.

TABLESPACE_NAME                CONTENTS  DEFAULT TABLESPACE
------------------------------ --------- ------------------
MGMT_ECM_DEPOT_TS              PERMANENT
MGMT_TABLESPACE                PERMANENT
PATROL                         PERMANENT
SYSAUX                         PERMANENT *
USERS                          PERMANENT

Pressing  will result in the recommended default
tablespace (identified by *) being used.

Enter value for default_tablespace:

Using tablespace SYSAUX as the default tablespace for the AWR_STAGE


Choose the Temporary tablespace for the AWR_STAGE user
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Choose the AWR_STAGE user's temporary tablespace.

TABLESPACE_NAME                CONTENTS  DEFAULT TEMP TABLESPACE
------------------------------ --------- -----------------------
TEMP                           TEMPORARY *

Pressing  will result in the database's default temporary
tablespace (identified by *) being used.

Enter value for temporary_tablespace:

Using tablespace TEMP as the temporary tablespace for AWR_STAGE


... Creating AWR_STAGE user

if we look at the import log file, we will see that data is first imported into the AWR_STAGE schema and then from here it is inserted into the WRH$ and other DBA_HIST tables.

Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "AWR_STAGE"."WRH$_SQL_PLAN"                 3.845 MB   11494 rows
. . imported "AWR_STAGE"."WRH$_SQLTEXT"                  1.012 MB     569 rows
. . imported "AWR_STAGE"."WRH$_SYSMETRIC_SUMMARY"        156.8 KB    2025 rows
.......
........
	Append Data for "AWR_STAGE".WRH$_SGASTAT.
INSERT /*+ APPEND */ INTO SYS.WRH$_SGASTAT (SNAP_ID, DBID, INSTANCE_NUMBER, NAME, POOL, BYTES) SELECT SNAP_ID,
3228342000, INSTANCE_NUMBER, NAME, POOL, BYTES FROM "AWR_STAGE".WRH$_SGASTAT WHERE DBID = :c_dbid
... appended 388 rows

If we now query the DBA_HIST_SNAPSHOT table, we see that it contains the data for two DBID’s – this shows that AWR history data is available in the repository database for two databases.

SQL> select distinct dbid from dba_hist_snapshot;

      DBID
----------
3228342000
3892233981

We can query the repository AWR tables based on DBID values to obtain data for a specific database.

 

0 Comments

Leave Reply

Your email address will not be published. Required fields are marked *