Thursday, December 4, 2014

Data Pump Architecture

Data Pump consists of the following components:

• expdp (Data Pump export utility)
• impdp (Data Pump import utility)
• DBMS_DATAPUMP PL/SQL package (Data Pump API)
• DBMS_METADATA PL/SQL package (Data Pump Metadata API)

The expdp and impdp utilities use the DBMS_DATAPUMP and DBMS_METADATA built-in PL/SQL packages when exporting and importing data and metadata. The DBMS_DATAPUMP package moves entire databases or subsets of data between database environments. The DBMS_METADATA package exports and imports information about database objects.

Note You can call the DBMS_DATAPUMP and DBMS_METADATA packages independently (outside of expdp and impdp) from SQL*Plus. I rarely call these packages directly from SQL*Plus; but you may have a specific scenario where itfs desirable to interact directly with them.

When you start a Data Pump export or import job, a master operating-system process is initiated on the database server. This master process name has the format ora_dmNN_<SID>. On Linux/Unix systems, you can view this process from the operating-system prompt using the ps command:

$ ps -ef | grep ora_dm
oracle 14950 717 0 10:59:06 ? 0:10 ora_dm00_STAGE


Depending on the degree of parallelism and the work specified, a number of worker processes are also started. The master process coordinates the work between master and worker processes. The worker process names have the format ora_dwNN_<SID>.

Also, when a user starts an export or import job, a database status table is created (owned by the user who starts the job). This table exists for the duration of the Data Pump job. The name of the status table is dependent on what type of job you’re running. The table is named with the format SYS_<OPERATION>_<JOB_MODE>_NN, where OPERATION is either EXPORT or IMPORT. JOB_MODE can be FULL, SCHEMA, TABLE, TABLESPACE, and so on.

For example, if you’re exporting a schema, a table is created in your account with the name SYS_EXPORT_SCHEMA_NN, where NN is a number that makes the table name unique in the user’s schema. This status table contains information such as the objects exported/imported, start time, elapsed time, rows, error count, and so on. The status table has over 80 columns.
The status table is dropped by Data Pump upon successful completion of an export or import job. If you use the KILL_JOB interactive command, the master table is also dropped. If you stop a job with the STOP_JOB interactive command, the table isn’t removed and is used in the event you restart the job. If your job terminates abnormally, the master table is retained. You can delete the status table if you don’t plan to restart the job.

When Data Pump runs, it uses a database-directory object to determine where to write and read dump files and log files. Usually, you specify which directory object you want Data Pump to use. If you don’t specify a directory object, a default is used. A Data Pump export creates an export file and a log file. The export file contains the objects being exported. The log file contains a record of the job activities.

Below shows the architectural components related to a Data Pump export job. Here’s how you initiate the job from the command line:

$ expdp user/pwd dumpfile=exp.dmp logfile=exp.log directory=dp_dir

In this example, a database-directory object named DP_DIR is defined to reference the /oradump operating-system directory. The output files are defined via the command line to be exp.dmp and exp.log.

image

Below displays the architectural components of a Data Pump import job. Here’s how you initiate the job from the command line:

$ impdp user/pwd dumpfile=exp.dmp logfile=imp.log directory=dp_dir

In the above command, the Data Pump import reads from a dump file named exp.dmp, which is located in the operating-system directory referenced by the database-directory object named DP_DIR. The import job reads the dump file and populates database objects.

image

For each Data Pump job, you must ensure that you have access to a directory object.

No comments:

Post a Comment