Saturday, August 2, 2008

10G DATAPUMP

Database export/import - Using data pump

Oracle Data Pump is the replacement for he original Export/Import utilities. This utility is available from oracle database 10g. Oracle data pump enables very high- speed of movement of data and metadata from one database to another.
The Data Pump Export and Import utilities have a similar look and feel to the original utilities, But they are much more efficient and give you greater control and management of import and export jobs


Directory Objects
=================
Description of feature: Data Pump differs from original Export and Import in that all jobs run primarily on the server using server processes. These server processes access files for the data pump jobs using directory objects that identify the location of the files. The directory objects enforce a security model that can be used by DBAs to control access to these files.

Interactive Command – Line Mode
===================
Description of feature: Besides regular operating system command – line mode, there is now powerful interactive command – line mode which allows theuser to monitor and control data pump export and import operations.

Definitions:

Definition: Oracle 10g's new DataPump utility is designed as the eventual replacement for the original Oracle Import and Export utilities. This article - the first in a series - provides an overview of the Data Pump's suite of tools for extracting, transforming, and loading data within an Oracle database.


User Procedures:
================

In order to use this data pump feature, you must perform the following steps:
• Changing from original Export/Import to oracle data pump.
• Different modes of Export/Import using data pump.

Each of these steps is described in more detail below.

STEP1:
======
if you want to export to a file, the first thing that you must do is create a database DIRECTORY object for the output directory, and grant access to users who will be doing exports and imports.

Why Directory Objects?
They are needed to ensure data security and integrity. Otherwise, users would be able to read data that they should not have access to and perform unwarranted operations on the server.Develop and include all the default mandatory columns in the invoice form. Also include additional columns are per the client requirement.
For example will use Scott user account and create a directory object it can access, (Steps for using data pump)

connect to sqlplus as sys user,

$sqlplus '/as sysdba'

make sure that Scott user account has unlocked by connecting to sqlplus as Scott user.

Assign create directory privilege to Scott,

GRANT CREATE ANY DIRECTORY TO Scott;
CREATE OR REPLACE DIRECTORY test_dir AS
'/u01/logicalbkp';

Assign read,write privilege to Scott on that particular directory where we need to take the backup.

GRANT READ, WRITE ON DIRECTORY test_dir TO Scott;


STEP2:
======
Full Export/Import Mode: You can invoke the data pump export using a command line. Export/Import parameters can be specified directly in the command line. A full export is specified using the FULL parameter. In a full database export, the entire database is unloaded. This mode requires that you have the EXP_FULL_DATABASE role.Shown below is an example

expdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expdpDB10G.log




impdp system/password full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impdpDB10G.log

Schema Export/Import Mode: The schema export mode is invoked using the SCHEMAS parameter. If you have no EXP_FULL_DATABASE role, you can only export your own schema. If you have EXP_FULL_DATABASE role, you can export several schemas in one go. Optionally, you can include the system

privilege grants as well.

expdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

impdp scott/tiger schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp
logfile=impdpSCOTT.log

Table Export/Import Mode: This export mode is specified using the TABLES parameter. In this mode, only the specified tables, partitions and their dependents are exported. If you do not have the EXP_FULL_DATABASE role, you can export only tables in your own schema. You can only specify tables in the same schema.

expdp scott/tiger tables=EMP,DEPT directory=TEST_DIR
dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log



impdp scott/tiger tables=EMP,DEPT directory=TEST_DIR
dumpfile=EMP_DEPT.dmp logfile=impdpEMP_DEPT.log


TABLE_EXISTS_ACTION=APPEND parameter allows data to be imported into existing tables.


Data pump performance can be improved by using the PARALLEL parameter. This should be used in conjunction with the "%U" wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be created or read:

expdp scott/tiger schemas=SCOTT directory=TEST_DIR
parallel=4 dumpfile=SCOTT_%U.dmp logfile=expdpSCOTT.log



The DBA_DATAPUMP_JOBS view can be used to monitor the current jobs



The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects. When the INCLUDE parameter is used, only those objects specified by it will be included in the export. When the EXCLUDE parameter is used all objects except those specified by it will be included in the export.

expdp scott/tiger schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log


expdp scott/tiger schemas=SCOTT exclude=TABLE:"= 'BONUS'" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log




All data pump actions are performed by multiple jobs (server processes not DBMS_JOB jobs). These jobs are controlled by a master control process which uses Advanced Queuing. At runtime an advanced queue table, named after the job name, is created and used by the master control process.

The table is dropped on completion of the data pump job. The job and the advanced queue can be named using the JOB_NAME parameter. Cancelling the client process does not stop the associated data pump job. Issuing "ctrl+c" on the client during a job stops the client output and presents a command prompt. Typing "status" at this prompt allows you to monitor the current job

No comments: