Tuesday, October 13, 2009

Using Network_Link in Datapump

Using Network_Link in Datapump
===============================

Oracle's export and import utilities have historically used a disk file as intermediate storage when unloading or reloading the database. For large databases, this "dump file" was an issue because operating system limits on file size could be exceeded, making export impossible.

Creative DBAs have used file compression utilities, such as compress on UNIX, to get the most capacity from the dump file. Later versions of import and export allowed the use of multiple dump files to get around the limits.

In Oracle 10g, the Data Pump version of import can eliminate the dump file entirely by importing directly from another database instance.

Example:
--------

1. Create tnsnames for remote database from where the database has to be imported.[Target]
conn_151 =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.9.200.151)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = wisedba.com)
)
)

2. Create a directory for user to access during Datapump job[Target]

SYS> create directory defdir as '/home/ez10g/wisedba';
SYS> grant read, write on directory defdir to scott;

3. Grant user to create database link[Target}

SYS> grant create database link to scott;

4. Create database link as Scott user (to the remote database)[Target]

SCOTT> Create database link conn_151
2 connect to scott identified by tiger
3 using 'conn_151';

5. Checking tables in scott user (in local database)[Target]
SCOTT> select * from tab;

TNAME TABTYPE CLUSTERID
------------------------------ ------- ----------
BONUS TABLE

6. Import 'EMP' table from remote database (without creating any dump file)[Target]

$ impdp scott/tiger tables=emp directory=defdir network_link=conn_151

Import: Release 10.2.0.1.0 - Production on Monday, 02 April, 2007 18:04:53

Copyright (c) 2003, 2005, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "SCOTT"."SYS_IMPORT_TABLE_01": scott/******** tables=emp directory=defdir network_link=conn_151
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 6 MB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . imported "SCOTT"."EMP" 114688 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "SCOTT"."SYS_IMPORT_TABLE_01" successfully completed at 18:05:12

7. Check whether the table has been imported to local database

SCOTT> select * from tab;

TNAME TABTYPE CLUSTERID
------------------------------ ------- ----------
EMP TABLE <<-- Table has been imported
BONUS TABLE



-----------------------------------FULL LEVEL IMPORT---------------------------------------


Example:
--------

1. Create tnsnames for remote database from where the database has to be imported.[Target]
qadb =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.9.200.151)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = qadb)
)
)

2. Create a directory for user to access during Datapump job[Target]

SYS> create directory data_pump_dir as '/home/ez10g/wisedba';
SYS> grant read, write on directory defdir to system;

3. Grant user to create database link[Target}

SYS> grant create database link to system;

4. Create database link as Scott user (to the remote database)[Target]

SCOTT> Create database link qadb
2 connect to system identified by qadb
3 using 'qadb';

5. Checking users in the DB (in local database)
SYS> select * from all_users;

USERNAME USER_ID CREATED
------------------------------ ---------- ---------
CTXSYS 36 03-JUL-08
EXFSYS 34 03-JUL-08
WMSYS 25 03-JUL-08
DBSNMP 24 03-JUL-08
TSMSYS 21 03-JUL-08
DMSYS 35 03-JUL-08
DIP 19 03-JUL-08
OUTLN 11 03-JUL-08
SYSTEM 5 03-JUL-08
SYS 0 03-JUL-08

6. Import 'EMP' table from remote database (without creating any dump file)[Target]

$ impdp system/esbgui2008 directory=data_pump_dir network_link=qadb full=y


NOTE: The tablespace should be existing as same in the target DB (local db from where u need to import).


-----------------SCHEMA LEVEL-----------------------


1. Create tnsnames for remote database from where the database has to be imported.[Target]
qadb =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.9.200.151)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = qadb)
)
)

2. Create a directory for user to access during Datapump job[Target]

SYS> create directory data_pump_dir as '/home/ez10g/wisedba';
SYS> grant read, write on directory defdir to scott;

3. Grant user to create database link [Target}

SYS> grant create database link to scott;

4. Create database link as Scott user (to the remote database)[Target]

SCOTT> Create database link qadb
2 connect to scott identified by tiger
3 using 'qadb';

5. Checking tables in scott schema(in local database)
SCOTT> select * from tab;


6. Import all table from remote database (without creating any dump file)[Target]

$ impdp scott/tiger directory=data_pump_dir network_link=qadb remap_schema=scott:scott


View
Purpose
DBA_DB_LINKS
Lists all database links in the database.
ALL_DB_LINKS
Lists all database links accessible to the connected user.
USER_DB_LINKS
Lists all database links owned by the connected user

No comments: