Beruflich Dokumente
Kultur Dokumente
SKIP: Default value for this parameter is SKIP. This parameter is exactly same as
theIGNORE=Y option in conventional import utility.
2.
APPEND: This option appends the data from the data dump. The extra rows in the dump will
be appended to the table and the existing data remains unchanged.
3.
TRUNCATE: This option truncate the exiting rows in the table and insert the rows from the
dump
4.
REPLACE: This option drop the current table and create the table as it is in the dump file.
Both SKIP and REPLACE options are not valid if you set the CONTENT=DATA_ONLY for the
impdp.
DEPT
SAL
10
5000
Hero
10
5500
Jain
10
4000
John
20
6000
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage
Management, OLAP,
Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_TABLE_01": /******** AS SYSDBA directory=exp_dir
tables=scott.employee dumpfile=emp.dmp logfile=emp.log
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported "SCOTT"."EMPLOYEE"
5.953 KB
4 rows
Master table "SYS"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_TABLE_01 is:
/home/oracle/shony/emp.dmp
Job "SYS"."SYS_EXPORT_TABLE_01" successfully completed at 23:31:20
A.
TABLE_EXISTS_ACTION=SKIP
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage
Management, OLAP,
Data Mining and Real Application Testing options
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA directory=exp_dir
dumpfile=emp.dmp logfile=imp1.log table_exists_action=skip
Processing object type TABLE_EXPORT/TABLE/TABLE
TABLE_EXISTS_ACTION=APPEND
I have deleted and inserted 4 new rows into employee table. So as of now the rows the dump and
table are different and I am going to import the dump with APPEND option.
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage
Management, OLAP,
Data Mining and Real Application Testing options
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA directory=exp_dir
dumpfile=emp.dmp logfile=imp1.log table_exists_action=append
Processing object type TABLE_EXPORT/TABLE/TABLE
C.
TABLE_EXISTS_ACTION=TRUNCATE
Now lets try with table_exists_action=truncate option. In truncate option it will truncate the
content of the existing table and insert the rows from the dump. Currently my employee table
has 8 rows which we inserted last insert.
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage
Management, OLAP,
Data Mining and Real Application Testing options
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
D.
TABLE_EXISTS_ACTION=REPLACE
This option drop the current table in the database and the import recreate the new table as in the
dumpfile.
Username: / as sysdba
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage
Management, OLAP,
Data Mining and Real Application Testing options
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA directory=exp_dir
dumpfile=emp.dmp logfile=imp1.log table_exists_action=replace
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."EMPLOYEE"
5.953 KB
4 rows
$ date
Wed May
Sample=10
Oracle Data Pump technology enables very high-speed movement of data and metadata from one
database to another.
* Schema Mode
* Table Mode
* Tablespace Mode
Full Export Mode:A full export is specified using the FULL parameter. In a full database export, the entire database is
unloaded. This mode requires that you have the EXP_FULL_DATABASE role.
Example:
The following is an example of using the FULL parameter. The dump file, expfull.dmp is written
to the dpump_dir2 directory.
Schema Mode:A schema export is specified using the SCHEMAS parameter. This is the default export mode.
If you have the EXP_FULL_DATABASE role, then you can specify a list of schemas and optionally
include the schema definitions themselves, as well as system privilege grants to those schemas.
If you do not have the EXP_FULL_DATABASE role, you can export only your own schema.
Example:
Table Mode:A table export is specified using the TABLES parameter. In table mode, only a specified set of tables,
partitions, and their dependent objects are unloaded. You must have the EXP_FULL_DATABASE role to
specify tables that are not in your own schema. All specified tables must reside in a single schema.
Example:
The following example shows a simple use of the TABLES parameter to export three tables found
in the hr schema: employees, jobs, and departments. Because user hr is exporting tables found
in the hr schema, the schema name is not needed before the table names.
Example:
Government Jobs
Template
Order Of
Synonyms
Extract Data
Proxy Server
2. What is use of DIRECT=Y option in exp?
Setting direct=yes, to extract data by reading the data directly, bypasses the SGA,
bypassing the SQL command-processing layer (evaluating buffer), so it should be faster.
Default value N.
3. What is use of COMPRESS option in exp?
Imports into one extent. Specifies how export will manage the initial extent for the
table data. This parameter is helpful during database re-organization. Export the objects
(especially tables and indexes) with COMPRESS=Y. If table was spawning 20 Extents of 1M
each (which is not desirable, taking into account performance), if you export the
table withCOMPRESS=Y, the DDL generated will have initial of 20M. Later on when importing
the extents will be coalesced. Sometime it is found desirable to export with COMPRESS=N,
in situations where you do not have contiguous space on disk (tablespace), and do not want
imports to fail.
4.
1.
2.
3.
4.
5.
6.
5.
1.
2.
3.
4.
5.
6.
7.
INDEXES=N and building indexes later on after the import. Indexes can easily be recreated
after the data was successfully imported.
8. Use STATISTICS=NONE
9. Disable the INSERT triggers, as they fire during import.
10. Set Parameter COMMIT_WRITE=NOWAIT(in Oracle 10g) or COMMIT_WAIT=NOWAIT
(in Oracle 11g) during import.
6. What is use of INDEXFILE option in imp?
Will write DDLs of the objects in the dumpfile into the specified file.
7. What is use of IGNORE option in imp?
Will ignore the errors during import and will continue the import.
8. What are the differences between expdp and exp (Data Pump or normal exp/imp)?
Data Pump is server centric (files will be at server).
Data Pump has APIs, from procedures we can run Data Pump jobs.
In Data Pump, we can stop and restart the jobs.
Data Pump will do parallel execution.
Tapes & pipes are not supported in Data Pump.
Data Pump consumes more undo tablespace.
Data Pump import will create the user, if user doesnt exist.
9. Why expdp is faster than exp (or) why Data Pump is faster than conventional
export/import?
Data Pump is block mode, exp is byte mode.
Data Pump will do parallel execution.
Data Pump uses direct path API.
10. How to improve expdp performance?
Using parallel option which increases worker threads. This should be set based on the
number of cpus.
11. How to improve impdp performance?
Using parallel option which increases worker threads. This should be set based on the
number of cpus.
12. In Data Pump, where the jobs info will be stored (or) if you restart a job in Data Pump,
how it will know from where to resume?
Whenever Data Pump export or import is running, Oracle will create a table with the
JOB_NAME and will be deleted once the job is done. From this table, Oracle will find out how
much job has completed and from where to continue etc.
Default export job name will be SYS_EXPORT_XXXX_01, where XXXX can be FULL or
SCHEMA or TABLE.
Default import job name will be SYS_IMPORT_XXXX_01, where XXXX can be FULL or
SCHEMA or TABLE.
13. What is the order of importing objects in impdp?
Tablespaces
Users
Roles
Database links
Sequences
Directories
Synonyms
Types
Tables/Partitions
Views
Comments
Packages/Procedures/Functions
Materialized views
14. How to import only metadata?
CONTENT= METADATA_ONLY
15. How to import into different user/tablespace/datafile/table?
REMAP_SCHEMA
REMAP_TABLESPACE
REMAP_DATAFILE
REMAP_TABLE
REMAP_DATA
16. How to export/import without using external directory?
17. Using Data Pump, how to export in higher version (11g) and import into lower version
(10g), can we import to 9i?
18. Using normal exp/imp, how to export in higher version (11g) and import into lower
version (10g/9i)?
19. How to do transport tablespaces (and across platforms) using exp/imp or expdp/impdp?