Managing Exports and Imports
Managing Exports and Imports
Main Tasks:
Data archival
Upgrading to new releases.
Backing up Oracle database
Moving between Oracle databases.
Export’s basic function to extract the object definition and table data from an Oracle database and store in Oracle binary format.
Exports Parameters :
• Buffer - Size of a buffer
• File – Output File (default ‘ expdat.dmp’)
• Compress - Import data to one extent(Default’Y’)
• Grants – Export Grants
• Indexes – Export indexes
• Rows – Export data rows
• Constraints – export Constraints ( default ‘Y’)
• Log – logfile of screen output
• Full – Entire File (default ‘N’)
• Owner – List of Owner Names
• Tables – List of table names
• Triggers – Export triggers(Y)
Export Examples
• Incremental Exports:
• An incremental Exports backups up only tables that have changed since the last Incremental,Cumulative or Complete Export.
• An Incremental export exports the table definition and all its data, not just changed rows.(i.e., Entire table)
Cumulative Export
A cumulative export backups tables that have changed since the last cumulative or complete export. A cumulative export includes all the
incremental exports done from last cumulative or Complete into a single cumulative export file.
Complete Exports
A complete export establishes a base for incremental and cumulative exports. It is also similar to full database export except it updates the tables
that track Incremental and Cumulative exports.
Complete export Three weeks
Cumulative export every Sunday
Import Parameters
Buffer - Size of databuffer
File – output file
Grants – export grants
Indexes – export indexes
Rows – export data rows
Log – logfile of screen output
Full – Entire file
Tables – List of table names
Show – Lists file contents
Touser – To which user you want to import
Commit – commit array insert
From user – From which user
4.For reorg
Cold Backup
Hot Backup
• Hot backup is taken when the database is up and running in Archive log mode. Hot backup can be taken on tablespace by tablespace
mechanism. Which is only the method recommended.
• You must put the tablespace in begin backup mode and after finishing the backup you must set it to end backup mode. It will generate
lot of redo entries.
Method 1:
select substr(sql_text,instr(sql_text,'INTO "'),30) table_name,
rows_processed,
round((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes,
trunc(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60))
rows_per_min
from sys.v_$sqlarea
where sql_text like 'INSERT %INTO "%'
and command_type = 2
and open_versions > 0;
For this to work one needs to be on Oracle 7.3 or higher (7.2 might also be OK). If the import has more than one table, this statement will only
show information about the current table being imported.
Method 2:
Use the FEEDBACK=N import parameter. This parameter will tell IMP to display a dot for every N rows imported. For example,
FEEDBACK=1000 will show a dot after every 1000 row.
Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
From Oracle8i, the export utility supports multiple output files. This feature enables large exports to be divided into files whose sizes will not
exceed any operating system limits (FILESIZE= parameter). When importing from multi-file export you must provide the same filenames in the
same sequence in the FILE= parameter. Look at this example:
exp SCOTT/TIGER FILE=D:F1.dmp,E:F2.dmp FILESIZE=10m LOG=scott.log
Use the following technique if you use an Oracle version prior to 8i:
Create a compressed export on the fly. Depending on the type of data, you probably can export up to 10 gigabytes to a single file. This example
uses gzip. It offers the best compression I know of, but you can also substitute it with zip, compress or whatever.
EXPORT:
Set the BUFFER parameter to a high value (e.g. 2Mb -- entered as an integer "2000000")
Set the RECORDLENGTH parameter to a high value (e.g. 64Kb -- entered as an integer "64000")
Use DIRECT=yes (direct mode export)
Stop unnecessary applications to free-up resources for your job.
If you run multiple export sessions, ensure they write to different physical disks.
DO NOT export to an NFS mounted filesystem. It will take forever.
IMPORT:
Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and
then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and
supply the passwords for the schemas on all CONNECT statements.
Place the file to be imported on a separate physical disk from the oracle data files
Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the init$SID.ora file
Set the LOG_BUFFER to a big value and restart oracle.
Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback
segment of course). The rollback segment must be as big as your biggest table (I think?)
Use COMMIT=N in the import parameter file if you can afford it
Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
Remember to run the indexfile previously created
If exp and imp are used to export data from an Oracle database with a different version than the database in which is imported, then the following
rules apply:
exp
FILE=scott.dmp
OWNER=scott
GRANTS=y
ROWS=y
COMPRESS=y
Command-Line Method
Export Messages
Information is displayed about the release of Export you are using and the release of Oracle Database that you are connected to. Then, status
messages similar to the following are shown:
.
.
. about to export SCOTT's tables via Conventional Path ...
. . exporting table BONUS 0 rows exported
. . exporting table DEPT 4 rows exported
. . exporting table EMP 14 rows exported
. . exporting table SALGRADE 5 rows exported
.
.
.
Export terminated successfully without warnings.