How to Import an Oracle Schema with Data Pump
How to Import an Oracle Schema with Data Pump
Section titled “How to Import an Oracle Schema with Data Pump”Oracle Data Pump (impdp) imports logical exports created by expdp. Beyond a straight restore, it supports remapping schemas and tablespaces — making it the primary tool for database migrations, cloning environments, and selective data recovery. This guide covers all common import scenarios.
Prerequisites
Section titled “Prerequisites”You need a dump file created by expdp and a directory object pointing to the location containing that file.
-- Verify the directory object exists and points to the dump locationSELECT directory_name, directory_pathFROM dba_directoriesWHERE directory_name = 'DP_DIR';
-- Grant access if importing as a non-DBA userGRANT READ, WRITE ON DIRECTORY dp_dir TO system;
-- Check the target tablespace exists and has spaceSELECT tablespace_name, status, ROUND(SUM(bytes)/1024/1024/1024, 2) AS total_gbFROM dba_data_filesGROUP BY tablespace_nameORDER BY tablespace_name;The importing user typically needs IMP_FULL_DATABASE for full database imports, or object-level privileges for schema imports.
Step 1: Basic Schema Import
Section titled “Step 1: Basic Schema Import”# Import a schema directly (same schema name, same tablespace names)impdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ PARALLEL=4Step 2: Remap Schema (Import to Different Schema Name)
Section titled “Step 2: Remap Schema (Import to Different Schema Name)”This is the most common migration pattern — import source schema as a different name on the target.
# Remap source schema to a different target schemaimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ REMAP_SCHEMA=app_schema:app_schema_dev \ PARALLEL=4
# Multiple schema remaps in one importimpdp system/manager@ORCL \ FULL=Y \ DIRECTORY=dp_dir \ DUMPFILE=full_db_%U.dmp \ LOGFILE=full_db_imp.log \ REMAP_SCHEMA=prod_schema1:dev_schema1 \ REMAP_SCHEMA=prod_schema2:dev_schema2 \ PARALLEL=8Step 3: Remap Tablespaces
Section titled “Step 3: Remap Tablespaces”When the target database has different tablespace names, use REMAP_TABLESPACE.
# Remap all objects from source tablespace to target tablespaceimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ REMAP_TABLESPACE=prod_data:dev_data \ REMAP_TABLESPACE=prod_idx:dev_idx \ PARALLEL=4Step 4: Handle Object Conflicts
Section titled “Step 4: Handle Object Conflicts”When objects already exist in the target, control how conflicts are resolved with TABLE_EXISTS_ACTION.
# SKIP — leave existing objects unchanged (default)impdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ TABLE_EXISTS_ACTION=SKIP
# REPLACE — drop and recreate existing tables with imported dataimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ TABLE_EXISTS_ACTION=REPLACE
# APPEND — keep existing rows, add imported rowsimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ TABLE_EXISTS_ACTION=APPEND
# TRUNCATE — delete existing rows, then importimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_imp.log \ TABLE_EXISTS_ACTION=TRUNCATEStep 5: Import Specific Tables Only
Section titled “Step 5: Import Specific Tables Only”# Import only specific tables from a full or schema dumpimpdp system/manager@ORCL \ TABLES=app_schema.orders,app_schema.customers \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=tables_imp.log \ TABLE_EXISTS_ACTION=REPLACEStep 6: Extract DDL Without Importing Data
Section titled “Step 6: Extract DDL Without Importing Data”Use SQLFILE to generate the DDL statements without loading any data — ideal for reviewing what will be imported.
# Extract DDL only to a SQL fileimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ SQLFILE=dp_dir:app_schema_ddl.sql
# The SQLFILE output can be reviewed and run manually# cat /u01/datapump/exports/app_schema_ddl.sqlAdvanced Examples
Section titled “Advanced Examples”Full Database Import with Remapping
Section titled “Full Database Import with Remapping”# Full database import to a different server/databaseimpdp system/manager@TARGET_DB \ FULL=Y \ DIRECTORY=dp_dir \ DUMPFILE=full_db_%U.dmp \ LOGFILE=full_db_imp.log \ REMAP_TABLESPACE=prod_data:dev_data \ REMAP_TABLESPACE=prod_idx:dev_idx \ EXCLUDE=STATISTICS \ PARALLEL=8Import Only Metadata (No Data)
Section titled “Import Only Metadata (No Data)”# Import structure only — tables, indexes, procedures, views, no rowsimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_meta_imp.log \ CONTENT=METADATA_ONLY \ PARALLEL=4Import Only Data (No DDL)
Section titled “Import Only Data (No DDL)”# Import rows into existing tables, skip all DDLimpdp system/manager@ORCL \ SCHEMAS=app_schema \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=app_schema_data_imp.log \ CONTENT=DATA_ONLY \ TABLE_EXISTS_ACTION=APPEND \ PARALLEL=4Import with Row Filtering
Section titled “Import with Row Filtering”# Import only rows matching a conditionimpdp system/manager@ORCL \ TABLES=app_schema.orders \ DIRECTORY=dp_dir \ DUMPFILE=app_schema_%U.dmp \ LOGFILE=orders_imp.log \ QUERY=app_schema.orders:'"WHERE status = '"'"'COMPLETED'"'"'"'Network Mode Import (No Dump File)
Section titled “Network Mode Import (No Dump File)”Network mode imports directly from a source database over a database link — no dump file needed.
-- Create a database link to the source (on target database)CREATE DATABASE LINK source_db CONNECT TO system IDENTIFIED BY password USING 'SOURCE_TNSNAME';# Import directly from source database via database linkimpdp system/manager@TARGET_DB \ SCHEMAS=app_schema \ NETWORK_LINK=source_db \ REMAP_SCHEMA=app_schema:app_schema_copy \ LOGFILE=dp_dir:network_import.log \ PARALLEL=4Common Mistakes and Pitfalls
Section titled “Common Mistakes and Pitfalls”Importing into the wrong container — In a CDB, always specify the PDB in the connect string: impdp system@pdb_dev, not impdp system@cdb_root.
Missing user or tablespace on target — If the schema or tablespace in the dump does not exist on target, import fails. Create the user and tablespace first, or use REMAP_SCHEMA and REMAP_TABLESPACE.
Not excluding statistics — Importing stale production statistics into a development database causes the optimizer to make poor choices. Use EXCLUDE=STATISTICS and gather fresh stats.
Forgetting TABLE_EXISTS_ACTION — The default is SKIP, which silently ignores tables that already exist. If you expect to replace data, you must explicitly set REPLACE, TRUNCATE, or APPEND.
Import fails on constraint violations with APPEND — When appending rows, foreign key constraints may fail if parent rows do not exist. Import in dependency order or defer constraints temporarily.
Not checking the log file — Data Pump considers an import successful even if some objects failed. Always review the log file for ORA- errors.
Verification Queries
Section titled “Verification Queries”-- Check the import job completed (look for "successfully completed")-- grep -i 'completed\|error\|ORA-' /u01/datapump/exports/app_schema_imp.log
-- Verify tables were importedSELECT table_name, num_rows, last_analyzed, statusFROM dba_tablesWHERE owner = 'APP_SCHEMA'ORDER BY table_name;
-- Verify row counts match expectationsSELECT table_name, TO_NUMBER(EXTRACTVALUE(DBMS_XMLGEN.GETXMLTYPE( 'SELECT COUNT(*) c FROM app_schema.' || table_name), '//c')) AS row_countFROM dba_tablesWHERE owner = 'APP_SCHEMA' AND num_rows > 0ORDER BY table_name;
-- Check for invalid objects after importSELECT object_name, object_type, statusFROM dba_objectsWHERE owner = 'APP_SCHEMA' AND status = 'INVALID'ORDER BY object_type, object_name;
-- Recompile any invalid objectsEXEC DBMS_UTILITY.COMPILE_SCHEMA(schema => 'APP_SCHEMA', compile_all => FALSE);Related Topics
Section titled “Related Topics”- How to Export an Oracle Schema with Data Pump - Create the dump file
- How to Create a Database Link - For network mode imports
- Data Pump Documentation - Full Data Pump reference
- How to Gather Statistics - Gather fresh stats after import