How to Use the Oracle 12c Data Pump with a Parameter File

By Chris Ruel, Michael Wessler

A parameter file is a text file listing the parameters for Oracle 12c’s Data Pump Export or Import and setting the chosen values. Data Pump Export and Import parameter files are constructed the same way.

Follow these steps to run a Data Pump Export with this parameter file:

  1. Type the parameter file into a text editor and save it to a directory.

    This example is a parameter file that exports the DEPARTMENTS and EMPLOYEES tables of the HR schema:

    # File: /u01/app/oracle/scripts/datapump/my_data_pump_parfile.par
    DIRECTORY=my_data_pump_dir
    DUMPFILE=my_data_pump_dumpfile.dmp
    LOGFILE=my_data_pump_logfile.log
    SCHEMAS=HR
    TABLES=EMPLOYEES, DEPARTMENTS
    COMPRESSION=ALL
  2. Open a command-line prompt and go to the directory where your parameter file is saved.

  3. Launch Data Pump Export with your parameter file:

    <expdp parfile=my_data_pump_parfile.par>

    You should see this:

    Export: Release 12.1.0.1.0 - Production on Sat Jul 20 06:51:40 2013
    Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
    Username:
  4. Enter the username and give the password for the user you want to export with.

    You should see something like this:

    Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production
    With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
    Starting "SYS"."SYS_EXPORT_SCHEMA_01": /******** AS SYSDBA parfile=my_data_pump_parfile.par
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 128 KB
    ...output snipped...
    . . exported "HR"."DEPARTMENTS"             5.437 KB   27 rows
    . . exported "HR"."EMPLOYEES"              8.726 KB   107 rows
    Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
    ******************************************************************************
    Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
     /u01/app/oracle/dumpfiles/MY_DATA_PUMP_FILE.DMP
    Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at 06:52:25
  5. Create the user and the tablespace.

    Make sure both users have the same privileges.

  6. Create a parameter file that imports the data into a new user in its own tablespace.

    In this example, HR2 is imported to its own tablespace, HR2_DATA.

    Because this export is only a partial piece of the HR data model, you exclude constraints and triggers; they have dependent objects that aren’t in the export dump file. You don’t have to exclude them, but you get an error in the log file as Data Pump tries to create them.

    Such a parameter file might look like this:

    # File: /u01/app/oracle/scripts/datapump/my_HR2_data_pump_parfile.par
    DIRECTORY=my_data_pump_dir
    DUMPFILE=my_data_pump_file.dmp
    LOGFILE=my_HR2_data_pump_logfile.log
    EXCLUDE=CONSTRAINT
    EXCLUDE=TRIGGER
    REMAP_SCHEMA=HR:HR2
    REMAP_TABLESPACE=EXAMPLE:HR2_DATA
  7. Run the import:

    <impdp parfile=my_hr2_data_pump_parfile.par>

    You should see something like this:

Import: Release 12.1.0.1.0 - Production on Sat Jul 20 07:00:17 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Username: / as sysdba
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table "SYS"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYS"."SYS_IMPORT_FULL_01": /******** AS SYSDBA parfile=my_hr2_data_pump_parfile.par
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "HR2"."DEPARTMENTS"             5.437 KB   27 rows
. . imported "HR2"."EMPLOYEES"              8.726 KB   107 rows
...output snipped...
Job "SYS"."SYS_IMPORT_FULL_01" successfully completed at 07:02:53

By default, the log file is created in the same directory as your dump file. The log file is a text file that any text editor can read.

If the user is someone other than the schema you’re exporting or importing, you need one of these two things:

  • Database administrator (DBA) privileges

  • The DATAPUMP_EXP_FULL_DATABASE and DATAPUMP_IMP_FULL_DATABASE roles

If you’re working as an administrator in a container database (CDB), you also need the CDB_DBA role.