Basics of the Oracle 12c Data Pump - dummies

Basics of the Oracle 12c Data Pump

By Chris Ruel, Michael Wessler

Data Pump is modeled after Oracle 12c’s Export/Import tools that were available prior to Oracle 10g. Export/Import is still available, but Data Pump has taken the tasks traditionally done by Export/Import and added a lot more options and flexibility.

Data Pump is useful for

  • Moving data from one schema to another

  • Moving data from one version of Oracle to another

  • Moving data from one OS to another

  • Creating logical backups

You can use Data Pump to transport an entire database from one instance to another. This capability includes new Oracle 12c pluggable databases as well as older nonpluggable databases. You can use this to move a database to a new server environment or to upgrade to a higher database version.

You should be familiar with these files:

  • Dump file: This file is created during a Data Pump Export. It’s the import file when you do a Data Pump Import. It’s binary so you can’t open it to see anything useful.

  • Parfile: This optional file lists the parameters that control the Data Pump Import or Export. You create this text-based file yourself.

  • Log file: This output is for all Data Pump Import and Export jobs unless you specify otherwise. You can name it yourself or let it have a default name. It’s useful for getting jobs statistics and for troubleshooting.

You can interactively do these things with Data Pump jobs:

  • Start

  • Stop

  • Pause

  • Restart

  • Kill

The result is that you can start a job from the command line, detach from it to do something else (while it’s still running), and re-attach later to check progress or make changes.

When a Data Pump job runs into a problem, it automatically suspends itself; that way you have a chance to fix the problem before the job fails altogether. Prior to Data Pump, if an Export/Import job ran into a problem, it would fail immediately, sometimes wasting hours of time.

The command-line program impdb launches Data Pump Import. From the OS command line, launch Data Pump Import and have it show a list of the parameters:

<impdp help=y>

You see something like this:

Import: Release 12.1.0.1.0 - Production on Sat Jul 20 06:54:52 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
   Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
   Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
   Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
...output snipped...

Like Data Pump Export, Import lists the parameters that can be used with the import portion of Data Pump. Many of these parameters behave the same way they do when you’re using Data Pump Export.

Take a closer look at some the Data Pump Import parameters:

  • CONTENT: If you have a full content export file, you can choose to import only the metadata. For example, you might want to create all the tables with no rows. Obviously, if you didn’t include the rows in the export dump file, you can’t tell Data Pump Import to put them in!

  • ESTIMATE: This parameter estimates the size of the Data Pump Import.

  • DIRECTORY: This one tells Data Pump Import where it can find the dump file. It doesn’t have to be the same place it was dumped, but you must move the file to the new location. This parameter might be useful when moving the file to another machine or OS.

  • DUMPFILE: A complete listing of all the files created by Data Pump Export.

  • EXCLUDE: This works much like Data Pump Export but tells Data Pump Import what to leave from the dump file.

  • INCLUDE: This parameter is another way of controlling what objects are put into the target database.

  • FLASHBACK_SCN, FLASHBACK_TIME: Use these parameters with the Data Pump Import tool only when connecting through a NETWORK_LINK. Data Pump Import can connect directly to a remote database across a database link and write the data directly into the target system. Use these parameters to pull data from the past.

  • NETWORK_LINK: You can connect from one database and import into another by setting up a database link and specifying it with this parameter. No files are created when this method is used. This parameter is very handy for logical recovery and cloning.

  • PARALLEL: This helps speed up your import.

  • REMAP_SCHEMA: This parameter is handy for copying the objects/data from one schema to another.

  • REMAP_TABLESPACE: Moves the objects into a new tablespace. By default, they go into the same tablespace they came from. This parameter is useful when used in conjunction with remap_schema and while moving data from one database to another.

  • SCHEMAS: This parameter gives a list of schemas to Data Pump to tell it what to import. By default, Data Pump imports everything in the file. In essence, you can have a full export but then pick and choose what you want to import.

  • TABLES: As with SCHEMAS, you can choose from your dump file what to import.

  • TABLESPACES: You can choose what tablespaces you want import from the dump file.