Export/Import DataPump Parameter VERSION - Compatibility of Data Pump Between Different Oracle Versions (Doc ID 553337.1)

To BottomTo Bottom

In this Document

  Purpose
  Scope
  Details
  1. Summary
  2. Introduction
  3. Basic Data Pump Compatibility
  4. Export Data Pump Compatibility
  5. Import Data Pump Compatibility
  6. Database Migration with a Full Database Data Pump Export/Import
  7. Examples
  8. Possible Error Messages
  9. Known Defects
  @ 10. For Support: Enhancement Requests.
  Additional Resources
  References

 

APPLIES TO: 

Enterprise Manager for Oracle Database - Version 10.1.0.2 and later
Oracle Database Backup Service - Version N/A and later
Oracle Database Cloud Service - Version N/A and later
Oracle Database - Standard Edition - Version 10.1.0.2 and later
Oracle Database Exadata Cloud Machine - Version N/A and later
Information in this document applies to any platform.

PURPOSE

This article describes issues related to the compatibility of the different versions of the Oracle Export Data Pump and Import Data Pump utilities.

SCOPE

The article is intended for users of the Oracle10g and Oracle11g databases who wish to use Export Data Pump (expdp) and Import Data Pump (impdp) to export data from an Oracle database release x and import this data into an Oracle database release y. The article gives information how to create an Export Data Pump dumpfile that can be imported into a higher or lower release database, and how the compatibility applies when Data Pump is used over a database link. 
For the compatibility of the original export and import clients (exp resp. imp), see: 
Note:132904.1 "Compatibility Matrix for Export & Import Between Different Oracle Versions"
For the transportable tablespaces export/import, see: 
Note:291024.1 "Compatibility and New Features when Transporting Tablespaces with Export and Import" 
In the text below, the following terminology is used: 

  • Source database = the database where the data is exported from. 
  • Target database = the database where the data is imported into. 
  • Database compatibility level = the value of init.ora/spfile parameter COMPATIBLE. 
  • Data Pump client version = the release version of the Data Pump client (software version). 
  • Database version = the release version (the first four digits) of the ORACLE_HOME (= version of database dictionary).

DETAILS

1. Summary

The following are the most important guidelines regarding Data Pump compatibility:

1. The compatibility level of the Data Pump dumpfile set is determined by the compatibility level of the source database.

2. Use the Export Data Pump client (expdp) that matches the version of the source database (up to one major version lower expdp client can be used, but this is not recommended).

3. Use the Import Data Pump client (impdp) that matches the version of the target database (up to one major version lower impdp client can be used, but this is not recommended).

4. Use the Export Data Pump parameter VERSION in case the target database has a lower compatibility level than the source database.

5. Transferring data over a database link is supported even if the compatibility level of the (remote) source database differs from the (local) connected database (up to one major version difference is supported).

6. Import Data Pump can always read Export Data Pump dumpfile sets created by older versions of the database.

7. Export dumpfiles created with the Export Data Pump client (expdp) cannot be read by the original Import client (imp).

8. Export dumpfiles created with the original Export client (exp) cannot be read by the Import Data Pump client (impdp).

Overview of Data Pump dumpfile compatibility. 

    Export              Use Export Data Pump parameter VERSION=...
      From                 if dumpfile needs to be imported into a
    Source                Target Database with compatibility level
  Database        (value of init.ora/spfile parameter COMPATIBLE):
      With
COMPATIBLE    10.1.0.x.y    10.2.0.x.y    11.1.0.x.y    11.2.0.x.y    12.1.0.x.y    12.2.0.x.y     18.x.y.z     19.x.y.z
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
10.1.0.x.y             -             -             -             -             -             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
10.2.0.x.y  VERSION=10.1             -             -             -             -             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
11.1.0.x.y  VERSION=10.1  VERSION=10.2             -             -             -             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
11.2.0.x.y  VERSION=10.1  VERSION=10.2  VERSION=11.1             -             -             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
12.1.0.x.y  VERSION=10.1  VERSION=10.2  VERSION=11.1  VERSION=11.2             -             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
12.2.0.x.y  VERSION=10.1  VERSION=10.2  VERSION=11.1  VERSION=11.2  VERSION=12.1             -            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
18.x.y.z    VERSION=10.1  VERSION=10.2  VERSION=11.1  VERSION=11.2  VERSION=12.1  VERSION=12.2            -            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------
19.x.y.z    VERSION=10.1  VERSION=10.2  VERSION=11.1  VERSION=11.2  VERSION=12.1  VERSION=12.2 VERSION=18.x            -
---------- ------------- ------------- ------------- ------------- ------------- ------------- ------------ ------------

Overview of Data Pump client/server compatibility. 

Data Pump client compatibility.
===============================

   expdp and  Connecting to Database version
impdp client     10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
     version  10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
----------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
   10.1.0.x  supported  supported  supported  supported         no         no         no         no
   10.2.0.x         no  supported  supported  supported  supported         no         no         no
   11.1.0.x         no  supported  supported  supported  supported         no         no         no
   11.2.0.x         no         no         no  supported  supported  supported  supported  supported
   12.1.0.x         no         no         no         no  supported  supported  supported  supported
   12.2.0.x         no         no         no         no         no  supported  supported  supported
   18.x.y.z         no         no         no         no         no  supported  supported  supported
   19.x.y.z         no         no         no         no         no         no         no  supported

For details about generic interoperability between Oracle client and server versions, see also:
Note:207303.1 "Client / Server / Interoperability Support Between Different Oracle Versions"

Overview of Data Pump dumpfile set file versions. 

Data Pump file version.
=======================

     Version      Written by     Can be imported into Target:
   Data Pump   database with      10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
Dumpfile Set   compatibility   10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
------------ --------------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
         0.1          10.1.x  supported  supported  supported  supported  supported  supported  supported  supported
         1.1          10.2.x         no  supported  supported  supported  supported  supported  supported  supported
         2.1          11.1.x         no         no  supported  supported  supported  supported  supported  supported
         3.1          11.2.x         no         no         no  supported  supported  supported  supported  supported
         4.1          12.1.x         no         no         no         no  supported  supported  supported  supported
         5.1          12.2.x         no         no         no         no         no  supported  supported  supported
         5.1          18.x.y         no         no         no         no         no         no  supported  supported
         5.1          19.x.y         no         no         no         no         no         no         no  supported

Overview of Data Pump client/feature compatibility.

Data Pump New Features.
=======================

Version   New Feature                        Remark:
--------- ---------------------------------  ----------------------------------------------
10.1.0.1  Data Pump Technology               Introduction of high-speed data movement.
---------
10.2.0.1  COMPRESSION                        Compress metadata in export dumpfile.
10.2.0.1  ENCRYPTION_PASSWORD                Allows encrypted column data in dumpfile.
10.2.0.1  SAMPLE                             Specify a percentage of data to be unloaded.
10.2.0.1  TRANSFORM                          Change DDL for OID's and space allocation.
10.2.0.1  VERSION                            Create file compatible with earlier release.
---------
11.1.0.1  COMPRESSION                        Both data and metadata can be compressed.
11.1.0.1  DATA_OPTIONS (expdp)               Specify handling of XMLType columns.
11.1.0.1  DATA_OPTIONS (impdp)               Specify handling of constraint violations.
11.1.0.1  ENCRYPTION                         Both data and metadata can be encrypted.
11.1.0.1  ENCRYPTION_ALGORITHM               Specify a specific encryption algorithm.
11.1.0.1  ENCRYPTION_MODE                    Specify the type of security to be used.
11.1.0.1  PARTITION_OPTIONS                  Specify how to handle partitioned tables.
11.1.0.1  REMAP_DATA                         Change column values based on a function.
11.1.0.1  REMAP_TABLE                        Rename tables during an import operation.
11.1.0.1  REUSE_DUMPFILES                    Option to overwrite existing dumpfiles.
11.1.0.1  TRANSPORTABLE                      Transfer table data by copying datafiles.
---------
11.2.0.1                                     Legacy mode: accept exp and imp parameters.
11.2.0.1  ABORT_STEP                         Stop job after initializing to query master.
11.2.0.1  ACCESS_METHOD                      Specify a particular method to (un)load data.
11.2.0.1  CLUSTER                            Control whether workers use all RAC instances.
11.2.0.1  DATA_OPTIONS (impdp)               Specify to disable the APPEND hint.
11.2.0.1  KEEP_MASTER                        Specify whether to retain master table.
11.2.0.1  MASTER_ONLY                        Only import the master table.
11.2.0.1  METRICS                            Report additional information in logfile.
11.2.0.1  SERVICE_NAME                       Use with CLUSTER to specify a service name.
11.2.0.1  SOURCE_EDITION (expdp)             Specify the edition from which to exp objects.
11.2.0.1  TABLES                             Now specify tables in multiple schemas.
11.2.0.1  TABLES                             Now specify % for multiple tables and part.
11.2.0.1  TARGET_EDITION (impdp)             Specify the edition from which to exp objects.
11.2.0.2                                     Default first segment now 8M for part. table.
11.2.0.2  TRANSFORM                          Specify how to handle SEGMENT CREATION.
---------
12.1.0.1                                     Support for CDB and Pluggable db's (PDB).
12.1.0.1  COMPRESSION_ALGORITHM              Specify the algorithm when compressing data.
12.1.0.1  ENCRYPTION_PWD_PROMPT              Specify whether Data to prompt for password.
12.1.0.1  FULL                               Can now be used together with TRANSPORTABLE.
12.1.0.1  LOGTIME                            Provide timestamp for messages in the logfile.
12.1.0.1  TRANSFORM                          Specify to disable logging during import.
12.1.0.1  TRANSFORM                          Specify to change the LOB storing on import.
12.1.0.1  TRANSFORM                          Specify to change table compression type.
12.1.0.1  VIEWS_AS_TABLES                    Export views as tables.
---------
12.2.0.1
12.2.0.1  REMAP_DIRECTORY                    Let you remap directories when you move databases between platforms.
12.2.0.1  TRUST_EXISTING_TABLE_PARTITIONS    Enable data from multiple partitions to be loaded in parallel into a pre-existing table. This is a flag on Data Pump Import
                                             DATA_OPTIONS
12.2.0.1  VALIDATE_TABLE_DATA                Verify the format number and date data types in table data columns. This is a flag on Data Pump Import DATA_OPTIONS
12.2.0.1  ENABLE_NETWORK_COMPRESSION         Tell Data Pump to compress data before sending it over the network. This is a flag on DATA_OPTIONS parameter.
12.2.0.1  GROUP_PARTITION_TABLE_DATA         Enable data for all partitions to be loaded at once and in parallel. This is a flag on the Data Pump Export DATA_OPTIONS parameter.
12.2.0.1  VERIFY_STREAM_FORMAT               Validate the format of a data stream before it is written to the Data Pump dump file. This is a flag on the Data Pump Export 
                                             DATA_OPTIONS parameter.
---------
18.1.X.X  CONTINUE_LOAD_ON_FORMAT_ERROR      This is a new value for the DATA_OPTIONS parameter for impdp. When it is set, Data Pump jumps ahead and continue loading from the
                                             next granule when an inconsistency is found.
---------
19c       Use Document 2457955.1           19c DataPump New Features are described in this document.

2. Introduction

2.1. Data Pump.
With Oracle10g Release 1 (10.1.0.x) we have introduced the new Export DataPump (expdp) and Import DataPump (impdp) utilities. These utilities have a better performance and increased flexibility when compared to the original export (exp) and import (imp) clients. All Data Pump Export and Import processing, including the reading and writing of dump files, is done on the system (server) selected by the specified database connect string.
With the Oracle Export Data Pump and Import Data Pump utilities you can transfer data objects between Oracle databases, even if they reside on platforms with different hardware and software configurations. With Export Data Pump the object definitions and table data can be extracted from an Oracle database and stored in an Oracle binary-format export Data Pump dumpfile located on the server.
The dumpfile set can then be transferred using FTP in binary mode or physically transported to a different site. With the Import Data Pump client the object definitions and table data can be read from the dumpfile set and then inserted into the Oracle database.


2.2. How to determine the database compatibility level ? 
To determine the compatibility level of the database, run the following in SQL*Plus:

col value for a20
col description for a49
select * from database_compatible_level;
col value clear
col description clear

-- Sample output:

VALUE                DESCRIPTION
-------------------- -------------------------------------------------
12.1.0.0.0           Database will be completely compatible with this
                     software version

-- or:

show parameter compatible

NAME                  TYPE        VALUE
--------------------- ----------- -------------
compatible            string      12.1.0.0.0

When checking for compatibility, Data Pump examins the major database release number and the database maintenance release number (i.e.: 11.1 in the example output above).
For details about version numbers, see also:
Note:39691.1 "VERSION NUMBER - Oracle version numbers explained"


2.3. How to determine the software version of the database and the Data Pump client ? 
The software version of the Export Data Pump and Import Data Pump clients can be obtained by invoking the utilities with the parameter HELP=Y. The banner of the help page shows the version of the utility:

SELECT * FROM v$version;

BANNER                                                                            CON_ID
----------------------------------------------------------------------------- ----------
Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production           0
PL/SQL Release 12.1.0.1.0 - Production                                                 0
CORE    12.1.0.1.0      Production                                                     0
TNS for Linux: Version 12.1.0.1.0 - Production                                         0
NLSRTL Version 12.1.0.1.0 - Production                                                 0

-- And for the Data Pump clients:

%expdp HELP=Y

Export: Release 12.1.0.1.0 - Production on Wed May 28 09:58:07 2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.
... (other output)

and:

%impdp HELP=Y

Import: Release 12.1.0.1.0 - Production on Wed May 28 09:59:53 2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates.  All rights reserved.
... (other output)

For details, see also: 
Note:175627.1 "Export-Import iSR - How to Find the Database and Export/Import Version"


2.4. How to obtain the version of an export Data Pump dumpfile ? 
Every export dumpfile starts with a header (usually 4 kb in size) which contains details about the dumpfile. Starting with Oracle10g Release 2 (10.2.0.1.0), this header information can be extracted from the dumpfile by calling the procedure DBMS_DATAPUMP.GET_DUMPFILE_INFO. 
Possible output based on an example procedure SHOW_DUMPFILE_INFO:

SET serveroutput on SIZE 1000000
exec show_dumpfile_info(p_dir=> 'my_dir', p_file=> 'expdp_f.dmp')

----------------------------------------------------------------------------
Purpose..: Obtain details about export dumpfile. Version: 18-DEC-2013
Required.: RDBMS version: 10.2.0.1.0 or higher
. Export dumpfile version: 7.3.4.0.0 or higher
. Export Data Pump dumpfile version: 10.1.0.1.0 or higher
Usage....: execute show_dumfile_info('DIRECTORY', 'DUMPFILE');
Example..: exec show_dumfile_info('MY_DIR', 'expdp_s.dmp')
----------------------------------------------------------------------------
Filename.: ftest.dmp
Directory: dpdir
Disk Path: <DIRECTORY_PATH>/expdp
Filetype.: 1 (Export Data Pump dumpfile)
----------------------------------------------------------------------------
...Database Job Version..........: 12.02.00.00.00
...Internal Dump File Version....: 5.1
...Creation Date.................: Tue Jun 06 11:04:57 2017
...File Number (in dump file set): 1
...Master Present in dump file...: 1 (Yes)
...Master in how many dump files.: 1
...Master Piece Number in file...: 1
...Operating System of source db.: x86_64/Linux 2.4.xx
...Instance Name of source db....: db4
...Characterset ID of source db..: 873 (AL32UTF8)
...Language Name of characterset.: AL32UTF8
...Job Name......................: "<SCHEMA_NAME>"."SYS_EXPORT_FULL_01"
...GUID (unique job identifier)..: 51470FA2BE241FE4E0539395400A4A02
...Block size dump file (bytes)..: 4096
...Metadata Compressed...........: 1 (Yes)
...Data Compressed...............: 0 (No)
...Compression Algorithm.........: 3 (Basic)
...Metadata Encrypted............: 0 (No)
...Table Data Encrypted..........: 0 (No)
...Column Data Encrypted.........: 0 (No)
...Encryption Mode...............: 2 (None)
...Internal Flag Values..........: 514
...Max Items Code (Info Items)...: 23
----------------------------------------------------------------------------

PL/SQL procedure successfully completed.

In the example above, the dumpfile set has the compatibility level 12.2 (12.02.00.00.00) and is using a file version format 5.1. This dumpfile set can only be imported into a database with the same or a higher compatibility level. 
For details, and the example procedure SHOW_DUMPFILE_INFO which can be called to obtain the export dumpfile header details, see: 
Note:462488.1 "How to Gather the Header Information and the Content of an Export Dumpfile ?"

3. Basic Data Pump Compatibility

3.1. Data Pump dumpfile file version versus Data Pump job version.
When an Export Data Pump dumpfile set is created, we store information in the header block of each dumpfile. Besides the version of the Data Pump job, we also store the version of the dumpfile. The version of the dumpfile determines the internal structure of the dumpfile:

  • In Oracle10g Release 1 we create an Export Data Pump dumpfile with version 0.1.
  • In Oracle10g Release 2 we create an Export Data Pump dumpfile with version 1.1.
  • In Oracle11g Release 1 we create an Export Data Pump dumpfile with version 2.1.
  • In Oracle11g Release 2 we create an Export Data Pump dumpfile with version 3.1.
  • In Oracle12c Release 1 we create an Export Data Pump dumpfile with version 4.1.
  • In Oracle12c Release 2 we create an Export Data Pump dumpfile with version 5.1.
  • In Oracle18c we create an Export Data Pump dumpfile with version 5.1.
  • In Oracle19c we create an Export Data Pump dumpfile with version 5.1.

 

Note that the version of Data Pump dumpfile set is used internally to keep track of Data Pump specific features that result in a change of the internal structure of the dumpfile.
It is not possible to create a dumpfile with a higher version, e.g.: a 11.2.0.4.0 source database cannot create a version 4.1 dumpfile because that dumpfile version has a structure that was introduced in Oracle12c. It is possible though to create a dumpfile with a lower version, e.g.: a 11.2.0.4.0 source database can also create a version 1.1 dumpfile by specifying the Export Data Pump parameter VERSION (see section 4.2. "Export Data Pump parameter: VERSION" below). The reason why you would do this is to make the dumpfile set compatible with the lower release target database so it can be imported.

With the VERSION parameter you can specify the Data Pump job version and indirectly control the version of the dumpfile set. E.g.: if you specify VERSION=11.2 then Data Pump will create a dumpfile set that can be imported into an Oracle11g Release 2 database (dumpfile version: 3.1). When specified, you also determine which version of the objects will be exported. Database objects or attributes that are incompatible with the specified version will not be exported.

During the initial stage of an Import Data Pump job, the header block of each file in the dumpfile set is examined. We compare the value of the dumpfile version to the server's current file_version to make sure that the version of the dumpfile is equal to or less than that of the server.

  • Oracle10g Release 1 can import from a Data Pump dumpfile set with the file version 0.1.
  • Oracle10g Release 2 can import from a Data Pump dumpfile set with the file version 0.1 or 1.1.
  • Oracle11g Release 1 can import from a Data Pump dumpfile set with the file version 0.1 or 1.1 or 2.1.
  • Oracle11g Release 2 can import from a Data Pump dumpfile set with the file version 0.1 or 1.1 or 2.1 or 3.1.
  • Oracle12c Release 1 can import from a Data Pump dumpfile set with the file version 0.1 or 1.1 or 2.1 or 3.1 or 4.1.
  • Oracle12c Release 2 can import from any Data Pump dumpfile set, except dump files created with 18.x version.
  • Oracle18c can import from any Data Pump dumpfile set, except dump files created with 19.x version.
  • Oracle19c can import from any Data Pump dumpfile set.

An overview of the Data Pump dumpfile versions:

     Version      Written by     Can be imported into Target:
   Data Pump   database with      10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
Dumpfile Set   compatibility   10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
------------ --------------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
         0.1          10.1.x  supported  supported  supported  supported  supported  supported  supported  supported
         1.1          10.2.x         no  supported  supported  supported  supported  supported  supported  supported
         2.1          11.1.x         no         no  supported  supported  supported  supported  supported  supported
         3.1          11.2.x         no         no         no  supported  supported  supported  supported  supported
         4.1          12.1.x         no         no         no         no  supported  supported  supported  supported
         5.1          12.2.x         no         no         no         no         no  supported  supported  supported
         5.1          18.x.y         no         no         no         no         no         no  supported  supported
         5.1          19.x.y         no         no         no         no         no         no         no  supported

Remark: 
Due to a regression in DataPump after installation of 12.1.0.2.180417DBBP or 12.1.0.2.180717DBBP, the dump file version written to the external table dump file in SQL mode will be 4.2, instead of 4.1. More details and the solution to fix the 12.1.0.2 DataPump regression can be found from Document 2422236.1 - Alert - Regression in DataPump After Applying 12.1.0.2.180417DBBP or 12.1.0.2.180717DBBP.

3.2. Data Pump server versus Data Pump clients.
Data Pump is server based and not client based. This means that most Data Pump specific defects will be fixed on the server side (changes in packages in source and target database) and not on the client side (Export or Import Data Pump client). If a Data Pump defect that occurs during an import job is fixed in a later patchset (e.g. 11.2.0.4.0) and the target database is still on the base release (11.2.0.1.0) then the defect will still occur when importing with a 11.2.0.4.0 Import Data Pump client into this 11.2.0.1.0 target database. The same applies to export jobs. It is therefore recommended that both the source database and the target database have the latest patchset installed.


3.3. Data Pump dumpfiles versus original export dumpfiles.

Data Pump versus original export/import clients:

1. Export dumpfiles created with the Export Data Pump client (expdp) can only be read by the Import Data Pump client (impdp).

2. Export dumpfiles created with the original Export client (exp) cannot be read by the Import Data Pump (impdp) client.

For details about the compatibility of the original export and import clients (exp resp. imp), see: 
Note:132904.1 "Compatibility Matrix for Export & Import Between Different Oracle Versions

4. Export Data Pump Compatibility

4.1. Export Data Pump Compatibility Details.

Export Data Pump compatibility:

1. The compatibility level of the Data Pump dumpfile set is determined by the compatibility level of the source database.

2. When exporting data, use the same version Export Data Pump client as the version of the source database (up to one major version lower expdp client is possible, but this is not recommended).

3. If the dumpfile needs to be imported into a lower version target database, use the Export Data Pump parameter VERSION to match the compatibility level of the target database.

Client/server compatibility:

Export Data   Connecting to Source Database:
Pump client      10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
    version   10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
----------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
   10.1.0.x  supported  supported  supported  supported         no         no         no         no
   10.2.0.x         no  supported  supported  supported  supported         no         no         no
   11.1.0.x         no         no  supported  supported  supported  supported         no         no
   11.2.0.x         no         no         no  supported  supported  supported  supported  supported
   12.1.0.x         no         no         no         no  supported  supported  supported  supported
   12.2.0.x         no         no         no         no         no  supported  supported  supoprted
   18.x.y.z         no         no         no         no         no  supported  supported  supported
   19.x.y.z         no         no         no         no         no         no         no  supported

Remarks:
1. Connecting with a lower version Export Data Pump client (e.g. 10.2.0.4.0) to a higher version source database (e.g. 11.2.0.4.0) is possible, but not recommended.

  • We support a compatibility between the database and the Data Pump client where the database may have a higher version with a maximum difference of one major database release. This means that any 10.x Data Pump client can start a Data Pump job on any higher release database (up to 11.x.). For example, you can use a 10.1.0.2.0 Export Data Pump client to start an export Data Pump job on an 11.2.0.4.0 database.
  • If the source database had COMPATIBLE=11.1.0, then the resulting dumpfile set will have the same compatibility level as the source database, even if the Export Data Pump job was started with a 10.1.0.2.0 or 10.2.0.4.0 Export Data Pump client.
  • It is not recommended to connect with a lower version Export Data Pump client (e.g. 10.2.0.4.0) to a higher version source database (e.g. 11.2.0.4.0).
    • If you are connecting with an older Export Data Pump client, certain Export Data Pump features may be unavailable. For example, if you connect with a 10.2.0.4.0 Export Data Pump client to an 11.2.0.4.0 database, an error will occur if you have also specified COMPRESSION=all because the value 'all' from the COMPRESSION parameter is not recognized by the 10.2.0.4.0 Export Data Pump client.
    • Certain Data Pump client specific defects may not have been fixed in the earlier release of the Export Data Pump client. Note however, that most Data Pump defects are fixed on the server side (source database) and not on the client side (export client).

2. Connecting with a higher version Export Data Pump client (e.g. 11.2.0.4.0) to a lower version source database (e.g. 10.2.0.4.0) is not supported. Attempting to run such a job, will fail with UDE-18 (Data Pump client is incompatible with database version 11.2.0.4.0).

  • In exception to the above: within the same maintenance release (e.g. 10.2.0), it is possible to connect with an Export Data Pump client of a later patchset (e.g. 10.2.0.3.0) to a lower version source database (e.g. 10.2.0.1.0).
  • Note that Data Pump is server based and not client based. This means that most Data Pump specific defects are fixed on the server side (source database) and not on the client side (export client). If a Data Pump defect is fixed in a later patchset (e.g. 10.2.0.4.0) and the source database is still on the base release (10.2.0.1.0) then the defect will still occur when exporting with a 10.2.0.4.0 Export Data Pump client from this 10.2.0.1.0 source database.
  • Beware that there is a known defect when exporting with a 10.2.0.4.0 Export Data Pump client from a 10.2.0.3.0 or lower release database. For details, see also section 9.4. "Bug:7489698 - Schema Export with 10.2.0.4.0 expdp client from 10.2.0.3.0 or lower source fails". 

3. In a downgrade scenario, perform the Export Data Pump job with the same version Export Data Pump client as the version of the source database (recommended), and specify the VERSION parameter which is set to the lower compatibility level of the target database.

  • If the source database already had the same compatibility of the target database (e.g.: an 11.1.0.6.0 source database with COMPATIBLE=10.2.0, and a 10.2.0.4.0 target database with COMPATIBLE=10.2.0), then it is not required to specify the parameter VERSION=10.2.0 for the Export Data Pump job. However, to avoid confusion we recommend the usage of the VERSION parameter whenever the target database has a lower compatibility or release number.
  • See also section 4.2. "Export Data Pump parameter: VERSIONbelow.

4. If the source database is a read-only database, then an Export Data Pump job that is started on that database will fail because Data Pump cannot create its so-called Master table. To workaround this, consider the read-only source database as a remote database and start the export Data Pump job on a different local database which has a database link to the remote read-only source database. Specify the NETWORK_LINK parameter when connecting with the Export Data Pump client to that different local database. The data from the remote read-only source database instance is transferred over the database link and written to a dump file set on the connected (local) database instance. See also section 4.3. "Export Data Pump NETWORK_LINK compatibility." below.


4.2. Export Data Pump parameter: VERSION
With the Export Data Pump parameter VERSION, you can control the version of the dumpfile set, and make the set compatible to be imported into a database with a lower compatibility level. Note that this does not mean that dumpfile set can be used with versions of Oracle Database prior to 10.1.0 because Export and Import Data Pump only work with Oracle Database 10g release 1 (10.1) or later. 
When specified, internal Data Pump components like the Metadata Application Program Interface (API) make sure that the metadata objects are comptible with the specified version. As a result of this you have the ability to determine which version of the objects will be exported. Database objects or attributes that are incompatible with the specified version will not be exported. For example, tables containing new datatypes that are not supported in the specified version will not be exported.

Syntax Export Data Pump parameter: VERSION

VERSION={COMPATIBLE | LATEST | version_string}

COMPATIBLE = The version of the metadata corresponds to the database compatibility level. Database compatibility must be set to 9.2 or higher. This is the default value.
LATEST = The version of the metadata corresponds to the database release version.
version_string = A specific database version (e.g.: 11.1.0).

Remarks: 
1. The minimum value of 'version_string' that can be specified is: 9.2 (e.g. if the dumpfile set needs to be imported into a 10.1.0.5.0 database which has COMPATIBLE=9.2 in the init.ora).

2. The maximum value of 'version_string' that can be specified is equal to the release version of the source database. Note that this would be similar to specifying: VERSION=LATEST

3. If you specify a VERSION for an Export Data Pump job that is older than the current database version, certain Export Data Pump features may be unavailable. For example, specifying VERSION=10.1 will cause an error if data compression is also specified for the job (e.g.: COMPRESSION=all), because compression was not supported in 10.1. See also section 8.6. "ORA-39055 (The AAA feature is not supported in version xx.yy.zz)" below.

4. If you specify a VERSION for an Export Data Pump job that is older than the source database version, then a dumpfile set is created that you can import into that older version target database. However, this dumpfile set will not contain any objects that the older specified version does not support. For example, if you export from a version 10.2 database in order to import into a version 10.1 database (i.e. by specifying VERSION=10.1), comments on indextypes will not be exported into the dumpfile set.

5. For an overview of the new database features introduced, see: 

  • Manual Part No. B10763-01 "Oracle Database Upgrade Guide 10g Release 1 (10.1)", Chapter 5 "Compatibility and Interoperability", Section "Compatibility and Interoperability Issues Introduced in Oracle Database 10g Release 10.1"
  • Manual Part No. B14238-01 "Oracle Database Upgrade Guide 10g Release 2 (10.2)", Chapter 5 "Compatibility and Interoperability", Section "Compatibility and Interoperability Issues Introduced in Oracle Database 10g Release 10.2"
  • Manual Part No. B28300-02 "Oracle Database Upgrade Guide 11g Release 1 (11.1)", Chapter 5 "Compatibility and Interoperability", Section "Compatibility and Interoperability Issues Introduced in 11g Release 1 (11.1)"

Example:
To create an Export Data Pump dumpfile set from a 11.1.0.6.0 source database with default compatibility level (COMPATIBLE=11.1.0.0.0) and which needs to be imported into a 10.2.0.4.0 target database that has a default compatibility level (COMPATIBLE=10.2.0.4.0), run the export Data Pump job with the 11.1.0.6.0 (or 10.2.0.4.0) Export Data Pump client that connects to the 11.1.0.6.0 source database and provide the Export Data Pump parameter: VERSION=10.2.0.4 (or VERSION=10.2 which means the same).


4.3. Export Data Pump NETWORK_LINK compatibility. 
Export Data Pump can be started with the NETWORK_LINK parameter which enables an export from a remote (source) database that is identified by a valid database link. The data from the source database instance is transferred over the database link and written to a dump file set on the connected (local) database instance. 
The compatibility rules are similar to those of the standard Export Data Pump compatibility:

Export Data Pump NETWORK_LINK compatibility: 

1. The compatibility level of the Data Pump dumpfile set is determined by the compatibility level of the (remote) source database or the (local) connected database, whichever is the lowest. 

2. When exporting data, use the same version Export Data Pump client as the version of the (local) connected database (up to one major version lower expdp client can be used, but this is not recommended). 

3. The compatibility level of the (remote) source database can differ from the (local) connected database.

4. If the dumpfile needs to be imported into a lower version target database, use the Export Data Pump parameter VERSION to match the compatibility level of the target database.

Remarks: 
1. See section 4.1. "Export Data Pump Compatibility Details" above for the standard Export Data Pump compatibility details.

2. Exporting data over a database link that connects to a remote source database with a lower compatibility level is supported.

  • The matrix from Document 207303.1 - Client / Server Interoperability Support Matrix for Different Oracle Versions summarizes client and server combinations that are supported for the most commonly used product versions.
    For network_link between different Oracle versions connections must be supported in BOTH directions. Example: As 11.2 -> 10.1 is not supported then database links between these version are not supported in either direction.
  • It is not required to specify the VERSION parameter for such an Export Data Pump job as we will automatically use the lowest COMPATIBLE value of the two databases involved, which is in this specific configuration the (remote) source database. The Data Pump dumpfile of previous example will have compatible version 10.1 (dumpfile version: 0.1).
  • If the remote source database has a lower compatibility level, certain Export Data Pump features may be unavailable. For example, if the remote source database has COMPATIBLE=10.1.0 then an error will be reported if data compression is also specified for the job (e.g.: COMPRESSION=all), because compression was not supported in 10.1.

3. Exporting data over a database link that connects to a remote source database with a higher compatibility level is also supported.

  • The matrix from Document 207303.1 - Client / Server Interoperability Support Matrix for Different Oracle Versions summarizes client and server combinations that are supported for the most commonly used product versions.
    For network_link between different Oracle versions connections must be supported in BOTH directions. Example: As 11.2 -> 10.1 is not supported then database links between these version are not supported in either direction.
  • It is not required to specify the VERSION parameter for such an Export Data Pump job as we will automatically use the lowest COMPATIBLE value of the two databases involved, which is in this specific configuration the (local) connected database. The Data Pump dumpfile of previous example will have compatible version 10.2 (dumpfile version: 1.1).
  • Due to several defects, Export Data Pump jobs with NETWORK_LINK parameter fail to export from a remote source database with a higher compatibility level. These bugs are fixed in 10.2.0.3.0 and higher, and as a result of that, it is possible to export data over a database link that connects to a remote source database with a higher compatibility level if the release version of the local database is 10.2.0.3.0 or higher. In addition the VERSION parameter has to be specified to workaround two of these defects. Trying to export data over a database link that connects to a remote source database with a higher compatibility level while the release version of the local database is 10.2.0.2.0 or lower will fail due to these defects. See also section 8.9. "ORA-39022 (Database version xx.yy.zz is not supported)" below.

4. When you perform an export over a database link, the data from the remote source database instance is written to dump files on the connected local database instance. The remote source database can be a read-only database.

5. Export Data Pump supports the following types of database links: public, fixed-user, and connected-user. Current-user database links are not supported.

6. If an export operation is performed over an unencrypted network link, then all data is transferred over the database link without any encryption, even if the data was encrypted in the remote source database.

Restrictions:
1. Tables with a LONG column are not supported in an Export Data Pump job with NETWORK_LINK parameter. An ORA-31679 error will be generated and the export will move on to the next table. See also section 8.11. "ORA-31679 (Table data object "aaa"."bbb" has long columns, and longs can not be loaded/unloaded using a network link)" below.

2. Tables with object_type columns are not supported in an Export Data Pump job with NETWORK_LINK parameter. An ORA-22804 error will be generated and the export will move on to the next table. See also section 8.12. "ORA-22804 (remote operations not permitted on object tables or user-defined type columns)" below.

3. When both the NETWORK_LINK and the TABLES parameters are used, then only whole tables can be exported (not partitions of tables). Trying to export a partition of a table over a database link will fail with ORA-39203 (Partition selection is not supported over a network link) or in older releases with ORA-14100 (partition extended table name cannot refer to a remote object). See also section 8.13. "ORA-39203 (Partition selection is not supported over a network link)" below.

Example: 
An Export Data Pump client (11.1.0.6.0) connects to a local 11.1.0.6.0 database (with COMPATIBLE=11.1.0) with the NETWORK_LINK parameter. The referred database link connects to a remote 10.2.0.4.0 database with COMPATIBLE=10.2.0.4.0 (i.e.: the source database). The Export Data Pump dumpfile set will be written to the server where the 11.1.0.6.0 database is installed, and will have a file version 1.1 and compatibility level of 10.2.0.4.0. 

Note: Starting with 11.2, it is possible to perform a network mode import of an individual partition with data pump.

5. Import Data Pump Compatibility

5.1. Import Data Pump Compatibility Details.

Import Data Pump compatibility:

1. When importing data, use the same version Import Data Pump client as the version of the target database (up to one major version lower impdp client is possible, but this is not recommended).

2. Import Data Pump can always read Export Data Pump dumpfile sets created by older versions of the database.

3. Import Data Pump cannot import dumpfile sets if the compatibility level of the dumpfile set is higher than the the compatibility level of the target database.

Client/server compatibility:

Import Data   Connecting to Target Database:
Pump client      10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
    version   10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
----------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
   10.1.0.x  supported  supported  supported  supported         no         no         no         no
   10.2.0.x         no  supported  supported  supported  supported         no         no         no
   11.1.0.x         no         no  supported  supported  supported  supported         no         no
   11.2.0.x         no         no         no  supported  supported  supported  supported  supported
   12.1.0.x         no         no         no         no  supported  supported  supported  supported
   12.2.0.x         no         no         no         no         no  supported  supported  supported
   18.x.y.z         no         no         no         no         no  supported  supported  supported
   19.x.y.z         no         no         no         no         no         no         no  supported

Remarks:
1. Connecting with a lower version Import Data Pump client (e.g. 10.2.0.4.0) to a higher version target database (e.g. 11.1.0.6.0) is possible, but not recommended.

  • We support a compatibility between the database and the Data Pump client where the database may have a higher version with a maximum difference of one major database release. This means that any 10.x Data Pump client can start a Data Pump job on any higher release database (up to 11.x.). For example, you can use a 10.1.0.2.0 Import Data Pump client to start an import Data Pump job on an 11.1.0.6.0 database.
  • It is not recommended to connect with a lower version Import Data Pump client (e.g. 10.2.0.4.0) to a higher version target database (e.g. 11.1.0.6.0).
    • If you are connecting with an older Import Data Pump client, certain Import Data Pump features may be unavailable. For example, if you connect with a 10.1.0.5.0 Import Data Pump client to a 10.2.0.4.0 database, an error will occur if TRANSFORM is also specified for the job (e.g.: TRANSFORM=oid:n), because the value 'oid' for the TRANSFORM parameter is not recognized by the 10.1.0.5.0 Import Data Pump client.
    • Certain Data Pump client specific defects may not have been fixed in the earlier release of the Import Data Pump client. Note however, that most Data Pump defects are fixed on the server side (target database) and not on the client side (import client).

2. Connecting with a higher version Import Data Pump client (e.g. 11.1.0.6.0) to a lower version target database (e.g. 10.2.0.3.0) is not supported. Attempting to run such a job, will fail with UDI-18 (Data Pump client is incompatible with database version 10.2.0.4.0).

  • In exception to the above: within the same maintenance release (e.g. 10.2.0), it is possible to connect with an Import Data Pump client of a later patchset (e.g. 10.2.0.3.0) to a lower version source database (e.g. 10.2.0.1.0).
  • Note that Data Pump is server based and not client based. This means that most Data Pump specific defects are fixed on the server side (target database) and not on the client side (import client). If a Data Pump defect is fixed in a later patchset (e.g. 10.2.0.4.0) and the target database is still on the base release (10.2.0.1.0) then the defect will still occur when importing with a 10.2.0.4.0 Import Data Pump client into this 10.2.0.1.0 target database.
  • Beware that there is a known defect when exporting with a 10.2.0.4.0 Export Data Pump client from a 10.2.0.3.0 or lower release database. For details, see also section 9.4. "Bug:7489698 - Schema Export with 10.2.0.4.0 expdp client from 10.2.0.3.0 or lower source fails". 

3. Import Data Pump can always import from Export Data Pump dumpfile sets created by older versions of the database. I.e.: an export Data Pump dumpfile set created on 10.1.0.2.0 can be imported into any higher release database.

4. Import Data Pump cannot import dumpfile sets with a higher compatibility level than the target database. I.e.: an export Data Pump dumpfile created with VERSION=COMPATIBLE (default) from a 11.1.0.6.0 source database with COMPATIBLE=11.1.0 can never be imported into a 10.2.0.4.0 target database.

  • In exception to the above: new features are usually only introduced with a new main release (like with Oracle10g, Release 2 (10.2.0.1.0)), and not with a patchset (like 10.2.0.4.0 versus the base release of Oracle10g Release 2 (10.2.0.1.0)). Therefore it is possible to import any dumpfile from a main release in all patched versions of the main release (e.g.: import an 10.2.0.4.0 Export Data Pump dumpfile set into a 10.2.0.1.0 target database without specifying the VERSION=10.2.0.1.0 for the Export Data Pump job).

An overview of the Data Pump dumpfile versions:

     Version      Written by     Can be imported into Target:
   Data Pump   database with      10gR1      10gR2      11gR1      11gR2      12cR1      12cR2        18c        19c
Dumpfile Set   compatibility   10.1.0.x   10.2.0.x   11.1.0.x   11.2.0.x   12.1.0.x   12.2.0.x   18.x.y.z   19.x.y.z
------------ --------------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
         0.1          10.1.x  supported  supported  supported  supported  supported  supported  supported  supported
         1.1          10.2.x         no  supported  supported  supported  supported  supported  supported  supported
         2.1          11.1.x         no         no  supported  supported  supported  supported  supported  supported
         3.1          11.2.x         no         no         no  supported  supported  supported  supported  supported
         4.1          12.1.x         no         no         no         no  supported  supported  supported  supported
         5.1          12.2.x         no         no         no         no         no  supported  supported  supported
         5.1          18.x.y         no         no         no         no         no         no  supported  supported
         5.1          19.x.y         no         no         no         no         no         no         no  supported


5.2. Import Data Pump parameter: VERSION
With the Import Data Pump parameter VERSION, you can control the version of the objects that are loaded into the target database. When specified, internal Data Pump components like the Metadata Application Program Interface (API) make sure that the metadata objects are comptible with the specified version. As a result of this you have the ability to determine which version of the objects will be imported: database objects or attributes that are incompatible with the specified version will not be transferred to the target database.

Syntax Import Data Pump parameter: VERSION 

VERSION={COMPATIBLE | LATEST | version_string} 

COMPATIBLE = The version of the metadata corresponds to the database compatibility level. Database compatibility must be set to 9.2 or higher. This is the default value. 
LATEST = The version of the metadata corresponds to the database release version. 
version_string = A specific database version (e.g.: 11.1.0).

Remarks:
1. It is important to realize that the Import Data Pump parameter VERSION cannot be used to allow a higher version dumpfile to be imported into a lower release database. If the dumpfile needs to be imported into a lower release target database, you have to use the Export Data Pump parameter VERSION to match the compatibility level of the target database. For example, a 11.1.0.6.0 dumpfile from a 11.1.0.6.0 source database with COMPATIBLE=11.1.0 has the version: 2.1. This dumpfile cannot be imported into a 10.2.0.x target database, not even if Import Data Pump was started with VERSION=10.2.0.  If the Export Data Pump job would have been done with VERSION=10.2.0 then the dumpfile can be imported into the 10.2.0.x target database because that dumpfile would already have the compatibility of 10.2.0 and the corresponding file version 1.1.

2. The minimum value of 'version_string' that can be specified is: 9.2. 

3. The maximum value of 'version_string' that can be specified is equal to the release version of the target database. Note that this would be similar to specifying: VERSION=LATEST

4. If you specify a VERSION for an Import Data Pump job that is older than the version of the dumpfile set, certain Import Data Pump features may be unavailable. For example, specifying VERSION=10.1 will cause an error if transform is also specified for the job (e.g.: TRANSFORM=oid:n), because transforming the Object Identifiers (OID's) of types was not supported in 10.1. See also section 8.6. "ORA-39055 (The AAA feature is not supported in version xx.yy.zz)" below. 

5. If you specify a VERSION for an Import Data Pump job that is older than the target database version, then Data Pump will skip the import of any objects from the dumpfile that the older specified VERSION does not support. For example, if you export from a version 10.2 database, and import the data back into another 10.2 database while specifying VERSION=10.1 then comments on indextypes will not be imported into the target database.


5.3. Import Data Pump NETWORK_LINK compatibility.
Import Data Pump can be started with the NETWORK_LINK parameter which enables an import into the connected local target database while extracting the data from a remote source database that is identified by a valid database link. This means that the system to which the Import Data Pump client is connected, contacts the remote source database referenced by the NETWORK_LINK parameter. SQL is directly used to move the data using an INSERT SELECT statement. The SELECT clause retrieves the data from the remote database over the network link. The INSERT clause uses SQL to insert the data into the target database. There are no dump files involved.
The compatibility rules are similar to those of the standard Import Data Pump compatibility:

Import Data Pump NETWORK_LINK compatibility:

1. The compatibility level of the imported objects is determined by the compatibility level of the (remote) source database or the (local) target database, whichever is the lowest.

2. When importing data, use the same version Import Data Pump client as the version of the (local) target database (up to one major version lower impdp client can be used, but this is not recommended). 

3. The compatibility level of the (remote) source database can differ from the (local) target database. 

4. Use the Import Data Pump parameter VERSION if a lower version of the objects need to be exported and imported.

Remarks:
1. See section 5.1. "Import Data Pump Compatibility Details" above for the standard Import Data Pump compatibility details.

2. Importing data over a database link that connects to a remote source database with a lower compatibility level is supported.

  • The matrix from Document 207303.1 - Client / Server Interoperability Support Matrix for Different Oracle Versions summarizes client and server combinations that are supported for the most commonly used product versions. 
    For network_link between different Oracle versions connections must be supported in BOTH directions. Example: As 11.2 -> 10.1 is not supported then database links between these version are not supported in either direction.
  • It is not required to specify the VERSION parameter for such an Import Data Pump job as we will automatically use the lowest COMPATIBLE value of the two databases involved, which is in this specific configuration the (remote) source database. The version of the objects that are transferred over the database link in previous example is: 10.1.
  • If the remote source database has a lower compatibility level, certain Import Data Pump features may be unavailable. For example, if the remote source database has COMPATIBLE=10.1.0 then an error will be reported if TRANSFORM is also specified for the job (e.g.: TRANSFORM=oid:n), because transforming the Object Identifiers (OID's) of types was not supported in 10.1.

3. Importing data over a database link that connects to a remote source database with a higher compatibility level is also supported.

  • The matrix from Document 207303.1 - Client / Server Interoperability Support Matrix for Different Oracle Versions summarizes client and server combinations that are supported for the most commonly used product versions.
    For network_link between different Oracle versions connections must be supported in BOTH directions. Example: As 11.2 -> 10.1 is not supported then database links between these version are not supported in either direction.
  • It is not required to specify the VERSION parameter for such an Import Data Pump job as we will automatically use the lowest COMPATIBLE value of the two databases involved, which is in this specific configuration the (local) target database. The version of the objects that are transferred over the database link in previous example is: 10.2.
  • Due to several defects, Import Data Pump jobs with NETWORK_LINK parameter fail to import from a remote source database with a higher compatibility level. These defects are fixed in 10.2.0.3.0 and higher, and as a result of that, it is possible to import data over a database link that connects to a remote source database with a higher compatibility level if the release version of the local target database is 10.2.0.3.0 or higher. In addition the VERSION parameter has to be specified to workaround two of these defects. Trying to import data over a database link that connects to a remote source database with a higher compatibility level while the local target database is 10.2.0.2.0 or lower will fail due to these defects. See also section 8.9. "ORA-39022 (Database version xx.yy.zz is not supported)" below.

4. If the USERID that is executing the Import Data Pump job has the IMP_FULL_DATABASE role on the local target database, then that user must also have the EXP_FULL_DATABASE role on the remote source database.

5. If the remote source database is read-only, then the connected user must have a locally managed tablespace assigned as the default temporary tablespace on the remote source database. Otherwise, the job will fail.

6. Import Data Pump supports the following types of database links: public, fixed-user, and connected-user. Current-user database links are not supported.

7. If an import operation is performed over an unencrypted network link, then all data is transferred over the database link without any encryption, even if the data was encrypted in the remote source database.

Restrictions:
1. Tables with a LONG column are not supported in an Import Data Pump job with NETWORK_LINK parameter. An ORA-31679 error will be generated and the import will move on to the next table. See also section 8.11. "ORA-31679 (Table data object "aaa"."bbb" has long columns, and longs can not be loaded/unloaded using a network link)" below.

2. Tables with evolved object_type columns are not supported in an Import Data Pump job with NETWORK_LINK and TABLE_EXISTS_ACTION parameters (i.e.: in case those tables and types already exist in the target database). An ORA-22804 error will be generated and the import will move on to the next table. See also section 8.12. "ORA-22804 (remote operations not permitted on object tables or user-defined type columns)" below.

3. When both the NETWORK_LINK and the TABLES parameters are used, then only whole tables can be imported (not partitions of tables). Trying to import a partition of a table over a database link will fail with ORA-39203 (Partition selection is not supported over a network link) or in older releases with ORA-14100 (partition extended table name cannot refer to a remote object). The only exception to this is if TRANSPORTABLE=ALWAYS is also specified, in which case single or multiple partitions of a specified table can be imported. See also section 8.13. "ORA-39203 (Partition selection is not supported over a network link)" below.

6. Database Migration with a Full Database Data Pump Export/Import

6.1. Upgrade Method with Data Pump.
When upgrading or migrating a database, you can use the 'Database Upgrade Assistant', or you can perform a manual upgrade, or you can use the Export Data Pump and Import Data Pump utilities. The Oracle Data Pump utilities physically copy data from the source database to a new target database. Be aware though that this also means that the time requirements for upgrading a database with Export/Import are higher than for the other upgrade methods (i.e.: slowest migration method). If you use the Data Pump utilities you may have the following benefits:

  • defragments the data (you can compress the imported data to improve performance);
  • restructures the database (you can create new tablespaces or modify existing tables, tablespaces, or partitions to be populated by imported data);
  • facilitates side-by-side testing of the old and new versions of Oracle Database because an entirely new database is created;
  • enables the copying of specified database objects or users (you can import only the objects, users, and other items that you wish; this is useful for establishing a test environment for the new software on only a subset of the production data);
  • serves as a backup archive (you can use a full database export as an archive of the current database);
  • allows the new database to be established on an Operating System or hardware platform that is different from that supporting the database being upgraded;
  • network-based Data Pump Import allows the new Oracle database to be directly loaded across the network from the old database being upgraded (no intervening dump files are required).


6.2. Patchsets.
When migrating a database with a full database Export/Import (Data Pump) we recommend that both the original source database and the new target database have the latest patchset installed before the full database Export Data Pump job resp. Import Data Pump job is started.
This means:
- for Oracle10g Release 1 source or target database: have the 10.1.0.5.0 patchset installed (see: Patch:4505133);
- for Oracle10g Release 2 source or target database: have the 10.2.0.3.0 patchset installed (see: Patch:5337014) or the 10.2.0.4.0 patchset (see: Patch:6810189);
- for Oracle11g Release 1 source or target database: have the 11.1.0.7.0 patchset installed (see: Patch:6890831);
- for Oracle11g Release 2 source or target database: have the 11.2.0.4.0 patchset installed (see: Patch:13390677).
- for Oracle12g Release 1 source or target database: have the 12.1.0.2.0 patchset installed (see: Patch:17694377).

For details about the latest patchsets and an overview of known issues, see also:
Note:263719.1 "ALERT: Oracle 10g Release 1 (10.1) Support Status and Alerts" 
Note:316900.1 "ALERT: Oracle 10g Release 2 (10.2) Support Status and Alerts" 
Note:454507.1 "ALERT: Oracle 11g Release 1 (11.1) Support Status and Alerts"
Note 880782.1 "ALERT: Oracle 11g Release 2 (11.2) Support Status and Alerts"
Note 1565065.1 "ALERT: Oracle 12c Release 1 (12.1) Support Status and Alerts"
Note 2239821.1 "ALERT: Oracle 12c Release 2 (12.2) Support Status and Alerts"


6.3. Migration to Oracle10g Release 2 (10.2.0.x.0). 
A direct migration with a full database Export Data Pump job and full database Import Data Pump job is supported from the following versions of the source database.

Migration with Data Pump to 10gR2 (10.2.0.x.0) if source is:
============================================================
Oracle10g Release 1: 10.1.0.2.0 (10.1.0.5.0 recommended)

Note that we also recommend that the latest patchset is installed on the target database. For the 10.2.0.3.0 patchset see: Patch:5337014 and for the 10.2.0.4.0 patchset see: Patch:6810189.


6.4. Migration to Oracle11g Release 1 (11.1.0.x.0).
A direct migration with a full database Export Data Pump job and full database Import Data Pump job is supported from the following versions of the source database.

Migration with Data Pump to 11gR1 (11.1.0.x.0) if source is:
============================================================
Oracle10g Release 1: 10.1.0.2.0 (10.1.0.5.0 recommended)
Oracle10g Release 2: 10.2.0.1.0 (10.2.0.4.0 recommended)

7. Examples

 

7.1. Import data into a target database with a higher compatibility level.

Sample configuration 1:

Source database: 10.1.0.5.0 with COMPATIBLE=10.1.0 
Target database: 10.2.0.4.0 with COMPATIBLE=10.2.0

Solution: 

  • Start the export job using the 10.1.0.5.0 Export Data Pump client that connects to the 10.1.0.5.0 source database.
  • Transfer the dumpfile set to the server where the target database is located.
  • Start the import of the data using the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database.
-- Step 1: export from 10.1.0.5 source database 
-- with 10.1.0.5 Export Data Pump client:

% expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME>

-- Step 2: transfer dumpfile to target server

-- Step 3: import into 10.2.0.4 target database  
-- with 10.2.0.4 Import Data Pump client: 

% impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Remarks:
1. Instead of the 10.1.0.5.0 Export Data Pump client, any other lower release Export Data Pump client can be used to start the export job, e.g. a 10.1.0.2.0 Export Data Pump client. However, this is not recommended. The dumpfile set has the same compatibility as the compatible value of the source database (10.1.0), and can be imported into any Oracle10g or Oracle11g database with a compatible value of 10.1.0 or higher. Instead of the 10.2.0.4.0 Import Data Pump client, any other lower release Import Data Pump client can be used to start the import job, e.g. a 10.1.0.5.0 Import Data Pump client that connects to the 10.2.0.4.0 target database. However, this is also not recommended.

2. Note that an alternative solution would be to create a database link (e.g.: my_dblink) from the 10.2.0.4.0 target database to the 10.1.0.5.0 source database and start the export job using the 10.2.0.4.0 Export Data Pump client that connects to the 10.2.0.4.0 target database with NETWORK_LINK=my_dblink VERSION=10.1 parameters (the VERSION parameter is not required here as it defaults to 10.1 (lowest compatible version)). The dumpfile will be written on the server where the target database is located, and can be imported with the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database. Example:

-- Step 1: create database link on local 10.2.0.4 target
-- database that connects to remote 10.1.0.5 source database:

CONNECT <LOGIN>/<PASSWORD>
CREATE DATABASE LINK my_dblink 
   CONNECT TO <USER_NAME> IDENTIFIED BY <PASSWORD>
   USING '<HOST_NAME>';

-- Step 2: export with 10.2.0.4 Export Data Pump client
-- from 10.1.0.5 source database over database link: 

% expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink

-- Step 3: import into 10.2.0.4 target database   
-- with 10.2.0.4 Import Data Pump client:  

% impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

3. It is also possible to import without any dumpfile set: create a database link (e.g.: my_dblink) from the 10.2.0.4.0 target database to the 10.1.0.5.0 source database and start the import job using the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database with NETWORK_LINK=my_dblink VERSION=10.1 parameters (the VERSION parameter is not required here as it defaults to 10.1 (lowest compatible version)). The data will be imported directly into the 10.2.0.4.0 target database. Example:

-- Step 1: create database link on local 10.2.0.4 target 
-- database that connects to remote 10.1.0.5 source database: 

CONNECT <LOGIN>/<PASSWORD> 
CREATE DATABASE LINK my_dblink  
   CONNECT TO <USERNAME> IDENTIFIED BY <PASSWORD> 
   USING '<HOSTNAME>'; 

-- Step 2: import with 10.2.0.4 Import Data Pump client 
-- from 10.1.0.5 source database over database link:  

% impdp <LOGIN>/<PASSWORD> LOGFILE=my_dir:impdp_s.log \
SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink


7.2. Import data into a target database with a lower compatibility level.

Sample configuration 2: 

Source database: 11.1.0.6.0 with COMPATIBLE=11.1.0 
Target database: 10.2.0.4.0 with COMPATIBLE=10.2.0

Solution:

  • Start the export job using the 11.1.0.6.0 Export Data Pump client that connects to the 11.1.0.6.0 source database. Use the parameter VERSION=10.2 to create a dumpfile set with a lower compatibility level.
  • Transfer the dumpfile set to the server where the target database is located.
  • Start the import of the data using the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database.
-- Step 1: export from 11.1.0.6 source database  
-- with 11.1.0.6 Export Data Pump client and specify VERSION: 

% expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> VERSION=10.2

-- Step 2: transfer dumpfile to target server 

-- Step 3: import into 10.2.0.4 target database   
-- with 10.2.0.4 Import Data Pump client:  

% impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Remarks:
1. Instead of the 11.1.0.6.0 Export Data Pump client, any other lower release Export Data Pump client can be used to start the export job, e.g. a 10.1.0.2.0 Export Data Pump client. However, this is not recommended. The dumpfile set has the same compatibility as the value specified with the VERSION parameter (10.2), and can be imported into any Oracle10g or Oracle11g database with a compatible value of 10.2.0 or higher. Instead of the 10.2.0.4.0 Import Data Pump client, any other lower release Import Data Pump client can be used to start the import job, e.g. a 10.1.0.5.0 Import Data Pump client that connects to the 10.2.0.4.0 target database. However, this is also not recommended.

2. Note that an alternative solution would be to create a database link (e.g.: my_dblink) from the 10.2.0.4.0 target database to the 11.1.0.6.0 source database and start the export job using the 10.2.0.4.0 Export Data Pump client that connects to the 10.2.0.4.0 target database with NETWORK_LINK=my_dblink VERSION=10.2 parameters. The VERSION parameter is required here to workaround two defects;  see also section 8.9. "ORA-39022 (Database version xx.yy.zz is not supported)" below. The dumpfile will be written on the server where the target database is located, and can be imported with the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database. See previous section for a detailed example of the statements.

3. It is also possible to import without any dumpfile set: create a database link (e.g.: my_dblink) from the 10.2.0.4.0 target database to the 11.1.0.6.0 source database and start the import job using the 10.2.0.4.0 Import Data Pump client that connects to the 10.2.0.4.0 target database with NETWORK_LINK=my_dblink VERSION=10.2 parameters. The VERSION parameter is required here to workaround two defects; see also section 8.9. "ORA-39022 (Database version xx.yy.zz is not supported)" below. The data will be imported directly into the 10.2.0.4.0 target database. See previous section for a detailed example of the statements.

4. Note that due to several defects, a Data Pump job with NETWORK_LINK parameter will fail if the remote source database has a higher compatibility level. These defects are fixed in 10.2.0.3.0 and higher, and as a result of that, it is only possible to transfer data over a database link that connects to a remote source database with a higher compatibility level if the release version of the local database is 10.2.0.3.0 or higher. In addition the VERSION parameter has to be specified to workaround two of these defects. See also section 8.9. "ORA-39022 (Database version xx.yy.zz is not supported)." below.


7.3. Export data from a read-only database.

Sample configuration 3: 

Source database: 10.2.0.4.0 with COMPATIBLE=10.2.0 which is READ-ONLY
Target database: 10.1.0.5.0 with COMPATIBLE=10.1.0 
Local database : 11.1.0.6.0 with COMPATIBLE=11.1.0

Solution:

  • Create a database link (e.g.: my_dblink) from the local 11.1.0.6.0 database to the 10.2.0.4.0 read-only source database.
  • Start the export job using the 11.1.0.6.0 Export Data Pump client that connects to the 11.1.0.6.0 local database. Use the parameters NETWORK_LINK=my_dblink VERSION=10.1 to create a dumpfile set with a lower compatibility level.
  • Transfer the dumpfile set to the server where the target database is located.
  • Start the import of the data using the 10.1.0.5.0 Import Data Pump client that connects to the 10.1.0.5.0 target database.
-- Step 1: create database link on local 11.1.0.6 database
-- that connects to remote 10.2.0.4 read-only source database: 

CONNECT <LOGIN>/<PASSWORD> 
CREATE DATABASE LINK my_dblink  
   CONNECT TO <USERNAME> IDENTIFIED BY <PASSWORD>
   USING '<HOSTNAME>'; 

-- Step 2: export with 11.1.0.6 Export Data Pump client 
-- from 10.2.0.4 read-only source database over database link:  

% expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink \
VERSION=10.1

-- Step 3: transfer dumpfile to target server 

-- Step 4: import into 10.1.0.5 target database    
-- with 10.1.0.5 Import Data Pump client:   

% impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \   
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Remarks:
1. Instead of the 11.1.0.6.0 local database, any other 10.2.0.4.0 or higher release database with a database link to the read-only source database, can be used as an interim database on which the export job is initiated with NETWORK_LINK and VERSION=10.1.0 parameters.

2. Note that it is not possible to start the export job by connecting directly to the 10.2.0.4.0 source database as we cannot create the internal Data Pump master table for this job because the source database is read-only.

3. Also note that a Data Pump job with NETWORK_LINK and VERSION=10.1.0 that is started on the 10.1.0.5.0 target database is not possible as the remote read-only source database has a higher compatibility level (10.2.0) and Bug 5115034 "NETWORK INTEROPERABILITY WON'T WORK FOR VERSIONS MORE THAN A MINOR RELEASE APART" (not a public bug) is not fixed in 10.1.0.5.

8. Possible Error Messages

8.1. UDE-18 (Data Pump client is incompatible with database version xx.yy).

%expdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir \ 
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME>

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 14:46:22
Copyright (c) 2003, 2005, Oracle.  All rights reserved. 
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production 
With the Partitioning, OLAP and Data Mining options 

UDE-00018: Export utility version can not be more recent than the Data Pump server. 

or: 

%expdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir \ 
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME>

Export: Release 11.1.0.6.0 - Production on Tuesday, 26 February, 2008 14:49:43 
Copyright (c) 2003, 2007, Oracle.  All rights reserved.  

UDE-00018: Data Pump client is incompatible with database version 10.2.0.3.0

Cause: An attempt was made to connect with a higher version Export Data Pump client (e.g. 11.1.0.6.0) to a lower version database (e.g. 10.2.0.3.0), where the Export Data Pump client used Oracle Net to connect to the lower version database. This is not supported. 
Action: Use the Export Data Pump client that matches the database version of the source database (i.e.: 10.2.0.3.0 in this case, or any other lower release Export Data Pump client).

8.2. UDI-18 (Data Pump client is incompatible with database version xx.yy).

%impdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir \ 
DUMPFILE=expdp_s.dmp LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Import: Release 11.1.0.6.0 - Production on Tuesday, 26 February, 2008 14:56:16
Copyright (c) 2003, 2007, Oracle.  All rights reserved. 

UDI-00018: Data Pump client is incompatible with database version 10.2.0.3.0

Cause: An attempt was made to connect with a higher version Import Data Pump client (e.g. 11.1.0.6.0) to a lower version database (e.g. 10.2.0.3.0), where the Import Data Pump client used Oracle Net to connect to the lower version database. This is not supported. 
Action: Use the Import Data Pump client that matches the database version of the target database (i.e.: 10.2.0.3.0 in this case, or any other lower release Import Data Pump client).

8.3. ORA-39142 (Incompatible version number xx.yy in dump file "zzz").

%impdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Import: Release 10.1.0.2.0 - Production on Tuesday, 26 February, 2008 15:02:16
Copyright (c) 2003, Oracle.  All rights reserved. 
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production 
With the Partitioning, OLAP and Data Mining options 

ORA-39001: invalid argument value 
ORA-39000: bad dump file specification 
ORA-31619: invalid dump file "/expdp/expdp_s.dmp" 

or: 

%impdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Import: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 15:07:27
Copyright (c) 2003, 2005, Oracle.  All rights reserved. 
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production 
With the Partitioning, OLAP and Data Mining options 

ORA-39001: invalid argument value 
ORA-39000: bad dump file specification 
ORA-39142: incompatible version number 2.1 in dump file "/expdp/expdp_s.dmp"

Cause: An attempt was made to import an Export Data Pump dumpfile into a lower version target database (e.g. 10.2.0.3.0) while the export dumpfile was created from a higher version source database (e.g. 11.1.0.6.0) which had a higher compatibility level (e.g. COMPATIBLE=11.1.0), and the Export Data Pump job was down without the VERSION parameter. 
Action: Re-create the export dumpfile from the source database (11.1.0.6.0) with the Export Data Pump client (11.1.0.6.0) and the VERSION parameter (VERSION=10.2.0). Re-run the import with the Import Data Pump client (10.2.0.3.0) into the target database (10.2.0.3.0). If the source database does not exist anymore, this situation can be worked around by importing the data into a temporary database (11.1.0.6.0 or higher), and re-create the dumpfile set with the correct VERSION parameter (i.e. VERSION=10.2.0).

8.4. ORA-39083 (Object type PROCACT_SCHEMA failed to create with error).

%impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME>

Import: Release 11.1.0.6.0 - Production on Tuesday, 26 February, 2008 15:09:24
Copyright (c) 2003, 2007, Oracle.  All rights reserved. 
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production 
With the Partitioning, OLAP, Data Mining and Real Application Testing options 
Master table "<LOGIN_SCHEMA>"."SYS_IMPORT_FULL_01" successfully loaded/unloaded 
Starting "<LOGIN_SCHEMA>"."SYS_IMPORT_FULL_01":  <LOGIN>/******** DIRECTORY=my_dir 
DUMPFILE=expdp_s.dmp LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME> 
Processing object type SCHEMA_EXPORT/USER 
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT 
Processing object type SCHEMA_EXPORT/ROLE_GRANT 
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE 
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA 
ORA-39083: Object type PROCACT_SCHEMA failed to create with error:  
ORA-06550: line 2, column 1: 
PLS-00201: identifier 'DBMS_CUBE_EXP.SCHEMA_INFO_IMP_BEG' must be declared  
ORA-06550: line 2, column 1: PL/SQL: Statement ignored  
ORA-06550: line 3, column 1:  
PLS-00201: identifier 'DBMS_CUBE_EXP.SCHEMA_INFO_IMP_LOOP' must be declared 
ORA-06550: line 3, column 1: PL/SQL: Statement ignored 
ORA-06550: line 7, column 1: 
PLS-00201: identifier 'DBMS_CUBE_EXP.SCHEMA_INFO_IMP_END' must be declared  
ORA-06550: line 7, c 
Processing object type SCHEMA_EXPORT/TABLE/TABLE 
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA 
...

Cause: An attempt was made to import an Export Data Pump dumpfile back into an 11.1.0.6.0 database, but the 11gR1 source database had COMPATIBLE=11.1.0 and the 11gR2 target database had the COMPATIBLE=10.2.0 
Action: Re-create the export dumpfile from the source database (11.1.0.6.0) with the Export Data Pump client (11.1.0.6.0) and the VERSION parameter (VERSION=10.2.0). Re-run the import with the Import Data Pump client (11.1.0.6.0) into the target database (11.1.0.6.0).

8.5. ORA-39023 (Version xx.yy.zz is not supported).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> VERSION=11.1.0 

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 15:11:09
Copyright (c) 2003, 2005, Oracle.  All rights reserved. 
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production 
With the Partitioning, OLAP and Data Mining options 

ORA-39001: invalid argument value 
ORA-39023: Version 11.1.0 is not supported.

Cause: An attempt was made to specify a dumpfile set version (e.g. VERSION=11.1.0) and this value was higher than the compatible level of the source database (e.g.: 10.2.0.3.0). Note that a similar error would be reported, if the version that is specified is too low (lower than 9.2.0). 
Action: Re-run the job and specify a value for the VERSION parameter that matches the compatible level of the target database (e.g. VERSION=10.1.0) or run the job with the default value (VERSION=compatible) which will correspond to the compatibility level of the source database.

8.6. ORA-39055 (The AAA feature is not supported in version xx.yy.zz).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \ 
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink \ 
COMPRESSION=all 

Export: Release 11.1.0.6.0 - Production on Tuesday, 26 February, 2008 15:26:43
Copyright (c) 2003, 2007, Oracle.  All rights reserved. 
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production 
With the Partitioning, OLAP, Data Mining and Real Application Testing options 
ORA-39005: inconsistent arguments 
ORA-39055: The COMPRESSION feature is not supported in version 10.1.0.5.0.

Cause: An attempt was made to specify an Export Data Pump parameter (COMPRESSION=all) that was not supported in the earlier version of the dumpfile set (e.g.: an Export Data Pump job with NETWORK_LINK is started on a local 11.1.0.6 database and the database link connects to a remote 10.1.0.5.0 source database). 
Action: Re-run the export Data Pump job and omit the parameter that was introduced in the later release.

8.7. LRM-101 (unknown parameter name 'aaa').

%expdp <LOGIN>/<PASSWORD>@<SERVICE_NAME> DIRECTORY=my_dir \ 
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> COMPRESSION=all 

LRM-00101: unknown parameter name 'COMPRESSION' 

or: 

LRM-00121: 'all' is not an allowable value for 'COMPRESSION'

Cause: An attempt was made to specify an Export Data Pump parameter or parameter value that is not known be that specific Data Pump client release. E.g.: the value 'all' for parameter COMPRESSION was introduced in Oracle11g and is not known by a 10.2.0.3.0 Export Data Pump client. 
Action: Re-run the export Data Pump job and omit the parameter that was introduced in the later release. In this specific case a 10.2.0.3.0 Export Data Pump client was used to export from a 11.1.0.6.0 source database. Therefore, an alternative solution would be to re-run the export with the 11.1.0.6.0 Export Data Pump client.

8.8. ORA-39143 (dump file xxx may be an original export dump file).

%impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=exp_s.dmp \  
LOGFILE=impdp_s.log SCHEMAS=<SCHEMA_NAME> 

Import: Release 11.1.0.6.0 - Production on Tuesday, 26 February, 2008 15:33:48
Copyright (c) 2003, 2007, Oracle.  All rights reserved.  
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production  
With the Partitioning, OLAP, Data Mining and Real Application Testing options  

ORA-39001: invalid argument value  
ORA-39000: bad dump file specification  
ORA-39143: dump file "D:\exp_s.dmp" may be an original export dump file

Cause: An attempt was made to use the Import Data Pump client (impdp) to import a dumpfile that was created with the original Export client (exp). 
Action: Re-run the import with the original Import client (imp), or re-run the import Data Pump job (impdp) after re-creating the dumpfile on the source database with the Export Data Pump client (expdp).

8.9. ORA-39022 (Database version xx.yy.zz is not supported).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink

-- when expdp/impdp is connected to a local 10.1.0.x database
...
ORA-39006: internal error  
ORA-39022: Database version 11.1.0.6.0 is not supported.  


-- when expdp/impdp is connected to a local 10.2.0.1 database 
...  
ORA-39001: invalid argument value  
ORA-39169: Local version of 10.2.0.1.0 cannot work with remote version  
of 11.1.0.6.0.  


-- when expdp/impdp is connected to a local 10.2.0.3 database  
...  
ORA-39127: unexpected error from call to export_string   
:=SYS.DBMS_CUBE_EXP.schema_info_exp('<SCHEMA>',1,1,'11.01.00.00.00',newblock)  
ORA-37118: The OLAP API library was not preloaded.  
ORA-06512: at "SYS.DBMS_CUBE_EXP", line 205  
ORA-06512: at "SYS.DBMS_CUBE_EXP", line 280  
ORA-06512: at line 1  
ORA-06512: at "SYS.DBMS_METADATA", line 5980  
ORA-39125: Worker unexpected fatal error in KUPW$WORKER.UNLOAD_METADATA   
while calling DBMS_METADATA.FETCH_XML_CLOB [TABLE_STATISTICS]  
ORA-06502: PL/SQL: numeric or value error: invalid LOB locator specified:   
ORA-22275   
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105  
ORA-06512: at "SYS.KUPW$WORKER", line 6234
...

-- when expdp/impdp is connected to a local 10.2.0.4.0 database
...
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.MAIN [INSERT 
INTO sys.ku$_list_filter_temp@my_dblink1 SELECT process_order, duplicate,
object_name, base_process_order FROM  "<LOGIN_SCHEMA>"."SYS_EXPORT_SCHEMA_01"    
WHERE process_order = :1]  
ORA-00947: not enough values  
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95  
ORA-06512: at "SYS.KUPW$WORKER", line 6345
...

Cause: An attempt was made to export data over a database link that connects to a database with a higher compatibility level (e.g.: an Export Data Pump job with NETWORK_LINK is started on a local 10.x database and the database link connects to a remote 11.x source database). If the local connecting database is 10.2.0.4 or below, then the Data Pump job may fail due to several defects:
- Bug 5115034 "NETWORK INTEROPERABILITY WON'T WORK FOR VERSIONS MORE THAN A MINOR RELEASE APART" (not a public bug), fixed in 10.2.0.3.0 and 11.1.0.6.0
- Bug 5442398 "VERSIONS NEED TO MINIMIZED BETWEEN SIDES OF A NETWORK JOB" (not a public bug), fixed in 10.2.0.4.0 and 11.1.0.6.0
- Bug 6936831 "10.2.0.4 DATA PUMP WITH NETWORK_LINK TO ORACLE11G FAILS WITH ORA-947", not fixed yet.
The errors ORA-39006 and ORA-39022 will be reported if a Data Pump job with NETWORK_LINK is started on a local 10.1.x database and the database link connects to a remote 10.2.x or 11.x source database. 
The errors ORA-39001 and ORA-39169 will be reported if a Data Pump job with NETWORK_LINK is started on a local 10.2.0.1 or 10.2.0.2 database and the database link connects to a remote 11.x source database (even if the remote Oracle11g source database has compatibility set to 10.2). 
The errors ORA-39127 and ORA-37118 will be reported if a Data Pump job with NETWORK_LINK is started on a local 10.2.0.3 database and the database link connects to a remote 11.x source database.
The errors ORA-39126 and ORA-947 will be reported if a Data Pump job with NETWORK_LINK is started on a local 10.2.0.4 database and the database link connects to a remote 11.x source database. 
Action: Re-run the Export Data Pump job without the NETWORK_LINK parameter while connecting to the remote source database (if needed use Oracle Net), create an Export Data Pump dumpfile set that is written on the remote server, and transfer the dumpfile set to the local server. Use the Export Data Pump parameter VERSION in case the target database has a lower compatibility level than the source database.
Or alternatively, make use of a local connecting database which is a 10.2.0.3.0 or higher release database and explicitly use the Data Pump parameter VERSION=10.2 for this kind of job (only a valid workaround if the release of the local connecting database is 10.2.0.3.0 or 10.2.0.4.0).

8.10. ORA-39139 (Data Pump does not support XMLSchema objects. TABLE_DATA:"aaa"."bbb" will be skipped).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME>  

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 16:12:21
Copyright (c) 2003, 2005, Oracle. All rights reserved.  
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production  
With the Partitioning, OLAP and Data Mining options  
FLASHBACK automatically enabled to preserve database integrity.  
Starting "<LOGIN_SCHEMA>"."SYS_EXPORT_SCHEMA_01": <LOGIN>/******** DIRECTORY=my_dir  
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> 
Estimate in progress using BLOCKS method...  
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA  
ORA-39139: Data Pump does not support XMLSchema objects. 
TABLE_DATA:"OE"."PURCHASEORDER" will be skipped.  
Total estimation using BLOCKS method: 4.937 MB  
...

Cause: An attempt was made to export a table with an XMLSchema object. This is not supported in Oracle10g Release 2 and below. 
Action: Upgrade to Oracle11g (11.1.0.6.0 and higher), or use the original Export and Import clients when exporting a table with an XMLSchema object.

8.11. ORA-31679 (Table data object "aaa"."bbb" has long columns, and longs can not be loaded/unloaded using a network link).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink  

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 16:14:11
Copyright (c) 2003, 2005, Oracle.  All rights reserved.  
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production  
With the Partitioning, OLAP and Data Mining options  
Starting "<LOGIN_SCHEMA>"."SYS_EXPORT_TABLE_01":  <LOGIN>/******** DIRECTORY=my_dir  
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink  
...  
ORA-31679: Table data object "<SCHEMA_NAME>"."MYTAB" has long columns, and longs   
can not be loaded/unloaded using a network link  
...  

or:  

ORA-31693: Table data object "<SCHEMA_NAME>"."MYTAB" failed to load/unload and   
is being skipped due to error:   
ORA-00997: illegal use of LONG datatype

Cause: An attempt was made to export a table with a LONG column over a database link. This is not supported as you cannot create a table while selecting from a different table which has a LONG column. Reference: 
Bug 3369197 "DPIMP : NETWORK MODE HR.PER_ADDRESSES ORA-997: ILLEGAL USE OF LONG DATATYPE" (not a public bug). 
Action: Re-run the Export Data Pump job while connecting to the remote source database (if needed use Oracle Net), create an Export Data Pump dumpfile set that is written on the remote server, and transfer the dumpfile to the local server.

8.12. ORA-22804 (remote operations not permitted on object tables or user-defined type columns).

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \  
LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink  

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 16:37:42
Copyright (c) 2003, 2005, Oracle.  All rights reserved.  
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production  
With the Partitioning, OLAP and Data Mining options  
FLASHBACK automatically enabled to preserve database integrity.  
Starting "<LOGIN_SCHEMA>"."SYS_EXPORT_FULL_01":  <LOGIN>/******** DIRECTORY=my_dir  
DUMPFILE=expdp_s.dmp LOGFILE=expdp_s.log SCHEMAS=<SCHEMA_NAME> NETWORK_LINK=my_dblink  
...  
ORA-31693: Table data object "<SCHEMA_NAME>"."MGMT_JOB_CRED_PARAMS" failed to   
load/unload and is being skipped due to error:  
ORA-22804: remote operations not permitted on object tables or user-  
defined type columns  
...

Cause: An attempt was made to export a table (SYSMAN.MGMT_JOB_CRED_PARAMS) with an object type column (SYSMAN.MGMT_JOB_VECTOR_PARAMS) over a database link. This is not supported: you cannot access a table that contains a user defined type across a database link. Reference: 
Bug 3067966 "EXPDP FULL=Y WITH NETWORK LINK GIVES ORA-22804" (not a public bug). 
Action: Re-run the Export Data Pump job while connecting to the remote source database (if needed use Oracle Net), create an Export Data Pump dumpfile set that is written on the remote server, and transfer the dumpfile to the local server.

8.13. ORA-39203 (Partition selection is not supported over a network link). 

%expdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \   
LOGFILE=expdp_s.log TABLES=<SCHEMA_NAME>.<TABLE_NAME>:<PARTITION_NAME> NETWORK_LINK=my_dblink   

Export: Release 10.2.0.3.0 - Production on Tuesday, 26 February, 2008 17:44:26
Copyright (c) 2003, 2005, Oracle. All rights reserved.   
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production   
With the Partitioning, OLAP and Data Mining options   
ORA-39001: invalid argument value   
ORA-39203: Partition selection is not supported over a network link.

Cause: An attempt was made to export a specific partition of a table over a database link. This is not supported as only whole tables can be exported (not partitions of tables). Note that in older releases a different error can be reported: ORA-14100 (partition extended table name cannot refer to a remote object). This also explains the main reason why partition names cannot be specified for the TABLES parameter when used in conjunction with the NETWORK_LINK parameter: ORA-14100 will be reported for a select statement like: SELECT * FROM tc.ptab@my_dblink PARTITION (p1);
Action: Either re-run the Export Data Pump job and specify the complete table, or re-run the Export Data Pump job while connecting to the remote source database (if needed use Oracle Net), create an Export Data Pump dumpfile set that is written on the remote server, and transfer the dumpfile to the local server.

Note: Starting with 11.2, it is possible to perform a network mode import of an individual partition.

 

8.14. ORA-904 (invalid identifier ORIGINAL_OBJECT_NAME)

%impdp <LOGIN>/<PASSWORD> DIRECTORY=my_dir DUMPFILE=expdp_s.dmp \    
LOGFILE=imdp_s.log SCHEMAS=<SCHEMA>

Import: Release 11.1.0.6.0 - Production on Mittwoch, 28 November, 2008 8:47:41 
Copyright (c) 2003, 2007, Oracle.  All rights reserved. 
Connected to: Oracle Database 11g Release 11.1.0.6.0 - Production 

ORA-39002: invalid operation 
ORA-31694: master table "<LOGIN_SCHEMA>"."SYS_IMPORT_SCHEMA_01" failed to load/unload 
ORA-02354: error in exporting/importing data 
ORA-02373: Error parsing insert statement for table 
"<LOGIN_SCHEMA>"."SYS_IMPORT_SCHEMA_01". 
ORA-00904: "ORIGINAL_OBJECT_NAME": invalid identifier

Cause: An attempt was made to import a Data Pump dumpfile that was created from an 11.1.0.7.0 source database into an 11.1.0.6.0 target database. Reference of defect: 
Bug:7590679 "11.1.0.6 IMPDP OF A 11.1.0.7 DUMP FAILS WITH ORA-904 ORIGINAL_OBJECT_NAME". 
Action: Re-run the export from the 11.1.0.7.0 source database and specify VERSION=10.2 so we create a lower version dumpfile (i.e.: 10.2.x compatible). Note that this export job won't export objects that are 11.1 specific (see also section 4.3 above).  Another workaround would be to apply the 11.1.0.7.0 patchset also on the target database. See also: 
Note:752374.1 "11.1.0.6 Datapump Import of an 11.1.0.7 Dump Fails With ORA-904 "Original_object_name"

8.15. ORA-39358  (Export dump file version 18.1.0 not compatible with target version 12.2.0.0)

bash-4.1$ impdp <LOGIN>/<PASSWORD> directory=dpdir dumpfile=tc.dmp full=yes

Import: Release 12.2.0.1.0 - Production on Tue Jul 31 02:14:25 2018

Copyright (c) 1982, 2018, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production
ORA-39002: invalid operation
ORA-39358: Export dump file version 18.1.0 not compatible with target version 12.2.0.0

Cause: An attempt was made to import a Data Pump dumpfile that was created from an 18.1.0.0.0 source database into an 12.2.0 target database.
Action:Re-run the export from the 18.1.0.0 source database and specify VERSION=12.2 so we create a lower version dumpfile (i.e.: 12.2.x compatible).

9. Known Defects

9.1. Bug 5115034 - ORA-39022 or ORA-39169 when exporting higher release over NETWORK_LINK
Defect:  Bug 5115034 "NETWORK INTEROPERABILITY WON'T WORK FOR VERSIONS MORE THAN A MINOR RELEASE APART" (not a public bug)
Symptoms:  Data Pump from local 10.1.0.x with NETWORK_LINK to higher release source database aborts with ORA-39022 (Database version is not supported) or Data Pump from local 10.2.0.x with NETWORK_LINK to higher release source database aborts with ORA-39169 (Local version cannot work with remote version). 
Releases:  10.1.0.x and 10.2.0.2.0 and below (local connecting database).
Fixed in:  10.2.0.3.0 and higher  (fix is on local connecting database).
Patched files:  (in patchset)
Workaround:  none (run the Export Data Pump job directly on the remote source database).
Cause:  internal problem in version checking that showed up with availability of first maintenance release (10.2).
Trace:  not applicabe (errors reported in expdp session).
Remarks:  only reproduces when database link connects to a higher release database.

9.2. Bug 6630677 - IMPDP may skip table if export was done with NETWORK_LINK
Defect:  Bug:6630677 "IMPDP skips table exported just before table with LONG column using NETWORK_LINK" 
Symptoms:  if an export dumpfile set is created by expdp over a database link, and the export correctly skips a table which contains a LONG column, then on import from that dumpfile set the importing session does not load the table which was exported PRIOR to the error. 
Releases:  10.1.0.x and 10.2.0.4.0 and lower.
Fixed in:  10.2.0.5.0 and higher; generic fix available for 10.2.0.3.0 with Patch:6630677 
Patched files:  prvtbpw.plb 
Workaround:  explicitly exclude tables with LONG column from the Export Data Pump job (parameter EXCLUDE), or re-run an Export Data Pump job for the missing table. 
Cause:  the export of both the prior table and the table with LONG column were rolled back.
Trace:  the Import Data Pump logfile does not show line with import of missing table.
Remarks:  only reproduces if table was exported with NETWORK_LINK parameter.

9.3. Bug 6749942 - ORA-942 when exporting 10.2.0.3 with 11.1.0.6.0 expdp client over NETWORK_LINK

...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE  
ORA-39126: Worker unexpected fatal error in   
KUPW$WORKER.CREATE_OBJECT_ROWS [PROCACT_SCHEMA]   
TABLE:"<SCHEMA>"."LOGSTDBY$APPLY_PROGRESS"  
ORA-00942: table or view does not exist  
ORA-02063: preceding line from MY_DBLINK  
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105  
ORA-06512: at "SYS.KUPW$WORKER", line 7715  
----- PL/SQL Call Stack -----  
  object      line  object  
  handle    number  name  
2877E134     18051  package body SYS.KUPW$WORKER  
2877E134      7736  package body SYS.KUPW$WORKER  
2877E134      6945  package body SYS.KUPW$WORKER  
2877E134      2519  package body SYS.KUPW$WORKER  
2877E134      8342  package body SYS.KUPW$WORKER  
2877E134      1477  package body SYS.KUPW$WORKER  
286B2130         2  anonymous block  

Job "<LOGIN_SCHEMA>"."SYS_EXPORT_FULL_01" stopped due to fatal error at 13:52:15

Defect:  Bug:6749942 "ATTEMPTING TO EXPORT 10.2.0.3 DATABASE FAILS WHEN USING 11.1.0.6 DATA PUMP" 
Symptoms:  export Data Pump 11.1.0.6 with NETWORK_LINK aborts with ORA-942 (table or view does not exist) when exporting an earlier release database over a database link.
Releases:  11.1.0.6.0 (local connecting database)
Fixed in:  11.1.0.7.0 and higher (fix is on local connecting database).
Patched files:  prvtbpw.plb 
Workaround:  none (run the Export Data Pump job directly on the remote source database).
Cause:  the export tried to access two views that were introduced in Oracle11g: sys.ku$_tab_subname_view and sys.ku$_ind_subname_view
Trace:  error stack showing ORA-942 and ORA-2063 
Remarks:  only reproduces if remote source database has table with partition/subpartitions.

9.4. Bug 7489698 - Schema Export with 10.2.0.4.0 expdp client from 10.2.0.3.0 or lower source fails
Defect:  Bug:7489698 "EXPDP 10204 CAN'T EXPORT FROM 10203 DB WITHOUT USING FULL=Y" 
Symptoms:  if a schema level Export Data Pump job is started and connects to a 10.2.0.3.0 or any lower release source database using Oracle Net, then the Export Data Pump job is aborted without any errors before the job can be created on the source database.
Releases:  10.2.0.4.0 expdp client that connects to a 10.2.0.3.0 and lower release source database. 
Fixed in:  10.2.0.5.0
Patched files:  Export Data Pump client (expdp).
Workaround:  create a database link from 10.2.0.4.0 to lower release database, and run the expdp job on the 10.2.0.4.0 local database while specifying the NETWORK_LINK parameter to the remote source database. 
Cause:  problem is introduced with the fix for Bug:5714205 "EXPDP / IMPDP do not support long lists for parameters" (fixed in 10.2.0.4.0 and 11.1.0.6.0). 
Trace:  the Master trace file shows that DBMS_DATAPUMP.SET_PARAMETER is called, after which the job aborts with DBMS_DATAPUMP.STOP_JOB and the reason for this: "dropping master since job never started". 
Remarks:  problem is specific for the 10.2.0.4.0 expdp client.

@ 10. For Support: Enhancement Requests.

 

Additional Resources

Community: Database Utilities

REFERENCES



NOTE:291024.1 - Compatibility and New Features when Transporting Tablespaces with Export and Import
BUG:6788714 - IS DATA PUMP EXPORT CERTIFIED TO BE BACKWARD COMPATIBLE ?
BUG:6936831 - 10.2.0.4 DATA PUMP WITH NETWORK_LINK TO ORACLE11G FAILS WITH ORA-947
BUG:7489698 - EXPDP 10204 CAN'T EXPORT FROM 10203 DB WITHOUT USING FULL=Y
NOTE:2457955.1 - 19c DataPump New Features
NOTE:454507.1 - ALERT: Oracle 11g Release 1 (11.1) Support Status and Alerts

BUG:6630677 - IMPDP SKIPS TABLE EXPORTED JUST BEFORE TABLE WITH LONG COLUMN USING NETWORK_LINK
NOTE:2249901.1 - Data Pump: Oracle Database 12c Release 2 (12.2) New Features 
NOTE:132904.1 - Compatibility Matrix for Export And Import Between Different Oracle Versions [Video]

NOTE:207303.1 - Client / Server Interoperability Support Matrix for Different Oracle Versions
BUG:7590679 - 11.1.0.6 IMPDP OF A 11.1.0.7 DUMP FAILS WITH ORA-904 ORIGINAL_OBJECT_NAME
NOTE:752374.1 - 11.1.0.6 DataPump Import Of An 11.1.0.7 Dump Fails With Error ORA-904 ORIGINAL_OBJECT_NAME
NOTE:30699.1 - Init.ora Parameter "COMPATIBLE" Reference Note


NOTE:341733.1 - Export/Import DataPump Parameters INCLUDE and EXCLUDE - How to Load and Unload Specific Objects
NOTE:462488.1 - How to Gather the Header Information and the Content of an Export Dumpfile ?
BUG:6749942 - ATTEMPTING TO EXPORT 10.2.0.3 DATABASE FAILS WHEN USING 11.1.0.6 DATA PUMP

NOTE:2422236.1 - Alert - Regression in DataPump After Applying 12.1.0.2.180417DBBP or Higher

BUG:5714205 - DATAPUMP IS NOT UPWARD COMPATIBLE TO EXP
NOTE:351598.1 - Export/Import DataPump: The Minimum Requirements to Use Export DataPump and Import DataPump (System Privileges)
NOTE:175627.1 - How to Find the Database and Export/Import Version

NOTE:316900.1 - ALERT: Oracle 10g Release 2 (10.2) Support Status and Alerts
NOTE:39691.1 - VERSION NUMBER - Oracle version numbers explained

NOTE:263719.1 - ALERT: Oracle 10g Release 1 (10.1) Support Status and Alerts