Subscribe Would you like to get notified when the next post is published? Email Address: Subscribe. Follow Following. Database Heartbeat Join other followers. Sign me up. Summary : in this tutorial, you will learn how to use the Oracle Data Pump Import to load an export dump file set into a target Oracle Database system. The Data Pump Import program is a tool that allows you to load an export dump file set into a target Oracle database system.
Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. How to import an Oracle database from dmp file and log file? Ask Question. Asked 10 years, 6 months ago. Active 1 year, 1 month ago. Viewed k times. Please help me to solve this issue. I'm a complete newbie on Oracle Improve this question. When those objects are recreated at import time, Data Pump generates the IM column store clause that matches the setting for those objects at export time.
If N is specified on import, then Data Pump drops the IM column store clause from all objects that have one. If there is no IM column store clause for an object that is stored in a tablespace, then the object inherits the IM column store clause from the tablespace. The object then inherits the IM column store clause from the new pre-created tablespace.
This transform is useful when you want to override the IM column store clause for an object in the dump file. Alternatively you can put parameters in a parameter file. Quotation marks in the parameter file are maintained during processing. See Oracle Database Reference for a listing and description of parameters that can be specified in an IM column store clause. Specifying this transform changes LOB storage for all tables in the job, including tables that provide storage for materialized views.
If Y the default value is specified on import, then the exported OIDs are assigned to new object tables and types. Oracle Data Pump also performs OID checking when looking for an existing matching type on the target database.
The assignment of the exported OID during the creation of new object tables and types is inhibited. Instead, a new OID is assigned. Inhibiting assignment of exported OIDs can be useful for cloning schemas, but does not affect referenced objects. Before loading data for a table associated with a type, Oracle Data Pump skips normal type OID checking when looking for an existing matching type on the target database. Other checks using a hash code for a type, version number, and type name are still performed.
If set to Y , it directs Oracle Data Pump to suppress column encryption clauses. Columns which were encrypted in the source database are not encrypted in imported tables. If set to N the default , it directs Oracle Data Pump to create column encryption clauses, as in the source database. The value supplied for this transform must be a number greater than zero.
It represents the percentage multiplier used to alter extent allocations and the size of data files. If the value is specified as Y , then segment attributes physical attributes, storage attributes, tablespaces, and logging are included, with appropriate DDL.
The default is Y. Set this parameter to N to use the default segment creation attributes for the tables being loaded. This functionality is available with Oracle Database 11g release 2 If the value is specified as Y , then the storage clauses are included, with appropriate DDL.
If NONE is specified, then the table compression clause is omitted and the table is given the default compression for the tablespace. Tables are created with the specified compression. If the table compression clause is more than one word, then it must be contained in single or double quotation marks.
Also, your operating system can require you to enclose the clause in escape characters, such as the backslash character. For example:. Specifying this transform changes the type of compression for all tables in the job, including tables that provide storage for materialized views. For the following example, assume that you have exported the employees table in the hr schema. This results in the exclusion of segment attributes both storage and tablespace from the table.
The data files must already exist on the target database system. A question mark? You cannot use wildcards in the directory portions of the absolute path specification.
If a wildcard is used, then all matching files must be part of the transport set. If any files are found that are not part of the transport set, then an error is displayed, and the import job terminates. At some point before the import operation, you must copy the data files from the source system to the target system.
You can copy the data files by using any copy method supported by your operating system. If desired, you can rename the files when you copy them to the target system. See Example 2.
Depending on your operating system, the use of quotation marks when you specify a value for this parameter can also require that you use escape characters. Oracle recommends that you place this parameter in a parameter file, which can reduce the number of escape characters that you would otherwise be required to use on the command line. However, the file name portion of the absolute file path can contain wildcards.
Example 1. Example 2. This example illustrates the renaming of data files as part of a transportable tablespace export and import operation. Assume that you have a data file named employees. Using a method supported by your operating system, manually copy the data file named employees. As part of the copy operation, rename it to workers. The actual data was copied over to the target database in step 1. Perform a transportable tablespace import, specifying an absolute directory path for the data file named workers.
The metadata contained in tts. Example 3. Example 4. This example illustrates use of the question mark? For example, a file named myemp. Specifies whether to verify that the specified transportable tablespace set is being referenced by objects in other tablespaces.
The check addresses two-way dependencies. For example, if a table is inside the transportable set but its index is not, then a failure is returned and the import operation is terminated.
Similarly, a failure is also returned if an index is in the transportable set but the table is not. This check addresses a one-way dependency. For example, a table is not dependent on an index, but an index is dependent on a table, because an index without a table has no meaning.
Therefore, if the transportable set contains a table, but not its index, then this check succeeds. However, if the transportable set contains an index, but not the table, then the import operation is terminated. The example also assumes that a data file named tbs6.
Because this import is a transportable-mode import, the tablespaces into which the data is imported are automatically created by Data Pump.
You do not need to pre-create them. However, copy the data files to the target database before starting the import. There are no dump files involved in this situation.
Doing so would result in an error. When transportable jobs are performed, it is best practice to keep a copy of the data files on the source system until the import job has successfully completed on the target system. If the import job fails, then you still have uncorrupted copies of the data files.
The target database into which you are importing must be at the same or later release level as the source database. Oracle recommends that you place this parameter in a parameter file. If you use a parameter file, then that can reduce the number of escape characters that you have to use on a command line.
Suppose you have a parameter file, tablespaces. If transportable is not possible, then the job fails. In a table mode import, using the transportable option results in a transportable tablespace import in which only metadata for the specified tables, partitions, or subpartitions is imported. In a full mode import, using the transportable option results in a full transportable import in which metadata for all objects in the specified database is imported.
In both cases you must copy and possibly convert the actual data files to the target database in a separate operation. If the import job should fail for some reason, you will still have uncorrupted copies of the data files. NEVER : Instructs the import job to use either the direct path, or the external table method to load data, instead of the transportable option.
That is, all the metadata for the complete table are present, so the table definition looks the same on the target system as it did on the source. But only the data for the specified partitions are inserted into the table. Not reclaiming unused data segments reduces the time of the import operation. As a result of this setting, the tables containing TSTZ column data cannot be updated, and are dropped from the import.
In addition, data file bitmaps cannot be rebuilt. APIs or Classes. Storage for a single object cannot straddle the two kinds of tablespaces. If the source database is release Example of a Network Link Import. Example of a Full Transportable Import. The required data files are reported by the transportable tablespace export. Oracle Data Pump uses whichever of the following is earlier:.
Specifies the version of database objects that you want to be imported that is, only database objects and attributes that are compatible with the specified release will be imported. Note that this does not mean that Oracle Data Pump Import can be used with releases of Oracle Database earlier than This parameter can be used to load a target system whose Oracle Database is at an earlier compatibility release than that of the source system. When the VERSION parameter is set, database objects or attributes on the source system that are incompatible with the specified release are not moved to the target.
For example, tables containing new data types that are not supported in the specified release are not imported. Legal values for this parameter are as follows:. The version of the metadata corresponds to the database compatibility level.
Database compatibility must be set to 9. In the following example, assume that the target is an Oracle Database 12c Release 1 Oracle Data Pump imports a table with the same columns as the view and with row data fetched from the view.
Oracle Data Pump also imports objects dependent on the view, such as grants and constraints. Dependent objects that do not apply to tables for example, grants of the UNDER object privilege are not imported.
If either is used, then Oracle Data Pump performs a table-mode import. If a schema name is not supplied, it defaults to the user performing the import. The view must exist and it must be a relational view with only scalar, non-LOB columns. If you specify an invalid or non-existent view, the view is skipped and an error message is returned. By default, Oracle Data Pump automatically creates a temporary "template table" with the same columns and data types as the view, but no rows.
If the database is read-only, then this default creation of a template table fails. In such a case, you can specify a table name. The table must be in the same schema as the view. It must be a non-partitioned relational table with heap organization. It cannot be a nested table. If the import job contains multiple views with explicitly specified template tables, then the template tables must all be different.
For example, in the following job in which two views use the same template table , one of the views is skipped:. Template tables are automatically dropped after the import operation is completed. The following example performs a network import to import the contents of the view hr. The view is imported as a table named view1 with rows fetched from the view.
Specifies that you want to import one or more tables in the dump file that were exported as views. If either is used, Data Pump performs a table-mode import. The following example imports the table in the scott1.
Learn how to run Oracle Data Pump commands from an attached client, or from a terminal other than the one on which the job is running. This feature is useful in situations in which you start a job at one location, and must check it at a later time from a different location. The following table lists the activities that you can perform for the current job from the Oracle Data Pump Import prompt in interactive-command mode. Increase or decrease the number of active worker processes for the current job.
This command is valid only in Oracle Database Enterprise Edition. In logging mode, the job status is continually output to the terminal. Stops the import client session, exits Import, and discontinues logging to the terminal, but leaves the current job running. Provides information about Oracle Data Pump Import commands available in interactive-command mode.
It exits Import and returns to the terminal prompt. Detaches all currently attached client sessions and then terminates the current job. After all clients are detached, the job process structure is immediately run down, and the Data Pump control job table is deleted. Log files are not deleted. Enables you to increase or decrease the number of active child processes, parallel query PQ child processes, or both, for the current job.
You set it to the desired number of parallel processes. An increase takes effect immediately if there are enough resources, and if there is enough work requiring parallelization. A decrease does not take effect until an existing process finishes its current task. If the integer value is decreased, then child processes are idled but not deleted until the job exits. The failing statement or current object being processed is skipped and the job is restarted from the next work item.
For parallel jobs, this option causes each worker to skip whatever it is currently working on and to move on to the next item at restart. Displays cumulative status of the job, a description of the current operation, and an estimated completion percentage. It also allows you to reset the display interval for logging mode status. You have the option of specifying how frequently, in seconds, this status should be displayed in logging mode.
If no value is entered or if the default value of 0 is used, then the periodic status display is turned off and status is displayed only once. This status information is written only to your standard output device, not to the log file even if one is in effect. The following example displays the current job status, and changes the logging mode display interval to two minutes seconds. To attach and restart jobs, the master table and dump file set must not be disturbed, either when you issue the command, or after you issue the command.
A warning requiring confirmation is then issued. An orderly shutdown stops the job after worker processes have finished their current tasks. They are then detached. After all clients are detached, the process structure of the job is immediately run down.
That is, the Data Pump control job process does not wait for the worker processes to finish their current tasks. However, you can be required to redo some tasks that were incomplete at the time of shutdown at restart time.
You can use these common scenario examples to learn how you can use Oracle Data Pump Import to move your data. In the example, the table is named employees. It uses the dump file created in "Performing a Table-Mode Export. Only table row data is loaded. The example is a schema-mode import of the dump file set created in "Performing a Schema-Mode Export". For the given mode of import, all the objects contained within the source, and all their dependent objects, are included except those specified in an EXCLUDE statement.
This example imports the employees table from the hr schema into the scott schema. The dblink references a source database that is different than the target database. This example shows how to use wildcards in the file name for importing multiple dump files into Oracle Autonomous Database from the Oracle Object Store Service. How to Read Graphic Syntax Diagrams. Syntax diagrams are drawings that illustrate valid SQL syntax. To read a diagram, trace it from left to right, in the direction shown by the arrows.
Previous Next JavaScript must be enabled to correctly display this content. SYSDBA is used internally and has specialized functions; its behavior is not the same as for general users. The redo that is generated in such a case is generally for maintenance of the Data Pump control table, or related to underlying recursive space transactions, data dictionary changes, and index maintenance for indices on the table that require logging.
This additional time is required because the database must check to determine if the new timezone rules change the values being loaded. The mode is specified on the command line, using the appropriate parameter. Note: When you import a dump file that was created by a full-mode export, the import operation attempts to copy the password for the SYS account from the source database.
Note: You cannot export transportable tablespaces and then import them into a database at a lower release level.
Filtering During Import Operations Oracle Data Pump Import provides data and metadata filtering capability, which can help you limit the type of information that you import. The same filter name can be specified multiple times within a job.
About Import Command-Line Mode Learn how to use Oracle Data Pump Import parameters in command-line mode, including case sensitivity, quotation marks, escape characters, and information about how to use examples. For background information on setting up the necessary environment to run the examples, see: Using the Import Parameter Examples. Specifying Import Parameters For parameters that can have multiple values specified, the values can be separated by commas or by spaces.
Case Sensitivity When Specifying Parameter Values For tablespace names, schema names, table names, and so on that you enter as parameter values, Oracle Data Pump by default changes values entered as lowercase or mixed-case into uppercase.
Using the Import Parameter Examples If you try running the examples that are provided for each parameter, then be aware of the following: After you enter the username and parameters as shown in the example, Import is started and you are prompted for a password. Unless specifically noted, these parameters can also be specified in a parameter file.
See Also: Oracle Database Sample Schemas Your Oracle operating system-specific documentation for information about how special and reserved characters are handled on your system. Default Null. The result of using each number is as follows: n : If the value is zero or greater, then the import operation is started. Restrictions None.
Purpose Instructs Import to use a particular method to load data. Default If there is only one running job, then the current job in user's schema. Purpose This command attaches the client session to an existing import job, and automatically places you in interactive-command mode. You cannot attach to a job in another schema unless it is already running.
0コメント