REGIONS
table, which is part of the HR
sample schema, so that it can be created, along with its data, in another schema (either in the same Oracle database or another Oracle database).REGIONS
table:HR
.HR
schema name is not included in CREATE and INSERT statements in the .sql script file that will be created. (This enables you to re-create the table in a schema with any name, such as one not named HR.)C:tempexport.sql
.) The script file containing CREATE and INSERT statements will be created in this location.insert
, which causes SQL INSERT
statements to be included to insert the data. Other values include loader
to cause SQL*Loader files to be created, and xls
to cause a Microsoft Excel .xls file to be created.REGIONS
table on the left to move it to the right-hand column. Figure 10-2 shows the result of these actions.WHERE
clauses' in the bottom part of this page.C:tempexport.sql
.)REGIONS
table that you exported in 'Example: Exporting Metadata and Data for a Table', but in a different schema. This other schema can be an existing one or one that you create.NICK
following the instructions in 'Example: Creating a User'. To re-create the REGIONS
table in the schema of user NICK
by invoking the script in C:tempexport.sql
follow these steps using SQL Developer:NICK
, create the connection.NICK
connection.NICK
connection, type the following:REGIONS
table has been created and four rows have been inserted.NICK
connection. You now see the REGIONS
table.REGIONS
table in the Connections navigator, and examine the information under the Columns and Data tabs in the main display area.REGIONS
table, which is part of the HR
sample schema, so that the data can be imported into a table with the same column definitions. This might be a REGIONS
table in another schema (either in the same Oracle database or another Oracle database).REGIONS
table:HR
.CREATE
statements, but only INSERT
statements.C:tempexport.xls
.)REGIONS
table on the left to have it appear in a row in the bottom part of the page. Figure 10-2 shows the result of these actions.WHERE
clauses' in the bottom part of this page.REGIONS
table to be exported to the file C:tempexport.xls
.)REGIONS
) table.NICK
following the instructions in 'Example: Creating a User'. This user wants to take the exported data, add one row in the Excel file, and import it into a new table that has the same column definitions as the REGIONS
table. (This example is trivial, and adding a row to the Excel file may not be typical, but it is presented merely to illustrate some capabilities.)NICK
, create the connection.NICK
connection.NICK
connection, type the following:NEW_REGIONS
table has been created.NICK
connection. You now see the NEW_REGIONS
table.NEW_REGIONS
table, disconnect from NICK
(right-click NICK
in the Connections navigator and select Disconnect) and connect again, and expand the Tables node.NICK
, right-lick the NEW_REGIONS table and select Import Data.c:temp
folder, select export.xls
, and click Open..xls
file is loaded into the NEW_REGIONS
table and is committed.Feature or Utility | Description |
---|---|
SQL*Loader utility |
|
Data Pump Export and Data Pump Import utilities |
|
Export and Import utilities |
|
Import/Export Scenario | Recommended Option |
---|---|
You have to load data that is not delimited. The records are fixed length, and field definitions depend on column positions. | SQL*Loader |
You have tab-delimited text data to load, and there are more than 10 tables. | SQL*Loader |
You have text data to load, and you want to load only records that meet certain selection criteria (for example, only records for employees in department number 3001). | SQL*Loader |
You want to import or export an entire schema from or to another Oracle database. There is no XMLType data in any of the data. | Data Pump Export and Data Pump Import |
You want to import or export data from or to another Oracle database. The data contains XMLType data and contains no FLOAT or DOUBLE data types. | Import ( imp ) and Export (exp ) |
INSERT
statements to populate tables in an Oracle database. This method can sometimes be slower than other methods because extra overhead is added as SQL statements are generated, passed to Oracle, and executed. It can also be slower because when SQL*Loader performs a conventional path load, it competes equally with all other processes for buffer resources.INSERT
statements to insert the data from the datafile into the target table. An external table load allows modification of the data being loaded by using SQL functions and PL/SQL functions as part of the INSERT
statement that is used to create the external table.dependents
will be created in the HR
sample schema. It will contain information about dependents of employees listed in the employees
table of the HR
schema. After the table is created, SQL*Loader will be used to load data about the dependents from a flat data file into the dependents
table.dependents.dat
, in your current working directory. You can create this file using a variety of methods, such as a spreadsheet application or by simply typing it into a text editor. It should have the following content:dependents.ctl
, in your current working directory. You can create this file with any text editor. It should have the following content:oracle
user account.hr
by entering the following at the command prompt:dependents
table, as follows:last_name
column indicates that a value must be provided. The constraint on the relative_id
column indicates that it must match a value in the employee_id
column of the employees
table. The benefits
column has a datatype of CLOB
so that it can hold large blocks of character data. (In this example, there is not yet any benefits information available so the column is shown as NULL
in the data file, dependents.dat
.)Table created
message, enter exit
to exit the SQL Command Line.dependents.dat
file is loaded into the dependents
table and the following message is displayed:dependents.log
. The content of the log file looks similar to the following:dependents
table, as you would any other table.HR
sample schema and then test those changes without affecting the current HR
schema. You could export the HR
schema and then import it into a new HRDEV
schema, where you could perform development work and conduct testing. To do this, take the following steps:oracle
user account.SYSTEM
by entering the following at the command prompt:password
is the password that you specified for the SYS
and SYSTEM
user accounts upon installation (Windows) or configuration (Linux) of Oracle Database XE.dmpdir
for the tmp
directory that you just created, and to grant read and write access to it for user HR
.HR
schema to a dump file named schema.dmp
by issuing the following command at the system command prompt:password
is the password for the SYSTEM
user.schema.dmp
file and the expschema.log
file are written to the dmpdir
directory.schema.dmp
, into another schema, in this case, HRDEV
. You use the REMAP_SCHEMA
command parameter to indicate that objects are to be imported into a schema other than their original schema. Because the HRDEV
user account does not already exist, the import process automatically creates it. In this example, you will import everything except constraints, ref_constraints, and indexes. If a table already exists, it is replaced with the table in the export file.password
is the password for the SYSTEM
user.impschema.log
file in the dmpdir
directory):HRDEV
schema is now populated with data from the HR
schema.HRDEV
user account. To do so, start the SQL Command Line and connect as user SYSTEM
(as you did in step 4), and then at the SQL prompt, enter the following ALTER
USER
statement:hrdev
.HRDEV
schema without affecting your production data in the HR
schema.exp
and imp
commands, respectively. These utilities provide support for XMLType data, whereas the Data Pump Export and Import utilities do not.FLOAT
and DOUBLE
data types. If your data contains these types and does not contain XMLType data, you must use Data Pump Export and Import, described in 'Exporting and Importing with Data Pump Export and Data Pump Import'.exp
) utility can only be imported by the Import (imp
) utility; they cannot be imported with the Data Pump Import (impdp
) utility.