The import utility populates a table, typed table, or view
with data using an SQL INSERT statement. If the table or view receiving
the imported data already contains data, the input data can either
replace or be appended to the existing data.
Like export, import is a relatively simple data movement utility.
It can be activated through the Control Center, by issuing CLP commands,
by calling the ADMIN_CMD stored procedure, or by calling its API, db2Import,
through a user application.
There are a number of data formats that import supports, as well
as features that can be used with import:
- Import supports IXF, WSF, ASC, and DEL data formats.
- Import can be used with file type modifiers to customize the import
operation.
- Import can be used to move hierarchical data and typed tables.
- Import logs all activity, updates indexes, verifies constraints,
and fires triggers.
- Import allows you to specify the names of the columns within the
table or view into which the data is to be inserted.
- Import can be used with DB2 Connect™.
Important: Support for the WSF file
format is deprecated and might be removed in a future release. It
is recommended that you start using a supported file format instead
of WSF files before support is removed.
Import modes
Import has five modes which
determine the method in which the data is imported. The first three,
INSERT,
INSERT_UPDATE,
and
REPLACE are used when the target tables already
exist. All three support IXF, WSF, ASC, and DEL data formats. However,
only
INSERT and
INSERT_UPDATE can
be used with nicknames.
Table 1. Overview of INSERT,
INSERT_UPDATE, and REPLACE import modesMode |
Best practice usage |
INSERT |
Inserts input data into target table without
changing existing data |
INSERT_UPDATE |
Updates rows with matching primary key values
with values of input rows Where there's no matching
row, inserts imported row into the table |
REPLACE |
Deletes all existing data and inserts imported
data, while keeping table and index definitions |
The other two modes,
REPLACE_CREATE and
CREATE,
are used when the target tables do not exist. They can only be used
with input files in the PC/IXF format, which contains a structured
description of the table that is to be created. Imports cannot be
performed in these modes if the object table has any dependents other
than itself.
Note: Import's CREATE and REPLACE_CREATE modes
are being deprecated. Use the db2look utility instead.
Table 2. Overview of REPLACE_CREATE and CREATE import
modesMode |
Best practice usage |
REPLACE_CREATE |
Deletes all existing data and inserts imported
data, while keeping table and index definitions Creates
target table and index if they don't exist |
CREATE |
Creates target table and index Can
specify the name of the table space where the new table is created |
For importing data, you can
use the task assistant available in IBM® Data Studio Version
3.1 or later. Task assistants can guide you through the process of
setting options, reviewing the automatically generated commands to
perform the task, and running these commands. For more details, see Administering databases with task assistants.
How import works
The number of steps and
the amount of time required for an import depend on the amount of
data being moved and the options that you specify. An import operation
follows these steps:
- Locking tables
Import acquires either an exclusive
(X) lock or a nonexclusive (IX) lock on existing target tables, depending
on whether you allow concurrent access to the table.
- Locating and retrieving data
Import uses the FROM
clause to locate the input data. If your command indicates that XML
or LOB data is present, import will locate this data.
- Inserting data
Import either replaces existing data
or adds new rows of data to the table.
- Checking constraints and firing triggers
As the data
is written, import ensures that each inserted row complies with the
constraints defined on the target table. Information about rejected
rows is written to the messages file. Import also fires existing triggers.
- Committing the operation
Import saves the changes
made and releases the locks on the target table. You can also specify
that periodic take place during the import.
The following items are mandatory for a basic import operation:
- The path and the name of the input file
- The name or alias of the target table or view
- The format of the data in the input file
- The method by which the data is to be imported
- The traverse order, when importing hierarchical data
- The subtable list, when importing typed tables
Additional options
There are a number
of options that allow you to customize an import operation. You can
specify file type modifiers in the MODIFIED BY clause to change the
format of the data, tell the import utility what to do with the data,
and to improve performance.
The import utility, by default,
does not perform commits until the end of a successful import, except
in the case of some ALLOW WRITE ACCESS imports. This
improves the speed of an import, but for the sake of concurrency,
restartability, and active log space considerations, it might be preferable
to specify that commits take place during the import. One way of doing
so is to set the COMMITCOUNT parameter to "automatic,"
which instructs import to internally determine when it should perform
a commit. Alternatively, you can set COMMITCOUNT to
a specific number, which instructs import to perform a commit once
that specified number of records has been imported.
There are
a few ways to improve import's performance. As the import utility
is an embedded SQL application and does SQL fetches internally, optimizations
that apply to SQL operations apply to import as well. You can use
the compound file type modifier to perform a specified
number of rows to insert at a time, rather than the default row-by-row
insertion. If you anticipate that a large number of warnings will
be generated (and, therefore, slow down the operation) during the
import, you can also specify the norowwarnings file
type modifier to suppress warnings about rejected rows.
Messages
file During an import, standard ASCII text message
files are written to contain the error, warning, and informational
messages associated with that operation. If the utility is invoked
through the application programming interface (API)
db2Import,
you must specify the name of these files in advance with the
MESSAGES parameter,
otherwise it is optional. The messages file is a convenient way of
monitoring the progress of an import, as you can access is while the
import is in progress. In the event of a failed import operation,
message files can be used to determine a restarting point by indicating
the last row that was successfully imported.
Note: If the volume of
output messages generated by an import operation against a remote
database exceeds 60 KB, the utility will keep the first 30 KB and
the last 30 KB.