IBM Migration Toolkit (MTK) FAQ

Product documentation


This document answers many common IBM Migration Toolkit (MTK) questions.


Frequently asked questions

Here are some MTK frequently asked questions.

What is MTK?
MTK is a utility to convert source DBMS constructs and data to those compatible with IBM ® DBMS: MTK is a free downloadable utility:
  • No part number.
  • Internally used by IBM migration specialists and externally used by business partners, ISVs, and more.
Why is the IBM Migration Toolkit Needed?
Because SQL is not all the same:
  • Different standards levels
  • Differences in ANSI compliance
  • Differences in implementations:
  • Proprietary source extensions
  • Data types and type semantics
  • Syntax Procedural language
  • Built-in functions
What does MTK convert?
For supported source database platforms, MTK converts:
  • DDL statements:
  • ALTER TABLE and more.
  • Constraints
  • SQL Statements:
  • Triggers
  • Procedures
  • Functions
What does MTK not convert?
  • Applications
  • Replication schemes
  • OLAP specific features
  • Catalog/system tables
  • Statements specific to system administration
  • Disk Partitioning Schemes
What is involved in a database migration?
1. Assess source database – size, complexity
2. Educate and train
3. Setup environment – develop, administer, test, production
4. Obtain source structure
5. Port source structure
6. Deploy objects to target
7. Migrate data
8. Verify target
9. Tune performance
10. Modify applications
Installation directory name
Here are some restrictions on installation directory name:
  • MTK must not be installed into a directory with a space in the name. Everything will work, until deployment. During deployment, the DB2 database product commands that load the file into DB2 database product will fail because of the space in the file name.
MTK won’t start at all
Java 1.4.2 or later must be installed and accessible through the PATH environment variable.
Source metadata extraction
The problem is probably with the connection to the source database. Either the ODBC or JDBC connection is not set up correctly, or the user ID does not have read access to the system catalog. See "DB2 database product connection problem" for details about the requirements for the JDBC connection.
Problems with this step are best diagnosed from the mtk.log which is found in the MTK installation directory. Because the mtk.log file is overwritten every time MTK starts, you need to:
1. Reproduce the problem.
2. Close MTK.
3. Send the mtk.log file to the MTK development team for further analysis.
Translation problems
Most of the common translation problems are caused by the use of context files, schema handling, and translation of statements that do not map easily to the DB2 database product.

The extractor puts statements in the file to set the database or schema environment for the objects in the file. Imported files often lack those USE, CONNECT, or SETUSER statements. The translator uses a default schema for the objects, but it does not always work for helping the translator resolve object names (especially if you are using a mixture of imported and extracted files), and the default schema rarely works for data extraction.

MTK users are sometimes confused about the default schema entry field. When it is set to "from_first_object," the translator takes the schema for the first object as the schema to be replaced by the DB2 database product user ID used in deployment. When it is set to "specify_schema_for_all_object," the translator keeps the schema on every object through deployment. The user can also type in a specific schema, which is then replaced by the IBM target server user ID used during deployment.

If the problem is not caused by user error or is an exception in the translator code, take the source and try to isolate the problem to the smallest test case possible. Provide this test case to the MTK development team.

Refine problems
When the default schema entry filed in the Convert step is set to "from_first_object," the translator takes the schema for the first object as the schema to be replaced by the IBM target user ID used in deployment. Because of this, even if this schema for the first object is refined, it does not take effect as the IBM target user ID that is used during deployment and verification steps.
Deployment to a DB2 database product

Make sure you have your IBM target and java environment setup correctly.

To diagnose problems with data extraction, you want to look at mtk.log and the Verify_*.out file. To diagnose problems with deployment into a DB2 database product, you want to look at the Deploy_*.log files.

What source databases are supported?

  • Sybase 11, 12, 12.5 SQL Server 7 and SQL Server 2000.
  • SQLServer 6.5 has been reported to work, but is not tested.
  • Oracle 7, 8, and 9.
  • Informix 7.3 and Informix 9. Data movement from Informix 7.2 is limited by the lack of a TO_CHAR function to put dates into a format that the DB2 database product consumes easily.
Generally, MTK does not process invalid source code successfully.
On what platforms does MTK run? Migrating from a DB2 database product to a DB2 database product
The IBM Migration Toolkit is not designed to migrate from a DB2 database product to a DB2 database product. Please consider the following options to get proper help: DB2 database product connection problem
A customer received the following errors while trying to deploy their database from a local machine to a remote server, although they were able to connect to the database by using the DB2 Client. They said it appeared that the message was looking for a DB2 Database for Linux, UNIX, and Windows JDBC driver, but they were not sure where to check or what to do to correct the problem.

ERROR MTKDB2ConnectionUNO DB2 UDB JDBC Driver not

ERROR DeploymentExtensionUNO The DB2 Connection to the database ASDTTST1 failed (rc=-6) with the following error message:No suitable driver

MTK uses the JDBC driver which is in SQLLIB\java\ is needed in the class path.

Check your CLASSPATH by opening a window and typing SET CLASSPATH. The revealed class path should include D:\SQLLIB\java\;D:\SQLLIB\java\;D:\SQLLIB\java\;D:\SQLLIB\bin; with D:\SQLLIB replaced by the drive and path for the local installation of DB2 Database for Linux, UNIX, and Windows.

The class path should not include any *.dll files.

You can either change the class path for the system or add the DB2 database product jar files to the class path in MTKMain.bat file.

If you are running a local 32-bit application (such as MTK) in a 64-bit DB2 database product instance, LIBPATH, SHLIB_PATH, and LD_LIBRARY_PATH should contain sqllib/lib32 instead of sqllib/lib.

DB2 database product-specific questions

  • A request to size an xSeries® hardware configuration is not within the expertise or scope of MTK or the SMPO team. Techline can do this kind of thing. Check out the following website:
  • A C++ compiler is required for stored procedures in a DB2 database product before version 8.2. Alternatives are listed in the DB2 database product documentation.
Source database connectivity errors
Verify installation of the source database client.
Out of memory errors
The IBM Migration Toolkit runs in the available Java memory space. For large migrations that require more memory, you can allocate more memory space.
To allocate more memory space:
1. In the directory where the IBM Migration Toolkit is installed, locate and edit the launching file (MTK.bat, MTKMain, or MTKMainOra).
2. Add the -Xmx NNNm flag to the Java command, where NNN is the amount of memory, in megabytes, that you want to allocate. For example, to allocate 256 Mb of memory for migration, type the following command:
java  -Xmx256m -classpath "..."
Do not change any of the other flags. The default size is 1/3 or 1/2 of the physical memory on the machine.
Continuing out of memory errors
The best workaround for this is to extract half of the objects into one file, then extract the other half of the objects into a second file. During conversion, you can select both files at the same time to convert all of the stored procedures. If you encounter memory restrictions during conversion, then you should convert by sections.
What are the steps for conversion?
See the MTK help documentation. From inside the MTK interface, go to Help > Help Content and "The GUI process."
In Sybase or SQLServer conversion, CREATE DEFAULT and sp_bindefault statements are not replaced by DB2 database product statements
The CREATE DEFAULT statement is translated by adding a DEFAULT clause to the column named in the sp_bindefault statement. The DB2 database product does not name defaults, so the default name is not migrated.
Why does MTK produce separate files for TEXT, IMAGE, CLOB, or BLOB data?
There are potential problems with putting LOB data inline if the LOB values are long. With ASCII format, each row has to have enough space for the longest LOB value.

With delimited format, binary data (BLOB) or strings of undefined content, as it is often the case with CLOB data, may cause problems. If the target is specified as DB2 database product, MTK will use LOB LOCATOR STRINGS, and put lob data for many rows into a single file.

Why isn't the data moved?
Assuming you have checked the "Extract and store data on this system" and "Load data to target database by using generated scripts" boxes on the deployment page, the problem is probably that you are connected to Oracle with a userID that does not have SELECT permission on the tables you are trying to migrate. There will be error messages saying this in the log file (mtk.log in the installation directory).

MTK can extract the metadata because that only requires SELECT permission on the system catalog, however, extracting data requires SELECT permission on the specific table.

Common setup issues
Cannot connect to the source database
Refer to the MTK documentation for information on setup of source database clients.
MTK on UNIX does not support connections to source database through ODBC
Make sure you have the required JDBC clients on UNIX.
Common usage issues
SQL 1042C error on DB2 on Linux when running MTK Java UDFs
Make sure your Java environment for Linux is correctly set. For more information, refer to DB2 Information center at
DB2 Database for Linux, UNIX, and Windows JDBC Driver not found
Make sure SQLLIB\java\ is needed in the class path.
MTK fails to load LOB data into the DB2 database product table
You have run into a DB2 database product limitation that afflicts remote deployment. Remote deployment does not work with LOB data, because DB2 database product requires a path on the server for the LOBSPATH parameter.

If the DB2 database product server can connect to the source database server, either MTK can be installed on the DB2 database product server to run the migration there, or the scripts that MTK creates for moving data manually can be executed on the DB2 database product server, which gets around the LOBSPATH problem. There are detailed instructions for doing this in the MTK topic "Manually transferring data by using files."

If the DB2 database product server cannot connect to the source database server, things get more complicated:
1. You can use the Directory for data extraction field on the Generate Data Transfer Scripts page to put the extracted data into a directory on the DB2 database product server.
2. After deploying the objects and extracting the data on the Deploy to target panel, you can move the DataMove_*_db2.bat or DataMove_* file to the DB2 database product server.
3. Edit DataMove_*_db2.bat or DataMove_* to correct the path.
4. Run DataMove_*_db2.bat or DataMove_* to load the data into the DB2 database product.
When I selected from my table I saw references (such as "D161/C20_807113.out") instead of the actual LOB data
The LOAD statement used should have two parameters to instruct the DB2 database product on where to get the LOB data. For example:

LOBS FROM C:\MTK\projects\SQL2000\DataOutScripts\dbo_Categories\ MODIFIED BY LOBSINFILE

If either is missing, then references are written instead of LOB data.
I want to distribute MTK UDF's with my migrated application
Given that IBM provides you the UDFs "as-is" without warranty or indemnification, you can redistribute them with your application.
I have encountered "transaction log full" problem when deploying data to a DB2 database product by selecting the "import" function
To fix this:
1. In the Generate Data Transfer Script panel, select Import and then select the Advanced options button.
2. In the dialog box that appears, enter a number of rows in the commit count field. If the DB2 database product is configured with "Log retain for recovery status" set to no (the default), the log space will be recovered at each commit.
I have some stored procedures in MSSQL with default values. It looks like this functionality is not supported in the DB2 database product - is that correct?
MTK stripped off the default values without issuing a message. The default parameter values are not supported by the DB2 database product. However, the converter does not lose the default value when it removes it from the procedure statement, instead it adds the default value in the calling statement.
Problem with moving Unicode data from SQL Server to a DB2 database product
The load or import statement for a unicode file needs the "no check lengths" option selected on the advanced options page before creating the data transfer scripts. The codepage may also need to be set on that page.
Codepage issues into DB2 database product
In the Advanced Options for Generate Data Transfer Scripts panel there is a field labeled "code page." This field needs to be filled in with the appropriate value so that the generated LOAD command contains the "codepage=xxx" modifier that tells DB2 database product to interpret the special characters in the transfer file correctly.

See this article for more information:

I am getting this error when I try to run the SQL Translator: --* [200040] "C:\MTK\projects\testsql\SQLTranslator.input"(24:1)-(24:14) Input Error: card_trans_vue is not a valid table, view, or table function name.
The SQL Translator does not have a definition of card_trans_vue available. You can include the definition of card_trans_vue in the script window. If you have run a script containing the definition of card_trans_vue through the Convert step, you can select "Use all files" in the drop-down list next to the Paste button. Then the SQL Translator will have the definition of card_trans_vue available and will not issue the error.
Whenever I try to migrate the objects from Sybase to the DB2 database product, by default it is moving all of the UDT data types also. How can I avoid this?
MTK always extracts the UDT data types. The only way to avoid migrating those to the DB2 database product is to edit the source file after extraction and delete the unneeded UDT data types in the file before conversion.
How do we specify a specific table space name for DB2 database product tables?
The MTK team recommends that customers use the DB2 Control Center wizard to create their table spaces.

MTK provides an initial CREATE TABLESPACE statement at the beginning of the DB2 database product file when you chooses that option in the converter advanced options. Providing this CREATE TABLESPACE statement causes the CREATE TABLE statements to keep the IN table space parameter and put the tables into table spaces of the correct names.

You need to add index and long table space clauses to the DB2 database product file after the column list:
Read syntax diagramSkip visual syntax diagram
                         '-INDEX IN--tablespace-name2-'  '-LONG IN--tablespace-name3-'

MTK creates a lot of functions with SYB, MS7, ORA, or INFX schema. How do I migrate those from one DB2 database product to another?
The functions with the SYB schema are being created by mtksyb.udf during the deployment. If you look in the Deploy_conversion.bat file, you will see a block of lines like this (with cmp replaced by your database name):
@ECHO Installing JAVA UDFs file under name syb.cmp...
DB2 -v -td! -f "C:\MTK\mtksybdrop.udf" >null
DB2 CALL SQLJ.REMOVE_JAR('syb.cmp') >null
DEL null
DB2 -v CALL SQLJ.INSTALL_JAR('file:C:\MTK\sybUDFs.jar','syb.cmp') >>%UDFLOGFILE%
DB2 -v -td! -f "mtksyb.udf" >>%UDFLOGFILE%
@ECHO Creation of MTK UDFs done.

Here is a guide for repeating the deployment of the UDFs into another database:

  • SYB from mtksyb.udf or mtksybISeries.udf
  • MS7 from mtkms7.udf or mtkms7ISeries.udf
  • ORA from mtkora8.udf or mtkora8ISeries.udf
  • INFX from mtkinfx.udf or mtkinfxISeries.udf
The installation of the JAR file completed successfully, but creation of each JAVA UDF fails with DB21034E
The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned:

SQL20204N The user defined function or procedure "PROCID" was unable to map to a single Java method. LINE NUMBER=3. SQLSTATE=46008

The JAR file is installed under a different name than is used in the external name of the procedure. In other words, CALL SQLJ.INSTALL_JAR('file:/db2home/db2inst2/worksp/scripts/sybUDFs.jar','syb.cmp'):

When MTK prepares to deploy the UDF file to DB2 database product, it changes the instances of udfjar in the external name to match the DB2 database product name.
I want to keep the original comments on the translation of my stored procedures
MTK only preserves all comments (both those created by MTK and original sp comments) or removes all comments. Most of the comments produced by MTK are the original source statements. Because the translation to DB2 database product cannot be a one-for-one mapping of statements, there is no way to copy only the source comments into the DB2 database product with any assurance that the comments would be placed somewhere that makes sense. Therefore, MTK keeps source comments only with the original source statements.
I have my tables in one file and my procedures in another. When I translate the procedures with the tables file set as the context file, the translator says it cannot find the tables.
The translator maps each database or schema combination into a schema for DB2 database product, because all the objects go into a single DB2 database product:

  • The objects in the context file all go to dbo schema.
  • Objects with dbo schema in the next database go into dbo1 schema.
  • If the procedures file contains use database or a setuser username with a different database or username than appears in the tables file, it tells the translator that the following objects are in a new database or schema combination, which will map to dbo1.
The referenced tables are not found in schema dbo1, only in schema dbo, so it is not recognized as the same name.

You can either remove the use or setuser statement from the procedures file, or add the same statement to the tables file to tell the translator to treat all of the objects as being in the same database or schema combination. The translator will then successfully find the tables.


Except for the special case of character types of length one (CHAR(1)), nullable character types are converted to variable length character types. This is because in Sybase the behavior of VARCHAR is the same as a nullable CHAR. Meaning extra space padding is truncated and NULL values are allowed. However, in a DB2 database product even if a CHAR column is nullable, it remains fixed length for all values except NULL. Also, in a DB2 database product the default is that columns are nullable rather than non-nullable as is the default for Sybase.

I cannot connect to the Oracle database
The machine running MTK needs either an ODBC connection to the Oracle server or an Oracle client and ojdbc14.jar or in the class path in order for MTK to be able to access the Oracle server. The Oracle native driver connection uses a service name and relies on the Oracle client to use the information in the tnsnames.ora file to translate that into the details of the connection string.
The entry in the tnsnames.ora file on that machine supplies the host and port information. For example:
oracle =
(SERVICE_NAME = oracle)

I do not have an ODBC connection and the native driver option is grayed out
MTK needs either ojdbc14.jar or to be found in the class path.
MS SQLServer 2000
MTK needs three MS jar files to be found in the class path. The jar files needed are:

  • msbase.jar
  • mssqlserver.jar
  • msutil.jar
You can either add them to the system class path, or edit MTKMain.bat to add them. It will be easier to add them to the class path if you first copy them to your c:\MTK directory.
MS SQLServer 7
There is no native driver option. There is a Microsoft ® document titled “HOW TO: Get Started with Microsoft JDBC” at .
MTK needs jconn2.jar to be found in the class path. This can be downloaded as the JConnect tool from Sybase.
MTK needs ifxjdbc.jar, which is included in the MTK installation and in the MTKMain.bat class path.
How do I resolve the Java warning: WARNING: Could not create system preferences directory. System preferences are unusable.
Follow these steps and provide necessary permissions:
1. Create /etc/.java/.systemPrefs and enter chmod 777 to the change the mode to 777.
2. Create $HOME/.java/.userPrefs and enter chmod 777 to the change the mode to 777.
3. Provide the executable permissions to JDBC driver (for example: chmod 777 jconn3.jar).
For more information, see
Terms of use | Feedback
(C) Copyright IBM Corporation 1999, 2007. All Rights Reserved.

Rate this page:

(0 users)Average rating

Add comments

Document information

More support for:

IBM Migration Toolkit

Software version:


Operating system(s):

HP-UX, Linux, Solaris, Windows, z/OS

Software edition:

All Editions

Reference #:


Modified date:


Translate my page

Machine Translation

Content navigation