IBM Support

Information Server: Oracle BLOB issue

Technote (troubleshooting)


Problem processing an Oracle table with a BLOB field containing large values.
around 25 MB of data.


The original problem error message that was being reported was

Field,0: Fatal Error: Virtual data set.; output of "Field": the record is too big to fit in a block; the length
requested is: 249555, the max block length is: 131072.

This error is resolved this by setting the environment variable
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE to a larger value for the jobs processing this BLOB field. The default value of this environment variable is131072 bytes.
Making $APT_DEFAULT_TRANSPORT_BLOCK_SIZE larger works well up until values of 10485760 (10 MB). However when setting is set to something larger than 10MB to 12 MB, the performance of the jobs goes down drastically.

Resolving the problem

The environment variables that are needed for jobs processing large Oracle Blob datatypes are:


-          APT_TSORT_STRESS_BLOCKSIZE=16000000000

Not settng the APT_TSORT_STRESS_BLOCKSIZE variable causes the performance in jobs to drop dramatically. The default value of this is 20MB.

NOTE: If this environment variable does not already exist, create it first.

Document information

More support for: InfoSphere DataStage

Software version: 8.0.1,,,, 8.1,,, 8.5,,,, 8.7,

Operating system(s): AIX, HP-UX, Linux, Solaris

Reference #: 1611564

Modified date: 30 November 2012