Information Server: Oracle BLOB issue

Technote (troubleshooting)


Problem(Abstract)

Problem processing an Oracle table with a BLOB field containing large values.
around 25 MB of data.

Symptom

The original problem error message that was being reported was

Field,0: Fatal Error: Virtual data set.; output of "Field": the record is too big to fit in a block; the length
requested is: 249555, the max block length is: 131072.

This error is resolved this by setting the environment variable
$APT_DEFAULT_TRANSPORT_BLOCK_SIZE to a larger value for the jobs processing this BLOB field. The default value of this environment variable is131072 bytes.
Making $APT_DEFAULT_TRANSPORT_BLOCK_SIZE larger works well up until values of 10485760 (10 MB). However when setting is set to something larger than 10MB to 12 MB, the performance of the jobs goes down drastically.



Resolving the problem

The environment variables that are needed for jobs processing large Oracle Blob datatypes are:

-          APT_DEFAULT_TRANSPORT_BLOCK_SIZE=80000000

-          APT_TSORT_STRESS_BLOCKSIZE=16000000000

Not settng the APT_TSORT_STRESS_BLOCKSIZE variable causes the performance in jobs to drop dramatically. The default value of this is 20MB.

NOTE: If this environment variable does not already exist, create it first.


Rate this page:

(0 users)Average rating

Document information


More support for:

InfoSphere DataStage

Software version:

8.0.1, 8.0.1.1, 8.0.1.2, 8.0.1.3, 8.1, 8.1.0.1, 8.1.0.2, 8.5, 8.5.0.1, 8.5.0.2, 8.5.0.3, 8.7, 8.7.0.1

Operating system(s):

AIX, HP-UX, Linux, Solaris

Reference #:

1611564

Modified date:

2012-11-30

Translate my page

Machine Translation

Content navigation