IBM Support

DataStage Job aborts with error: "The record is too big to fit in a block"

Troubleshooting


Problem

Jobs that process a large amount of data in a column can abort with this error: the record is too big to fit in a block; the length requested is: xxxx, the max block length is: xxxx.

Resolving The Problem

To fix this error you need to increase the block size to accommodate the record size:

  1. Log into Designer and open the job.

  2. Open the job properties--> parameters-->add environment variable and select: APT_DEFAULT_TRANSPORT_BLOCK_SIZE

  3. You can set this up to 256MB but you really shouldn't need to go over 1MB.
    NOTE: value is in KB

    For example to set the value to 1MB:
    APT_DEFAULT_TRANSPORT_BLOCK_SIZE=1048576

    The default for this value is 128kb.

When setting APT_DEFAULT_TRANSPORT_BLOCK_SIZE you want to use the smallest possible value since this value will be used for all links in the job.

For example if your job fails with APT_DEFAULT_TRANSPORT_BLOCK_SIZE set to 1 MB and succeeds at 4 MB you would want to do further testing to see what it the smallest value between 1 MB and 4 MB that will allow the job to run and use that value. Using 4 MB could cause the job to use more memory than needed since all the links would use a 4 MB transport block size.

NOTE: If this error appears for a dataset use APT_PHYSICAL_DATASET_BLOCK_SIZE.

[{"Product":{"code":"SSZJPZ","label":"IBM InfoSphere Information Server"},"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Component":"Not Applicable","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF010","label":"HP-UX"},{"code":"PF016","label":"Linux"},{"code":"PF027","label":"Solaris"},{"code":"PF033","label":"Windows"}],"Version":"8.1;8.0;7.5","Edition":"","Line of Business":{"code":"LOB10","label":"Data and AI"}}]

Document Information

Modified date:
23 June 2018

UID

swg21416107