Jobs that process a large amount of data in a column can abort with this error:
the record is too big to fit in a block; the length requested is: xxxx, the max block length is: xxxx.
Resolving the problem
To fix this error you need to increase the block size to accommodate the record size:
- Log into Designer and open the job.
- Open the job properties--> parameters-->add environment variable and select: APT_DEFAULT_TRANSPORT_BLOCK_SIZE
- You can set this up to 256MB but you really shouldn't need to go over 1MB.
NOTE: value is in KB
For example to set the value to 1MB:
The default for this value is 128kb.
When setting APT_DEFAULT_TRANSPORT_BLOCK_SIZE you want to use the smallest possible value since this value will be used for all links in the job.
For example if your job fails with APT_DEFAULT_TRANSPORT_BLOCK_SIZE set to 1 MB and succeeds at 4 MB you would want to do further testing to see what it the smallest value between 1 MB and 4 MB that will allow the job to run and use that value. Using 4 MB could cause the job to use more memory than needed since all the links would use a 4 MB transport block size.
NOTE: If this error appears for a dataset use APT_PHYSICAL_DATASET_BLOCK_SIZE.