Incorporating server job functionality

You can incorporate Server job functionality in your Parallel jobs by the use of Server Shared Container stages.

You can incorporate Server job functionality in your Parallel jobs by the use of Server Shared Container stages. This allows you to, for example, use Server job plug-in stages to access data source that are not directly supported by Parallel jobs. (Some plug-ins have parallel versions that you can use directly in a parallel job.)

You create a new shared container in the Designer, add Server job stages as required, and then add the Server Shared Container to your Parallel job and connect it to the Parallel stages. Server Shared Container stages used in Parallel jobs have extra pages in their Properties dialog box, which enable you to specify details about parallel processing and partitioning and collecting data.

You can only use Server Shared Containers in this way on SMP systems (not MPP or cluster systems).

The following limitations apply to the contents of such Server Shared Containers:

  • There must be zero or one container inputs, zero or more container outputs, and at least one of either.
  • There can be no disconnected flows - all stages must be linked to the input or an output of the container directly or via an active stage. When the container has an input and one or more outputs, each stage must connect to the input and at least one of the outputs.
  • There can be no synchronization by having a passive stage with both input and output links.

All of columns supplied by a Server Shared Container stage must be used by the stage that follows the container in the parallel job.

For details on how to use Server Shared Containers, see in InfoSphere® DataStage® Designer Client Guide. This also tells you how to use Parallel Shared Containers, which enable you to package parallel job functionality in a reusable form.