IBM Support

Detailed System Requirements for Tivoli Dynamic Workload Console, V8.3

Release Notes


Abstract

This document describes the detailed system requirements for Tivoli Dynamic Workload Console on the supported platforms for version 8.3.

Content

Tivoli Dynamic Workload Console prerequisites and software limitations

The Tivoli(R) Dynamic Workload Console is web-based user interface for Tivoli Workload Scheduler. It provides you with a mean for viewing and controlling scheduling activities in production on both the Tivoli Workload Scheduler distributed and z/OS environments.

Using the Tivoli Dynamic Workload Console you can access the Tivoli Workload Scheduler environment from any location in your network using one of the supported browsers.

These are the activities that you can perform from the Tivoli Dynamic Workload Console user interface in your Tivoli Workload Scheduler production environments:

  • Browsing and managing scheduling objects involved in current plan activities.
  • Creating and controlling connections to Tivoli Workload Scheduler environments.
  • Submitting jobs and job streams in production.
  • Setting user preferences.

Tivoli Dynamic Workload Console must reside on a server that can reach the Tivoli Workload Scheduler nodes using network connections.

This chapter explains how to install Tivoli Dynamic Workload Console and it is divided into the following sections:

  • Tivoli Dynamic Workload Console installation media
  • Tivoli Dynamic Workload Console prerequisites
  • Choosing installation methods
  • Product limitations
Tivoli Dynamic Workload Console installation media

Tivoli Dynamic Workload Console is packaged into six CDs as follows:

CD_1 (Windows(R))

  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
CD_2 (Linux(R) on IA32)
  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
CD_3 (Linux on Power64)
  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
CD_4 (AIX(R))
  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
CD_5 (Linux on s/390 and zSeries(R))
  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
CD_6 (Solaris)
  • Launchpad
  • Tivoli Dynamic Workload Console
  • Installation and Troubleshooting Guide
What the Tivoli Dynamic Workload Console installable image contains

The IBM(R) Integrated Solutions Console (ISC) is the base for the Tivoli Dynamic Workload Console application itself. It provides the capabilities of a single platform for consolidating administrative console functions to control and manage other application resources, such as Tivoli Workload Scheduler resources. The Tivoli Dynamic Workload Console can be installed on top of either of the following:

Integrated Solutions Console version 6.0.1 installed stand-alone


The Integrated Solutions Console installed on top of the Embedded Version of IBM WebSphere(R) Application Server.
Integrated Solutions Console version 6.0.1 installed on an external WebSphere Application Server
The Integrated Solutions Console installed on a version of WebSphere Application Server not Embedded. Refer to the Integrated Solutions Console version 6.0.1 documentation for more information on the supported versions of WebSphere Application Server.

The Tivoli Dynamic Workload Console installable images contain:

  • The IBM Integrated Solutions Console (ISC) 6.0.1 bundle, installable stand-alone. It contains the following components:
  • Cloudscape(TM) 5.1.60.24
  • IBM Eclipse Help System 3.0.1
  • Embedded Version of IBM WebSphere Application Server - Express 6.0.2
  • WebSphere Portal Technology 5.1.0.2
  • Presentation Services Web Component Library (WCL) 5.0
  • The IBM Tivoli Dynamic Workload Console module.

Table 1 describes the different configuration of the Tivoli Dynamic Workload Console you can obtain on a system.

If you want to ...You install the following components...
Table 1. Components installed during Tivoli Dynamic Workload Console installation
Install on a system where no instance of Integrated Solutions Console is installed.IBM Integrated Solutions Console (ISC) 6.0.1 bundle

IBM Tivoli Dynamic Workload Console module

Install on a system where an Integrated Solutions Console installed on top of an external WebSphere Application Server exists, but you do not want to integrate the two environments.IBM Integrated Solutions Console (ISC) 6.0.1 bundle

IBM Tivoli Dynamic Workload Console module

Integrate with a preexisting Integrated Solutions Console installed on an external WebSphere Application Server.IBM Tivoli Dynamic Workload Console module
Integrate with a preexisting Integrated Solutions Console installed stand-alone.IBM Tivoli Dynamic Workload Console module

Note:


You cannot install two instances of Integrated Solutions Console stand-alone on the same system, but you can install a Integrated Solutions Console stand-alone on a system with a preexisting Integrated Solutions Console installed on top of an external WebSphere Application Server.

For more information on installation options refer to the Installation and Troubleshooting Guide.

Tivoli Dynamic Workload Console prerequisites

Because of the strict relationship with the Integrated Solutions Console, Tivoli Dynamic Workload Console inherits the Integrated Solutions Console 6.0.1 prerequisites. This section contains the list of prerequisites divided into:

  • Java(TM) Virtual Machine additional requirements
  • Prerequisites for installing Tivoli Dynamic Workload Console
  • Additional prerequisites for using the launchpad
  • Product limitations
Java Virtual Machine additional requirements

The Tivoli Dynamic Workload Console uses a Java Virtual Machine and requires the following disk space available:

At installation time


100 MB of free space in the temporary directory.
At run time
Additional disk space to manage the data displayed in the browse task result table. This additional space depends on:
  • the number of objects to be displayed
  • the number of queries run concurrently
  • the number of users logged in to the Tivoli Dynamic Workload Console user interface
As an average one object displayed as result of one query run by a user, requires 5 KB disk space available in the Application Server profile directory. This disk space is then released when the panel showing the browse task result is closed.
Prerequisites for installing Tivoli Dynamic Workload Console

If you install Tivoli Dynamic Workload Console on top of an existing Integrated Solutions Console, Tivoli Dynamic Workload Console inherits the Integrated Solutions Console installation path.

If you install both Tivoli Dynamic Workload Console and the Integrated Solutions Console, by default they are installed in the following directory, referred to as the installation_directory:

Tivoli Dynamic Workload Console componentDefault installation directory
Table 2. Default installation directories
IBM Tivoli Dynamic Workload ConsoleWindows: %Program Files%\IBM\ISC
UNIX(R): /opt/IBM/ISC

You can choose a different installation directory when you install Tivoli Dynamic Workload Console . For more information, see the Installation and Troubleshooting Guide.

The Integrated Solutions Console version 6.0.1, and therefore also the IBM Tivoli Dynamic Workload Console has the following installation prerequisites:

Operating systemImpactRequirement
Table 3. Hardware requirements for the Integrated Solutions Console

AIXProcessorPower 4 minimum
Memory1 GB or more available per processor. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in.
Operating System
    • AIX V5.2 ML 5
    • AIX V5.3 ML 1
Install authorityLog in as root
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB
Sun SolarisProcessorPower 4 minimum
Memory1 GB or more available per processor. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in. If the JVM is running in 64-bit mode, the memory footprint is doubled to 700 MB under the same conditions.
Operating SystemSun Solaris 9 on SPARC
Install authorityLog in as root
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB
Linux on IA32Processor1.0 GHz minimum
Memory1 GB or more available per processor. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in. If the JVM is running in 64-bit mode, the memory footprint is doubled to 700 MB under the same conditions.
Operating System
    • Red Hat Enterprise Linux 3.0 either AS or ES on IA32 with the following packages:

    • compat-gcc-7.3-2.96.122
      compat-libstdc++-7.3-2.96.122
      compat-libstdc++-devel-7.3-2.96.122
      compat-glibc-7.x-2.2.4.32.5
      compat-gcc-c++-7.3-2.96.122
      compat-db-4.0.14-5
      rpm-build-4.2.1-4.2
    • Red Hat Enterprise Linux 4.0 either AS or ES on IA32
    • SuSE Linux Enterprise Server 8 on IA32
    • SuSE Linux Enterprise Server 9 on IA32
Install authorityLog in as root
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB
Linux on Power64ProcessorPower 4 minimum
Memory1 GB or more available per processor. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in. If the JVM is running in 64-bit mode, the memory footprint is doubled to 700 MB under the same conditions.
Operating System
    • Red Hat Enterprise Linux 3.0 AS on Power64
    • Red Hat Enterprise Linux 4.0 AS on Power64
    • SuSE Linux Enterprise Server 8 on Power64
    • SuSE Linux Enterprise Server 9 on Power64
Install authorityLog in as root
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB
Linux on s/390(R) and zSeriesProcessorG5 minimum, G6 recommended
Memory1 GB available recommended. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in. If the JVM is running in 64-bit mode, the memory footprint is doubled to 700 MB under the same conditions.
Operating System
    • Red Hat Enterprise Linux 3.0 AS on s/390 31-bits with the following packages:

    • compat-db-4.0.14-5
      compat-pwd-0.62-3
      compat-listdc++-7.2.2-2.95.3.77
      rm-build-4.2.1-4.2
    • Red Hat Enterprise Linux 4.0 AS on s/390 31-bits
    • SuSE Linux Enterprise Server 8 on s/390 31-bits
    • SuSE Linux Enterprise Server 9 on s/390 31-bits
    • Red Hat Enterprise Linux 3.0 AS on zSeries 64-bits with the following packages:

    • compat-db-4.0.14-5
      compat-pwd-0.62-3
      compat-listdc++-7.2.2-2.95.3.77
      rm-build-4.2.1-4.2
    • Red Hat Enterprise Linux 4.0 AS on zSeries 64-bits
    • SuSE Linux Enterprise Server 8 on zSeries 64-bits
    • SuSE Linux Enterprise Server 9 on zSeries 64-bits
Install authorityLog in as root
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB
WindowsProcessor1.0 GHz minimum
Memory1 GB or more available per processor. Swap space should be double the physical memory.
Memory footprintIdle state memory footprint after the system has been restarted of 350 MB with a single ISC user logged in.
Operating System
    • Windows 2000 Professional with SP4
    • Windows 2000 Server with SP4
    • Windows 2000 Advanced Server with SP4
    • Windows XP Professional with SP2
    • Windows Server 2003 - Standard Edition
    • Windows Server 2003 - Enterprise Edition
Install authorityLog in as Administrator
Disk spaceThe following disk space must be available before proceeding with the installation:
    • installation_directory: 550 MB
    • temporary directory: 100 MB

See Checking the version of Java Runtime libraries on Linux operating systems for more information on using 32-bit versions of the libraries on 64-bit machines (Linux PPC and Linux z/OS(R)).

Additional prerequisites for using the launchpad

If you are installing the product using the launchpad, you need to have one of the following browsers installed:

  • Mozilla Version 1.7, or later
  • Firefox Version 1.0, or later
  • Microsoft(R) Internet Explorer (only for Microsoft Windows operating systems) Version 5.5, or later
Note:
On UNIX and Linux operating systems make sure you export the browser location to the BROWSER environment variable.
Supported Client Browsers

You must use one of the following client browsers to access from the systems across your network to the machine where you installed Tivoli Dynamic Workload Console :

  • Microsoft Internet Explorer Version 6.x.
  • Mozilla Version 1.7.8.
Checking the version of Java Runtime libraries on Linux operating systems

Integrated Solutions Console uses a 32 bit Java Runtime. For this reason, to use Integrated Solutions Console, and so Tivoli Dynamic Workload Console , on 64 bit machines you must have the 32 bit version of the libraries used by the Java Runtime. On Linux operating systems installed on 64-bit machine check the version of the library as follows:

For Linux on Power64:


Run the command:
rpm -qa --qf "%{NAME}~%{ARCH}~%{VERSION}~%{RELEASE}\n" | grep compat
This is a sample output of this command:
compat-libstdc++-33~ppc~3.2.3~47.3
compat-db~ppc~4.1.25~9
compat-libstdc++-33~ppc64~3.2.3~47.fc4

If the output contains the string ppc the package is 32 bit; if the string is ppc64 the package is 64 bit and you need to install the compatibility pack as described in Table 3.
For Linux on s/390 or on zSeries:
Run the command:
rpm -qa --qf "%{NAME}~%{ARCH}~%{VERSION}~%{RELEASE}\n" | grep compat
This is a sample output of this command:
compat-libstdc++-33~s390~3.2.3~47.3
compat-libstdc++-33~s390x~3.2.3~47.3
compat-libstdc++-295~s390x~2.95.3~81

If the output contains the string s390 the package is 31 bit; if the string is s390x the package is 64 bit and you need to install the compatibility pack as described in Table 3.
Note:
On Linux on s/390 or on zSeries the version is 31 bit not 32 bit.

If the library used is 64 bit, apply the compatibility pack listed in Table 3.

Compatibility tables for the Tivoli Dynamic Workload Console

Compatibility table for Tivoli Workload Scheduler:

Tivoli Workload Scheduler
Distributed connector
Tivoli Dynamic Workload Console
Table 4. Compatibility table for Tivoli Workload Scheduler

8.3.0.02
8.3.0.02
8.3

Compatibility table for Tivoli Workload Scheduler for z/OS:

Tivoli Workload Scheduler for z/OS
z/OS connector
Tivoli Dynamic Workload Console
Table 5. Compatibility table for Tivoli Workload Scheduler for z/OS

8.2 + APAR PK33565
8.3, 8.3.0.02
8.3
8.3
8.3.0.02
8.3
Note:
Version 8.3 of the scheduler is used in compatibility mode with 8.2. This means that even though Tivoli Workload Scheduler for z/OS version 8.3 is installed, only the version 8.2 functions can be used.

Choosing installation methods

You can install Tivoli Dynamic Workload Console using one of the following methods:

InstallShield for Multiplatforms (ISMP) wizard graphical interface in interactive mode.


You access the graphical user interface of the wizard by invoking a setup command and entering the configuration settings to install and configure your installation. Using this method you can synchronously monitor the installation processing and results.
InstallShield for Multiplatforms (ISMP) wizard in silent mode.
You customize a response file by adding all configuration settings to be used during installation, and then invoking from the command line the setup command using the -silent keyword. Using this method you can run the installation unattended and in the background.
InstallShield for Multiplatforms (ISMP) wizard invoked from the launchpad.
You run the launchpad which, in turn, invokes the setup command to perform the installation in interactive mode. Using this method, the launchpad guides you through a set of screens where you become familiar with the product documentation and with the information you need to provide during the installation itself. The use of the launchpad adds some prerequisites as described in Additional prerequisites for using the launchpad.
Post Installation steps

Before creating a new engine connection to access a Tivoli Workload Scheduler environment, you must run an additional configuration task to enable that Tivoli Workload Scheduler workstation to work with Tivoli Dynamic Workload Console. This task must be run on the system where the Tivoli Workload Scheduler engine that you want to connect to is installed, and therefore:

In a Tivoli Workload Scheduler distributed environment

  • On the master domain manager.
  • On a full status fault-tolerant agent (FTA) workstation where the Tivoli Workload Scheduler connector is installed.
In a Tivoli Workload Scheduler z/OS environment
On the distributed system where you installed the Tivoli Workload Scheduler z/OS connector.

These are the steps you must run on that system:


1. Make sure that the WebSphere Application Server is started on the Tivoli Workload Scheduler workstation and then run the following script:
On Windows:
As Administrator, from the directory TWS_home\wastools:
webui -operation enable -user username -password password 
                       -port
port [-server ServerName]
On UNIX
As root, from the directory TWS_home/wastools:
./webui.sh -operation enable -user username -password password 
                          -port
port [-server ServerName]
where:
user
Is the TWS_user user.
password
Is the TWS_user password.
port
Is the SOAP port of the WebSphere Application Server. This is a mandatory setting when using the enable flag. Its default values are 31118 for distributed environments, and 31128 for z/OS environments.
server
Is the name of the WebSphere Application Server profile used. By default the value assigned to this field is server1.
2. Stop and start the WebSphere Application Server on the Tivoli Workload Scheduler system where you run the script.

When you have completed these steps you are ready to create engine connections towards that Tivoli Workload Scheduler workstation and to manage your Tivoli Workload Scheduler production environment.

If you want to disable that Tivoli Workload Scheduler workstation to work with Tivoli Dynamic Workload Console do the following:


1. Run the following script:
On Windows:
As Administrator, from the directory TWS_home\wastools:
webui -operation disable
On UNIX
As root, from the directory TWS_home/wastools:
./webui.sh -operation disable
2. As a final step restart the WebSphere Application Server on the Tivoli Workload Scheduler system where you run the script.

By doing so you disable the possibility to establish engine connections from the Tivoli Dynamic Workload Console to that Tivoli Workload Scheduler workstation.

Additional configuration step to improve multiple access to the Tivoli Workload Scheduler

Before using the Tivoli Dynamic Workload Console with Tivoli Workload Scheduler, you should run the modifyThreadPool script on master domain managers or connectors after you installed Tivoli Workload Scheduler, Fix Pack 2. This script raises the limit of concurrent Tivoli Workload Scheduler users from 20 to a number that is limited only by the resources available in your system. Locate this script in the wastools subdirectory of the Tivoli Workload Scheduler home directory

Run the script with no arguments to enable as many concurrent Tivoli Workload Scheduler users as your system resources allow for.

As a final step restart the WebSphere Application Server on the Tivoli Workload Scheduler system where you run the script.

Product limitations

This section contains product limitations, and problems that could not be resolved with this release of IBM Tivoli Dynamic Workload Console:


1. With current release, the following Job Scheduling Console actions are not supported: in the panel to manage browsed distributed job streams:
  • Select all jobs for monitoring
  • Deselect all jobs for monitoring
Suggested action: from this panel, select the required job stream, and click List Jobs. In the displayed panel, you can select the properties of a single job and from the hyperlink Monitored Job you can change the Monitored Job property.
2. In the Tivoli Dynamic Workload Console for z/OS there is not any job stream editor available to manage jobs within the job stream in a plan. As a consequence, on z/OS, there is no Submit and Edit option available within the Predefined Job Stream action.
3. The current version of Tivoli Dynamic Workload Console allows viewing the dependencies of z/OS jobs in read only mode.
4. On the Tivoli Dynamic Workload Console for z/OS, the Find Resource and Find Workstation panels related to a z/OS engine are displayed in English only.
5. Occasionally, when a calendar field of a wizard panel has been set to a specific value, the field might be apparently reset to the initial value, if the active window changes. Actually, the input data previously provided is taken into account.
6. When using the Launchpad on Linux and Solaris 9 platforms, some warning messages might appear on the standard output. These messages can be ignored because they do not indicate any malfunctioning.
7. In the panel to manage browsed workstations, the List Jobs button returns the list of all the jobs that belong to job streams defined on the selected workstation, and not the list of jobs defined on the selected workstation.
8. Occasionally, a memory leak occurs when one of the following situations takes place:
  • The browser is closed while one or more Tivoli Dynamic Workload Console tabs are still open.
  • You log out without closing all the Tivoli Dynamic Workload Console tabs with the close page link
  • The session expires, while one or more Tivoli Dynamic Workload Console tabs are still open.
The problem can be mitigated by closing the active page using the close page link.
9. From the Quick Start panel it is not possible to see the engine connection associated to a task.
10. In z/OS environments, during task creation or editing, a few special characters might not be validated by the Tivoli Dynamic Workload Console. As a consequence, an error might occur on the z/OS connector causing the result table of a the task to be empty. The problem occurs when the code page of the z/OS engine is not the same as the code page of the z/OS connector client. The invalid characters are those that map the hexadecimal values 0x7B, 0x5B, 0x7C (EBCDIC code of #,$,@) in all supported code pages.
11. When you run a query and obtain a result that is long less than 12 lines, the status row might not be displayed.
12. If you uninstall the Tivoli Dynamic Workload Console, without uninstalling the Integrated Solutions Console (it remains installed until there is at least one application that uses it) the three user groups created by the Tivoli Dynamic Workload Console are not removed from the Integrated Solutions Console, however their presence does not have any impact.
13. If you installed the Integrated Solutions Console together with the Tivoli Dynamic Workload Console, when you uninstall the Tivoli Dynamic Workload Console, but you still have one or more applications that use the Integrated Solutions Console, the latter remains installed, even if the summary message incorrectly says that it will be uninstalled.
14. During the Tivoli Dynamic Workload Console Installation with the Traditional Chinese language, the License Agreement panel will be displayed in the Simplified Chinese language. The Traditional Chinese Licence Agreement and License Information are the following:
LI_zh_TWLA_zh_TW

Original Publication Date

01 December 2006

[{"Product":{"code":"SSGSPN","label":"IBM Workload Scheduler"},"Business Unit":{"code":"BU053","label":"Cloud & Data Platform"},"Component":"Tivoli Dynamic Workload Console","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF016","label":"Linux"},{"code":"PF027","label":"Solaris"},{"code":"PF033","label":"Windows"}],"Version":"8.3","Edition":"Advanced","Line of Business":{"code":"LOB45","label":"Automation"}}]

Document Information

Modified date:
24 January 2019

UID

swg27008715