IBM Support

How to manually shrink growing staging tables database table 'xstagefact'



Customer uses the "imports data from staging tables" feature of Controller. As part of this, they periodically place new data inside the Controller database table 'xstagefact', and then send a command to import that data into the system.

Over time, the database table 'xstagefact' increases in size.

The customer therefore has enabled the option "Number of Days Before Records in the Staging Tables are Deleted".

- However, this does not seem to automatically deleting records/data inside the xstagefact table. In other words, the amount of data (in that database table) is growing.

=> Is there a method to manually delete that old/historic data (from the staging area)?


Inside 'Maintain - Configuration- General' - 'General 3' the setting 'Import from Staging Tables - Number of Days Before Records in the Staging Tables are Deleted' is set to 'Days' (not 'Never'), for example given a value of 31:


In theory this should mean that periodically (example every 31 days) the data inside 'xstagefact' is automatically deleted.

- However, the customer has noticed that this mechanism does not work (the table 'xstagefact' gets bigger and bigger over time).


The setting 'Import from Staging Tables - Number of Days Before Records in the Staging Tables are Deleted' does not work because of a defect (reference APAR PH01895) in Controller.

Resolving The Problem


Upgrade to a future version of Controller, where the menu item 'Import from Staging Tables - Number of Days Before Records in the Staging Tables are Deleted' works OK.

  • TIP: to be automatically notified when this version is released, subscribe to APAR PH01895.


Manually delete the data from xstagefact table.


There are several methods to manually delete data from xstagefact:

Method #1 - (easiest for small numbers of job deletions) Delete the batch jobs from Transfer/External Data/Import From Staging Table

This will remove the corresponding data from xstagefact too.


As a precaution, before making any changes:

  • Create a backup of the database
  • Ideally, first try inside a 'test' (or 'development') database first (to make sure you understand/test the procedure).

1. Inside the Controller application, click "Transfer - External Data - Import From Staging Table"

2. Highlight the relevant old jobs (for example any job older than 1 month)

3. Click the 'delete' icon.

Method #2 - (easiest for large number of row deletions) Run a 'delete' (or 'truncate') SQL query.

This method is typically quickest/easiest for many customers, because they typically have many (for example several thousand!) batch jobs.

  • However, care should be taken to make sure that your SQL Query only deletes data that you no longer need.
  • Customers should talk with their I.T. department's database server administrator ('DBA') for assistance with how to delete old data.


In one real-life customer example:

  • database hosted on DB2
  • database schema owner (user) = fastnet
  • Customer wanted to delete all the entries inside xstagefact
The steps were therefore:
1. As a precaution, create a backup copy of the Controller database
2. Ask the I.T. department's DBA to tun the following DB2 script (on the Controller database): 
  truncate table fastnet.xstagefact IMMEDIATE

Related information


Document information

More support for: Cognos Controller

Component: --

Software version: 8.x, 10.1.x, 10.2.0, 10.2.1, 10.3.0, 10.3.1

Operating system(s): Windows

Reference #: 1458934

Modified date: 03 April 2019