z/OS DFSMSdfp Storage Administration
Previous topic | Next topic | Contents | Contact z/OS | Library | PDF


Testing procedures

z/OS DFSMSdfp Storage Administration
SC23-6860-01

When the set up tasks are complete ( Setting up the test environment), you can begin NaviQuest testing. The initial testing establishes the base line test set against data sets that will never be SMS managed. Because they are not managed, they have an expected result of null ('') for each storage class, storage group, data class, and management class.

Requirement: For these test cases, you must use the subtype prefix NEVR.

After the base line test is complete, you can test each phase, or cycle, of the SMS implementation, one subtype at a time. Normally, a SMS implementation phase is made up of either a single data type or several data subtypes. Each data subtype is tested independently. Once each data subtype tests correctly, you can begin data conversion for SMS for that phase.

Recommendation: Put all test cases into the initial base line test set. This will save you from having to repeat adding data types or subtypes one at a time.

Use the following procedure to test your base line test set or any other phase.

  1. Collect data set information for input.
    With NaviQuest, you can create many test cases at once using input from the following sources:
    • ISMF lists
    • DCOLLECT data
    • SMF data created by a storage class ACS exit
    • VMA data
    The data set test cases must all be representative of the data type that you will migrate to SMS management and require the same SMS services.
    From the ISMF Primary Option menu, "Data Set" (option 1) offers you two ways to generate data sets samplings:
    • From a saved listing
    • From a new listing created from the criteria you specify, such as a VTOC or catalog.

    The multivolume variable is always set to "Yes" in an ISMF table if the data set is not open at the time the table is saved. The value is set correctly at the time the data set is opened, which can sometimes cause errors in the bulk test case generator.

    Recommendations: 
    • Generate SMF test cases from the ACSTST program for temporary data sets, because saving tables of temporary data sets might produce errors in bulk test case creation ("Test Case Generation from Saved ISMF List" option 11.1.1).
    • Set the ACQUIRE DATA FROM VOLUME and ACQUIRE DATA IF DFHSM MIGRATED options under the ISMF "Data Set Selection Entry" panel (ISMF option 1) to Y before generating the list.

    After you have created the list, enter the SAVE command on the command line to save the list into a table. For information on the SAVE command, see z/OS DFSMS Using the Interactive Storage Management Facility.

    _______________________________________________________________

  2. Generate the test cases.

    Use the Test Case Generation Selection Menu panel to turn the ISMF list, DCOLLECT data, SMS data generated by the storage class ACS exit, and VMA data into standard SMS test cases.

    To generate test cases from saved ISMF tables, select option 1, Saved ISMF List, and then use the Test Case Generation from Saved ISMF List Entry Panel to Enter the following information:
    • Saved list name previously saved in step 1
    • Member name prefix (subtype prefix)
    • PDS that contains the test cases
    • Whether you want to replace the existing test cases with the output test cases
    Also select additional values that you want included in the test cases.

    Recommendation: If you do not enter a PDS name, NaviQuest will generate one based on the format userid.Tnn.TESTCASE. Instead, specify a name so that the test case library conforms to your installation’s naming standards.

    To generate test cases from DCOLLECT data, select option 2, DCOLLECT Data, and then use Test Case Generation from DCOLLECT Data Entry panel.

    Enter the following information:
    • Data set name
    • Number of test cases you want included
    • Member name prefix (subtype prefix) of the DCOLLECT test cases
    • PDS that contains the test cases
    • Whether to replace the existing test cases with the output test cases

    Before you can use this function, you must have DCOLLECT data that includes D (data set) records.

    To generate test cases from the ACSTST program, select option 3, SMF Data, and then use Test Case Generation from SMF Data Entry panel.

    Enter both the data set name containing the system management facility (SMF) data and the name of the test case PDS.

    Recommendation: This function requires that you have the IGDACSSC storage class exit installed and have extracted the SMF type 127 records. The ACSTST program is also required. Both are available in the sample library SYS1.SACBCNTL.

    To generate a test case from a VMA extract file, select option 4, VMA Extract Data, and then use the Test Case Generation from VMA Extract Data Entry panel.

    Enter the following information:
    • Name of the data set containing the VMA Extract data
    • Number of test cases you want generated
    • Member name prefix (subtype prefix) of the VMA test cases
    • Program name, if you want to test the implementation for a particular program
    Also include the name of the test case PDS and whether to replace the existing test cases.

    Requirement: To use this function, you must have already run GFTAXTR from your saved SMF records (types 14, 15, 21 and 30). JCL for GFTAXTR can be found in SYS1.SAMPLIB member GFTAXTRP.

    In addition, you can also use the following batch options to create test cases:

    Add a 1-to-4 character subtype prefix to each test case member. The prefix must be unique for each data subtype. For example, the first group of TSO data could have subtype prefix TSOA, the second TSOB, and so on.

    See step 1 for creating the ISMF table. ACBJBAG2, ACBJBAG1, ACBJBAOW, and ACBJBAI1 in SYS1.SACBCNTL JCL library can perform this task in batch (see How to run storage administration tasks in batch).

    _______________________________________________________________

  3. Make ACS routine and construct changes.

    You must change (but not activate) the ACS code and constructs to reflect the new phase of implementation that you want to test. Before you change the ACS code and construct definitions contained in the source control data set (SCDS), save the old source in case it is needed for recovery.

    For information on recovering the ACDS, refer to Recovering Storage Management Subsystem information.

    You can now update the ACS routines to reflect the new data subtype you want migrated to SMS.

    _______________________________________________________________

  4. Update the FILTLISTs.

    When the ACS code is changed, you might want to use the COPYFILT function of NaviQuest to update all the ACS routines from a common definition of the filter lists. You will be prompted to provide a change log entry that reflects changes you are making to the ACS routines. This entry will be automatically placed into the change log in the ACS routines.

    To use the COPYFILT macro, see COPYFILT macro: COPYLIB facility for FILTLISTs.

    _______________________________________________________________

  5. Translate and validate the ACS routines.

    You must translate and validate (but not activate) the ACS routines. refids="vs76475 ts64795">The ISMF translate function transforms ACS routines into a table format. Translation checks for syntax errors and transforms the ACS routines into a format suitable for input to the validation.

    The ISMF validate function verifies that all possible constructs that can be assigned with the ACS logic have been defined to the SCDS used for testing. ACS routines must be translated before they can be validated; however, validation of ACS routines is optional.

    To translate and validate, you can either use the online ISMF functions or you can use the NaviQuest ISMF-in-batch EXEC.

    For online translation and validation, choose option 7 (ACS Class Selection) from the ISMF Primary Option menu. To translate, choose option 2 (Translate). To validate, choose option 3 (Validate).

    To use the translate facility in batch, see ACS routine translate: ACBQBAO1.

    For more information on translation, see Translating ACS routines. For more information on validation, see Validating ACS routines or an entire SCDS.

    _______________________________________________________________

  6. Run the test cases.

    Create a new ACS listing by using the ISMF Test ACS Routines option (7.4.3). The testbed library contains the test cases. Specify an asterisk (*) to run all test cases in the library.

    The new ACS listing represents the SMS configuration after the ACS routines have been changed for the new data subtype.

    Recommendations:
    1. Include the prefix of the subtype tested in the ACS listing data set name, to make it easier to identify which data subtype the listing represents.
    2. Run test cases in batch whenever possible. For sample JCL for testing ACS routines in batch, see Test ACS routines: ACBQBAIA.

    _______________________________________________________________

  7. Compare the results of the regression testing.

    After the base line test, every test includes both testing of new data subtypes and regression testing of previously tested data subtypes, including the base line test set.

    At this time, you use the NaviQuest ACS comparison test function to compare the results of all test cases in the testbed library with their expected results. The ACS comparison test produces a report of exceptions. Because you have not yet stored the expected results of the test cases for this data subtype, these test cases appear as exceptions. Later, in step 9, you will store the expected results for the current data subtype test cases. But for now, the exceptions you get are either these valid initial (that is, first run) test cases, or they are errors.

    To run the ACS comparison test, choose option 2 from the NaviQuest Primary Option Menu.

    On the ACS Comparison Report panel, enter the following information:
    • Name of your base test case results
    • Name of your new test case results
    • PDS that contains the test cases
    • PDS that contains the exception test cases
    • Name of the comparison results data set
    After running the ACS Test Listings Comparison Entry panel, you must verify the following items:
    • The number of exceptions should be the same as the number of test cases you are currently testing.
    • Exceptions should all have the same subtype prefix.
    • Each listed test case should have the listed results that you expect.
    If changes have been made correctly to the ACS routines, the differences between the two should be only the data subtype that is being initially tested.

    Specify a comparison data set name to be used to store the results of the comparison. Also input whether you want to write over the data set specified if it already exists. If N is specified, and the data set name already exists, an error message will be returned. If Y is specified, the data set will be deleted, a new data set with the same name will be allocated, and the report will be written to this data set. Then press the Enter key.

    You will be automatically placed into ISPF "browse" when the comparison completes. The comparison data set you are browsing lists only the test cases identified as exceptions.

    If exceptions other than the test cases for the subtype you are initially testing are listed, you have probably made an error in coding the revisions to your ACS routines. Changes in coding that have caused errors must be corrected before you can proceed. This means repeating the operations until the test cases match the exceptions.

    The following files are created or updated as output:
    • Exception PDS
    • Comparison data set

    Important: Each test is an initial test for one data subtype but may include many regression tests for previously tested data subtypes. Expected values are not stored in the initially tested data subtype until its testing completes successfully.

    The ACS comparison performed in step 7 has two functions:
    • It validates the regression tests.

      Current test results of each previously tested data subtype should match the saved expected results previously stored with the test cases. If the results are the same, the regression test is successful. If the results differ, there is an error in the new ACS logic; that is, the ACS routine is assigning different values.

    • It indicates the subtype test set that is being initially tested.

      Because this is an initial test, this test case has no expected results stored in the test cases, other than null. Thus, during the comparison in step 7, all test cases for this new data subtype show an exception; that is, new results will no longer be null.

    For more information about running the ACBQBAC1 EXEC in batch, see ACS test listings comparison: ACBQBAC1.

    _______________________________________________________________

  8. Validate the test results and determine errors.

    You must manually compare the new test cases to their expected results for the single data subtype that has been initially tested. This comparison determines if there are initial test errors. If the exceptions contain any test cases from the data subtypes previously tested correctly (in regression testing), these exceptions are also errors.

    It is the manual verification of the results that makes sure that the values are the expected results. When all test cases are correct, the test values are stored in the test cases as save expected results, to be used for later regression testing.

    If you find errors, you can generate the NaviQuest ACS cross-reference report for additional information about the specific test cases that produced the errors. Use this report to help you debug the ACS logic. If you find errors (from either step 7 or step 8), you must correct the ACS code before returning to step 3 and retest until the data subtype results have no errors.

    If you do not find errors, the test is complete, as all the test cases in the subtype test set have the correct expected results.

    To create an ACS cross-reference report, choose Enhanced ACS Test Listing Entry panel (option 3) from the NaviQuest Primary Option Menu. On the Enhanced ACS Test Listing Entry panel, fill in the fields with the following names:
    • ISMF test case listing (generated through option 7.4.3)
    • Data set for cross-reference listing (name of data set to contain cross-reference)

    Indicate whether the specified data set should be written over if it already exists. If N is specified, and the data set name already exists, an error message will be returned. If Y is specified, the data set will be deleted, a new data set with the same name will be allocated, and the report will be written to this data set.

    Specify with a Y or an N which variables you want included in your report. Once you have specified all variables that you want, press the Enter key and the report will be produced.

    _______________________________________________________________

  9. Save the expected results.

    Once the subtype test is correct, you can use NaviQuest to place the results of the test (that is, the expected results for later regression testing) into the test case definition as the saved expected results for later regression testing. Test results are only saved after all test cases in the subtype test set have completed with the expected results for that data subtype. The saved expected results will be used for later regression testing, as explained in step 7.

    To save the test results, choose Test Case Update With Test Results Entry Panel (option 4) from the ISMF Primary Option menu, which takes you to the Test Case Update With Test Results Entry panel.

    Enter the names of the testbed PDS library, the exception test case PDS, the PDS created in the ACS comparison report, and the new ACS test case listing.

    The test case members for the exceptions are read and copied into the testbed library. The saved expected results are obtained from the comparison report and are also saved in the testbed library.

    You have now completed testing for this data subtype and can now start testing the next data subtype.

    Delete the following data sets at the end of this step:
    • Comparison report generated in step 7
    • Exception PDS created in step 7
    • Base and new ACS listing can be deleted (or printed and deleted)

    For more information about running the ACBQBAU1 EXEC in batch, see Update test cases with expected results: ACBQBAU1.

    _______________________________________________________________

  10. Test the next data subtype in the current phase.

    Continue NaviQuest testing for each data subtype in the current SMS implementation phase. This testing either repeats Steps 1 through 10 or repeats Steps 3 through 10, depending on whether all subtype test sets are initially placed into the testbed.

    After the initial test of the base line, all additional tests include regression testing along with initial testing.

    _______________________________________________________________

  11. Activate your new SMS configuration.

    Once an entire phase (that is, all the subtypes within the implementation phase) have tested correctly, you can activate the new configuration by using the SETSMS command at an MVS console.

    For more information on activating your configuration, see Activating Storage Management Subsystem configurations.

    You might want to use the NaviQuest reporting capabilities to determine the amount of DASD space required to convert the data in each phase, prior to attempting conversion. Use this information to ensure that enough DASD is available for the conversion.

    _______________________________________________________________

  12. Convert data to SMS management.
    After activation of your new configuration, you can now migrate the data to SMS management. There are several options for doing this data migration:
    • DFSMSdss COPY
    • DUMP/RESTORE
    • Normal allocation processing
    • MIGRATE/RECALL
    • CONVERTV

Go to the previous page Go to the next page




Copyright IBM Corporation 1990, 2014