

Suggested AutoCollect Implementation
Suggested AutoCollect Implementation
The following is a suggested implementation for the first time user of AutoCollect.
The user must first do the following research and testing at his site.
- Determine which MUFs are selected for AutoCollect processing.
For the first time user we suggest that they select one MUF and use it as both the Source and Repository MUF.
- Verify that the repository MUFs have the Snapshot and Delta databases installed in the CXX. A simple CXX report should be able to validate their presence and also provide the current allocation sizes.
- If the databases are not present, check the r12 installation process for the steps to install the databases. If you cannot determine why the databases were not installed, contact CA Support.
- If the dataset sizes are not large enough or if you have not allocated and initialized the two datasets for each database, follow the instructions earlier in this guide to allocate and initialize the AutoCollect databases.
- Before starting data collection you should run a DBUTLTY LOAD FORMAT=NONE to verify the databases are empty and ready to receive AutoCollect data.
Important! We strongly recommend that you implement the AutoCollect databases as 1019 (Snapshot) and 1020 (Delta). If you choose to use other DBIDs for these databases, the various AutoCollect functions of DBUTLTY need to include the DBIDs on every AutoCollect function execution.
- Create a set of DBUTLTY jobs to execute and test the various AutoCollect functions:
- Execute OPTION=SNAPSHOT several times.
- Execute OPTION=SNAPRPT to report on the created Snapshot rowsets.
- Execute OPTION=DELTACRE to create the Delta rowsets.
- Execute OPTION=DELTARPT to report on the created Delta rowsets.
- Execute OPTION=SUMMARY to summarize all the INTERVAL delta rowsets.
- Execute OPTION=BASELINE to summary and baseline all the INTERVAL delta rowsets.
- Execute OPTION=AVGPERF to summary and average all the INTERVAL delta rowsets.
- Execute OPTION=DELTARPT to report on the created Delta rowsets included the user created Delta rowsets.
- Execute OPTION=DSVOUT to create an output datasets for SUMMARY and BASELINE delta rowsets.
- View the sequential output datasets through TSO or Roscoe.
- Use available FTP protocol to transfer the files from the mainframe to .txt files on your PC. Allow EBCDIC to ASCII translation to occur.
- Follow the instructions provided to import the .txt files into the excel spreadsheet.
- (For sites with SQL) Following the examples for using DBSQLPR to print a report of the data in a Delta rowset.
- (For sites with SQL) Following the examples for using DBSQLPR to create sequential output dataset for uploaded to a PC .txt file.
- Use the SNAPRPT and SNAPDEL functions to report on selected Snapshot rowsets and then delete them. Practice using selection criteria such as DATERG, TIMERG and MUFNAME.
- Use the DELTARPT and DELTADEL functions to report on selected Delta rowsets and then delete them. Practice using selection criteria such as TYPE, DATERG, TIMERG and MUFNAME.
When done testing, execute a DBUTLTY LOAD FORMAT=NONE to clear out the test data. Save the various JCLs so that you have working examples for your site.
Once the initial testing phase has been completed, the user should now begin to plan on his AutoCollect snapshot collection process. Key to this process is the time periods in which the Source MUF is normally executing. As we have seen in the various examples in chapter 2, the timing of the MUF startup and shutdown will affect when we do over Snapshot collection points.
The process below provides a simple implementation for collection Snapshots of weekly data.
- Select a weekly Snapshot collection:
- If the MUF comes down weekly, do the Snapshot just before the MUF cycle.
- If the MUF stays up for longer periods of time, select a specific quiet time during the weekend. Execute the Snapshot at the same time each week.
- If the MUF is cycled at non-weekly increments, combine the two points above of doing a snapshot at a fixed time each week and one right before the MUF is shu tdown.
- Collect 4 weeks of normal activity Snapshots.
- Create Delta rowsets.
- Produce Snapshot and Delta reports.
- Create Summary rowsets for each week, if needed.
- Create a weekly baseline record from the four weeks Summary rowsets.
- Create the DSVOUT output files.
- Download the DSV files into Microsoft Excel spreadsheets.
- Review the spreadsheets for any performance opportunities:
- Look for trouble spots, such as low resource, waits, buffer not available, and so on.
- Look for ways to improve performance, such as buffers, covered, virtual, and so on.
- Build a list of suggested changes.
- Implement first performance change.
- Wait for a week to collect new performance information
- Create new weekly summary Delta rowset.
- Create a new DSV output with all summary rowsets
- Update to an Excel spreadsheet.
- Compare all statistics:
- Statistics where we expected change
- Other statistics that changed
- Determine if the change was positive.
- Decide to keep or reset the change.
- Repeat the process until all noted changes have been made.
- Continue weekly review process.
- Each week, statistics should be reviewed for the following:
- Overall performance
- Changing workloads
- Error conditions
Copyright © 2014 CA.
All rights reserved.
 
|
|