Mainframe
Testing - Complete Tutorial
Before
mainframe testing, lets learn
What is a Mainframe?
The mainframe is a high
performance and a high-speed computer system. It is used for larger scale
computing purposes that requires great availability and security. It is mostly
used in sectors like finance, insurance, retail and other critical areas where
huge data are processed multiple times.
What is Mainframe Testing?
Mainframe
Testing is defined as testing of Mainframe Systems and is similar to web based
testing. The Mainframe application (otherwise called job batch) is tested
against the test cases developed using requirements.
●
Mainframe Testing is usually performed on the
deployed code using various data combinations set into the input file.
●
Applications that run on the mainframe can be
accessed through terminal emulator. The emulator is the only software that
needs to be installed on the client machine.
●
While performing Mainframe testing, the tester
only needs to know about the navigations of the CICS screens. They are custom
built for specific applications.
Any
changes made to the code in COBOL, JCL, etc. tester does not have to worry
about the emulator set up on the machine. The changes work through one terminal
emulator will work on others too.
In this tutorial, you will learn-
●
Mainframe Attributes
●
Classification of Manual Testing in Mainframe
●
Mainframe Testing Approach
●
Mainframe Automation Testing Tools
●
Methodology in Mainframe Testing
●
Steps involved in Batch testing
●
Steps involved in Online Testing
●
Steps involved in Online Batch Integration
testing
●
Commands used in Mainframe Testing
●
Pre-requisites to start mainframe testing
●
Best Practices
●
Mainframe testing Challenges and Troubleshooting
●
Common Abends encountered
●
Common issue faced during mainframe testing
Mainframe Attributes
- Virtual
Storage
- It is a technique that lets a processor simulate main
storage that is larger than the actual amount of real storage.
- It is a technique to use memory effectively to store
and execute various sized tasks.
- It uses disk storage as an extension of real storage.
- Multiprogramming
- The computer executes more than one program at the same
time. But at any given moment only one program can have control of CPU.
- It is a facility provided to make efficient use of the
CPU.
- Batch
Processing
- It is a technique by which any task is accomplished in
units known as jobs.
- A job may cause one or more programs to execute in a
sequence.
- The Job scheduler makes a decision about the order in
which the jobs should be executed. To maximize the average throughput,
jobs are scheduled as per their priority and class.
- The necessary information for batch processing is
provided through JCL (JOB CONTROL LANGUAGE). JCL describes the batch job
programs, data and resources needed.
- Time
Sharing
- In a time-sharing system, each user has access to the
system through the terminal device. Instead of submitting jobs that are
scheduled for later execution, the user enters commands that are
processed immediately.
- Hence this is called "Interactive
Processing". It enables the user to interact directly with the
computer.
- Time-share processing is known as "Foreground
Processing" and the batch job processing is known as
"Background Processing."
- Spooling
- SPOOLing stands for Simultaneous Peripheral
Operations Online.
- SPOOL device is used to store the output of
program/application. The spooled output is directed to output devices
like a printer (if needed).
- It is a facility exploiting the advantage of buffering
to make efficient use of the output devices.
Classification of Manual Testing in
Mainframe
Mainframe Manual Testing can be classified into two types
:
- Batch
Job Testing
○
Testing process involves executions of batch jobs
for the functionality implemented in the current release.
○
The test result extracted from the output files and
the database are verified and recorded.
- Online
Testing
○
Online Testing refers to testing of CICS screens
which is similar to testing of the web page.
○
The functionality of the existing screens could be
changed, or new screens could be added.
○
Various applications can have enquiry screens and
update screens. The functionality of these screens needs to be checked as part
of the online testing.
Mainframe Testing Approach
- The Business team prepares requirement documents. Which
determines how a particular item or process is going to be modified in the
cycle of release.
- The testing team and the development receive the
requirement document. They will figure out how many processes will be
affected by the change. Usually, in a release, only 20-25% of the
application affected directly by the customized requirement. The other 75%
of the release will be for the out-box-functionalities like testing the
applications and processes.
- So, a Mainframe application has to be tested in two
parts:
- Testing Requirements Testing the
application for the functionality or the change mentioned in the
requirement document.
- Testing Integration Testing
the whole process or other application which receive or send data to the
affected application. Regression Testing is the primary focus of
this testing activity.
Mainframe Automation Testing Tools
Below is the list of tools which can be used for mainframe Automation
Testing.
●
REXX
●
Excel
●
QTP
Methodology in Mainframe
Testing
Let us
consider an example: An XYZ insurance company has member enrollment module. It
takes data both from member enrollment screen and offline enrollment. As we
discussed earlier, it takes two approaches for Mainframe testing, online
testing, and batch testing.
●
Online testing is done on the member enrollment screen. Just like
a web page the database is validated with data entered through the screens.
●
Offline enrollment can be paper enrollment or enrollment on a third
party website. The Offline data (also referred to as batch) will be entered
into the company database through batch jobs. An input flat file is prepared as
per the prescribed data format and fed to the sequence of batch jobs. So for
mainframe application testing we can use the following approach.
○
The first job in the line of batch jobs validates
the data entered. Let say for example special character, alphabets in number
only fields, etc.
○
The second job validates the consistency of data based
on business conditions. For example, a child enrollment should not contain
dependent data, member zip code (which is not available for service by the
enrolled plan), etc.
○
The third job modifies the data in the format that
can be entered into the database. For instance, deleting the plan name
(database will store only plan ID, and insurance plan name), appending date of
entry, etc.
○
The fourth job loads the data into the database.
●
Batch job testing is done on this process in two phases
○
Each job is validated separately, and the
○
Integration between the jobs is validated by
providing input flat file to the first job and validating the database.
(Intermediary results have to be validated for extra caution)
The following is the method followed for Mainframe testing:
Step 1): Shakedown/Smoke Testing
The main
focus in this stage is to validate whether the code deployed is in the right
test environment. It also ensures that there are no critical issues with the
code.
Step 2): System Testing
Below are the types of testing done as part of System Testing.
- Batch Testing This testing will be done by validating the test results
on output files and data changes done by the batch jobs under testing
scope and recording of them.
- Online Testing This testing will be done on the front end of the
mainframe application. Here the application is tested for correct entry
field like an insurance plan, interest on the plan, etc.
- Online-Batch Integration testing This testing will be
done on the systems having batch processes and online application. The
data flow and interaction between the online screens and the batch jobs is
validated.
- (Example for this type of testing Consider an update on Plan details like increase of
interest rate. The change of interest is done on an update screen, and the
balance details on the affected accounts will be modified only by a
nightly batch job. Testing in this case will be done by validating the
Plan details screen and the batch job run for updating all the accounts).
- Database Testing The databases where the data from the mainframe
application (IMS, IDMS, DB2, VSAM/ISAM, Sequential datasets, GDGs) are
validated for their layout and the data storage.
Step 3): System Integration Testing
The
primary purpose of this testing is to validate the functionality of the systems
which are interacting with the system under test.
These
systems are not directly affected by the requirements. However, they use data
from the system under test. It is important to test the interface and different
types of messages (like Job Successful, Job Failed, Database updated, etc. )
that can possible flow between the systems and the resulting actions taken by
the individual systems.
Types of
testing done in this stage are
- Batch Testing
- Online Testing
- Online Batch Integration Testing
Step 4): Regression Testing
Regression
Testing is a common phase in any type of testing project. This testing in
Mainframes ensures that batch jobs and the online screens which do not directly
interact with the system under test (or do not come in the scope of
requirements) are not affected by the current project release.
In order
to have effective regression testing, a particular set of test cases should be
shortlisted depending on their complexity and a regression bed (Test cases
repository) should be created. This set should be updated whenever there is a
new functionality rolled out into the release.
Step 5): Performance Testing
This
testing is done to identify the bottlenecks in high hit areas like front end
data, upgrading online databases and to project the scalability of the
application.
Step 6): Security Testing
This
testing is done to evaluate how well the application is designed and developed
to counter anti-security attacks.
Two fold
security testing should be done on the system Mainframe security and Network
security.
The features which need to the tested are
- Integrity
- Confidentiality
- Authorization
- Authentication
- Availability
Steps involved in Batch testing
- After the QA team receives the approved package (Package
contains procedures, JCL, Control Cards, Modules, etc.), the tester should
preview and retrieve the contents into PDS as required.
- Convert the production JCL or Development JCL into QA
JCL otherwise called JOB SETUP.
- Copying production file and preparing test files.
- For every functionality, there will be a job sequence
defined. (As explained in the example in Methodology in Mainframe
section).The jobs should be submitted using the SUB command with the test
data files.
- Check the intermediate file in order to identify the
reasons for missing or error-out data.
- Check the final output file, database and the Spool to
validate the test results.
- If the job fails, the spool will have the reason for the
job failure. Address the error and resubmit the job.
Test
Reporting Defect should be logged if the actual result deviates from
expected.
Steps involved in Online
Testing
- Select the Online screen in a test environment.
- Test each field for the acceptable data.
- Test the Test Scenario on the screen.
- Verify the database for the data updates from the online
screen.
Test
Reporting Defect should be logged if the actual result deviates from
expected.
Steps involved in Online
Batch Integration testing
- Run the job in a Test Environment and validate
the data on the online screens.
- Update the data on the online screens and validate if
the batch job is properly run with the updated data.
Commands used in Mainframe
Testing
- SUBMIT Submit a
background job.
- CANCEL Cancel
background job.
- ALLOCATE Allocate a
dataset
- COPY Copy a
dataset
- RENAME Rename data
set
- DELETE Delete
Dataset
- JOB SCAN To bind the
JCL with the program, libraries, file, etc. without executing it.
There are
many other commands used when required, but they are not that frequent.
Pre-requisites to start
mainframe testing
Basic details needed for mainframe testing are:
●
Login ID and password for logging into the
application.
●
Brief knowledge on ISPF commands.
●
Names of the files, file qualifier and their
types.
Before
starting mainframe testing, the below aspects should be verified.
- Job
- Do a job scan (Command JOBSCAN) to check for errors
before executing it.
- CLASS parameter should be pointed to the test class.
- Direct the job output into spool or a JHS or as
required by using MSGCLASS parameter.
- Reroute the email in the job to spool or a test mail
ID.
- Comment the FTP steps for initial testing and then
point the job to a test server.
- In case an IMR (Incident Management record) is
generated in the job, just add comment "TESTING PURPOSE" in the
job or param card.
- All the production libraries in the job should be
changed and pointed to test libraries.
- The job should not be left unattended.
- To prevent the job to run in an infinite loop incase of
any error, TIME parameter should be added with specified time.
- Save the output of the job including the spool. The
spool can be saved using XDC.
- File
- Create test file of needed size only. Use
GDGs(Generation Data Groups Files with the same name but with
sequential version numbers
MYLIB.LIB.TEST.G0001V00,MYLIB.LIB.TEST.G0002V00 so on ) when necessary to
store data into consecutive files with the same name.
- The DISP (Disposition - describes the system to perform
keep or delete the dataset after normal or abnormal termination of the
step or job) parameter for the files should be coded correctly.
- Ensure that all the files used for job execution are
saved and closed properly to prevent job to go into HOLD.
- While testing using GDGs make sure that the right
version is pointed at.
- Database
- While executing the job or online program, ensure that
unintended data is not inserted or updated or deleted.
- Also, ensure that the correct DB2 region is used for
testing.
- Test
cases
- Always test for boundary conditions like Empty file,
First record processing, Last record processing, etc.
- Always include both positive and negative test
conditions.
- In case if standard procedures are used in the program
like Check point restart, Abend Modules, Control files, etc. include test
cases to validate if the modules have been used correctly.
- Test
Data
- Test data setup should be done before the beginning of
the testing.
- Never modify the data on the test region without
notifying. There may be other teams working with same data, and their
test would fail.
- In case the production files are needed during the
execution, proper authorization should be obtained before copying or
using them.
Best Practices
- Incase of a Batch Job run, MAX CC 0 is an indicator that
the job has run successfully. It does not mean that the functionality is
working fine. The job will run successfully even when the output is empty
or not as per the expectation. So it is always expected to check all the
outputs before declaring the job successful.
- It is always a good practice to do a dry run of the job
under test. Dry run is done with empty input files. This process should be
followed for the jobs which are impacted by the changes made for the test
cycle.
- Before the test cycle begins the test job set up should
be done well in advance. This will help in finding out any JCL error in
advance hence saving time during execution.
- While accessing DB2 tables through SPUFI (Option on the
emulator to access DB2 tables), always set auto commit as "NO"
in order to avoid accidental updates.
- Test Data availability is the primary challenge in batch
testing. Required data should be created well in advance of the test cycle
and should be checked for completeness.
- Some online transactions and batch jobs may write data
into MQs (Message Queue) for transmitting data to other applications. If
the data is not valid, it may disable/stop MQs, this will affect the whole
testing process. It is a good practice to check that MQs are working fine
after testing.
Mainframe testing Challenges
and Troubleshooting
Challenges
|
Approach
|
Incomplete / Unclear Requirements
There may be access to user manual/ training guide, but
those are not same as documented requirements.
|
Testers should be involved in the SDLC from the
requirements phase onwards. This will help to verify if the requirements are
testable.
|
Data
Setup/ Identification
There
may be situations where existing data should be reused as per the
requirement. It is sometimes difficult to identify the required data from the
existing data.
|
For
data setup, homegrown tools can be used as per the need. For fetching
existing data, queries should be built in advance. In case of any difficulty,
a request can be placed to data management team for creating or cloning
required data.
|
Job Setup
Once the jobs are retrieved into PDS, the job needs to be
setup in the QA region. So that the jobs are not submitted with production
qualifier or path detail.
|
Job setup tools should be used so as to overcome human
errors made during setup.
|
Ad-hoc
Request
There
may be situations when end to end testing needs to be supported due to a
problem in upstream or downstream application issues. These requests increase
the time and effort in execution cycle.
|
Use
of automation scripts, regression scripts, and skeleton scripts could help in
reducing the time and effort overhead.
|
On-Time Releases for scope change
There may be a situation where the code impact may
completely change the look and feel of the system. This may require a change
to test cases, scripts, and data.
|
Scope change management process and Impact analysis should
be in place.
|
Common Abends encountered
- S001 An I/O
error occurred.
- Reason Reading at
the end of the file, file length error, attempt to write into read-only
file.
- S002 Invalid I/O
record.
- Reason Attempt to
write a record longer than record length.
- S004 Error
occurred during OPEN.
- Reason Invalid DCB
- S013 Error
opening a dataset.
- Reason PDS member
does not exist, record length in the program does not match the actual
record length.
- S0C1 Operation
Exception
- Reason Unable to
open file, missing DD card
- S0C4 Protection
exception/ Storage violation
- Reason Trying access
storage not available to the program.
- SC07 Program
Check Exception Data
- Reason Change in
record layout or file layout.
- Sx22 Job has
been canceled
- S222 Job
canceled by the user without a dump.
- S322 Job or Step
time exceeded the specified limit, or the program is in a loop or
insufficient time parameter.
- S522 TSO session
timeout.
- S806 Unable to
link or load.
- Reason - Job id
unable to find the specified load module.
- S80A Not enough
virtual storage to satisfy GETMAIN or FREEMAIN requests.
- S913 Trying to
access the dataset which the user is not authorized.
- Sx37 Unable to
allocate enough storage to the dataset.
Error
Assist A very popular tool to get detailed information on various types of
abends.
Common issue faced during
mainframe testing
●
Job Abends For successful completion of the job, you
should check the data, input file and the modules present at the specific
location or not. Abends can be faced due to multiple reasons, the most common
being Invalid data, Incorrect input field, date mismatch, environmental
issues, etc.
●
Output file emptyThough the job might run successfully (MaxCC 0),
the output might not be as expected. So before passing any test case, the
tester have to make sure that the output is cross verified. Only then proceed
further.
●
Input file empty In some applications, files will be received
from the upstream processes. Before using the received file for testing current
application, the data should be cross verified to avoid re-execution and
rework.