Source: Software Testing Life Cycle by Vinayak Rao
Software Testing Theory Index
Software Testing Life Cycle
- Test Planning
- Test development
- Test execution
- Result Analysis
- Bug Tracking
- Reporting
Planning is the First phase of Testing Img Cartoonbank.com
|
Test Planning
Test plan Document
1.1 Objective
Purpose of the document is specified over here.
1.2 References Documents
The list of all the documents that are reffered to prepare this document will be listed out here in this section. (SRS and Project Plan)
Advertisement : Watch Demo video & Download Free trial here
|
2.0 Coverage of Testing
2.1 Features to be tested
The list of all the features that are to be tested based on the Implicit and explicit requirements from the customer will be mentioned in this section
2.2 Features not to be tested
The list of all the features that can be skipped from testing phase are mentioned here, Generally Out of scope features such as incomplete modules are listed here. If severity is low and time constraints are high then all the low risk features such as GUI or database style sheets are skipped. Also the features that are to be incorporated in future are kept out of testing temporarily.
3.0 Test Strategy
3.1 Levels of testing
Its a project level term which describes testing proceedures in an organization. All the levels such as Unit, module, Integration, system and UAT (User Acceptance test) are mentioned here which is to be performed.
3.2 Types of testing
All the various types of testing such as compatibility,regression and etc are mentioned here with module divisions
3.3 Test design techniques
The List of All the techniques that are followed and maintained in that company will be listed out here in this section. Some of the most used techniques are Boundary Value Analysis(BVA) and Equalence Class Participation(ECP).
Boundary Value Analysis:
BVA says that whenever there is a range for an input, just concentrate on the boundaries and not the entire values in between. In other words LB
or Lower bound is the minimum range and UB or Upper bound is the maximum range, So the values that will be considered in the BVA technique are
LB, LB + 1, LB - 1, UB, UB + 1, UB - 1. And one MV or mid range value can be added If the range is too lengthy.[MV = (LB+UB)/2]
Equalance class participation:
the different classes of inputs based on their properties and then develop the test cases.
To Understand these two techniques, we need to take an example of test case based on the requirements of a particular feature.
Example: The Email ID field of a web application needs to be developed on following specificaions or requirements,
a) The word limit for the email text box field must be Minimum of 4 characters and Maximum of 20 characters.
b) Only Lowercase alphabet allowed.
c) It should not accept special characters except @ & _ .
Once we understand the requirements, BVA = 3 characters(LB-1), 4(LB), 5(LB+1), 12(MV), 19(UB-1), 20(UB), 21(UB+1).
ECP states that data can be divided into different class of inputs, these can be determined based on the type of data that need to be provided for
testing any particular feature. In the above example by considering the BVA , the ECP table can be divided into two categories.
Valid and Invalid.
Example: Valid = 4 char, 5 char, 12 char, 19 char, 20char, a-z, @, _, a_5#.etc
Example: Invalid = 3 char, 21 char, A-Z, All special char except @ & _, 0 to 9, Alphanumeric texts, Empty spaces, decimals etc.
Based on these two types of inputs, you can create two tables, and
3.4 Configuration Management
3.5 Test Metrics
3.6 Terminology
3.7 Automation Plan
3.8 List of Automated tools
4.0 Base Criteria
4.2 Suspension Criteria
5.0 Test deliverables
6.0 Test environment
7.0 Resource planning
8.0 Scheduling
9.0 Staffing and Training
10.0 Risks and Contingencies
11.0 Assumptions
12.0 Approval Information
Test development stage
Arguably, this is the most important stage of the testing life cycle, in this section or phase of the testing part; the testers will develop the test cases based against the requirements of the customer. There are usually three levels of requirements, to be understood by the testers before they can proceed to write the test cases for the product
-
HLI ( High level Information)
-
LLI / Used Cases ( Low level Information )
-
Snapshots (Prototype or images of a similar product or framework.)
- Login screen should contain username, password, connect to fields, Login, Clear and Cancel buttons.
- Connect to field should not be a mandatory field but it must allow the user to connect to a database whenever he requires.
- Upon entering the valid username, valid password and clicking on Login button, the corresponding page according to the level of user ( admin, member, guest etc) must be displayed.
- Upon entering some information into any fields and clicking on Clear button, all the fields must be cleared and the cursor must be placed in the username field.
- Upon clicking on the Cancel button, the login screen must close.
- Initially whenever the login screen is invoked the Login and Clear buttons must be disabled.
- Cancel button must be always enabled.
- Upon entering some information into any of the fields the Clear button must be enabled.
- Upon entering any information into the username and password field the Login button must be enabled.
- Tabbing order ( Hitting tab on keyboard should highlight fields in specified sequence.) Username, password, connect to, Login, Clear and Cancel.
Used Case Document
Name of the Used Case : Login Screen
Brief Description : This document describes the functionality of Login screen.
Implicit requirements
- Initially whenever the login screen is invoked the cursor must be available in the username field.
- Upon entering invalid username, valid password and clicking Login, the following message must be displayed " Invalid username. Please try again."
- Upon entering valid username, invalid password and clicking Login, the following message must be displayed. " Invalid password. Please try again."
- Upon entering invalid username, invalid password and clicking on Login, the following message must be displayed. " Invalid username/password Please try again."
Explicit requirements
- Initially whenever the login screen is invoked, the Login and Clear button must be disabled.
- Cancel button must be always enabled.
- Upon entering information in any field, Clear button must be enabled.
- Upon entering username and password details, Login button should be enabled.
- Tabbing order must be Username, Password, Connect to, Login, Clear and Cancel.
Guidelines for Test engineer
- Identify the module to which the used case belongs to. In our example the login screen use case generally belongs to the security module.
- Identify the functionality of the use case with respect to the total functionality. example: Authentication for login screen
- All the Look and Feel ( GUI ) related test cases need to be written by the Test engineers directly even if the HLI, LLI and snapshots are not available. ( Important )
- Identify the functional points and prepare the Functional Points Document ( FPD ).
- Identify the actors involved in the use case wether normal or administrator etc.
- Identify the inputs required to perform testing, valid and invalid inputs need to be identified with respect to the functionality of the features.
- Identify wether this used case is linked with any other use case, such as homepage or admin page or database connections page to confirm authentication.
- Identify the Pre conditions and ensure that the build version released is the correct one and can be used for executing the test cases hereafter.
Methodology
- Understand the Main flow of the application. Usually all the valid inputs and normal actions of a valid user fall under the main flow.
- Understand the alternative flow. Generally the spontaneous and unpredictable inputs from actors such as invalid entries and interchanging the order of inputs will create different responses from the application. Such scheme of things fall under alternative flow.
- Understand the special requirements. Most of the time default settings and standards are developed by the coders and in some cases the customer may request a non conventional approach with respect to the applications behavior, In such case understanding the requirement and creating proper test case is very crucial.
- Documentation is very very important and all the documents need to be created seperately. The different versions and reference tables makes the test cases lengthy but will help later to trace any defects to the requirements by using Traceablity Matrix.
- Functional points document ( FPD ) is to be maintained in order to understand features that need to be tested and features that need to be skipped. The point where user can perform some action on the application is called as the Functional point.
Chronology of Documents in Testing
Traceability Matrix / Cross Reference Matrix
UCD | FPD | TSD | TCD | DPD |
1 | 3 | 4 | 25 | 1 |
UC id | TC id |
7.1 | 5 |
7.1 | 6 |
DPD id | TCD id |
2 | 5 |
Test Execution Phase
- GUI test cases
Guidelines to be followed when preparing GUI test cases
- Check for availability of all the objects on the application
- Check for alignment of all the objects, even though customer does not specify them in the requirements
- Check for consistency of the objects ( Color, appearance, resolution, spelling etc )
- And any such feature that can be tested just by observing or a defect that can be avoided by just looking and pointing out in the development stage will fall under GUI test cases.
Functional test cases
- The functional test cases can be classified into two categories, +ve test cases and -ve test cases. +ve test cases are written for the steps that user follows in order to perform the functions that the feature is supposed to do, In other words the Main flow of the application can be tested with +ve test cases with valid data and inputs.
- Whereas -ve test cases will be used to test the irregular and abnormal expected actions by the end user on the applications functionality. At least one set of Invalid input as test data is required to produce a -ve test case.
Non Functional test cases
- The test cases that are prepared to test the applications stability, Load or performance related features will fall under the non functional test cases. We will discuss performance and load testing details in other chapters of this knol. Please refer the Complete theory Index above to navigate to other sections.
Test Case Template
- Objective : The purpose of this test case will be mentioned here.
- Project Name : The code name of the project or the product name will be mentioned here which will be specific to the company policies and may vary from organisation to organisation.
- Module name : The particular module the test case belongs to for ex. Login screen . will be mentioned here.
- Author/ Prepared by : The test engineer, lead and other relevant names of people will be listed here who are responsible for the end document.
- Test scenarios : Based on the FPD the features are shortlisted that need to be tested and the scenarios and possible combinations will be listed in this section.
- Revision history : The authority of team leaders and test managers have to review and approve the test document before testers can proceed for executing the test. The relevant authorised signatures and timestamps will be present in this section
- Test case ID : The serial numbers of the test cases is listed here
- Req/Ref ID : The reference serial numbers or ids from the use cases will be listed here to create a cross reference matrix for the corresponding test case
- TC Type : The type of test case wether GUI, +ve or -ve will be listed here
- Description : The details of the action that the test engineer needs to perform on the features will be mentioned here clearly.
- Test Data : In order to perform functionality testing, It is very important to test the features with a range of data, and also approach with different techniques such as BVA and ECP. To keep the test case document tidy, test engineers prefer to create linking tables of test data in a seperate document or provide the inputs data at the end of the test case document. The test data can be divided into two categories Valid and Invalid inputs data, in order to write +ve and -ve test cases efficiently.
- EV : Expected value after performing the action will be listed here
- AV : The actual behavior of the application after executing the test case will be recorded and listed here.
- Result : The comparison of EV and AV will be done and the result, either PASS or FAIL will be mentioned here in this section accordingly.
- Priority : The priority is assigned to different test cases based on their effect on the testers to continue further execution of test cases on the application. In other words sometimes simple defects can create navigational blocks and prevent testers from accesing the entire features of the application and hence such defects will receive high priority , similarily GUI related defects receive low priority.
- Build No : The version of the build that is released by the development team will be listed here.
The following XL sheet is an example of a standard test case for our login screen functionality. Some results are PASS and some results are FAIL in order to make the Bug reporting and Result analysis chapters easier to understand.
Sr.No. | Object Name | Object type |
1 | Username / Password | Textbox |
2 | Connect To | Listbox/combobox |
3 | Login | Button |
4 | Clear | Button |
5 | Cancel | Button |
Sr.No | Username | Pswd | EV | AV |
1 | user1 | pswd2 | Admin | Admin |
2 | user2 | pswd2 | Home | Home |
3 | user3 | pswd3 | Home | Home |
4 | user4 | pswd4 | Home | Home |
5 | user5 | pswd5 | Admin | Admin |
Sr.No | Username | Password | Error Message | AV |
1 | user1 | pswd1 | IVUPTA | Admin |
2 | NA | pswd1 | IVUPTA | IVUPTA |
3 | user2 | NA | IVPPTA | IVPPTA |
4 | NA | NA | IVUPPTA | IVUPPTA |
Result Analysis and Bug Tracking
DPD ( Defect Profile Document )
- Defect ID : The sequence of defects identified are arranged serialwise and listed here
- Test case ID : The corresponding test case number based on this defect is identified and will be mentioned here ( advantages of Cross reference matrix )
- Description : The brief description of the defect will be listed here.
- Steps for reproduceability : The steps that the tester followed to encounter the defect will be mentioned here, in order to assist the developers in quickly identifying the Bug.
- Submitter : The name of the test engineer who has submitted the defect will be mentioned here.
- Date of submission: The date on which the defect report was logged will be listed here
- Build No. : The corresponding build that was released from the developers to the testing department will be listed here
- Version no: The version no. to which the build belongs will be mentioned here.
- Assigned to : The development lead will fill the corresponding developers name for whom the defect is assigned.
- Severity : This field describes the seriousness of the defect from the testers point of view. And can be classified into 4 types
- Fatal / Sev1/ S1/ 1 : If the problems encountered are related to the unavailability of functional feature, then such type of defects prevent testers from pursuing further testing , and hence are rated Fatal . Sometimes these defects are also called as show stopper defects. ( Missing fields or features )
- Major / Sev2/ S2/ 2 : If the features are available and functional testing can be carried but the results are not according to the expected value then these defects are termed as major defects (ex. an add button displays 5 when entered 2 + 2 , In which case the function is working properly but not according to expectations.)
- Minor / Sev3/ S3/ 3 : If the problems are related to the GUI or the Look and feel of the applications features, then these are treated as minor defects. ( inconsistent objects or spelling mistakes fall under this category.)
- Suggestions/ Sev4/ S4/ 4 : If the problems are related to the overall value of the application or can enhance the user friendliness of the application, after being rectified, then these defects are classified under the suggestions list.
- Priority : Priority describes the sequence in which the Development team will look into the defects and arrange them to be rectified. Priority can be classified into 4 types
- Critical / P1/
- High / P2
- Medium/ P3
- Low / P4
- Usually in normal situations , the highest severity defects will be given the highest priority and the least ones accordingly, but sometimes depending on the situation and barriers between the developers and testing team's knowledge bank , the correlation may change.
Advertisement : Unique, Established opportunity. Free trial, Watch Video here
|
No comments:
Post a Comment