Saturday, April 28, 2012

Software Testing Life Cycle - Reknolled


 

Source: Software Testing Life Cycle  by Vinayak Rao

 

Software Testing Theory Index

Part-1 Interview Based Theory  ( Index and Sub Index listed) 
Automation testing. (Under revision ETA 02/15/09)
Quicktest proffesional QTP 9.0 VBScript. ( Under revision ETA 02/17/09)
Quality Center. (unpublished draft proofreading due. ETA ..02/09)
 

Software Testing Life Cycle

     This Cycle can be understood by categorizing the stages in it, They are as follows
 
    Planning is the First phase of Testing Img Cartoonbank.com 
  • Test Planning
  • Test development
  • Test execution
  • Result Analysis
  • Bug Tracking
  • Reporting

 

Test Planning

1) Plan : Plan is a strategic document which describes how to perform a task in an effective efficient and optimized way. (Nrstt Reddy1)
 
2) Optimization : Its a process of utilizing the input available resources to their level best and getting the maximum possible output.
 

Test plan Document

 Test plan is a strategic document which describes how to perform testing on the application in an effectively.
 
     Test lead will design the Test Plan Document, Its contents are described below
1.0   Introduction
  1.1 Objective
      Purpose of the document is specified over here.
     
  1.2
References Documents
      The list of all the documents that are reffered to prepare this document will be listed out here in this section. (SRS and Project Plan)        
 
 
2.0  
Coverage of Testing
  2.1 Features to be tested
      The list of all the features that are to be tested based on the Implicit and explicit requirements from the customer will be mentioned in this section

  2.2
Features not to be tested
      The list of all the features that can be skipped from testing phase are mentioned here, Generally Out of scope features such as incomplete modules are listed here. If severity is low and time constraints are high then all the low risk features such as GUI or database style sheets are skipped. Also the features that are to be incorporated in future are kept out of testing temporarily.
                       
3.0   Test Strategy
  3.1
Levels of testing
      Its a project level term which describes testing proceedures in an organization. All the levels such as Unit, module, Integration, system and UAT (User Acceptance test) are mentioned here which is to be performed.
  3.2 Types of testing
      All the various types of testing such as compatibility,regression and etc are mentioned here with module divisions
     
  3.3
Test design techniques
      The List of All the techniques that are followed and maintained in that company will be listed out here in this section. Some of the most used techniques are Boundary Value Analysis(BVA) and Equalence Class Participation(ECP).
 

 Boundary Value Analysis: 

Whenever the test engineers need to develop the test cases for a range kind of input, the best technique suggested is BVA.
      BVA says that whenever there is a range for an input, just concentrate on the boundaries and not the entire values in between. In other words LB
      or Lower bound is the minimum range and UB or Upper bound is the maximum range, So the values that will be considered in the BVA technique are
      LB, LB + 1, LB - 1, UB, UB + 1, UB - 1. And one MV or mid range value can be added If the range is too lengthy.[MV = (LB+UB)/2]


 Equalance class participation:

When there are more number of requirements for a single feature, then ECP is very helpful. ECP is applied by dividing
      the different classes of inputs based on their properties and then develop the test cases.
     
      To Understand these two techniques, we need to take an example of test case based on the requirements of a particular feature.
      Example: The Email ID field of a web application needs to be developed on following specificaions or requirements,
      a)  The word limit for the email text box field must be Minimum of 4 characters and Maximum of 20 characters.
      b)  Only Lowercase alphabet allowed.
      c)  It should not accept special characters except @ & _ .
     
      Once we understand the requirements, BVA = 3 characters(LB-1), 4(LB), 5(LB+1), 12(MV), 19(UB-1), 20(UB), 21(UB+1).
      ECP states that data can be divided into different class of inputs, these can be determined based on the type of data that need to be provided for
      testing any particular feature. In the above example by considering the BVA , the ECP table can be divided into two categories.
      Valid and Invalid.
      Example: Valid = 4 char, 5 char, 12 char, 19 char, 20char, a-z, @, _, a_5#.etc
      Example: Invalid = 3 char, 21 char, A-Z, All special char except @ & _, 0 to 9, Alphanumeric texts, Empty spaces, decimals etc.
      Based on these two types of inputs, you can create two tables, and  

  3.4 Configuration Management 
         All the documents that are generated during the testing process needs to be updated simultaneously to keep the testers and developers aware of the proceedings. Also the naming conventions and declaring new version numbers for the software builds based on the amount of chnange is done by the SCM ( Software Configuration Management team ) and the details will be listed here.
    
  3.5 Test Metrics
           The list of all the tasks that need to be measured and maintained will be present here.  Different metrics for tracing back the exact requirement and test case depends on the availability of metrics at the right time.

  3.6 Terminology
            Testing specific jargons for the project that will be used internally in the company wil be mentioned here in this section.

  3.7 Automation Plan
           The list of all the features , or modules that are planned for automation testing will be mentioned here. The application only undergoes automation testing after being declared STABLE by the manual testing team.

  3.8 List of Automated tools
          The list of Automated tools, like QTP , Loadrunner, Win runner, etc which will be used in this project wil be mentioned along with license details.

4.0   Base Criteria
   4.1 Acceptance Criteria
           The standards or metrics that need to be acheived by the testing team before declaring the product fit will be listed here. So before handovering to the customer, the time at which the testing needs to be stopped is mentioned here in this section.

  4.2 Suspension Criteria
          In high risk projects, or huge projects that consists several modules , It is necessary to minimize the repetitive process to be efficient. The situations when the testing needs to be suspended or temporarily halted will be listed here in this section.
      
5.0   Test deliverables
         The list of all the documents that are to be prepared during the testing process will be mentioned here in this section. All the copies of  verification documents after each level are submitted to the customer alongwith the user manual and product at the end of the project.

6.0   Test environment
          Environmental components and combinations that will be simulated for testing the product will be made sure that it is very close to the actua environment when the end user works on the product. All the details will be mentioned here in this section that is to be used for testing the application.
         
7.0   Resource planning
          Roles to be performed or in other words Who has to do What , will be mentioned clearly here.

8.0   Scheduling
          The starting dates and ending dates for each and every task and module will be listed here in this section.

9.0   Staffing and Training
          How much staff is to be recruited and what kind of training is to be provided to accomplish this project successfully will be described here in a detailed fashion.

10.0  Risks and Contingencies
           The list of all the potential risks and the corresponding solution plans will be listed here :
           Risks : Example, Resources may leave the organization, license and update deadlines, customer may change the requirements in terms of testing or maintenance in middle of the project. etc
           Contingencies :  Maintaining bench strength, Rechecking initial stages of  the whole process, Importance and priority settings for features to be tested and features to be skipped under time constraints to be listed and shared clearly.

11.0  Assumptions
           Some features or testing methods have to be done mandatorily even though, the customer does not mention it in his requirements document. These assumptions are listed out here in this section.

12.0  Approval Information
           As this document is published and circulated, the relevant and required authorities will approve the plan and will update this section with necessary details like date and department etc.
 

Test development stage

We will use examples extensively to discuss the entire phase of test development in our article, Please note that these are just standard methods and approaches towards testing and can vary slightly from company to company. (But the core remains the same.)
 

Arguably, this is the most important stage of the testing life cycle, in this section or phase of the testing part; the testers will develop the test cases based against the requirements of the customer. There are usually three levels of requirements, to be understood by the testers before they can proceed to write the test cases for the product

  •  HLI ( High level Information)
  • LLI / Used Cases ( Low level Information )
  •  Snapshots (Prototype or images of a similar product or framework.)
Used Case: These are the snippets created by the Business Analyst to describe the functionality of certain features of an application, It briefly states the roles of actors, actions and responses that are required to be included in test cases before executing them on the product or software.
 
Snapshot of the Login Screen ( Prototype can be provided ) 
Example 1: Login Screen of an Application ( Input Information required to prepare the used cases )
Functional requirements that are collected by the Business Analyst ( HLI )
 
  • Login screen should contain username, password, connect to fields, Login, Clear and Cancel buttons.
  • Connect to field should not be a mandatory field but it must allow the user to connect to a database whenever he requires.
  • Upon entering the valid username, valid password and clicking on Login button, the corresponding page according to the level of user ( admin, member, guest etc) must be displayed.
  • Upon entering some information into any fields and clicking on Clear button, all the fields must be cleared and the cursor must be placed in the username field.
  • Upon clicking on the Cancel button, the login screen must close.
Additionally, the requirements can be classified based on their implicit or explicit natures.
 
Implicit Requirements : Sometimes the customer is unaware of the finer details and provides rough requirements list, in that case the business analyst produces a list of requirements on his own to improve the value of the product.
 
Explicit Requirements : Requirements that are demanded by the customer fall in this category, these requirements will always receive priority in testing and cannot be listed under Features not to be tested in the Test Plan document.
 
Example 1 Contd...
Special Requiements / Validations / Business rules and standards.
  • Initially whenever the login screen is invoked the Login and Clear buttons must be disabled.
  • Cancel button must be always enabled.
  • Upon entering some information into any of the fields the Clear button must be enabled.
  • Upon entering any information into the username and password field the Login button must be enabled.
  • Tabbing order ( Hitting tab on keyboard should highlight fields in specified sequence.) Username, password, connect to, Login, Clear and Cancel.
Example 2 : Used case template and document for the login screen application.
 
Used case template fields include the following fields : Name , description, Actors involved, Special requirements, Pre conditions, Post conditions, Flow of Events.
 

Used Case Document

Name of the Used Case : Login Screen

Brief Description            : This document describes the functionality of Login screen.

Actors Involved             : Normal users, Administrators
 
Special Requiremens    : Implicit and explicit requiremens listed below.
 

Implicit requirements

  • Initially whenever the login screen is invoked the cursor must be available in the username field.
  • Upon entering invalid username, valid password and clicking Login, the following message must be displayed  " Invalid username. Please try again."
  • Upon entering valid username, invalid password and clicking Login, the following message must be displayed. " Invalid password. Please try again."
  • Upon entering invalid username, invalid password and clicking on Login, the following message must be displayed. " Invalid username/password Please try again."

Explicit requirements 

  • Initially whenever the login screen is invoked, the Login and Clear button must be disabled.
  • Cancel button must be always enabled.
  • Upon entering information in any field, Clear button must be enabled.
  • Upon entering username and password details, Login button should be enabled.
  • Tabbing order must be Username, Password, Connect to, Login, Clear and Cancel.
 
Pre Conditions              : Login screen must be available
 
Post Conditions            : Either homepage or admin page for valid users and error message
                                        for invalid users must be displayed.
 
Flow of Events             : There are two flows to the application behavior and responses. 
                                      a) Main flow
                                      b) Alternate flow.
 
( Sometimes Diagrams/ Flowcharts are available to depict the flows in used cases, but we will consider a table and jot down the requirements efficiently as listed below ) 
Example 2 contd.....
 
                                    Actions                                         Responses
 
1) Actor invokes the application             :    Application displays the screen with the following
                                                                      fields, username, password, connect to, login, clear
                                                                      and cancel.
 
2) Actor enters valid username, valid    :    Authenticates, application displays either homepage
    password & clicks on Login                     or Admin page depending on the actors level.
 
3) Actor enters valid username , valid   :    Authenticates, application displays either homepage
    password & selects a database, clicks      or admin depending on the level of actor and the 
    Login                                                        mentioned database connection.
 
4) Actor enters invalid username and    :    Go to Alternative Flow Table 1.1
    valid password and clicks on Login
 
5) Actor enters valid username and        :    Go to Alternative Flow Table 1.2
    invalid password and clicks Login    
 
6) Actor enters invalid username, and    :    Go to Alternative Flow Table 1.3
    invalid password and clicks Login
 
7) Actor enters some info and clicks       :     Go to Alternative Flow Table 1.4
    the Clear button
 
8) Actor clicks cancel on the screen        :     Go to Alternative Flow Table 1.5
 
( The table has been splitted into two divisions, each for the main flow and alternative flow for better understanding, There is a lot of documentation and methodology involved in testing when an established CMMI level or ISO certified company enters the arena.)
Example 2 contd........
 
Alternative Flow Table 1.1 ( Invalid username )
Response : Authenticates , Application displays the following message " Invalid username, Please Try Again. "
 
Alternative Flow Table 1.2 ( Invalid Password )
Response : Authenticates, Application displays the following message. " Invalid Password, Please Try Again."
 
Alternative Flow Table 1.3 ( Invalid Username and Password)
Response : Authenticates, Application displays the following message. " Invalid Username/ Password, Please Try Again."
 
Alternative Flow Table 1.4 ( clicking Clear buttuon)
Response : All the fields are cleared and the cursor is placed in the username field.
 
Alternative Flow Table 1.5 ( clicking Cancel Button)
Response : Login screen is closed and the application exits.
 

Guidelines for Test engineer

( After the Used Case Document is given to the Test engineer by the business analyst.)
 
  • Identify the module to which the used case belongs to. In our example the login screen use case generally belongs to the security module.
  • Identify the functionality of the use case with respect to the total functionality. example: Authentication for login screen
  • All the Look and Feel ( GUI ) related test cases need to be written by the Test engineers directly even if the HLI, LLI and snapshots are not available. ( Important )
  • Identify the functional points and prepare the Functional Points Document ( FPD ).
  • Identify the actors involved in the use case wether normal or administrator etc.
  • Identify the inputs required to perform testing, valid and invalid inputs need to be identified with respect to the functionality of the features.
  • Identify wether this used case is linked with any other use case, such as homepage or admin page or database connections page to confirm authentication.
  • Identify the Pre conditions and ensure that the build version released is the correct one and can be used for executing the test cases hereafter. 

Methodology

  • Understand the Main flow of the application. Usually all the valid inputs and normal actions of a valid user fall under the main flow.
  • Understand the alternative flow. Generally the spontaneous and unpredictable inputs from actors such as invalid entries and interchanging the order of inputs will create different responses from the application. Such scheme of things fall under alternative flow.
  • Understand the special requirements.  Most of the time default settings and standards are developed by the coders and in some cases the customer may request a non conventional approach with respect to the applications behavior, In such case understanding the requirement and creating proper test case is very crucial.
  • Documentation is very very important and all the documents need to be created seperately. The different versions and reference tables makes the test cases lengthy but will help later to trace any defects to the requirements by using Traceablity Matrix.
  • Functional points document ( FPD ) is to be maintained in order to understand features that need to be tested and features that need to be skipped.  The point where user can perform some action on the application is called as the Functional point.

Chronology of Documents in Testing

The following flowchart diagram indicates the linking of all the documents based on the login screen example.  
Each document acts as a pre requirement for the next one Img 1.2
Testing Documentation Chronology  
 

Traceability Matrix / Cross Reference Matrix

It is a document that contains tables of linking information and is used for tracing back the route of test development activities. There are many documents that need to be maintained by the test engineers and the business analysts in order the complete the inputs in this Matrix tables. In case any confusing or questionable circumstance arrives in future, This TM document can be reffered to find the root cause.
 
 UCD  FPD  TSD  TCD  DPD
 1  3  4  25  1
Complete Traceability Matrix TM, provides cross reference through all the documents)
( The numbers denote the example serial numbers from the corresponding documents UCD ( Use case document ) , FPD ( Functional Points document), TSD ( Test scenarios Document), TCD (Test case Document ), DPD ( Defect Profile document) respectively. Sometimes Differenet traceability matrix tables are maintained according to the standards of the company, for example :
 
 UC id  TC id
 7.1  5
 7.1  6
( Requirements Traceability Matrix (RTM ) for linking final test case ids to use case ids)
 
 DPD id  TCD id
 2  5
 ( Defects Traceability Matrix (DTM) for linking defects to the test case ids for easy reference)
 
 

Test Execution Phase

This is the phase where the test engineers will prepare and execute the test cases. One of the best techniques is to refer the use cases, pick up the standard test case templates and prepare test cases based on different categories. While do remember to maintain Traceability matrix alongside test execution to avoid rework and confusions at a later stage.
 
The Test cases can be divided into three types :
  • GUI test cases

Guidelines to be followed when preparing GUI test cases

  • Check for availability of all the objects on the application
  • Check for alignment of all the objects, even though customer does not specify them in the requirements
  • Check for consistency of the objects ( Color, appearance, resolution, spelling etc )
  • And any such feature that can be tested just by observing or a defect that can be avoided by just looking and pointing out in the development stage will fall under GUI test cases.

Functional test cases

  • The functional test cases can be classified into two categories, +ve test cases and -ve test cases. +ve test cases are written for the steps that user follows in order to perform the functions that the feature is supposed to do, In other words the Main flow of the application can be tested with +ve test cases with valid data and inputs.
  • Whereas -ve test cases will be used to test the irregular and abnormal expected actions by the end user on the applications functionality. At least one set of Invalid input as test data is required to produce a -ve test case.
 

Non Functional test cases

  • The test cases that are prepared to test the applications stability, Load or performance related features will fall under the non functional test cases. We will discuss performance and load testing details in other chapters of this knol. Please refer the Complete theory Index above to navigate to other sections.
 

Test Case Template

Test case template is used to create the Test case document easily and effectively. There are the following fields in a test case template
  • Objective : The purpose of this test case will be mentioned here.
  • Project Name : The code name of the project or the product name will be mentioned here which will be specific to the company policies and may vary from organisation to organisation.
  • Module name : The particular module the test case belongs to for ex. Login screen . will be mentioned here.
  • Author/ Prepared by : The test engineer, lead and other relevant names of people will be listed here who are responsible for the end document.
  • Test scenarios : Based on the FPD the features are shortlisted that need to be tested and the scenarios and possible combinations will be listed in this section.
  • Revision history : The authority of team leaders and test managers have to review and approve the test document before testers can proceed for executing the test. The relevant authorised signatures and timestamps will be present in this section
Test Case Document : Fields Explained
  • Test case ID : The serial numbers of the test cases is listed here
  • Req/Ref ID :  The reference serial numbers or ids from the use cases will be listed here to create a cross reference matrix for the corresponding test case
  • TC Type : The type of test case wether GUI, +ve or -ve will be listed here
  • Description : The details of the action that the test engineer needs to perform on the features will be mentioned here clearly.
  • Test Data : In order to perform functionality testing, It is very important to test the features with a range of data, and also approach with different techniques such as BVA and ECP. To keep the test case document tidy, test engineers prefer to create linking tables of test data in a seperate document or provide the inputs data at the end of the test case document. The test data can be divided into two categories Valid and Invalid inputs data, in order to write +ve and -ve test cases efficiently.
  • EV : Expected value after performing the action will be listed here
  • AV : The actual behavior of the application after executing the test case will be recorded and listed here.
  • Result : The comparison of EV and AV will be done and the result, either PASS or FAIL will be mentioned here in this section accordingly.
  • Priority : The priority is assigned to different test cases based on their effect on the testers to continue further execution of test cases on the application. In other words sometimes simple defects can create navigational blocks and prevent testers from accesing the entire features of the application and hence such defects will receive high priority , similarily GUI related defects receive low priority.
  • Build No : The version of the build that is released by the development team will be listed here.

The following XL sheet is an example of a standard test case for our login screen functionality. Some results are PASS and some results are FAIL in order to make the Bug reporting and Result analysis chapters easier to understand.

                                         
Login Table 1(a)
 Sr.No. Object Name  Object type
 1 Username / Password  Textbox
 2 Connect To  Listbox/combobox
 3  Login  Button
 4  Clear  Button
 5  Cancel  Button
 
Valid Inputs table 1 (b)
 Sr.No Username   Pswd   EV  AV 
 1  user1  pswd2   Admin  Admin
 2  user2  pswd2  Home   Home
 3  user3  pswd3  Home  Home
 4  user4  pswd4  Home  Home
 5  user5  pswd5  Admin  Admin
 
Invalid Inputs table 1 (c)
 Sr.No Username  Password  Error Message    AV
 1  user1   pswd1  IVUPTA   Admin
 2  NA   pswd1  IVUPTA  IVUPTA 
 3  user2  NA   IVPPTA  IVPPTA 
 4  NA   NA  IVUPPTA  IVUPPTA 
(IVUPTA   - Invalid username, Please try Again )
(IVPPTA    - Invalid Password, Please try Again)
(IVUPPTA - Invalid username/password , Please try Again )
 

Result Analysis and Bug Tracking

Afte the succesful execution of test cases, the tester will compare the expected values with the actual values and , declare the result as pass or fail.
 
Bug Tracking and Reporting : Very Important stage is to update the DPD ( Defect profile document ) and the let the developers know of the defects.
 

DPD ( Defect Profile Document )

The fields in the defect profile document are as followed
  • Defect ID : The sequence of defects identified are arranged serialwise and listed here
  • Test case ID : The corresponding test case number based on this defect is identified and will be mentioned here ( advantages of Cross reference matrix )
  • Description : The brief description of the defect will be listed here.
  • Steps for reproduceability : The steps that the tester followed to encounter the defect will be mentioned here, in order to assist the developers in quickly identifying the Bug.
  • Submitter : The name of the test engineer who has submitted the defect will be mentioned here.
  • Date of submission: The date on which the defect report was logged will be listed here
  • Build No. : The corresponding build that was released from the developers to the testing department will be listed here
  • Version no: The version no. to which the build belongs will be mentioned here.
  • Assigned to : The development lead will fill the corresponding developers name for whom the defect is assigned.
  • Severity : This field describes the seriousness of the defect from the testers point of view. And can be classified into 4 types
    • Fatal / Sev1/ S1/ 1 : If the problems encountered are related to the unavailability of functional feature, then such type of defects prevent testers from pursuing further testing , and hence are rated Fatal . Sometimes these defects are also called as show stopper defects. ( Missing fields or features )
    • Major / Sev2/ S2/ 2 : If the features are available and functional testing can be carried but the results are not according to the expected value then these defects are termed as major defects (ex. an add button displays 5 when entered 2 + 2 , In which case the function is working properly but not according to expectations.)
    • Minor / Sev3/ S3/ 3 : If the problems are related to the GUI or the Look and feel of the applications features, then these are treated as minor defects. ( inconsistent objects or spelling mistakes fall under this category.)
    • Suggestions/ Sev4/ S4/ 4 : If the problems are related to the overall value of the application or can enhance the user friendliness of the application, after being rectified, then these defects are classified under the suggestions list.
  • Priority : Priority describes the sequence in which the Development team will look into the defects and arrange them to be rectified. Priority can be classified into 4 types
    • Critical / P1/
    • High / P2
    • Medium/ P3
    • Low / P4
  • Usually in normal situations , the highest severity defects will be given the highest priority and the least ones accordingly, but sometimes depending on the situation and barriers between the developers and testing team's knowledge bank , the correlation may change.
For example : Whenever there is a customer visit in a short notice, then all the GUI cases ( minor defects ) will receive Critical or High Priority from the developers. and Similarily If some part of the application is to be released at a later stage in another build to the testing team then the testers consider the missing functionality as Fatal, but the developers will keep it assigned at a Low priority.
 
Below is the Defect Profile document example based on our Login Screen test case document execution . Again It is to be noted , that the template and description styles may change from company to company.

 

For Detailed explanation on Bug reporting and Result Analysis Please visit the Bug LIfe Cycle chapter from the Index above.
This brings us to the Closure of Software Testing Life Cycle, and the example of our login screen test case execution.
 
Reknolled under
 
Advertisement :  Unique, Established opportunity. Free trial, Watch Video here 
 
 
Knol Number 5053

No comments:

Post a Comment