Monday, April 30, 2012

Computer Graphics - Knol Book

Computer Graphics - Knol Book

Camera 2.0

Camera 2.0

Camera 2.0

Authors


Camera 2.0 provides addional capabilities to currently existing digital cameras. The representative new techniques include high dynamic range imaging, flash-noflash imaging, coded aperture and coded exposure imaging, photography under structured illumination, multi-perspective and panoramic stitching, digital photomontage, all-focus imaging, and light field imaging.

Camera 2.0 is still in laboratories getting developed.

The Camera 2.0 project is a Stanford project, and it began as a collaboration between the Stanford Computer Graphics Laboratory and the Nokia Research Center Palo Alto Laboratory. It has further  received funding from Adobe Systems, Kodak, Sony, Hewlett-Packard, The Walt Disney Company, Intel, Texas Instruments, Google, NVIDIA, and Sony and NSF.

http://graphics.stanford.edu/projects/camera-2.0/


_____________________

Stanford Researchers offers open software for cameras 2010
http://www.youtube.com/watch?v=stHNUO6-PfE
_____________________

Frankencamera 2010
http://www.youtube.com/watch?v=FHILNF6wT3g
_____________________

Franken Camera - Assembly - 2011 May - Technology Review video

_____________________

Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/5473

    Narayana Rao - 22 Aug 2011

    Cloud Computing - News and Knols

    Cloud Computing - News and Knols

    Cloud Computing - News and Knols

    Authors

    Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/5504

    Narayana Rao - 31 Aug 2011

    DART - A New Programming Language for Structured Web Programming



    11.9.2011

    Google engineers, Lars Bak and Gilad Bracha will make presentations on the new programming language, DART, a language for structured web programming at the international software development conference 2011 at Aarhus during 10-12 October 2011.

    Announcement of the presentations in the conference


    News items on the presentation

    http://news.cnet.com/8301-30685_3-20103843-264/google-to-debut-dart-a-new-language-for-the-web/

    http://www.informationweek.com/news/development/web/231601140

    Related Information

    Programming language rankings
    http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html


    Original knol - 5658

    Computer Architecture and Organization Video Lecture 17 to 27 Processor

    Computer Architecture and Organization Video Lecture 17 to 27 Processor

    Computer Architecture and Organization Video Lecture 17 to 27 Processor

    Authors

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/4667

    Narayana Rao - 01 Jun 2011

    Computer Architecture and Organization Video Lecture 28 to 32 Memory

    Computer Architecture and Organization Video Lecture 28 to 32 Memory

    Computer Architecture and Organization Video Lecture 28 to 32 Memory

    Authors

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/4668

    Narayana Rao - 01 Jun 2011

    Computer Architecture and Organization Video Lecture 33 to 37 Input Output System

    Computer Architecture and Organization Video Lecture 33 to 37 Input Output System

    Computer Architecture and Organization Video Lecture 33 to 37 Input Output System

    Authors

     
     
    _____________
     
     
     
     
     
     
     
     
    Note: I-O performance is also important.
    _____________

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/4669

    http://goo.gl/uXbzb

    Narayana Rao - 01 Jun 2011

    Computer Architecture and Organization Video Lecture 38 Concluding Remarks

    Computer Architecture and Organization Video Lecture 38 Concluding Remarks

    Computer Architecture and Organization Video Lecture 38 Concluding Remarks

    NPTEL Lecture Series, IIT Delhi

    Authors

     
     
    _______________
     
     
     
     
     
     
     
     
     
    _______________

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/4670


    http://goo.gl/COs9J

    Narayana Rao - 02 Jun 2011

    Von Neumann Architecture and Non-Von Neumann Architecture of Computers

    Von Neumann Architecture and Non-Von Neumann Architecture of Computers

    Von Neumann Architecture and Non-Von Neumann Architecture of Computers

    Authors



    In the 1946 paper, written with Arthur W. Burks and Hermann H. Goldstine, titled "Preliminary Discussion of the Logical Design of an Electronic Computing Instrument," Von Neumann outlined the basis for the general computing device and the ideas in it were profound and they had a great impact on the subsequent development of such machines.

    Von Neumann's design led eventually to the construction of the EDVAC computer in 1952.

    Von Neumann described in his "Preliminary Discussion" the general-purpose computing machine as one containing four main "organs." These are identified as relating to arithmetic, memory, control, and connection with the human operator. They represent today the arithmetic logic unit, the control unit, the memory, and the input-output devices.



    Sources
    http://www.csupomona.edu/~hnriley/www/VonN.html
     

    Computer Engineering - Products and Projects - Knol Book

    Computer Engineering - Products and Projects - Knol Book

    Computer Engineering - Products and Projects - Knol Book

    Authors


    Knol Books -  The concept

    Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/4337

    Narayana Rao - 23 Apr 2011

    Data Structures - Knol Book - Online Book

    Data Structures - Knol Book

    Data Structures - Knol Book

    Authors

     
     
    Lists
         Linear Lists - Array Representation
        Linear Lists - Linked Representation
        LInear Lists - Simuated Pointers 
    Arrays
    Matrices
        Sparse Matrices
    Stacks
    Queues
    Skip Lists and Hashing
    Binary and Other Trees
    Priority Queues
    Tournament Queues
    Binary Search Trees
    Balanced Search Trees
    Graphs

    Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/3666

    Narayana Rao - 25 Dec 2010

    Sunday, April 29, 2012

    SOFTWARE TESTING & QUALITY ASSURANCE - Mumbai University Syllabus and Related Knols

    SOFTWARE TESTING & QUALITY ASSURANCE - Mumbai University Syllabus and Related Knols

    SOFTWARE TESTING & QUALITY ASSURANCE - Mumbai University Syllabus and Related Knols

    Objectives: This course equips the students with a solid understanding of: • Practices that support the production of quality software • Software testing techniques • Life-cycle models for requirements, defects, test cases, and test results • Process models for units, integration, system, and acceptance testing • Quality Models

    Authors


    Prerequisite: Software Engineering

    Objectives:

    This course equips the students with a solid understanding of:
    • Practices that support the production of quality software
    • Software testing techniques
    • Life-cycle models for requirements, defects, test cases, and test results
    • Process models for units, integration, system, and acceptance testing
    • Quality Models

    1. Introduction: Software Quality, Role of testing, verification and validation,
    objectives and issues of testing, Testing activities and levels, Sources of Information
    for Test Case Selection, White-Box and Black-Box Testing , Test Planning and
    Design, Monitoring and Measuring Test Execution, Test Tools and Automation, Test
    Team Organization and Management .
    2. Unit Testing: Concept of Unit Testing , Static Unit Testing , Defect Prevention , 3.4
    Dynamic Unit Testing , Mutation Testing , Debugging , Unit Testing in eXtreme
    Programming
    3. Control Flow Testing: Outline of Control Flow Testing, Control Flow Graph, Paths
    in a Control Flow Graph, Path Selection Criteria, All-Path Coverage Criterion ,
    Statement Coverage Criterion, Branch Coverage Criterion, Predicate Coverage
    Criterion, Generating Test Input, Examples of Test Data Selection.
    4. Data Flow Testing: Data Flow Anomaly,. Overview of Dynamic Data Flow Testing,
    Data Flow Graph, Data Flow Terms, Data Flow Testing Criteria, Comparison of Data
    Flow Test Selection Criteria, Feasible Paths and Test Selection Criteria, Comparison
    of Testing Techniques.
    5. System Integration Testing: Concept of Integration Testing, Different Types of
    Interfaces and Interface Errors, Granularity of System Integration Testing, System
    Integration Techniques, Software and Hardware Integration, Test Plan for System
    Integration, Off-the-Shelf Component Integration, Off-the-Shelf Component Testing,
    Built-in Testing
    6. System Test Categories: Basic Tests, Functionality Tests, Robustness Tests,
    Interoperability Tests, Performance Tests, Scalability Tests, Stress Tests, Load and
    Stability Tests, Reliability Tests, Regression Tests, Documentation Tests.
    7. Functional Testing: Equivalence Class Partitioning, Boundary Value Analysis,
    Decision Tables, Random Testing, Error Guessing, Category Partition.
    8. System Test Design: Test Design Factors, Requirement Identification,
    Characteristics of Testable Requirements, Test Design Preparedness Metrics, Test
    Case Design Effectiveness
    9. System Test Planning And Automation: Structure of a System Test Plan,
    Introduction and Feature Description, Assumptions, Test Approach, Test Suite
    Structure, Test Environment, Test Execution Strategy, Test Effort Estimation,
    Scheduling and Test Milestones, System Test Automation, Evaluation and Selection
    of Test Automation Tools, Test Selection Guidelines for Automation, Characteristics
    of Automated Test Cases, Structure of an Automated Test Case, Test Automation
    Infrastructure
    10. System Test Execution: Preparedness to Start System Testing, Metrics for Tracking
    System Test, Metrics for Monitoring Test Execution, Beta Testing, First Customer
    Shipment, System Test Report, Product Sustaining, Measuring Test Effectiveness.
    11. Acceptance Testing: Types of Acceptance Testing, Acceptance Criteria, Selection of
    Acceptance Criteria, Acceptance Test Plan, Acceptance Test Execution, Acceptance
    Test Report, Acceptance Testing in eXtreme Programming.
    12. Software Quality: Five Views of Software Quality, McCall’s Quality Factors and
    Criteria, Quality Factors Quality Criteria, Relationship between Quality Factors and
    Criteria, Quality Metrics, ISO 9126 Quality Characteristics, ISO 9000:2000 Software
    Quality Standard ISO 9000:2000 Fundamentals, ISO 9001:2000 Requirements
    Text Book
    1. “Software Testing and Quality Assurance: Theory and Practice”, Sagar Naik,
    University of Waterloo, Piyu Tripathy, Wiley , 2008
    References:
    1. “Effective methods for Software Testing “William Perry, Wiley.
    2. “Software Testing - A Craftsman’s Approach”, Paul C. Jorgensen, CRC Press, 1995.
    3. “The Art of Creative Destruction”, Rajnikant Puranik, SPD.
    4. “Software Testing”, Srinivasan Desikan and Gopalaswamy Ramesh - Pearson
    Education 2006.
    5. “Introducing to Software Testing”, Louis Tamres, Addison Wesley Publications, First
    Edition.
    6. “Software Testing”, Ron Patton, SAMS Techmedia Indian Edition, Pearson
    Education 2001.
    7. “The Art of Software Testing”, Glenford J. Myers, John Wiley & Sons, 1979.
    8. “Testing Object-Oriented Systems: Models Patterns and Tools”, Robert V. Binder,
    Addison Wesley, 2000.
    9. “Software Testing Techniques”, Boris Beizer, 2nd Edition, Van Nostrand Reinhold,
    1990.
    10. “Software Quality Assurance”, Daniel Galin, Pearson Education.


    Term Work:
    Term work shall consist of at least 10 experiments covering all topics and one written
    test.
    Distribution of marks for term work shall be as follows:
    5. Laboratory work (Experiments and Journal) 15 Marks
    6. Test (at least one) 10 Marks

    __________________________________________________________________________________

    Search Link for Software Testing Knols

    __________________________________________________________________________________

    Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/5732
    Narayana Rao - 21 Sep 2011

    Drupal - Free Open Source CMS Plaltform

    Drupal - Free Open Source CMS Plaltform

    Drupal - Free Open Source CMS Plaltform

    Overviews, How to install Drupal, Construction Kits

    Authors

    Collected Knols

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/5741

    Narayana Rao - 22 Sep 2011

    Computer Science - Computer Engineering - InformationTechnology - Knol Collections

    Computer Science - Computer Engineering - InformationTechnology - Knol Collections

    Computer Science - Computer Engineering - InformationTechnology - Knol Collections

    Authors

    Comments

    Short urls

    http://knol.google.com/k/-/-/2utb2lsm2k7a/5742

    Narayana Rao - 22 Sep 2011

    Artificial Intelligence - YouTube Video Series - Collection

    Artificial Intelligence - YouTube Video Series - Collection

    Artificial Intelligence - YouTube Video Series - Collection

    IGNOU, Oreilly, LISP Language

    Authors

    Collected Knols

      Information Security - IT Security - Knol Book

      Information Security - IT Security - Knol Book

      Information Security - IT Security - Knol Book

      Authors








      Knol search link for Information Security



      _________________________________________________________________________________

      Social Knol button



      Comments

      Short urls


      Information Security - IT Security - Knol Book
      http://knol.google.com/k/-/-/2utb2lsm2k7a/5823

      Narayana Rao - 08 Oct 2011

      Dennis Ritchie - Co-Creator UNIX, Co-Inventory C Programming Language

      Dennis Ritchie - Co-Creator UNIX, Co-Inventory C Programming Language

      Dennis Ritchie - Co-Creator UNIX, Co-Inventory C Programming Language

      1941-2011

      Authors

       

      Biography of Dennis Ritchie

       
       
      Biography by Ritchie himself
       
       
       
      "I aided Ken Thompson in creating the Unix operating system."
       
      "Early in the development of Unix, I added data types and new syntax to Thompson's B language, thus producing the new language C. "
       
       

      Home page of Dennis Ritchie on Bell Labs Website

       
       
      Contains link to some of his articles and presentations on UNIX and C.
       
      The C Family of Languages: Interview with Dennis Ritchie, Bjarne Stroustrup, and James Gosling
      This article appeared in Java Report, 5(7), July 2000 and C++ Report, 12(7), July/August 2000.
       


      Articles on Dennis Ritchie


      Economic Times, India, 18.10.2011
      Dennis Ritchie whose creation is in your Iphone
      http://economictimes.indiatimes.com/tech/software/dennis-ritchie-the-other-man-inside-your-iphone-who-created-unix/articleshow/10395985.cms

      News of Death of Dennis Ritchie

       
       
       
      Google plus post by Rob Pike
       
       
       
       

      Professor John McCarthy - Father of Artificial Intelligence

      Professor John McCarthy - Father of Artificial Intelligence

      Professor John McCarthy - Father of Artificial Intelligence

      Authors


      5.11.2011

      Professor John McCarthy, hailed as the father of artificial intelligence passed away at the age of 84.
      http://www.dailymail.co.uk/news/article-2053617/Professor-John-McCarthy-Father-artificial-intelligence-dies-aged-84.html

      What is artificial intelligence?
      An article by John McCarthy
      http://www-formal.stanford.edu/jmc/whatisai/
      It is an elaborate introductory article on the subject.

      Programming the Social Computer - Lecture by Prof.David Robertson - Video




      http://www.youtube.com/watch?v=27L970r8J9U

      Prof. David Robertson is the Head of the School of Informatics at the University of Edinburgh.

      Final Projects - Engineering Courses




      Guidelines for Final Year Engineering Students

      Computer Science

      Software Engineering

      Final year project guidelines for software engineering department.
      http://www.uop.edu.jo/download/PdfCourses/se/SE_INTRODUCTION.pdf



      Mechanical Engineering
      http://www.seminarseason.com/category/mechanical-engineering-seminar-topics
      http://www.seminarseason.com/category/mechanical-engineering-seminars
      http://www.seminarseason.com/category/mechanical-seminar-topics


       http://www.seminarseason.com/


      Engineering Economics - Final Project guidelines
      http://www.cbe.csueastbay.edu/~alima/courses/3140/TermProject/TermProjectReportPreparation.pdf
      Knol - 5005

      Saturday, April 28, 2012

      The Great Indian Programming League


       
       
       
      Show your programming talent
       
      ___________
       
       
      ___________
       
      Participate today.
      Knol - 5029

      Software Testing Life Cycle - Reknolled


       

      Source: Software Testing Life Cycle  by Vinayak Rao

       

      Software Testing Theory Index

      Part-1 Interview Based Theory  ( Index and Sub Index listed) 
      Automation testing. (Under revision ETA 02/15/09)
      Quicktest proffesional QTP 9.0 VBScript. ( Under revision ETA 02/17/09)
      Quality Center. (unpublished draft proofreading due. ETA ..02/09)
       

      Software Testing Life Cycle

           This Cycle can be understood by categorizing the stages in it, They are as follows
       
        Planning is the First phase of Testing Img Cartoonbank.com 
      • Test Planning
      • Test development
      • Test execution
      • Result Analysis
      • Bug Tracking
      • Reporting

       

      Test Planning

      1) Plan : Plan is a strategic document which describes how to perform a task in an effective efficient and optimized way. (Nrstt Reddy1)
       
      2) Optimization : Its a process of utilizing the input available resources to their level best and getting the maximum possible output.
       

      Test plan Document

       Test plan is a strategic document which describes how to perform testing on the application in an effectively.
       
           Test lead will design the Test Plan Document, Its contents are described below
      1.0   Introduction
        1.1 Objective
            Purpose of the document is specified over here.
           
        1.2
      References Documents
            The list of all the documents that are reffered to prepare this document will be listed out here in this section. (SRS and Project Plan)        
       
       
      2.0  
      Coverage of Testing
        2.1 Features to be tested
            The list of all the features that are to be tested based on the Implicit and explicit requirements from the customer will be mentioned in this section

        2.2
      Features not to be tested
            The list of all the features that can be skipped from testing phase are mentioned here, Generally Out of scope features such as incomplete modules are listed here. If severity is low and time constraints are high then all the low risk features such as GUI or database style sheets are skipped. Also the features that are to be incorporated in future are kept out of testing temporarily.
                             
      3.0   Test Strategy
        3.1
      Levels of testing
            Its a project level term which describes testing proceedures in an organization. All the levels such as Unit, module, Integration, system and UAT (User Acceptance test) are mentioned here which is to be performed.
        3.2 Types of testing
            All the various types of testing such as compatibility,regression and etc are mentioned here with module divisions
           
        3.3
      Test design techniques
            The List of All the techniques that are followed and maintained in that company will be listed out here in this section. Some of the most used techniques are Boundary Value Analysis(BVA) and Equalence Class Participation(ECP).
       

       Boundary Value Analysis: 

      Whenever the test engineers need to develop the test cases for a range kind of input, the best technique suggested is BVA.
            BVA says that whenever there is a range for an input, just concentrate on the boundaries and not the entire values in between. In other words LB
            or Lower bound is the minimum range and UB or Upper bound is the maximum range, So the values that will be considered in the BVA technique are
            LB, LB + 1, LB - 1, UB, UB + 1, UB - 1. And one MV or mid range value can be added If the range is too lengthy.[MV = (LB+UB)/2]


       Equalance class participation:

      When there are more number of requirements for a single feature, then ECP is very helpful. ECP is applied by dividing
            the different classes of inputs based on their properties and then develop the test cases.
           
            To Understand these two techniques, we need to take an example of test case based on the requirements of a particular feature.
            Example: The Email ID field of a web application needs to be developed on following specificaions or requirements,
            a)  The word limit for the email text box field must be Minimum of 4 characters and Maximum of 20 characters.
            b)  Only Lowercase alphabet allowed.
            c)  It should not accept special characters except @ & _ .
           
            Once we understand the requirements, BVA = 3 characters(LB-1), 4(LB), 5(LB+1), 12(MV), 19(UB-1), 20(UB), 21(UB+1).
            ECP states that data can be divided into different class of inputs, these can be determined based on the type of data that need to be provided for
            testing any particular feature. In the above example by considering the BVA , the ECP table can be divided into two categories.
            Valid and Invalid.
            Example: Valid = 4 char, 5 char, 12 char, 19 char, 20char, a-z, @, _, a_5#.etc
            Example: Invalid = 3 char, 21 char, A-Z, All special char except @ & _, 0 to 9, Alphanumeric texts, Empty spaces, decimals etc.
            Based on these two types of inputs, you can create two tables, and  

        3.4 Configuration Management 
               All the documents that are generated during the testing process needs to be updated simultaneously to keep the testers and developers aware of the proceedings. Also the naming conventions and declaring new version numbers for the software builds based on the amount of chnange is done by the SCM ( Software Configuration Management team ) and the details will be listed here.
          
        3.5 Test Metrics
                 The list of all the tasks that need to be measured and maintained will be present here.  Different metrics for tracing back the exact requirement and test case depends on the availability of metrics at the right time.

        3.6 Terminology
                  Testing specific jargons for the project that will be used internally in the company wil be mentioned here in this section.

        3.7 Automation Plan
                 The list of all the features , or modules that are planned for automation testing will be mentioned here. The application only undergoes automation testing after being declared STABLE by the manual testing team.

        3.8 List of Automated tools
                The list of Automated tools, like QTP , Loadrunner, Win runner, etc which will be used in this project wil be mentioned along with license details.

      4.0   Base Criteria
         4.1 Acceptance Criteria
                 The standards or metrics that need to be acheived by the testing team before declaring the product fit will be listed here. So before handovering to the customer, the time at which the testing needs to be stopped is mentioned here in this section.

        4.2 Suspension Criteria
                In high risk projects, or huge projects that consists several modules , It is necessary to minimize the repetitive process to be efficient. The situations when the testing needs to be suspended or temporarily halted will be listed here in this section.
            
      5.0   Test deliverables
               The list of all the documents that are to be prepared during the testing process will be mentioned here in this section. All the copies of  verification documents after each level are submitted to the customer alongwith the user manual and product at the end of the project.

      6.0   Test environment
                Environmental components and combinations that will be simulated for testing the product will be made sure that it is very close to the actua environment when the end user works on the product. All the details will be mentioned here in this section that is to be used for testing the application.
               
      7.0   Resource planning
                Roles to be performed or in other words Who has to do What , will be mentioned clearly here.

      8.0   Scheduling
                The starting dates and ending dates for each and every task and module will be listed here in this section.

      9.0   Staffing and Training
                How much staff is to be recruited and what kind of training is to be provided to accomplish this project successfully will be described here in a detailed fashion.

      10.0  Risks and Contingencies
                 The list of all the potential risks and the corresponding solution plans will be listed here :
                 Risks : Example, Resources may leave the organization, license and update deadlines, customer may change the requirements in terms of testing or maintenance in middle of the project. etc
                 Contingencies :  Maintaining bench strength, Rechecking initial stages of  the whole process, Importance and priority settings for features to be tested and features to be skipped under time constraints to be listed and shared clearly.

      11.0  Assumptions
                 Some features or testing methods have to be done mandatorily even though, the customer does not mention it in his requirements document. These assumptions are listed out here in this section.

      12.0  Approval Information
                 As this document is published and circulated, the relevant and required authorities will approve the plan and will update this section with necessary details like date and department etc.
       

      Test development stage

      We will use examples extensively to discuss the entire phase of test development in our article, Please note that these are just standard methods and approaches towards testing and can vary slightly from company to company. (But the core remains the same.)
       

      Arguably, this is the most important stage of the testing life cycle, in this section or phase of the testing part; the testers will develop the test cases based against the requirements of the customer. There are usually three levels of requirements, to be understood by the testers before they can proceed to write the test cases for the product

      •  HLI ( High level Information)
      • LLI / Used Cases ( Low level Information )
      •  Snapshots (Prototype or images of a similar product or framework.)
      Used Case: These are the snippets created by the Business Analyst to describe the functionality of certain features of an application, It briefly states the roles of actors, actions and responses that are required to be included in test cases before executing them on the product or software.
       
      Snapshot of the Login Screen ( Prototype can be provided ) 
      Example 1: Login Screen of an Application ( Input Information required to prepare the used cases )
      Functional requirements that are collected by the Business Analyst ( HLI )
       
      • Login screen should contain username, password, connect to fields, Login, Clear and Cancel buttons.
      • Connect to field should not be a mandatory field but it must allow the user to connect to a database whenever he requires.
      • Upon entering the valid username, valid password and clicking on Login button, the corresponding page according to the level of user ( admin, member, guest etc) must be displayed.
      • Upon entering some information into any fields and clicking on Clear button, all the fields must be cleared and the cursor must be placed in the username field.
      • Upon clicking on the Cancel button, the login screen must close.
      Additionally, the requirements can be classified based on their implicit or explicit natures.
       
      Implicit Requirements : Sometimes the customer is unaware of the finer details and provides rough requirements list, in that case the business analyst produces a list of requirements on his own to improve the value of the product.
       
      Explicit Requirements : Requirements that are demanded by the customer fall in this category, these requirements will always receive priority in testing and cannot be listed under Features not to be tested in the Test Plan document.
       
      Example 1 Contd...
      Special Requiements / Validations / Business rules and standards.
      • Initially whenever the login screen is invoked the Login and Clear buttons must be disabled.
      • Cancel button must be always enabled.
      • Upon entering some information into any of the fields the Clear button must be enabled.
      • Upon entering any information into the username and password field the Login button must be enabled.
      • Tabbing order ( Hitting tab on keyboard should highlight fields in specified sequence.) Username, password, connect to, Login, Clear and Cancel.
      Example 2 : Used case template and document for the login screen application.
       
      Used case template fields include the following fields : Name , description, Actors involved, Special requirements, Pre conditions, Post conditions, Flow of Events.
       

      Used Case Document

      Name of the Used Case : Login Screen

      Brief Description            : This document describes the functionality of Login screen.

      Actors Involved             : Normal users, Administrators
       
      Special Requiremens    : Implicit and explicit requiremens listed below.
       

      Implicit requirements

      • Initially whenever the login screen is invoked the cursor must be available in the username field.
      • Upon entering invalid username, valid password and clicking Login, the following message must be displayed  " Invalid username. Please try again."
      • Upon entering valid username, invalid password and clicking Login, the following message must be displayed. " Invalid password. Please try again."
      • Upon entering invalid username, invalid password and clicking on Login, the following message must be displayed. " Invalid username/password Please try again."

      Explicit requirements 

      • Initially whenever the login screen is invoked, the Login and Clear button must be disabled.
      • Cancel button must be always enabled.
      • Upon entering information in any field, Clear button must be enabled.
      • Upon entering username and password details, Login button should be enabled.
      • Tabbing order must be Username, Password, Connect to, Login, Clear and Cancel.
       
      Pre Conditions              : Login screen must be available
       
      Post Conditions            : Either homepage or admin page for valid users and error message
                                              for invalid users must be displayed.
       
      Flow of Events             : There are two flows to the application behavior and responses. 
                                            a) Main flow
                                            b) Alternate flow.
       
      ( Sometimes Diagrams/ Flowcharts are available to depict the flows in used cases, but we will consider a table and jot down the requirements efficiently as listed below ) 
      Example 2 contd.....
       
                                          Actions                                         Responses
       
      1) Actor invokes the application             :    Application displays the screen with the following
                                                                            fields, username, password, connect to, login, clear
                                                                            and cancel.
       
      2) Actor enters valid username, valid    :    Authenticates, application displays either homepage
          password & clicks on Login                     or Admin page depending on the actors level.
       
      3) Actor enters valid username , valid   :    Authenticates, application displays either homepage
          password & selects a database, clicks      or admin depending on the level of actor and the 
          Login                                                        mentioned database connection.
       
      4) Actor enters invalid username and    :    Go to Alternative Flow Table 1.1
          valid password and clicks on Login
       
      5) Actor enters valid username and        :    Go to Alternative Flow Table 1.2
          invalid password and clicks Login    
       
      6) Actor enters invalid username, and    :    Go to Alternative Flow Table 1.3
          invalid password and clicks Login
       
      7) Actor enters some info and clicks       :     Go to Alternative Flow Table 1.4
          the Clear button
       
      8) Actor clicks cancel on the screen        :     Go to Alternative Flow Table 1.5
       
      ( The table has been splitted into two divisions, each for the main flow and alternative flow for better understanding, There is a lot of documentation and methodology involved in testing when an established CMMI level or ISO certified company enters the arena.)
      Example 2 contd........
       
      Alternative Flow Table 1.1 ( Invalid username )
      Response : Authenticates , Application displays the following message " Invalid username, Please Try Again. "
       
      Alternative Flow Table 1.2 ( Invalid Password )
      Response : Authenticates, Application displays the following message. " Invalid Password, Please Try Again."
       
      Alternative Flow Table 1.3 ( Invalid Username and Password)
      Response : Authenticates, Application displays the following message. " Invalid Username/ Password, Please Try Again."
       
      Alternative Flow Table 1.4 ( clicking Clear buttuon)
      Response : All the fields are cleared and the cursor is placed in the username field.
       
      Alternative Flow Table 1.5 ( clicking Cancel Button)
      Response : Login screen is closed and the application exits.
       

      Guidelines for Test engineer

      ( After the Used Case Document is given to the Test engineer by the business analyst.)
       
      • Identify the module to which the used case belongs to. In our example the login screen use case generally belongs to the security module.
      • Identify the functionality of the use case with respect to the total functionality. example: Authentication for login screen
      • All the Look and Feel ( GUI ) related test cases need to be written by the Test engineers directly even if the HLI, LLI and snapshots are not available. ( Important )
      • Identify the functional points and prepare the Functional Points Document ( FPD ).
      • Identify the actors involved in the use case wether normal or administrator etc.
      • Identify the inputs required to perform testing, valid and invalid inputs need to be identified with respect to the functionality of the features.
      • Identify wether this used case is linked with any other use case, such as homepage or admin page or database connections page to confirm authentication.
      • Identify the Pre conditions and ensure that the build version released is the correct one and can be used for executing the test cases hereafter. 

      Methodology

      • Understand the Main flow of the application. Usually all the valid inputs and normal actions of a valid user fall under the main flow.
      • Understand the alternative flow. Generally the spontaneous and unpredictable inputs from actors such as invalid entries and interchanging the order of inputs will create different responses from the application. Such scheme of things fall under alternative flow.
      • Understand the special requirements.  Most of the time default settings and standards are developed by the coders and in some cases the customer may request a non conventional approach with respect to the applications behavior, In such case understanding the requirement and creating proper test case is very crucial.
      • Documentation is very very important and all the documents need to be created seperately. The different versions and reference tables makes the test cases lengthy but will help later to trace any defects to the requirements by using Traceablity Matrix.
      • Functional points document ( FPD ) is to be maintained in order to understand features that need to be tested and features that need to be skipped.  The point where user can perform some action on the application is called as the Functional point.

      Chronology of Documents in Testing

      The following flowchart diagram indicates the linking of all the documents based on the login screen example.  
      Each document acts as a pre requirement for the next one Img 1.2
      Testing Documentation Chronology  
       

      Traceability Matrix / Cross Reference Matrix

      It is a document that contains tables of linking information and is used for tracing back the route of test development activities. There are many documents that need to be maintained by the test engineers and the business analysts in order the complete the inputs in this Matrix tables. In case any confusing or questionable circumstance arrives in future, This TM document can be reffered to find the root cause.
       
       UCD  FPD  TSD  TCD  DPD
       1  3  4  25  1
      Complete Traceability Matrix TM, provides cross reference through all the documents)
      ( The numbers denote the example serial numbers from the corresponding documents UCD ( Use case document ) , FPD ( Functional Points document), TSD ( Test scenarios Document), TCD (Test case Document ), DPD ( Defect Profile document) respectively. Sometimes Differenet traceability matrix tables are maintained according to the standards of the company, for example :
       
       UC id  TC id
       7.1  5
       7.1  6
      ( Requirements Traceability Matrix (RTM ) for linking final test case ids to use case ids)
       
       DPD id  TCD id
       2  5
       ( Defects Traceability Matrix (DTM) for linking defects to the test case ids for easy reference)
       
       

      Test Execution Phase

      This is the phase where the test engineers will prepare and execute the test cases. One of the best techniques is to refer the use cases, pick up the standard test case templates and prepare test cases based on different categories. While do remember to maintain Traceability matrix alongside test execution to avoid rework and confusions at a later stage.
       
      The Test cases can be divided into three types :
      • GUI test cases

      Guidelines to be followed when preparing GUI test cases

      • Check for availability of all the objects on the application
      • Check for alignment of all the objects, even though customer does not specify them in the requirements
      • Check for consistency of the objects ( Color, appearance, resolution, spelling etc )
      • And any such feature that can be tested just by observing or a defect that can be avoided by just looking and pointing out in the development stage will fall under GUI test cases.

      Functional test cases

      • The functional test cases can be classified into two categories, +ve test cases and -ve test cases. +ve test cases are written for the steps that user follows in order to perform the functions that the feature is supposed to do, In other words the Main flow of the application can be tested with +ve test cases with valid data and inputs.
      • Whereas -ve test cases will be used to test the irregular and abnormal expected actions by the end user on the applications functionality. At least one set of Invalid input as test data is required to produce a -ve test case.
       

      Non Functional test cases

      • The test cases that are prepared to test the applications stability, Load or performance related features will fall under the non functional test cases. We will discuss performance and load testing details in other chapters of this knol. Please refer the Complete theory Index above to navigate to other sections.
       

      Test Case Template

      Test case template is used to create the Test case document easily and effectively. There are the following fields in a test case template
      • Objective : The purpose of this test case will be mentioned here.
      • Project Name : The code name of the project or the product name will be mentioned here which will be specific to the company policies and may vary from organisation to organisation.
      • Module name : The particular module the test case belongs to for ex. Login screen . will be mentioned here.
      • Author/ Prepared by : The test engineer, lead and other relevant names of people will be listed here who are responsible for the end document.
      • Test scenarios : Based on the FPD the features are shortlisted that need to be tested and the scenarios and possible combinations will be listed in this section.
      • Revision history : The authority of team leaders and test managers have to review and approve the test document before testers can proceed for executing the test. The relevant authorised signatures and timestamps will be present in this section
      Test Case Document : Fields Explained
      • Test case ID : The serial numbers of the test cases is listed here
      • Req/Ref ID :  The reference serial numbers or ids from the use cases will be listed here to create a cross reference matrix for the corresponding test case
      • TC Type : The type of test case wether GUI, +ve or -ve will be listed here
      • Description : The details of the action that the test engineer needs to perform on the features will be mentioned here clearly.
      • Test Data : In order to perform functionality testing, It is very important to test the features with a range of data, and also approach with different techniques such as BVA and ECP. To keep the test case document tidy, test engineers prefer to create linking tables of test data in a seperate document or provide the inputs data at the end of the test case document. The test data can be divided into two categories Valid and Invalid inputs data, in order to write +ve and -ve test cases efficiently.
      • EV : Expected value after performing the action will be listed here
      • AV : The actual behavior of the application after executing the test case will be recorded and listed here.
      • Result : The comparison of EV and AV will be done and the result, either PASS or FAIL will be mentioned here in this section accordingly.
      • Priority : The priority is assigned to different test cases based on their effect on the testers to continue further execution of test cases on the application. In other words sometimes simple defects can create navigational blocks and prevent testers from accesing the entire features of the application and hence such defects will receive high priority , similarily GUI related defects receive low priority.
      • Build No : The version of the build that is released by the development team will be listed here.

      The following XL sheet is an example of a standard test case for our login screen functionality. Some results are PASS and some results are FAIL in order to make the Bug reporting and Result analysis chapters easier to understand.

                                               
      Login Table 1(a)
       Sr.No. Object Name  Object type
       1 Username / Password  Textbox
       2 Connect To  Listbox/combobox
       3  Login  Button
       4  Clear  Button
       5  Cancel  Button
       
      Valid Inputs table 1 (b)
       Sr.No Username   Pswd   EV  AV 
       1  user1  pswd2   Admin  Admin
       2  user2  pswd2  Home   Home
       3  user3  pswd3  Home  Home
       4  user4  pswd4  Home  Home
       5  user5  pswd5  Admin  Admin
       
      Invalid Inputs table 1 (c)
       Sr.No Username  Password  Error Message    AV
       1  user1   pswd1  IVUPTA   Admin
       2  NA   pswd1  IVUPTA  IVUPTA 
       3  user2  NA   IVPPTA  IVPPTA 
       4  NA   NA  IVUPPTA  IVUPPTA 
      (IVUPTA   - Invalid username, Please try Again )
      (IVPPTA    - Invalid Password, Please try Again)
      (IVUPPTA - Invalid username/password , Please try Again )
       

      Result Analysis and Bug Tracking

      Afte the succesful execution of test cases, the tester will compare the expected values with the actual values and , declare the result as pass or fail.
       
      Bug Tracking and Reporting : Very Important stage is to update the DPD ( Defect profile document ) and the let the developers know of the defects.
       

      DPD ( Defect Profile Document )

      The fields in the defect profile document are as followed
      • Defect ID : The sequence of defects identified are arranged serialwise and listed here
      • Test case ID : The corresponding test case number based on this defect is identified and will be mentioned here ( advantages of Cross reference matrix )
      • Description : The brief description of the defect will be listed here.
      • Steps for reproduceability : The steps that the tester followed to encounter the defect will be mentioned here, in order to assist the developers in quickly identifying the Bug.
      • Submitter : The name of the test engineer who has submitted the defect will be mentioned here.
      • Date of submission: The date on which the defect report was logged will be listed here
      • Build No. : The corresponding build that was released from the developers to the testing department will be listed here
      • Version no: The version no. to which the build belongs will be mentioned here.
      • Assigned to : The development lead will fill the corresponding developers name for whom the defect is assigned.
      • Severity : This field describes the seriousness of the defect from the testers point of view. And can be classified into 4 types
        • Fatal / Sev1/ S1/ 1 : If the problems encountered are related to the unavailability of functional feature, then such type of defects prevent testers from pursuing further testing , and hence are rated Fatal . Sometimes these defects are also called as show stopper defects. ( Missing fields or features )
        • Major / Sev2/ S2/ 2 : If the features are available and functional testing can be carried but the results are not according to the expected value then these defects are termed as major defects (ex. an add button displays 5 when entered 2 + 2 , In which case the function is working properly but not according to expectations.)
        • Minor / Sev3/ S3/ 3 : If the problems are related to the GUI or the Look and feel of the applications features, then these are treated as minor defects. ( inconsistent objects or spelling mistakes fall under this category.)
        • Suggestions/ Sev4/ S4/ 4 : If the problems are related to the overall value of the application or can enhance the user friendliness of the application, after being rectified, then these defects are classified under the suggestions list.
      • Priority : Priority describes the sequence in which the Development team will look into the defects and arrange them to be rectified. Priority can be classified into 4 types
        • Critical / P1/
        • High / P2
        • Medium/ P3
        • Low / P4
      • Usually in normal situations , the highest severity defects will be given the highest priority and the least ones accordingly, but sometimes depending on the situation and barriers between the developers and testing team's knowledge bank , the correlation may change.
      For example : Whenever there is a customer visit in a short notice, then all the GUI cases ( minor defects ) will receive Critical or High Priority from the developers. and Similarily If some part of the application is to be released at a later stage in another build to the testing team then the testers consider the missing functionality as Fatal, but the developers will keep it assigned at a Low priority.
       
      Below is the Defect Profile document example based on our Login Screen test case document execution . Again It is to be noted , that the template and description styles may change from company to company.

       

      For Detailed explanation on Bug reporting and Result Analysis Please visit the Bug LIfe Cycle chapter from the Index above.
      This brings us to the Closure of Software Testing Life Cycle, and the example of our login screen test case execution.
       
      Reknolled under
       
      Advertisement :  Unique, Established opportunity. Free trial, Watch Video here 
       
       
      Knol Number 5053