Course Registration System

 

Test Evaluation Summary

For the

Architectural Prototype

 

Version 1.0

Revision History

Date

Version

Description

Author

21/March/1999 1.0 Architectural prototype test evaluation C. Smith
 
 
 
 
 
 
 
 

 

 

Table of Contents

  1. Introduction
  2. Test Results Summary
  3. Test Coverage
  4. Code Coverage
  5. Defect Analysis
  6. Suggested Actions
  7. Diagrams

Test Evaluation Summary

For the

Architectural Prototype

  1. Introduction
    1. Purpose
    2. This Test Evaluation Report describes the results of the C-Registration Architectural Prototype tests in terms of test coverage (both requirements-based and code-based coverage) and defect analysis (i.e. defect density).

    3. Scope
    4. This Test Evaluation Report applies to the C-Registration Architectural Prototype. The tests conducted are described in the Test Plan for the Prototype [5]. This Evaluation Report is to be used for the following:

      • assess the acceptability and appropriateness of the performance behavior(s) of the prototype
      • assess the acceptability of the tests
      • identify improvements to increase test coverage and / or test quality
    5. References
    6. Applicable references are:

        1. Course Registration System Glossary, WyIT406, V2.0, 1999, Wylie College IT.
        2. Course Registration System Software Development Plan, WyIT418, V1.0, 1999, Wylie College IT.
        3. Course Registration System Iteration Plan, Elaboration Iteration #E1 , WyIT420, V1.0, 1999, Wylie College IT.
        4. Course Registration System Integration Build Plan for the Architectural Prototype, WyIT430, V1.0, 1999, Wylie College IT.
        5. Course Registration System Test Plan for the Architectural Prototype, WyIT432, V1.0, 1999, Wylie College IT.
  2. Test Results Summary

  3. The test cases defined in the Test Suite for the Prototype were executed following the test strategy as defined in the Test Plan [5].

    Test coverage (see Section 5.0 below) in terms of covering the use cases and test requirements defined in the Test Plan [5] was complete.

    Code coverage is described in Section 6.0 and was not considered as a significant measure of success for the prototype.

    Analysis of the defects (as shown in Section 7.0 below) indicates that there are significant performance problems accessing the legacy Course Catalog System. The performance and loading tests that involved read or write access to the Course Catalog System are well below the established targets. The Management Team will be assigning systems engineering resources to further evaluate these test results and to determine design alternatives.

  4. Test Coverage
  5. The tests to be performed on the prototype are defined in Section 5.1 of the Test Plan [5] along with their completion criteria. The test coverage results are as follows:

    Ratio Test Cases Performed = 40/40 = 100%

    Ratio Test Cases Successful = 30/40 = 80%

    The area of tests with the highest failure rate was:

      • Performance tests involving access to the Course Catalog System
      • Load tests involving access to the Course Catalog System.

    Further detail on test coverage is available using Rational RequisitePro and the Prototype Test Case matrix.

     

  6. Code Coverage
  7. Rational Visual PureCoverage was used to measure code coverage of the Prototype tests.

    Ratio LOC executed = 12,874 / 48,916 (about 25%)

    Approximately, 25% of the code was executed during the testing. It was determined that this coverage was adequate for the prototype tests as all interfaces were thoroughly exercised. Later iterations will require a significantly higher measure for code coverage.

  8. Defect Analysis
  9. This section summarizes the results of defect analysis that was generated using Rational ClearQuest. Section 8 recommends actions to address the findings of the defect analysis.

    1. Defect Density
    2. Data on defect density has been generated using data extracted from ClearQuest reports. Section 9 of this document includes charts that illustrate:

      • Defects by Severity Level (critical, high, medium, low)
      • Defect Source (the component in which the problem or fault resides)
      • Defect Status (logged, assigned, fixed, tested, closed).

      The Defects by Severity Level chart shows that 4 critical and 4 high priority defects were logged. Detailed analysis of the defect logs has shown that the critical and high priority defects are all associated with the performance and loading problems accessing the legacy Course Catalog System. (Note: Chart not included.)

      The Defect Source Chart shows an unusually high percentage of defects reside in the System Interface components.

      The Defect Status chart shows that many defects are in the logged state and not assigned yet for analysis.

    3. Defect Trend
    4. Defect trends (i.e. defect counts over time) was not measured for the Architectural Prototype tests.

    5. Defect Aging
    6. Tracking of defect age is not required for the Prototype. The current plan is to start tracking the age of open defects at the beginning of the Construction Phase. ClearQuest will be used to generate the Defect Aging Charts.

  10. Suggested Actions
  11. The recommended actions are as follows:

      1. Assign additional systems engineering resources to further evaluate the performance and loading problems associated with access to the legacy Course Catalog System. Design alternatives will be reviewed by the Project Team prior to implementation of any design solutions.
      2. Assign engineering resources to resolve outstanding open defects on the Prototype.
      3. Delay start of next iteration pending resolution of Critical and High Defects.
      4. Design additional tests to further test loads and access times for the Course Catalog System. Try using Rational Visual Quantify to identify and analyze the performance bottlenecks.
      5. It is recommended that future iterations include inspections of the all design or code involving external interfaces. These inspections should reduce the number of problems found during Test.
7.  Diagrams
  1. Image Described by Content Above
    Image Described by Content Above
    Image Described by Content Above
 
Copyright  (C) IBM Corp. 1987, 2004. All Rights Reserved.

Course Registration Project Web Example
Version 2001.03