test

test

System Test Plan

TDMS Phase II

Wednesday, Jan 05, 2011

 

Revision Chart

Version
Resource
Description of Version
Date Completed
DraftNaveed-Ur-Rehman ChughtaiDraft developed for the review of Project Manager10th, Jan 2010
Review
Final
Revision 1
Revision 1.1

 

Approvals

Title
Date
Signature
Zaighum Yazdan

 

Change Record

Date
Author
Version
Change Reference

 

Distribution

Name
Project Role
Zaighum YazdanProject Manager

 

 

 Table of Contents

 

1     Introduction. 4

1.1      Objectives. 4

1.1.1      Purpose of System Testing. 4

1.2      Background. 5

1.2.1      TDMS Phase-II. 5

1.3      Where does this Deliverable fit Within the Overall Project?. 6

1.4      Scope of the document 7

2     System Test Approach. 8

2.1      Scope. 8

2.2      What is not included in the scope?. 8

2.3      Inputs for System Testing. 8

2.4      Type of Testing. 8

3     System Testing Process. 11

3.1      Testing Process. 12

3.1.1      Test Objectives/Test Conditions. 12

3.1.2      Test Cycles. 12

3.1.3      Test Steps. 13

3.1.4      Test data. 13

3.1.5      Testing Environment 13

3.2      Testing Deliverables. 14

3.3      Bug Reports. 15

3.3.1       Bug Priority and Severity Levels. 15

3.3.2       Bug Resolution. 16

3.3.3       Status of Bug. 17

4     Reference Documents. 18

5     Test Platform.. 18

5.1      Hardware. 18

5.2      Software. 18

6     Testing Tools. 18

7     Team Roles and Responsibilities. 18

 

 

 

1       Introduction

 

This document describes the System Test Plan for TDMS Phase II. It is intended that the QA Team of Xavor will use this document to execute the system-testing phase. The plan includes the types of testing that will be carried out for TDMS Phase II on Xavor Corporate Portal, conforming to which will deem the system and application as successful and ready for release.

 

After a brief overview and purpose of system testing, this document defines the scope of the system test and covers areas such as functional testing, integration testing, security testing, and usability (User Interface) testing. It goes on to look at the issues that relate specifically to testing the application on Portal. The document then defines the approach for the actual testing effort.  That is the various steps that must be taken in order for full end-to-end modules as stand-alone and then as a system when integrated on Portal.

 

This document does not specify the test cases and test scenarios, which will be executed during the testing activity. These will be documented separately.

1.1     Objectives

 

The objective of this document is to explain the strategy of the System Testing and the testing processes that will be performed at the QA Department of Xavor. The intent is to provide the scope and the methodology of System testing for TDMS Phase II project.

 

The system will be tested to ensure that project specifications are met and that all the functionality is present and working. Attention will be paid to general system functions as specified in the FRS Document. Application error handling capabilities and consistent error messages will also be tested.

 

1.1.1      Purpose of System Testing 

 

System testing is performed to ensure that all of the individual components of the system work successfully together to accomplish their designated task. For TDMS Phase II project this means that all the components of the application work properly and perform the expected functions according to the business requirements mentioned in the Functional Specification document.

 

Specifically, the system testing is designed to:

 

  • Identify and correct problems of the application prior to deployment;
  • Ensure that the functionality defined in the functional specification and design documentation for the system is met;
  • Confirm that data is processed and output is generated as specified and verify the interfaces between existing systems are developed as required;
  • Verify interface program logic and test results using test data.
  • Validate security features.
  • Ensure that the system operates at acceptable performance levels as defined in the acceptance criteria.

 

System testing is intended to be a complete test of all functionality of all individual components of the application.

 

It is assumed that before system testing commences, a complete test of all the components of TDMS Phase II test has been carried out through unit testing and that all items such as field level edits, cross-validations, and functionality errors have been unit tested in detail.  System testing however is there to ensure that the components work according to specification when they are placed together in the system, and that end-to-end business processes can be executed successfully.

1.2     Background

This section describes the little background of the system, which comprises the corporate portal and business application.

 

1.2.1      TDMS Phase-II.

TDMS Phase II is targeted to support the process of uploading test data to a centralized database and be able to search and retrieve the test data for analysis, generating reports for various parameters. The following business units will be tested.

 

  • Login
  • Upload and Parse Test Data Files
  • Search Device
  • Export Search Device as CSV
  • Search Test Data – With Search Device filter criteria
  • Search Test Data – Without Search Device filter criteria
  • Excel Data Report
  • Delete Test Data

 

 

 

1.3     Where does this Deliverable fit Within the Overall Project?

 

The diagram below shows the overall tasks that would be performed throughout the project.

 

This document covers the System testing process and approaches for TDMS Phase II.

 

 

 

 

1.4     Scope of the document

 

This document covers the following:

  • System Test Approach
  • Types of Testing
  • System Testing Process
  • Testing Deliverables
  • Test Execution Process
  • Test Exit and Entry Criteria
  • Bug Report Classification
  • Team Roles and Responsibilities

 

 

 

2       System Test Approach

2.1     Scope

 

The scope of the System Test covers the development and integration of TDMS Phase II.

 

Demonstrating adherence to the specifications defined in the design stage, the Following are some of the features that will be tested:

 

  • Graphical User interface.
  • Consistency with established standards – i.e. usability, security of the applications.
  • Workflow, business process, transactions, etc.
  • System performance and integration.
  • Connections and data integrity.
  • Database updates – to test the system’s ability to successfully update the data without interruptions.

2.2     What is not included in the scope?

 

The following is not included in the scope of testing:

 

  • Unit Testing – It is always done by the developers during the development process. As no automated testing tool is being used for this project.
  • System testing every possible data combination and program flow. Instead, it will only test those scenarios that revolve around the business functionality.
  • Disaster Recovery Planning and Procedure.

 

2.3     Inputs for System Testing

 

  • Business System confirmed by the Project Manager/ Development team lead as complete for testing, either in standalone or system testing.
  • Approved System test plan
  • Test data and Test Cases
  • QA and Testing Team resources availability
  • Bug Report Database to record Test problem reports.

2.4     Type of Testing

 

This section outlines the types of testing that will be performed for the TDMS Phase II project.

The following table describes the type of system tests to be performed and their descriptions for a better understanding of the testing process.

 

 

TestDescription
Unit Testing
  • It is assumed that Unit Testing is the responsibility of the Development Team and will be performed by the developers during the development/construction phase of the project.

 

  • Unit Testing is the testing of individual modules/features of the software application to ensure that the specific module operates as per the design and functional specifications. This includes the code review and the coding standards.

 

  • Due to this recursive activity, it is always done by the developers/development team during the development process.

 

  • Roles involved
    • Development Team

 

§  Responsibility

    • Development Team

 

Usability Testing
  • Usability testing focuses on determining whether the system provides ease of use and contains the functionality that the system users desire.

 

  • Usability Testing will be the responsibility of the QA and testing team. The testing will confirm the usability features (look and feel, navigation requirements, design layout etc).

 

  • Roles involved
    • Testing Team

 

§  Responsibility

    • Testing Team

 

Integration Testing
  • Integration Testing will be carried out to check all the integration points across the RMS application and the Portal. As the Portal interface will be the focal point of integration for the business application, the testing of the integration of all the data is very critical.

 

  • Roles involved
    • Testing Team

 

§  Responsibility

    • Testing Team

 

System /Functional Testing
  • Functional testing ensures that all the business rules specified in the functional specifications, and design documents are followed throughout the system.

 

  • This involves end-to-end testing of an end-to-end business process.

 

  • Functional testing also involves testing the security of the system.

 

  • Roles involved
    • Testing Team

 

§  Responsibility

    • Testing Team

 

Performance Testing
  • Performance testing is conducted to ensure that the application performs well with a number of concurrent users. The goal is to introduce the flexibility of personalization without compromising performance.

 

  • Roles involved
    • Testing Team

 

§  Responsibility

    • Testing Team

 

User Acceptance Testing§  User Acceptance Testing is the responsibility of the key staff and PM. Once the system testing is completed and the business application is deemed as ready for release, Acceptance testing will be carried out on-site at Xavor to hand over the system ownership to them.

 

§  A separate Acceptance testing Plan exists with defined Acceptance criteria, which will be followed.

 

  • Roles involved
    • Project Manager (PM)
    • Testing Team
    • End Users (Recruitment Staff)

 

§  Responsibility

    • End Users (Recruitment Staff)
    • PM

 

 

 

3       System Testing Process

 

System testing will verify TDMS Phase II compliance to the business requirements defined in the functional specification, by focusing on the objectives and capabilities specified for that system. These capabilities are tested for functionality and performance within the boundaries of the interfaces defined for the release.

 

Defined below is our testing process that will be followed for performing various types of testing, constituting the complete system testing cycle.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

3.1     Testing Process

 

The system test plan provides the basis for planning, executing, and documenting system test activities.  This testing process is developed to include each of the tests listed in the Types of the Testing section that will be followed for RMS 1.4 Project. The Testing process consists of the following, which will be followed.

 

  • Test Objectives – Establish the scope of the testing for each process.  It is critical that these objectives encompass all system functions that are a part of the process.

 

  • Test Conditions – Specific scenarios that, when tested and successfully completed, satisfy the test objectives they reference. Identification of test conditions is the prerequisite for effectively designing test data. A comprehensive set of test conditions for each process will enhance the users’ confidence that the system will function properly when brought into production.  Whilst these come from the test objectives, they expand the objective and break it down into more specific requirements, which can be later mapped to the business requirements of the application and Portal.

 

  • Test Cycles – A method of organizing process flows and test conditions into manageable units of work.  Test cycles define the order of execution for the test conditions and related programs associated with the test goal.  Each cycle attempts to simulate the flow of work in the production environment.

 

  • Test Steps – Detailed activities that must be performed to complete each test cycle.  Each step contains a description, test data that will be used to complete the task, and the expected results of the task.  Once the test has been run, it will also contain a reference to the actual results generated.  It may also contain a bug/defect number if a problem is identified in testing.  If the step has been tested successfully, it will contain verification where the tester has initialed/signed off the task.

 

3.1.1      Test Objectives/Test Conditions

 

Each system functionality to be tested will have defined test objectives and conditions. The functional specification and the detailed design deliverables produced will form the input to the development of the test cases that are to be used in the testing process.

 

This will fall into the following categories:

 

  • Normal functionality performed with messages logged and accurate reports produced as required.
  • Exception runs with errors handled gracefully and reported.

 

3.1.2      Test Cycles

 

In order to seamlessly recreate bugs, retest critical defects and ensure the overall accuracy of the testing process, it is recommended that the testing team complete at least 2 iterations or cycles of testing.  This means that the entire process described above will be repeated at least twice.

 

Once the cycle of testing will be carried out for all system functions and processes. The second cycle will test the fixes and bugs reported that have been resolved.

 

The test cycles organize the test conditions into manageable processes that can be carried through from end to end.  The number of test cycles needs to strike a balance between not having too many, as this will take longer to run the test, but also not having too few, as then if one gets held up waiting for a bug to be resolved, the others can continue.

 

3.1.3      Test Steps

 

Test steps expand the test cases into a step-by-step approach to performing the test.  The steps should guide the tester through every step involved in running the test, from loading the test data to verifying the end output.  They should be written so that anyone could pick up the test cases, follow the steps and successfully run the tests, however technical or non-technical they are.

 

As each test step is executed, if a problem occurs, a bug report will be logged against that test step.  For more detail on bugs and their resolution, see the bugs section below.  If no problem has occurred and the test step is executed successfully, the tester will sign the test plan by the test step (in the ‘executed’ column) and continue with the next step in the plan.

 

3.1.4      Test data

The Testing team and Developer will develop the necessary test data to test all the possible steps required for the end-to-end testing process. The test data for system testing will be a combination of existing business data and specifically developed testing data developed by the testing team along with the test cases.

3.1.5      Testing Environment

 

The testing environment will be separate from the development environment so as to manage and control of the tested application versions. The test Engineer will be testing a specific business application deployed on the server machine and will control the product during each testing cycle.

3.2     Testing Deliverables

 

The testing team will produce the following deliverables:

 

Deliverables Description
System Test Plan The system test plan is developed in the design phase to define the testing process and approach to be followed during the testing phase, once the first level development cycle has been completed.
System Test CasesThe test cases are the business scenarios that are being tested.  It will comprise the business functionalities, the pre-conditions, and the expected results for the execution of these test cases. System Test cases will be developed during the testing phase of the project and discrepancies will be classified and resolved. These will be recorded in the Bugzilla database created specifically for each project. All bugs will be logged and the status is updated in Bugzilla.
Discrepancy Reports – Bug ReportsDuring testing when a discrepancy is found, a bug Report must be logged in a local database. The bug reports will lead to the identification of the problem. The Bug report will provide the following information for the identification of the root cause:

§  Business scenarios/functionality that is being tested

§  Description of the discrepancy

§  System response against the user action

§  Steps to reproduce

§  Image of discrepancy if applicable (Optional)

§  Developer responsible for the correction of the issue

§  Due date and time spent

§  Bug status (Open, Fixed, Partially Fixed, Ready for Retest, Not Valid, Not Reproducible, Closed)

§  Level of severity of the discrepancy (Cosmetic, Major, Minor, Medium, Critical, Enhancement)

§  Resolution action steps

§  Reason/Root cause of The Bug (Unclear Business Requirements, Design documents, Missing Specifications, Lack of standards, Tool Limitation, Miscellaneous)

 

 

3.3     Bug Reports

 

Bug Reports identified during the system test will be logged separately by the Test Engineers if any test case doesn’t meet the Pass/fail criteria. All the bugs identified will be resolved before the on-site deployment of the business application.

 

3.3.1       Bug Priority and Severity Levels

 

Severity is assigned to a bug so that the correction effort can first be focused on those reports that may fail the System Testing. The testing team or the development team lead can modify the severity of the Bug.

 

Reports must be given a prioritization rating of ‘High’, ‘Medium’, or ‘Low’.  The following metrics should be used when prioritizing bugs for resolution:

 

PriorityDescription
HighTesting on this test cycle cannot continue until the problem is fixed.
MediumTesting on this test cycle can continue, however, the system would be unusable if the system went live with this problem unresolved
LowTesting on this test cycle can continue, and the system would be operational if it went live with this problem unresolved.

 

The severity of the problem should also be taken into account when prioritizing bug reports. Problems can be categorized as the following severity levels:

 

 

 

 

 

 

 

 

 

 

Severity Level Description  Expected Service Commitment
Critical

Severity 1

§  A critical problem is one that causes a substantial system failure or renders the software application substantially unusable, and an immediate fix is required.

§  The System is likely to fail, be unusable, or unstable.  The Hardware or Software has failed or is unusable or unstable.  An error has or is likely to prevent users from using the System.

Resolve Immediately
Major

Severity 2

§  A major problem is one, which causes a system or software application feature failure that cannot be avoided by alternate methods by the user.

  • A non-key function has failed totally or a key function has partially failed. No workaround is available although other parts of the System may remain accessible and useable.
Resolve Immediately
Medium§  A medium problem is one that causes the application feature or system failure that can be avoided by alternate methods by the user.

  • A non-key function has failed totally or a key function has partially failed but a workaround is available and other components of the system remain accessible and useable.
Give High Attention
Minor§  A minor problem is one that causes an inconvenience to the user of the application, including but not limited to message and documentation errors.Normal Queue
Enhancement/Suggestion§  An enhancement is considered to be a change required that is not in the current scope of the project. The request must be re-evaluated by the project team for consideration (new/ additional user requirements).Normal Queue
Cosmetic

 

§  A cosmetic problem is not actually a problem in the system functionality or the application feature but is related to GUI/Interface. This doesn’t affect the application but is unavoidable for the user.Normal Queue

 

3.3.2       Bug Resolution

 

Once a Bug has been logged into the database, the Team Lead must be informed of the status. The Test Engineers will provide the details of the test cycles executed. The Test engineers must hand over the complete bug report for each system to the development team for resolution. Once the bug has been resolved, the testing team will begin the re-run of the test cycle in order to verify that the problem is resolved and that the test cycle has passed successfully.

 

System testing is obviously an iterative process. It may be that once a Bug is resolved, another problem comes to light as a result of the test cycle being able to progress further.

 

 

The following diagram shows the system test/Bug cycle.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

System test execution diagram

 

3.3.3       Status of Bug

The status of a Bug will change as it goes around this iteration.

  1. When a Bug is created it will be Open. The System Test Engineer will then notify the development team about the Bug for resolution.

 

  1. When the Bug comes back from development after being resolved, it will move into Ready for retest/Fixed by Developer/Partially Fixed according to the status.

 

  1. Once the test has been re-run successfully, the Bug will be Fixed.

 

  1. If the Bug has not been adequately resolved, and the Developer is unable to reproduce the problem then the Bug’s status will be Not Reproducible.

 

  1. It may be that the testing or development manager decides that a particular Bug cannot be resolved or has a very low prioritization and there may not be time for resolution. In these cases, the status of the Bug will be On Hold.

 

4       Reference Documents

  1. Functional Specification

 

5       Test Platform

5.1     Hardware

The minimum Hardware requirement for TDMS Phase-II is

 

  • Pentium 4
  • 2 GB RAM
  • Min space required 500 MB

 

5.2     Software

Accessing TDMS Phase-II sites requires one of the following:

 

  • Microsoft Internet Explorer 7.0/8.0, plus the latest service pack
  • Mozilla Firefox

 

6       Testing Tools

  • SharePoint Portal Issue Tracking for Test Problem Reports

7       Team Roles and Responsibilities

The system testing team will be formally responsible for conducting the tests for all the business applications and the Portal framework. The Quality Assurance Manager and Team Lead with relevant skills and experience, along with the Test Engineers will be responsible for executing the test plans and test cases for all system test components. The test cycles will be executed in accordance with the system test plan.

 

Role Responsibilities
Project Manager

 

§  Test Review responsibilities

§  Supporting the Testing team for test preparation and execution

§  Review of Test plans and Test cases with the QA

§  Signoff Test plans

§  Review the analysis and status of the bug

§  Impact Analysis for change management – If any

 

QA Team Lead Test definition and set-up

§  Establish test scope and approach in cooperation with the Project Manager

§  Develop system test plans

§  Define and Manage the testing process

 

Test scheduling and progress management

§  Monitoring Testing activities

§  Co-ordinate with technical and other support teams in test execution and Bug resolution

§  Provide testing status updates to the Project Manager

§  Support in execution of test plans and test cases

§  Monitor Testing Activities

§  Execute Test cases and Test plans

§  Preparing Test Schedules

 

Test Review responsibilities

§  Participate in reviews of test plans

§  Review Bugs and System Change Requests– if any

§  Perform quality assurance review of testing activities

§  Identification and escalation of testing issues, risks and problems.