Test Plan for a mobile applications
1 Introduction
1.1 Overview
This document explains the testing
methodology for a mobile application, and is to be used as a guide for the
testing activity.
The intended audience for this
document: The Project Managers Product development team members Test
engineers
1.2 Scope
The scope of testing as explained in
the document is to test the operating characteristics of an application that
runs on mobile devices supporting J2ME. The tests are organized by requirement
category such as usability, functionality, security, etc. The procedure for
carrying out testing in terms of preparation of test cases, test environment
setup, defects logging and reporting are explained.
The article does not address the following:
- Content
censorship (i.e. assessment against standards for violence, gambling,
political messaging etc.) for the purpose of preventing the deployment or
sale of an application. Distribution, DRM etc.
- Testing
requirements specific to a particular manufacturer’s (or network
operator’s) device, user interface, and standards (e.g. WAP)
implementation.
1.3 References Mention the documents of
references
1.4 Acronyms
Acronym Expansion
DRM Digital Rights Management
J2ME™ Java™ 2 Platform Micro Edition
2 Test Plan and Strategy
2 Test Plan and Strategy
2.1 Unit Testing
2.1.1 Objective
The objective of Unit testing is to
verify that a particular module of source code is working properly. Unit tests
are designed to test a single class or component or module in isolation.
Developers run unit tests, and only for the components they are working on.
2.1.2 Entry Criteria
- Test cases
are reviewed
- Build is
complete and self test done
- Unit Test
environment is set up
2.1.3 Exit Criteria
- All
planned test cases are executed
- Units are
working as per the expected results
- Defect are
fixed in the code and tracked to closure
2.1.4 Logging Tests and Reporting
The developer will fix the defects that
are found in unit testing. Additionally, if defects corresponding to other
modules or components are found during unit testing, these will be reported.
2.2 System Testing
In System Testing, separate units
(packages / modules / components), or groups of units of the application are
united and tested as a completely merged application. The purpose of System
Testing is to identify defects that will only surface when a complete system is
assembled. Verification of the system at this stage might include:
functionality, usability, security, installation etc. It is intended to
validate the application as a whole.
The rest of this document mainly
explains how System Testing is performed by the testing team.
2.2.1 Testing Procedure
2.2.1 Testing Procedure
The steps in testing consist of:
- Creation
of all the test scenarios and test cases
- Preparation
of a test case document that has a brief description of the test case ,
steps to conduct tests and expected result
- Defect
Report generation.
Outlined below are the main test types
that will be performed
a. Application Characteristics (AC)
– Information about the application is provided to help the testing team in the
testing work.
b. Stability (ST) – Focusing on
the application being stable on the device.
c. Application Launch (AL) –
Once an application is loaded it must start (launch) and stop correctly in
relation to the device and other applications on the device.
d. User Interface (UI)
e. Functionality (FN) – Documented features are implemented in the application and work as expected. Sources for the information are user manuals, formatted application specification documents and online documentation.
f. Connectivity (CO) – the application must demonstrate its ability to communicate over a network correctly. It must be capable of dealing with both network problems and server-side problems.
e. Functionality (FN) – Documented features are implemented in the application and work as expected. Sources for the information are user manuals, formatted application specification documents and online documentation.
f. Connectivity (CO) – the application must demonstrate its ability to communicate over a network correctly. It must be capable of dealing with both network problems and server-side problems.
g. Personal Information Management
(PI) – The application accessing user information needs to be able to do it
in an appropriate manner and not to destroy the information.
h. Security
2.3 Regression Testing
2.3 Regression Testing
This is an additional step, and is done
prior to taking up system testing which is to test new functionality. Regression
testing consists of running a set of standard tests to ensure that old
functionality has not been broken by new functionality. Regression tests are
also run if a new release is made after fixing a number of defects.
2.4 Pass/Fail Conditions
2.4 Pass/Fail Conditions
It is expected that an application must
pass all the tests in each test category to be successful.
2.5 Test Report
2.5 Test Report
For each report, the following
information is provided: • The name of the application
• The version number of the application
• Device used for testing
• Device firmware version
For each error reported, the following information is provided: • Description of the error
For each error reported, the following information is provided: • Description of the error
• Frequency of occurrence of error:
Systematic or Random or Once
• Location of the error in the
application
• Steps to reproduce the error
3 Schedules for Testing
This will be decided in consultation
with the project manager.
4 Risks and Assumptions
4 Risks and Assumptions
4.1 Risks:
The following may impact the test
cycle:
- Device
availability
- Any new
feature addition/modification to the application which is not communicated
in advance.
- Any delay
in the software delivery schedule including defect fixes. Any changes in
the functional requirements since the requirements were
signed-off/formulated
4.2 Assumptions:
- Every
release to QA will accompany a release note specifying details of the
features implemented and its impact on the module under test.
- All
“Show-Stopper” bugs receive immediate attention from the development team.
- All bugs
found in a version of the software will be fixed and unit tested by the
development team before the next version is released
- All
documentation will be up-to-date and delivered to the system test team.
- Devices,
Emulators and other support tools will be fully functional prior to
project commencement.
- In case of
lack of required equipment or changes in the feature requirements, the
test schedules may need to be reviewed.
5 Entry and Exit Criteria
5.1 Entry Criteria
- Development
of the application is complete
- Successful
completion of unit testing for the applications
- Release of
software to the test environment
- Dedicated
resources are allocated
- Approved
test bed to carry out system testing.
- Test
environment is up and working
5.2 Exit Criteria
The Following is the criteria when the
testing will be stopped for this module:
- All test
cases have been executed and at least 95% have passed successfully. The
remaining 5% do not impact critical functionality
- All test
results have been evaluated and accepted.
- There are
no showstoppers or high criticality defects unresolved or outstanding
6 .Test Metrics
Following metrics will be captured and
reported as part of the Test
- Summary
Report
- Test
Design effort
- Test
execution effort
- Number of
Test Cases executed
- Number of
Defects and their classification
- Test
Coverage (Number of test cases executed/Number planned)
7 Logging Tests and Reporting
Some third party applications will be
used for reporting bugs found during test execution. The QA team will log
defects in the tool as testing progresses.
8 Roles and Responsibilities
The roles and responsibilities of the testing team for the project are as follows:
The roles and responsibilities of the testing team for the project are as follows:
8.1 Project Manager / Test Manager
Responsibilities:
- Overall
responsibility for managing the testing process
- Approval
of test documents
- Approval
of inspections/reviews done as per the test plan
- Providing
resources for the project
8.2 Test Lead
Responsibilities:
- Requirement
gathering
- Planning
and estimating for testing
- Tracking
and monitoring the testing as per the test plan
- Reporting
the project status
8.3 Test Engineer
Responsibilities:
- Creating
the test cases as per the test plan
- Executing
the test cases
- Documenting
the results and logging errors.
9. Deliverables
- Test Plan
- Test Cases
Document – Document with a description and expected result for each test
case.
- Test
Results – The Pass/Fail status of each test cases and the list of issues.
Comments
Post a Comment