Verify test plan and test design
Verify the test plans and test design to ensure that:
- The test model delivers adequate and appropriate test coverage for the risk profile of the service
- The test model covers the key integration aspects and interfaces, e.g. at the SPIs
- That the test scripts are accurate and complete.
Prepare test environment
Prepare the test environment by using the services of the build and test environment resource and also use the release and deployment processes to prepare the test environment where possible; see paragraph 4.4.5.2. Capture a configuration baseline of the initial test environment.
Perform tests
Carry out the test s using manual or automated techniques and procedure s. Testers must record their findings during the tests. If a test fails, the reasons for failure must be fully documented. Testing should continue according to the test plan s and scripts, if at all possible. When part of a test fails, the incident or issues should be resolved or documented (e.g. as a known error) and the appropriate re-tests should be performed by the same tester.
Figure 4.33 Example of perform test activities
An example of the test execution activities is shown in Figure 4.33. The deliverable s from testing are:
- Actual results showing proof of testing with cross-references to the test model, test cycles and conditions
- Problem s, error s, issues, non-conformances and risks remaining to be resolved
- Resolved problems/known errors and related changes
- Sign-off.
Evaluate exit criteria and report
The actual results are compared to the expected results. The results may be interpreted in terms of pass/fail; risk to the business/ service provider; or if there is a change in a projected value, e.g. higher cost to deliver intended benefits.
To produce the report, gather the test metrics and summarize the results of the tests. Examples of exit criteria are:
- The service, with its underlying application s and technology infrastructure, enables the business users to perform all aspects of function as defined.
- The service meets the quality requirement s.
- Configuration baseline s are captured into the CMS.
Test clean up and closure
Ensure that the test environment s are cleaned up or initialized. Review the testing approach and identify improvements to input to design / build, buy/build decision parameters and future testing policy / procedure s.
Trigger, input and outputs, and inter-process interfaces
Trigger
The trigger for testing is a scheduled activity on a release plan, test plan or quality assurance plan.
Inputs
The key inputs to the process are:
- The service package – This comprises a core service package and re-usable component s, many of which themselves are services, e.g. supporting service. It defines the service’s utilities and warranties that are delivered through the correct functioning of the particular set of identified service asset s. It maps the demand patterns for service and user profile s to SLPs.
- SLP – One or more SLPs that provided a definitive level of utility or warranty from the perspective of outcome s, assets, patterns of business activity of customers (PBA).
- Service provider interface definitions – These define the interfaces to be tested at the boundaries of the service being delivered, e.g. process interfaces, organizational interfaces.
- The Service Design package – This defines the agreed requirement s of the service, expressed in terms of the service model and Service Operation s plan. It includes:
- Operation models (including support resource s, escalation procedures and critical situation handling procedure s)
- Capacity / resource model and plans – combined with performance and availability aspects
- Financial/economic/ cost models (with TCO, TCU)
- Service Management model (e.g. integrated process model as in ISO/IEC 20000)
- Design and interface specification s.
- Release and deployment plans – These define the order that release unit s will be deployed, built and installed.
- Acceptance Criteria – These exist at all levels at which testing and acceptance are foreseen.
- RFCs – These instigate required changes to the environment within which the service functions or will function.
Outputs
The direct output from testing is the report delivered to service evaluation (see section 4.6). This sets out:
- Configuration baseline of the testing environment
- Testing carried out (including options chosen and constraints encountered)
- Results from those tests
- Analysis of the results, e.g. comparison of actual results with expected results, risks identified during testing activities.
After the service has been in use for a reasonable time there should be sufficient data to perform an evaluation of the actual vs predicted service capability and performance. If the evaluation is successful, an evaluation report is sent to Change Management with a recommendation to promote the service release out of early life support and into normal operation.
Other outputs include:
Дата добавления: 2015-10-29; просмотров: 147 | Нарушение авторских прав
Читайте в этой же книге: Status accounting and reporting | Release design options and considerations | Designing release and release packages | Valuable release windows | Build and test prior to production | Service testing and pilots | Plan and prepare for deployment | Early life support | Inputs from Service Design | Types of testing |
mybiblioteka.su - 2015-2024 год. (0.006 сек.)