You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 42 Next »

A good test plan is the cornerstone of a successful testing implementation. While every testing effort may be unique, most test plans include a common content framework. This article presents the components that make up this framework, and serves as a guide to writing your own test plan.

  1. This section establishes the scope and purpose of the test plan. This is where to describe the Fundamental aspects of the testing effort.

    1. Purpose - Describe why the test plan was developed--what the objectives are. This may include documenting test requirements, defining testing strategies, identifying resources, estimating schedules and project deliverables.
    2. Background - Explain any events that caused the test plan to be developed. This can include implementing improved processes, or the addition of new environments or functionality.
    3. Technical Architecture - diagram the components that make up the system under test. Include data storage and transfer connections and describe the purpose each component serves including how it is updated. Document the layers such as presentation/interface, database, report writer, etc. A higher level diagram showing how the system under test fits into a larger automation picture also can be included if available.
    4. Specifications - list all required hardware and software including vendors and versions.
    5. Scope - briefly describe the resources that the plan requires, areas of responsibility, stages and potential risks.
    6. Project Information - identify all the information that is available in relation to this project. User documentation, project plan, product specifications, training materials and executive overview materials are examples of project information.
  2. This section of the test plan lists all requirements to be tested. Any requirement not listed is outside of the scope of the test plan.

    1. Business Requirements - possible questions to ask:
      1. What does the business do today?
      2. How does it do this?
      3. Who does it?
      4. When and where does it happen?
      5. Why is it needed?
    2. Functional Requirements - possible questions to ask:
      1. What are the functions?
      2. What are the qualitative descriptions for the functions?
      3. What is the context and relationship?
      4. Who performs the activities?
    3. Technical Requirements - possible questions to ask:
      1. Are there any constraints that apply?
      2. Are there any quality and performance related requirements?
      1. Keep sentences short.
      2. Never express more than one requirement in a single sentence, avoid the use of 'and' in sentences.
      3. Avoid the use of jargon, abbreviations and acronyms unless you are completely confident that they will be understood by all readers of your documentation.
      4. Keep paragraphs short. No paragraph should be made up of more than 3-4 sentences.
      5. Use lists and tables wherever possible to present information sequences.
      6. Use terminology consistently. Make a glossary or dictionary to define terms.
      7. Use words such as 'shall', 'should', 'will', and 'must' in a consistent way with the following meanings:
        • shall - indicates that the requirement is mandatory
        • should - indicates a feature that is desirable but not mandatory
        • will - indicates something that will be externally provided
        • must - is best avoided. If used, it should be a synonym for 'shall'.
      8. Do not express requirements using nested conditional clauses (if 'x' then if 'y' then 'R1a' else if 'z' then 'R1b'...)
      9. Use the active rather than the passive voice, particularly when describing actions taken by people or the system.
      10. Express complex relationships in diagrams explained in natural language description. Diagrams are much more effective, but are a challenge for many to create and understand.
      11. Never use anonymous references. When possible, refer to other requirements, tables or diagrams, give a brief description of what you are referring to, as well as the reference number.
      12. Pay attention to spelling and grammar
    • #SPECIFIC- Have only one interpretation and does not leave doubt in the reader's mind
      #MEASURABLE- Are quantified in a manner that can be verified
      #ATTAINABLE- Can be achieved using available tools and methods at a definable cost
      #RELEVANT- Are essential to the defined set of requirements being developed and will cause a deficiency if they are removed or deleted
      #TRACEABLE- Are linked to one another within a specific document and to/between other project documents

      1. Are the requirements verifiable when implemented?
      2. Are the requirements realistic, and how will they be implemented and tested?
      3. Where do I assign testing resources for increased efficiency and reduced risk?
      4. Is the planned testing for the requirement a good trade-off between the requirement's business value and its risk?
    • Requirements management is used to ensure that identified testing requirements are linked to test cases and tracked throughout the project.
      #NECESSARY= Mission critical
      #IMPORTANT= Supports necessary system operations
      #DESIRABLE= Provides functional, quality, or usability enhancement

  3. The components of a Test Strategy help to define the Test Plans that are created for each business area. The Test Strategy document presents global directions from which the Test Plans can then be generated.

    • The overall objective this strategy is designed to meet. For a complete system test, this may be a statement that all functional requirements must behave as expected or as documented.

    • Quality Attributes

      Requirement characteristics that are visible to users and may be considered during test planning.

      Availability

      Percentage of planned 'up time' during which system is actually available and fully operational

      Efficiency

      Measure of how well system resources are utilized (bandwidth, disk space, CPU, etc)

      Flexibility

      Indicates how much effort is required to add new capabilities to the product, aka extensibility, augmentability, expandability and scalability

      Integrity

      Precluding unauthorized access, preventing information loss, data privacy, system protection from viruses

      Interoperability

      Indicates how easily the system can exchange data or services with other systems

      Maintainability

      Indicates how easy it is to correct or make a change to the software component

      Portability

      Effort required to migrate a piece of software form one operating environment to another

      Reliability

      Indicates the probability the software will execute for a long time without failure. Measures include length of time until a defect is revealed, defect density and percentage of correctly performed operations.

      Re-usability

      Indicates the extent a software component can be used in other systems other than that in which it was originally developed

      Robustness

      Degree to which a system can perform correctly when confronted with invalid data, unexpected operating conditions, defects in connected hardware or software

      Testability

      Ease with which software components or integrated product can be tested to find defects

      Usability

      Also referred to as 'ease of use,' it is easier to use the software than not, measures the effort required to prepare input, operate the system, interpret output.

  • No labels