Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Expand
    Test Planning Introduction
    Test Planning Introduction
    • Purpose - Describe why the test plan was developed--what the objectives are. This may include documenting test requirements, defining testing strategies, identifying resources, estimating schedules and project deliverables.
    • Background - Explain any events that caused the test plan to be developed. This can include implementing improved processes, or the addition of new environments or functionality.
    • Technical Architecture - diagram the components that make up the system under test. Include data storage and transfer connections and describe the purpose each component serves including how it is updated. Document the layers such as presentation/interface, database, report writer, etc. A higher level diagram showing how the system under test fits into a larger automation picture also can be included if available.
    • Specifications - list all required hardware and software including vendors and versions.
    • Scope - briefly describe the resources that the plan requires, areas of responsibility, stages and potential risks.
    • Project Information - identify all the information that is available in relation to this project. User documentation, project plan, product specifications, training materials and executive overview materials are examples of project information.
  2. Expand
    Requirements
    Requirements

    This section of the test plan lists all requirements to be tested. Any requirement not listed is outside of the scope of the test plan.

    • Business Requirements - possible questions to ask:
      • What does the business do today?
      • How does it do this?
      • Who does it?
      • When and where does it happen?
      • Why is it needed?
    • Functional Requirements - possible questions to ask:
      • What are the functions?
      • What are the qualitative descriptions for the functions?
      • What is the context and relationship?
      • Who performs the activities?
    • Technical Requirements - possible questions to ask:
      • Are there any constraints that apply?
      • Are there any quality and performance related requirements?
    • Expand
      Writing Good Requirements
      Writing Good Requirements
      1. Keep sentences short.
      2. Never express more than one requirement in a single sentence, avoid the use of 'and' in sentences.
      3. Avoid the use of jargon, abbreviations and acronyms unless you are completely confident that they will be understood by all readers of your documentation.
      4. Keep paragraphs short. No paragraph should be made up of more than 3-4 sentences.
      5. Use lists and tables wherever possible to present information sequences.
      6. Use terminology consistently. Make a glossary or dictionary to define terms.
      7. Use words such as 'shall', 'should', 'will', and 'must' in a consistent way with the following meanings:
        • shall - indicates that the requirement is mandatory
        • should - indicates a feature that is desirable but not mandatory
        • will - indicates something that will be externally provided
        • must - is best avoided. If used, it should be a synonym for 'shall'.
      8. Do not express requirements using nested conditional clauses (if 'x' then if 'y' then 'R1a' else if 'z' then 'R1b'...)
      9. Use the active rather than the passive voice, particularly when describing actions taken by people or the system.
      10. Express complex relationships in diagrams explained in natural language description. Diagrams are much more effective, but are a challenge for many to create and understand.
      11. Never use anonymous references. When possible, refer to other requirements, tables or diagrams, give a brief description of what you are referring to, as well as the reference number.
      12. Pay attention to spelling and grammar
    • Expand
      SMART Requirements
      SMART Requirements

      SPECIFIC

      Have only one interpretation and does not leave doubt in the reader's mind

      MEASURABLE

      Are quantified in a manner that can be verified

      ATTAINABLE

      Can be achieved using available tools and methods at a definable cost

      RELEVANT

      Are essential to the defined set of requirements being developed and will cause a deficiency if they are removed or deleted

      TRACEABLE

      Are linked to one another within a specific document and to/between other project documents

    • Expand
      Requirements Prioritization Scale example
      Requirements Prioritization Scale example

      Requirements management is used to ensure that identified testing requirements are linked to test cases and tracked throughout the project.

      NECESSARY

      Mission critical

      IMPORTANT

      Supports necessary system operations

      DESIREABLE

      Provides functional, quality, or usability enhancement

  3. Expand
    Test Strategy
    Test Strategy

    The components of a Test Strategy help to define the Test Plans that are created for each business area. The Test Strategy document presents global directions from which the Test Plans can then be generated.

    • Purpose and Test Objective - The overall objective this strategy is designed to meet. For a complete system test, this may be a statement that all functional requirements must behave as expected or as documented.
  • 1012141517Testability
    Expand
    Quality Standard for Release
    Expand
    Quality Standard for ReleaseQuality Attributes - Requirement characteristics that are visible to users and may be considered during test planning.

    Quality Attributes - Requirement characteristics that are visible to users and may be considered during test planning.

    22Ease with which software components or integrated product can be tested to find defects
    23Usability
    24Also referred to as 'ease of use,' it is easier to use the software than not, measures the effort required to prepare input, operate the system, interpret output.
    25{expand

    Availability

    Percentage of planned 'up time' during which system is actually available and fully operational

    Efficiency

    Measure of how well system resources are utilized (bandwidth, disk space, CPU, etc)

    Flexibility

    Indicates how much effort is required to add new capabilities to the product, aka extensibility, augmentability, expandability and scalability

    Integrity

    Precluding unauthorized access, preventing information loss, data privacy, system protection from viruses

    Interoperability

    Indicates how easily the system can exchange data or services with other systems

    11

    Maintainability

    Indicates how easy it is to correct or make a change to the software component

    13

    Portability

    Effort required to migrate a piece of software form one operating environment to another

    Reliability

    16

    Indicates the probability the software will execute for a long time without failure. Measures include length of time until a defect is revealed, defect density and percentage of correctly performed operations.

    Re-usability

    18

    Indicates the extent a software component can be used in other systems other than that in which it was originally developed

    1920

    Robustness

    1Availability
    2Percentage of planned 'up time' during which system is actually available and fully operational
    3Efficiency
    4Measure of how well system resources are utilized (bandwidth, disk space, CPU, etc)
    5Flexibility
    6Indicates how much effort is required to add new capabilities to the product, aka extensibility, augmentability, expandability and scalability
    7Integrity
    8Precluding unauthorized access, preventing information loss, data privacy, system protection from viruses
    9Interoperability

    Degree to which a system can perform correctly when confronted with invalid data, unexpected operating conditions, defects in connected hardware or software

    Testability

    Ease with which software

    21

    components or integrated product can be tested to find defects

    Usability

    Also referred to as 'ease of use,' it is easier to use the software than not, measures the effort required to prepare input, operate the system, interpret output.