ICS 125: QA Plans
QA Plans > Who, What, and Why?
- Who: Usually, the lead QA engineer authors the test plan. He/she
must work with the project manager, development team, and QA
team.
- What: A "QA Plan" is a plan for quality assurance.
- Sets goals
- Outlines a strategy
- Identifies actions to be done
- Identifies resources needed
- Schedules activities, allocates budget
- It is basically a subset of the overall project plan
- The test suite(s) and results are separate from the plan
- Why: QA plans are needed to coordinate development activities to
achieve quality. There are many, many possible quality goals and
ways to try to achieve them, so a written plan is needed to make
sure everyone is working along the same path.
QA Plans > Contents
- Identifying information: Project, release, etc.
- Introduction: Background, summary
- Quality goals
- List and describe quality goals
- Reference materials: Links to requirements, design,
standards.
- Resources: People, machines, and tools to be used in testing.
- Strategy:
- Choose and describe QA activities to be done
- Try to assure that quality goals are met
- Matrix of goals vs. activities
- Risks: What quality goals will not be assured
- Plan of action:
- What will be done, by whom, and when
- Links to checklists, test suites, etc.
QA Plans > QA Activities > Reviews
- Basically, perform reviews that focus on specific artifacts or quality goals:
- Requirements review
- Prototype review
- Design review
- Implementation review
- Documentation review
- UI review
- Security review
- Performance/Scalability review
- Daily or automated reviews:
- All changes to a release branch must be peer-reviewed
- All developer work must compile before check-in
- All developers must subscribe to cvs@PROJECT mailing list
- Nightly run of automated style checker
QA Plans > QA Activities > Testing Approaches
- Exploratory testing: Just get the product and poke around.
Useful if QA team needs to gain familiarity before forming a detailed
plan. Useful for verifying that a product is ready for testing by
QA team.
- Ad hoc testing: Manually exercise the product to just see if you
can break it. Ideas from test case design can be applied "on the fly."
- Structured testing: More systematic manual testing. Testers
work through a detailed manual test suite.
- Automated testing:
- Record-and-playback: test through the GUI or web interface with a simulated user
- Programmatic testing: write test code to run against product code
- Black box / specification-based testing: Test only the visible
UI or specified API, do not make use of any knowledge of the
implementation. Try to cover every specified requirement.
- White box / implementation-based testing: Design the test suite
using knowledge of the product implementation. Try to cover every
part of the implementation.
- Regression testing: Verify that solved problems remain solved.
And, testing that fixes do not introduce unintended changes.
Usually automated.
- Smoke testing / Quick tests / Nightly tests: Automated test of
selected features that can be run often. E.g., run by developers
before they do a commit.
- User acceptance testing: Final, high-level test of the entire
system to see if it is acceptable to users. Think in open-ended,
real-world terms.
- Beta testing: Usually ad hoc testing by limited outside people.
Beta test programs must be managed to get many results.
- Early access: Actual usage by broader outside people
QA Plans > QA Activities > Testing scopes
- Testing in small sections is useful because the requirements are
more narrowly scoped and because any observed failures must be due to
defects in that section.
- Unit testing: Test one function, method, or class at a time.
Try to isolate that element from the rest of the product. Each
test is very simple. Assigning blame is obvious. Can be started
very early in development.
- Integration testing: Test specific combinations of components.
Other components may need to be replaced with stubs or drivers.
Test cases focus on interactions between components. Can be started
during development.
- System testing: Systematic testing of the entire system. Test
cases get larger and more complex. Harder to assign blame. Can
only be started after entire system is implemented.
- Staging: System testing in something very close to the system's
intended operating environment. Usually done with actual user
data. Can only be done just before deployment.
QA Plans > QA Activities > Testing specific quality goals
- Remember that the goal of QA is to assure that we meet our
quality goals. A good way to do that is to address a specific
quality goal.
- Functional testing: Verify that the system produces the correct
result. Can be done with any strategy, at any scope.
- Correctness: gives the right result for valid input
- Robustness: gracefully handles invalid input
- Accuracy: correct results are mathematically precise
- Compatibility: file formats, network protocols, browsers,
operating system versions, etc.
- Performance and scalability testing: Verify that the system will
perform well in heavy usage. Usually done with automated,
system-level tests.
- Load testing: time to complete operations under heavy usage load
- Stress testing: gracefully handles excessive usage load
- Volume testing: performance with very large datasets
- Longevity testing: servers should continue to satisfy long sequences of requests
- Usability testing: Verify that the system will be usable by
humans.
- Understandability: users can understand how to use the system
- Learnability: the UI gives clues to explain unfamiliar items
- Efficiency of use: users can get their work done without too many steps
- UI safety: common human errors have limited negative impact
- Security testing
- Physical, network, operating system, application
- Encrypted communications
- Authentication: you are who you say you are
- Authorization: limited access, limited actions
- Malicious inputs
- Denial of service
- Operability testing: Verify that use cases for the system administrator work.
- Install/uninstall
- Upgrade software, migrate data to new formats
- Recovery for system crashes or other errors
- Auditability: system keeps records of events for later review
sample use case templatesample test plan templateexample project plan template