In most software-development organizations, the testing program functions as the final "quality gate" for an application, allowing or preventing the move from the comfort of the software-engineering environment into the real world.

In most software-development organizations, the testing program functions as the final "quality gate" for an application, allowing or preventing the move from the comfort of the software-engineering environment into the real world. With this role comes a large responsibility: The success of an application, and possibly of the organization, can rest on the quality of the software product.
A multitude of small tasks must be performed and managed by the testing team

so many, in fact, that it is tempting to focus purely on the mechanics of testing a
software application and pay little attention to the surrounding tasks required of a testing program. Issues such as the acquisition of proper test data, testability of the application's requirements and architecture, appropriate test-procedure standards and documentation, and hardware and facilities are often addressed very late, if at all, in a project's life cycle. For projects of any significant size, test scripts and tools alone will not suffice— a fact to which most experienced software testers will attest.


1. Requirements Phase
  • Item 1: Involve Testers from the Beginning
  • Item 2: Verify the Requirements
  • Item 3: Design Test Procedures As Soon As Requirements Are Available
  • Item 4: Ensure That Requirement Changes Are Communicated
  • Item 5: Beware of Developing and Testing Based on an Existing System

2. Test Planning
  • Item 6: Understand the Task At Hand and the Related Testing Goal
  • Item 7: Consider the Risks
  • Item 8: Base Testing Efforts on a Prioritized Feature Schedule
  • Item 9: Keep Software Issues in Mind
  • Item 10: Acquire Effective Test Data
  • Item 11: Plan the Test Environment
  • Item 12: Estimate Test Preparation and Execution Time

3. The Testing Team
  • Item 13: Define Roles and Responsibilities
  • Item 14: Require a Mixture of Testing Skills, Subject-Matter Expertise, and Experience
  • Item 15: Evaluate the Tester's Effectiveness

4. The System Architecture
  • Item 16: Understand the Architecture and Underlying Components
  • Item 17: Verify That the System Supports Testability
  • Item 18: Use Logging to Increase System Testability
  • Item 19: Verify That the System Supports Debug and Release Execution Modes

5. Test Design and Documentation
  • Item 20: Divide and Conquer 
  • Item 21: Mandate the Use of a Test-Procedure Template and Other Test-Design Standards
  • Item 22: Derive Effective Test Cases from Requirements
  • Item 23: Treat Test Procedures As "Living" Documents
  • Item 24: Utilize System Design and Prototypes
  • Item 25: Use Proven Testing Techniques when Designing Test-Case Scenarios
  • Item 26: Avoid Including Constraints and Detailed Data Elements within Test Procedures
  • Item 27: Apply Exploratory Testing

6. Unit Testing
  • Item 28: Structure the Development Approach to Support Effective Unit Testing
  • Item 29: Develop Unit Tests in Parallel or Before the Implementation
  • Item 30: Make Unit-Test Execution Part of the Build Process

7. Automated Testing Tools
  • Item 31: Know the Different Types of Testing-Support Tools
  • Item 32: Consider Building a Tool Instead of Buying One
  • Item 33: Know the Impact of Automated Tools on the Testing Effort
  • Item 34: Focus on the Needs of Your Organization
  • Item 35: Test the Tools on an Application Prototype

8. Automated Testing: Selected Best Practices
  • Item 36: Do Not Rely Solely on Capture/Playback
  • Item 37: Develop a Test Harness When Necessary
  • Item 38: Use Proven Test-Script Development Techniques
  • Item 39: Automate Regression Tests When Feasible
  • Item 40: Implement Automated Builds and Smoke Tests

9. Nonfunctional Testing
  • Item 41: Do Not Make Nonfunctional Testing an Afterthought
  • Item 42: Conduct Performance Testing with Production-Sized Databases
  • Item 43: Tailor Usability Tests to the Intended Audience
  • Item 44: Consider All Aspects of Security, for Specific Requirements and System-Wide
  • Item 45: Investigate the System's Implementation To Plan for Concurrency Tests
  • Item 46: Set Up an Efficient Environment for Compatibility Testing

10. Managing Test Execution
  • Item 47: Clearly Define the Beginning and End of the Test-Execution Cycle
  • Item 48: Isolate the Test Environment from the Development Environment
  • Item 49: Implement a Defect-Tracking Life Cycle
  • Item 50: Track the Execution of the Testing Program

Title x

Welcome x

If you would like to get our latest promotional offers and updates, Please subscribe