Test Strategy & Planning
A comprehensive test strategy ensures systematic coverage of all system functionality. Planning upfront prevents ad-hoc testing that misses critical scenarios.
Why Testing Matters
NetSuite implementations involve complex configurations, customizations, and integrations. Without structured testing, issues surface in production—causing business disruption, data errors, and user frustration.
Defects found in production cost 10-100x more to fix than defects found during testing. A misconfigured approval workflow discovered after go-live means manual intervention on every affected transaction until fixed. Invest in testing upfront.
Testing Levels
A comprehensive test strategy includes multiple testing levels, each with a distinct purpose.
| Level | Focus | Who Tests |
|---|---|---|
| Unit Testing | Individual configurations, scripts, workflows | Implementer/Developer |
| Integration Testing | End-to-end processes, integrations | Implementer/QA |
| System Testing | Complete system behavior | QA Team |
| User Acceptance | Business requirements validation | Business Users |
| Performance | Speed, capacity, reliability | Technical Team |
Test Planning Components
A test plan documents your testing approach, scope, resources, and schedule.
Test Plan Contents
- Scope: What will and won't be tested
- Approach: Testing methodology and levels
- Resources: Who will test, their availability
- Environment: Where testing occurs (sandbox, dev)
- Schedule: Testing timeline and milestones
- Entry/Exit Criteria: When testing starts and ends
- Defect Management: How issues are logged and tracked
- Risk Assessment: What could go wrong
Create the test plan during design phase, not after configuration is complete. Early planning ensures adequate time allocation and helps identify testing requirements that influence design decisions.
Test Case Development
Test cases document specific scenarios to validate. Well-written test cases are repeatable and produce consistent results.
Test Case Components
- Test Case ID: Unique identifier for tracking
- Title: Brief description of what's being tested
- Preconditions: Setup required before executing
- Test Steps: Exact actions to perform
- Test Data: Specific data values to use
- Expected Results: What should happen
- Actual Results: What did happen (filled during execution)
- Pass/Fail: Test outcome
Map test cases to requirements. Every requirement should have at least one test case validating it. This traceability matrix proves to stakeholders that all requirements were tested and helps prioritize testing when time is limited.
Test Environment Strategy
NetSuite provides multiple environment types for testing. Use them appropriately.
| Environment | Purpose | Data |
|---|---|---|
| Development | Building and unit testing | Minimal test data |
| Sandbox | Integration and UAT | Production copy or test data |
| Release Preview | Testing NetSuite upgrades | Production snapshot |
| Production | Live operations only | Real data (never test here) |
Sandbox refreshes overwrite all data and customizations. Plan refreshes carefully—never refresh during active testing. Document the refresh schedule and notify all testers before refreshing.
Defect Management
When tests fail, defects need systematic tracking through resolution.
Defect Lifecycle
Defect Severity Levels
- Critical: System unusable, no workaround—blocks go-live
- High: Major functionality broken, workaround exists
- Medium: Feature doesn't work as expected, workaround available
- Low: Minor issue, cosmetic, or enhancement request