Phase 12 of 14

Testing

Testing validates that your NetSuite implementation meets requirements and works as expected. A structured testing approach catches issues before go-live, when they're cheaper and easier to fix.

Chapter 12.1

Test Strategy & Planning

A comprehensive test strategy ensures systematic coverage of all system functionality. Planning upfront prevents ad-hoc testing that misses critical scenarios.

Why Testing Matters

NetSuite implementations involve complex configurations, customizations, and integrations. Without structured testing, issues surface in production—causing business disruption, data errors, and user frustration.

🔒 The Cost of Defects

Defects found in production cost 10-100x more to fix than defects found during testing. A misconfigured approval workflow discovered after go-live means manual intervention on every affected transaction until fixed. Invest in testing upfront.

Testing Levels

A comprehensive test strategy includes multiple testing levels, each with a distinct purpose.

Unit Testing
Integration Testing
System Testing
UAT
Go-Live
Level Focus Who Tests
Unit Testing Individual configurations, scripts, workflows Implementer/Developer
Integration Testing End-to-end processes, integrations Implementer/QA
System Testing Complete system behavior QA Team
User Acceptance Business requirements validation Business Users
Performance Speed, capacity, reliability Technical Team

Test Planning Components

A test plan documents your testing approach, scope, resources, and schedule.

Test Plan Contents

  • Scope: What will and won't be tested
  • Approach: Testing methodology and levels
  • Resources: Who will test, their availability
  • Environment: Where testing occurs (sandbox, dev)
  • Schedule: Testing timeline and milestones
  • Entry/Exit Criteria: When testing starts and ends
  • Defect Management: How issues are logged and tracked
  • Risk Assessment: What could go wrong
✓ Best Practice

Create the test plan during design phase, not after configuration is complete. Early planning ensures adequate time allocation and helps identify testing requirements that influence design decisions.

Test Case Development

Test cases document specific scenarios to validate. Well-written test cases are repeatable and produce consistent results.

Test Case Components

  • Test Case ID: Unique identifier for tracking
  • Title: Brief description of what's being tested
  • Preconditions: Setup required before executing
  • Test Steps: Exact actions to perform
  • Test Data: Specific data values to use
  • Expected Results: What should happen
  • Actual Results: What did happen (filled during execution)
  • Pass/Fail: Test outcome
💡 Consultant Insight

Map test cases to requirements. Every requirement should have at least one test case validating it. This traceability matrix proves to stakeholders that all requirements were tested and helps prioritize testing when time is limited.

Test Environment Strategy

NetSuite provides multiple environment types for testing. Use them appropriately.

Environment Purpose Data
Development Building and unit testing Minimal test data
Sandbox Integration and UAT Production copy or test data
Release Preview Testing NetSuite upgrades Production snapshot
Production Live operations only Real data (never test here)
⚠️ Sandbox Refresh Timing

Sandbox refreshes overwrite all data and customizations. Plan refreshes carefully—never refresh during active testing. Document the refresh schedule and notify all testers before refreshing.

Defect Management

When tests fail, defects need systematic tracking through resolution.

Defect Lifecycle

New
Assigned
In Progress
Fixed
Verified
Closed

Defect Severity Levels

  • Critical: System unusable, no workaround—blocks go-live
  • High: Major functionality broken, workaround exists
  • Medium: Feature doesn't work as expected, workaround available
  • Low: Minor issue, cosmetic, or enhancement request

Test Strategy Checklist

Test Planning Checklist
Define testing scope and objectives
Identify testing resources and availability
Establish test environment strategy
Create test case templates
Define entry and exit criteria
Set up defect tracking system
Create requirements traceability matrix
Schedule testing phases in project plan
Chapter 12.2

Unit Testing

Unit testing validates individual components in isolation. Each configuration, script, and workflow should be tested independently before combining into larger processes.

What to Unit Test

Unit testing covers the building blocks of your NetSuite implementation.

Configuration Unit Tests

  • Custom Fields: Field appears, accepts valid data, rejects invalid
  • Custom Lists: Values display correctly, can be selected
  • Custom Records: Records can be created, edited, deleted
  • Saved Searches: Returns expected results
  • Forms: Layout correct, fields display as configured
  • Roles: Permissions work as intended

Script Unit Tests

  • Client Scripts: Field validation, page manipulation works
  • User Event Scripts: Before/after load/submit logic executes
  • Scheduled Scripts: Batch processing completes correctly
  • RESTlets/Suitelets: Endpoints respond correctly
  • Map/Reduce: Data processing works at scale

Workflow Unit Tests

  • Workflow Triggers: Initiates when conditions met
  • State Transitions: Moves through states correctly
  • Actions: Each action executes as expected
  • Conditions: Branching logic works

Unit Test Approach

Follow a systematic approach to unit testing each component.

1

Identify Test Scenarios

List all scenarios: happy path (expected behavior), edge cases, error conditions.

2

Prepare Test Data

Create test records needed to exercise the component. Document data dependencies.

3

Execute Tests

Run through each scenario, documenting actual results.

4

Log Defects

Create defect records for any failures with clear reproduction steps.

5

Retest Fixes

After defects are fixed, retest to verify the fix works.

Script Testing Tools

NetSuite provides tools to help test scripts during development.

Script Debugger

  • Set breakpoints in SuiteScript code
  • Step through execution line by line
  • Inspect variable values at runtime
  • Debug in sandbox or development account

Execution Log

  • View log.debug(), log.error() output
  • Check for script errors and stack traces
  • Monitor governance usage
  • Track script execution history
✓ Best Practice

Add comprehensive logging during development, then reduce logging verbosity before go-live. Debug-level logging in production consumes governance and fills logs. Keep error logging active for troubleshooting.

Common Unit Test Scenarios

Custom Field Tests

Scenario Test Action Expected Result
Field visibility View record form Field appears in correct location
Valid data entry Enter valid value, save Value saved successfully
Invalid data Enter invalid value, save Appropriate error message
Mandatory enforcement Leave blank, save Error requires entry
Default value Create new record Field shows default

Workflow Tests

Scenario Test Action Expected Result
Trigger conditions Create record meeting criteria Workflow initiates
Non-trigger Create record NOT meeting criteria Workflow does not initiate
State transition Complete transition conditions Moves to next state
Action execution Trigger workflow action Action completes (email sent, field set)

Unit Testing Checklist

Unit Test Execution
All custom fields tested for visibility and validation
All custom records tested for CRUD operations
All saved searches return expected results
All scripts tested with debugger/logging
All workflows tested for trigger and state logic
Role permissions validated
Form layouts verified
All unit test defects logged and tracked
Chapter 12.3

Integration Testing

Integration testing validates that components work together correctly. This includes end-to-end business processes and external system integrations.

Process Integration Testing

Business processes span multiple transactions and records. Integration testing validates the complete flow.

Order-to-Cash Flow

Quote
Sales Order
Fulfillment
Invoice
Payment

Test Points for O2C

  • Quote → SO: Convert quote, verify data carries forward
  • SO → Fulfillment: Fulfill order, inventory decrements correctly
  • Fulfillment → Invoice: Invoice generates from fulfillment
  • Invoice → Payment: Payment applies, balance updates
  • GL Impact: All entries post to correct accounts

Procure-to-Pay Flow

Requisition
PO
Receipt
Bill
Payment
💡 Consultant Insight

Test complete business cycles, not just individual steps. Create a sales order, fulfill it, invoice, receive payment, then verify the entire GL is balanced. This catches issues that only appear when transactions interact.

External Integration Testing

When NetSuite integrates with external systems, dedicated integration testing is critical.

Integration Test Areas

  • Authentication: API credentials work, tokens refresh
  • Connectivity: Network connection stable, firewall rules allow traffic
  • Data Mapping: Fields map correctly between systems
  • Data Transformation: Values convert correctly (dates, currencies)
  • Error Handling: Failures handled gracefully, errors logged
  • Retry Logic: Transient failures retry appropriately
  • Duplicate Prevention: Same data doesn't create multiple records
🤔 Decision: Integration Test Approach
Mock External System Simulate external system responses. Good for early testing when external system unavailable. Doesn't test actual integration.
Test Environment Connect to external system's test/sandbox. Realistic testing but requires coordination with external teams.
Production (Careful) Test against production external system with test data. Use only when no test environment exists and with extreme caution.

Common Integration Scenarios

E-commerce Integration

Test Scenario Validation
New web order Order creates in NetSuite with correct data
Customer sync New customers create, updates sync
Inventory sync Available quantity accurate on web
Fulfillment update Tracking info flows to web customer
Return/refund RMA flows correctly between systems

CRM Integration

Test Scenario Validation
Lead/opportunity sync CRM opportunities appear in NetSuite
Won opportunity Triggers customer/SO creation
Contact updates Changes sync bidirectionally
Invoice data Invoice info available in CRM

Testing Data Considerations

Integration testing requires realistic test data that exercises all scenarios.

Test Data Requirements

  • Representative Volume: Enough records to stress integrations
  • Edge Cases: Special characters, long text, unusual values
  • Error Conditions: Invalid data to test error handling
  • Date Ranges: Historical and future dates
  • Multi-Currency: Transactions in different currencies
  • Multi-Subsidiary: Data across subsidiaries (if applicable)
⚠️ Data Privacy

Never use real customer data in test environments without proper anonymization. Mask PII (names, emails, SSNs) in test data. Ensure test environments have same security controls as production.

Integration Testing Checklist

Integration Test Execution
Order-to-Cash process tested end-to-end
Procure-to-Pay process tested end-to-end
All external integrations connected and tested
Integration error handling validated
Data mapping verified across systems
GL entries balanced after process completion
Inventory quantities accurate
All integration test defects logged
Chapter 12.4

User Acceptance Testing (UAT)

UAT is the final validation that the system meets business requirements. Business users execute real-world scenarios to confirm the system works for their needs.

UAT Purpose and Scope

UAT validates that the system meets business requirements from the user's perspective—not just that it works technically.

UAT Objectives

  • Requirements Validation: Confirm all requirements are met
  • Usability: System is intuitive for end users
  • Process Fit: Workflows match actual business processes
  • Data Accuracy: Calculations and reports are correct
  • Training Readiness: Users can perform their jobs
🔒 UAT is Not Optional

Skipping UAT or rushing through it is the #1 cause of post-go-live issues. Business users must formally accept the system before go-live. Their sign-off confirms the system is ready for production.

UAT Planning

Successful UAT requires careful planning and preparation.

UAT Participants

  • Business Process Owners: Validate their process area
  • Subject Matter Experts: Deep knowledge testers
  • End Users: Representative users from each role
  • Power Users: Future super-users and trainers
  • Support Team: Answer questions, log issues

UAT Environment Setup

  • Fresh sandbox refresh before UAT
  • Test data loaded and verified
  • User accounts created with correct roles
  • Integrations connected to test systems
  • Issue tracking system ready
✓ Best Practice

Schedule UAT when business users have dedicated time—not during month-end close or peak season. UAT requires focus. Distracted testers miss issues and slow the process.

UAT Test Scenarios

UAT scenarios focus on business processes, not technical details. They should read like user stories.

Example UAT Scenarios

Process Scenario Tester Role
Sales Create quote for new customer with discount Sales Rep
Sales Convert quote to order, verify pricing Sales Rep
Warehouse Pick, pack, ship order, print labels Warehouse Staff
Finance Create invoice, apply payment, reconcile AR Clerk
Purchasing Create PO, receive goods, match bill Purchasing Agent
Management Run month-end reports, verify figures Controller

UAT Execution

Structure UAT execution for maximum effectiveness.

1

Kickoff Meeting

Orient testers on objectives, timeline, issue logging process. Walk through test scenarios.

2

Scripted Testing

Testers execute assigned scenarios step-by-step. Document results, log issues.

3

Exploratory Testing

Testers try unscripted scenarios based on experience. Often finds edge cases.

4

Issue Resolution

Development fixes critical/high issues. Retest fixes. Defer low-priority items.

5

Sign-Off

Process owners formally accept the system for their area. Document exceptions.

UAT Exit Criteria

Define clear criteria for UAT completion before starting.

Typical Exit Criteria

  • Test Execution: 100% of test scenarios executed
  • Pass Rate: 95%+ scenarios pass
  • Critical Defects: Zero open critical defects
  • High Defects: All high defects resolved or deferred with workaround
  • Sign-Off: All process owners signed off
💡 Consultant Insight

Don't let perfect be the enemy of good. Some low-priority issues can be deferred to post-go-live. But never go live with unresolved critical issues or without process owner sign-off—even if it means delaying.

UAT Checklist

UAT Execution
UAT environment prepared and verified
Testers trained on test execution
All UAT scenarios executed
All critical/high defects resolved
Deferred items documented with workarounds
All process owners provided formal sign-off
Exit criteria met
UAT summary report completed
Chapter 12.5

Performance Testing

Performance testing validates that the system performs acceptably under expected load. Slow pages and timeouts frustrate users and reduce adoption.

Why Performance Matters

NetSuite is a cloud application with shared resources. While Oracle manages infrastructure, your configurations and customizations significantly impact performance.

Performance Impact Areas

  • Complex Scripts: Inefficient code consumes governance
  • Saved Searches: Poorly designed searches timeout
  • Large Records: Many lines slow page load
  • Integrations: High-volume syncs strain the system
  • Concurrent Users: Peak usage periods
ℹ️ NetSuite Performance Limits

NetSuite enforces governance limits on scripts (units per execution) and concurrency limits on API calls. Performance testing helps identify where you're approaching these limits before they cause production failures.

Performance Test Types

Load Testing

Simulate expected concurrent user load to verify acceptable response times.

  • How many users will be active simultaneously?
  • What transactions will they perform?
  • What response time is acceptable?

Stress Testing

Push beyond normal load to find breaking points.

  • At what point does performance degrade?
  • What happens when limits are exceeded?
  • How does the system recover?

Volume Testing

Test with realistic data volumes.

  • Saved searches with thousands of results
  • Reports across large date ranges
  • Transactions with many line items

Key Performance Metrics

Metric Description Acceptable
Page Load Time Time to fully render a page < 3 seconds
Transaction Save Time to save a record < 5 seconds
Search Execution Time for saved search to return < 10 seconds
Report Generation Time to generate a report < 30 seconds
Script Governance Units consumed vs. limit < 80% of limit
⚠️ Timeout Risks

Operations exceeding NetSuite's timeout limits (typically 5 minutes for synchronous operations) will fail. Identify long-running processes during testing and redesign them as scheduled scripts or map/reduce jobs.

Performance Optimization

When testing reveals performance issues, optimize before go-live.

Script Optimization

  • Minimize record loads and saves
  • Use search.lookupFields() for single-field lookups
  • Batch API calls where possible
  • Cache frequently-used data
  • Avoid nested loops over large datasets

Search Optimization

  • Use filters to reduce result sets
  • Index custom fields used in criteria
  • Avoid formula fields in criteria
  • Use summary searches for aggregations
  • Limit columns to only what's needed

Form Optimization

  • Remove unused sublists from forms
  • Limit line-level scripts to necessary records
  • Use field display types (hidden, inline text) appropriately
  • Minimize sublist columns displayed
💡 Consultant Insight

The performance debugger is your friend. Enable it for slow pages to see exactly what's consuming time—record loads, script execution, saved search queries. Target the biggest time consumers first for maximum impact.

Performance Testing Checklist

Performance Validation
Key pages load within acceptable time
Transactions save without timeout
Critical saved searches execute quickly
Reports generate within acceptable time
Script governance within limits
Integrations handle expected volume
System stable under concurrent load
Performance issues documented and addressed
Chapter 12.6

Regression Testing

Regression testing ensures that changes don't break existing functionality. Every fix or enhancement requires validation that nothing else was impacted.

When to Regression Test

Regression testing is needed whenever the system changes.

Triggers for Regression Testing

  • Defect Fixes: After fixing bugs found in testing
  • Configuration Changes: After modifying workflows, scripts, searches
  • NetSuite Updates: After quarterly release upgrades
  • Bundle Updates: After SuiteApp/bundle upgrades
  • Integration Changes: After modifying integration endpoints
  • Post-Go-Live Enhancements: After any production changes
⚠️ NetSuite Releases

NetSuite releases updates twice yearly. These can change UI, APIs, and functionality. Always regression test in Release Preview before updates apply to production. Oracle provides Release Preview sandboxes specifically for this.

Regression Test Suite

Build a reusable regression test suite covering critical functionality.

What to Include

  • Core Business Processes: O2C, P2P, basic transactions
  • Critical Calculations: Pricing, tax, commissions
  • Integration Touchpoints: Key data flows
  • Financial Reports: Balance sheet, P&L, AR/AP aging
  • Security: Role permissions, restricted data access

Regression Suite Design

Suite Type Scope When to Run
Smoke Test Basic functionality (10-20 tests) After any deployment
Sanity Test Core processes (30-50 tests) After significant changes
Full Regression Complete coverage (100+ tests) Before major releases
✓ Best Practice

Prioritize regression tests by risk. A failed payment processing test is more critical than a cosmetic form issue. Run high-priority tests first so critical problems are found early.

Test Automation Considerations

Manual regression testing is time-consuming. Consider automation for frequently-run tests.

Automation Candidates

  • Good for Automation: Stable processes, high-volume tests, data validation
  • Poor for Automation: UI-heavy tests, frequently-changing areas, one-time tests

Automation Tools

  • SuiteScript Tests: Script-based validation
  • RESTlet Testing: API endpoint validation
  • Third-Party Tools: Selenium, Playwright for UI testing
  • NetSuite SDF: Deploy and test customizations
💡 Consultant Insight

Don't over-automate. Automated tests require maintenance when the system changes. Start with a small, stable set of automated tests for critical paths. Manual tests are often more practical for complex scenarios or areas that change frequently.

Regression Testing Process

1

Identify Scope

Based on the change, determine which areas could be impacted. Use impact analysis.

2

Select Test Cases

Choose relevant regression tests from the suite. Add new tests if needed.

3

Execute Tests

Run selected tests in the test environment. Document results.

4

Analyze Failures

Investigate failures. Determine if caused by the change or pre-existing issue.

5

Fix and Retest

Resolve issues and retest until all tests pass.

Post-Go-Live Regression

Regression testing continues after go-live for ongoing changes.

Ongoing Testing Triggers

  • Production Fixes: Test fix plus regression before deploying
  • Enhancements: Test new feature plus impacted areas
  • NetSuite Releases: Full regression in Release Preview
  • Bundle Updates: Test bundle functionality plus integration points
ℹ️ Change Control

Establish a change control process requiring regression testing before production deployments. No change goes to production without documented testing. This prevents "quick fixes" that break existing functionality.

Regression Testing Checklist

Regression Test Process
Regression test suite created and documented
Tests prioritized by criticality
Automation implemented for key tests (if applicable)
Change control process established
Release Preview testing scheduled
Pre-go-live regression completed
All regression failures resolved
Test suite maintenance plan in place