Penetration Testing Scope of Work (SOW) Template
The scope of work is the single most important document in a penetration testing engagement. A clear SOW protects both buyer and tester: it nails down what is in scope, what is excluded, how the tester will work, what gets delivered, and what happens when something unexpected appears. A vague SOW costs both sides time, money, and trust. This guide walks through every section a production-ready pentest SOW should contain, with drop-in language you can adapt to your own engagements, plus practical guidance on scoping, rules of engagement, deliverables, retests, and pricing. If you want a copy-ready, twelve-section template you can paste straight into a draft, see the pentest statement of work template tool. Pair it with pentest pricing and the methodology guide to assemble a complete proposal.
Why the SOW Decides Whether the Engagement Goes Well
Most pentest engagements that go sideways were already going sideways at signature. The scope was ambiguous, the rules of engagement were missing, the deliverables were described in marketing language, the retest clause was absent, and the pricing model did not match the work. By the time the kickoff call happens, the disagreement is baked in and the only question is who absorbs the cost.
A well-written SOW prevents that. It serves as a checklist for the tester (so nothing is missed), a contract for the legal team (so liability is clear), a brief for engineering (so prep work is correct), and a proposal for the buyer (so the cost is justified). Time spent on the SOW is the highest-leverage hour in the entire engagement.
A Complete Pentest SOW Outline
A production-ready pentest SOW typically includes the following sections. Use this as a table of contents and adapt to the engagement type.
- Engagement summary and objectives
- Parties, signatories, and authorisation
- Scope and target inventory
- Out-of-scope items and exclusions
- Methodology and standards
- Rules of engagement
- Timeline, milestones, and testing window
- Deliverables and report structure
- Retest scope and window
- Communication and escalation
- Pricing, payment terms, and change orders
- Confidentiality, data handling, and IP
- Acceptance criteria and sign-off
- Annexes (asset list, authorisation letters, contact sheet)
1. Engagement Summary and Objectives
Open with a one-paragraph summary of the engagement and a short list of objectives. The objectives should be testable so the report can map findings back to them.
Example wording
The provider will perform an authenticated grey-box web application penetration test against the customer's production e-commerce platform. The objectives are: (a) identify vulnerabilities exploitable by an authenticated low-privilege user; (b) assess the strength of authentication, session management, and access control; (c) evaluate susceptibility to OWASP Top 10 categories; (d) provide remediation guidance prioritised by risk to the business.
Avoid objectives like "ensure the application is secure", which are not testable and create unrealistic expectations.
2. Parties, Signatories, and Authorisation
List both parties, the named signatories, and the authority under which testing is performed. If any in-scope asset is owned or hosted by a third party (for example a cloud provider, a SaaS vendor, or a payment processor), include written authorisation as an annex. Hosting providers commonly require advance notice for penetration testing against assets they host.
Cross-reference the MSA for jurisdiction, governing law, and dispute resolution rather than restating them. The SOW carries engagement-specific authorisation; the MSA carries the legal framework.
3. Scope and Target Inventory
Scope must be enumerated, not described in prose. List every target by type, identifier, environment, and any qualifying notes. Vague scope is the leading cause of engagements running over.
| Asset type | Identifier | Environment | Qualifiers |
|---|---|---|---|
| Web application | app.example.com | Production | 2 user roles, ~120 routes, 35 forms |
| REST API | api.example.com (v2) | Production | 42 endpoints, OAuth2, rate limited |
| External IPs | 203.0.113.0/29 (8 IPs) | Production | Edge load balancers, no exploit beyond fingerprint |
| Mobile app | iOS 17+, Android 13+ | Staging build | Single role, jailbreak/root checks excluded |
For a deeper view of how to scope different assessment types, see the web application pentest checklist, API security testing checklist, and mobile pentest checklist.
4. Out-of-Scope Items and Exclusions
Stating exclusions explicitly is just as important as stating scope. Without it, the buyer assumes everything is covered and the tester assumes only the listed items are. Common exclusions include:
- Subdomains, environments, or hosts not in the asset inventory
- Third-party SaaS or services owned by other suppliers
- Denial of service, volumetric attacks, and resource exhaustion
- Social engineering of staff or customers (unless explicitly scoped)
- Physical security testing
- Exploitation that risks corruption or destruction of production data
- Source code review (unless a separate code-scanning SOW exists)
- Findings discovered outside the testing window
5. Methodology and Standards
State which methodology and standards the engagement follows. Buyers compare proposals on this; auditors expect to see it. Common references include OWASP WSTG, OWASP MASTG, OWASP API Security Top 10, PTES, NIST SP 800-115, and OSSTMM.
Briefly describe the phases (reconnaissance, mapping, vulnerability discovery, exploitation, post-exploitation, reporting) so the buyer understands what each day looks like. The detailed mechanics belong in the methodology guide, not the SOW.
6. Rules of Engagement
Rules of engagement (RoE) protect both sides. They tell the tester what is permitted, tell engineering what to expect on the wire, and give legal a defensible position if something goes wrong.
- Testing window: allowed days, hours, and time zone. Many buyers restrict testing to business hours so on-call coverage is in place.
- Source IP addresses: the IPs the tester will originate from, so SOC analysts can recognise the traffic and not page on it.
- Authentication: how test accounts are provisioned, what roles they hold, and how they are rotated or disabled afterwards.
- Prohibited techniques: denial of service, attacks against third-party assets, exploitation that destroys data, and any other constraints specific to the environment.
- Critical-finding escalation: the path and SLA for raising a critical issue immediately rather than waiting for the report.
- Outage handling: who to contact and how quickly if testing causes degradation or downtime.
- Data handling: what data the tester may retain, in what form, and for how long after engagement close.
- Authorisation: a clause stating the buyer authorises testing against the named assets within the named window. This is the tester's legal cover.
For a copy-ready version of this section as a standalone document, with eleven structured ROE clauses including stop-test conditions, evidence handling, and severity-driven communication SLAs, use the free pentest rules of engagement template. The ROE is usually attached to the SOW as an exhibit so the contract chain is clear.
7. Timeline, Milestones, and Testing Window
Lay out the engagement phases with realistic dates. A typical web application engagement looks like this.
| Phase | Duration | Output |
|---|---|---|
| Kickoff and pre-engagement | 0.5 day | Confirmed scope, RoE, accounts, contacts |
| Active testing | 5 to 10 days | Findings logged in the portal as discovered |
| Reporting | 1.5 to 3 days | Draft report, technical and executive sections |
| Debrief and Q&A | 1 hour | Walk-through with engineering and security leads |
| Retest window | 30 to 90 days | Retest results posted per finding |
8. Deliverables and Report Structure
Define what the buyer gets, in what format, and through what channel. Be specific: "a report" is not a deliverable. "A PDF report containing executive summary, scope, methodology, findings with CVSS 3.1 vectors, evidence, and remediation guidance, plus per-finding access in the client portal" is.
- Executive summary suitable for board or sponsor
- Scope, methodology, and rules of engagement summary
- Findings list with severity, CVSS 3.1 vector, evidence, and remediation
- Strategic recommendations and roadmap
- Appendices: tooling used, references, glossary
- Per-finding access in a branded client portal with comments and status
- Optional letter of attestation for procurement or compliance buyers
For deliverable structure, see the security assessment report template and the wider how to write a pentest report guide. SecPortal's AI report generation and branded client portal cover both the document and the per-finding view.
9. Retest Scope and Window
A pentest without a retest is a half engagement. Specify the terms now so verification is not a separate negotiation later.
- Eligible findings: typically all findings from the original report at low severity and above. Note whether informational findings are retested.
- Retest window: 30, 60, or 90 days from report delivery. After the window closes, retests are billed as a new engagement.
- Partial fixes: how a partially remediated finding is handled (reopen, reduce severity, route back to engineering).
- Variants discovered during retest: whether new issues found while verifying a fix are absorbed, scoped to an addendum, or treated as a new engagement.
- Delivery format: per-finding update in the portal plus a short retest addendum to the report.
For a deeper retest workflow, see how to retest vulnerabilities.
10. Communication and Escalation
Specify who talks to whom and how often. Most engagement-day friction is communication friction.
- Daily or end-of-day status update during active testing (channel and format)
- Weekly steering call for engagements over two weeks long
- Critical-finding escalation: SLA, channel, named contact, fallback contact
- Outage handling: who to call, response time, who decides to pause testing
- Single source of truth for findings (the client portal, not email)
11. Pricing, Payment Terms, and Change Orders
Pricing should follow effort and tie back to the scope inventory. Three pricing models are common.
- Fixed fee: a single price for the agreed scope. Most buyers prefer this. Requires accurate scoping or a contingency.
- Time and materials: day rate times agreed days. Honest for novel or exploratory work, harder for procurement to approve.
- Capped time and materials: day rate up to a cap. Combines flexibility with budget certainty.
State payment terms (net 30 is typical), invoicing milestones (often 50 percent on kickoff, 50 percent on report delivery), currency, and tax. Define the change order process: what triggers one, who approves, and how it impacts timeline. For benchmarks see how to price security services. SecPortal invoicing handles GBP, USD, and EUR via Stripe.
12. Confidentiality, Data Handling, and IP
Cross-reference the MSA for general confidentiality terms and add engagement-specific data handling rules.
- What customer data the tester may access during the engagement
- How exfiltrated data, screenshots, and request/response captures are stored
- Encryption requirements for evidence at rest and in transit
- Retention period after engagement close and the deletion process
- IP ownership: the report is typically licensed to the buyer; methodology and tools remain the provider's
- Whether the buyer may share the report with auditors, customers, or prospects
13. Acceptance Criteria and Sign-Off
State what counts as engagement completion so payment milestones can release cleanly.
- Final report delivered through the agreed channel
- All findings published in the client portal with status, evidence, and CVSS
- Debrief held with engineering and security leads
- Retest window opened (acceptance does not depend on retests being used)
- Buyer acknowledgement within an agreed window (e.g. 10 business days)
14. Annexes
- Detailed asset inventory (one row per target)
- Third-party authorisation letters (hosting providers, payment processors)
- Contact sheet with phone numbers for both sides
- Approved tester source IPs and user agents
- Test account credentials handover process
- Compliance mapping if the engagement supports a framework (PCI DSS, ISO 27001, SOC 2)
Drop-In Clauses You Can Adapt
The clauses below are starting points. Have legal review before signing.
Authorisation clause
Critical-finding escalation clause
Retest clause
Change order clause
Common Pitfalls in Pentest SOWs
- Scope by paragraph, not inventory: "the customer-facing platform" means different things to legal, engineering, and security. List assets line by line.
- No exclusions section: if it is not listed as out of scope, the buyer assumes it is in scope. Be explicit.
- Vague deliverables: "a comprehensive report" tells nobody anything. State the structure, format, and channel.
- Missing retest clause: turns verification into a new sale at the worst possible time.
- No critical-finding escalation: testers sit on a critical for days because the contract did not require otherwise.
- Pricing without a basis of estimate: a single number the buyer cannot interrogate is the number they will negotiate first.
- Authorisation buried in the MSA: a clear engagement-specific authorisation paragraph in the SOW protects the tester legally.
- No change order process: scope shifts mid-engagement and nobody knows how to price the addition cleanly.
Templating, Reuse, and Engagement Management
Once you have a SOW you trust, the next leverage point is reusing it without copy-paste errors. Maintain a master template, version it, and keep variant clauses (web app, API, mobile, network, cloud, code review, retest-only) in a clause library so a new SOW is an assembly job, not a fresh write.
Treat the SOW as the spine of the engagement record: engagement management should pull scope, dates, and deliverables straight from the signed SOW, so kickoff, findings, retests, and invoicing all reference the same source. For consultancies running multiple engagements in parallel see managing multiple security engagements.
Pentest SOW Quick Checklist
- Engagement objectives are testable
- Scope is enumerated by asset, not described in prose
- Out-of-scope items are explicit
- Methodology and standards are named
- Rules of engagement cover window, IPs, prohibited techniques, escalation
- Timeline lists kickoff, testing, reporting, debrief, retest window
- Deliverables specify structure, format, and channel
- Retest clause covers eligibility, window, partial fixes, variants
- Communication and escalation contacts are named with phone numbers
- Pricing model matches the work and a basis of estimate is shown
- Change order process is documented
- Confidentiality, data retention, and IP cross-reference the MSA
- Acceptance criteria allow milestones to release
- Annexes include asset list, third-party authorisations, contact sheet
Frequently Asked Questions About Pentest SOWs
Run engagements off the same SOW your client signed
SecPortal links scope, findings, retests, AI-generated reports, and invoicing to a single engagement record so the SOW you signed is the engagement you deliver. See pricing or start free.
Get Started Free