Penetration Testing Statement of Work Template turn the awarded proposal into a contract both sides can defend
A free, copy-ready penetration testing statement of work (SOW) template. Twelve structured sections covering parties and engagement reference, objectives and assumptions, scope and exclusions, methodology and depth, deliverables, project plan and milestones, evidence handling, retests, pricing and invoicing, change control, acceptance and warranty, and signatures. Sits under your master services agreement, anchors the operational rules of engagement, and aligns with PTES, NIST SP 800-115, OWASP WSTG, and CREST scheme expectations.
SecPortal stores the SOW alongside the engagement, the findings, the report, the retests, and the invoice. One audit trail from kickoff to closure. Free plan available.
No credit card required. Free plan available forever.
Full template
Copy the full SOW template
Twelve sections covering parties and engagement reference, objectives and assumptions, scope and exclusions, methodology and depth, deliverables, project plan and milestones, evidence handling, retests, pricing and invoicing, change control, acceptance and warranty, and signatures. Sits under your MSA. Replace every {{PLACEHOLDER}} before executing.
1. Parties and engagement reference
Identify the contracting legal entities, the engagement reference, and the master agreement this SOW sits under. The engagement reference should match the record the work will be tracked against.
This Statement of Work ({{SOW_REFERENCE}}) is entered into on {{SOW_EFFECTIVE_DATE}} between:
Buyer (the Client): {{CLIENT_LEGAL_NAME}}, a {{CLIENT_ENTITY_TYPE}} with its registered office at {{CLIENT_ADDRESS}}.
Vendor (the Tester): {{VENDOR_LEGAL_NAME}}, a {{VENDOR_ENTITY_TYPE}} with its registered office at {{VENDOR_ADDRESS}}.
This SOW is issued under and governed by the Master Services Agreement dated {{MSA_EFFECTIVE_DATE}} between the parties (the MSA). Capitalised terms not defined in this SOW have the meanings given in the MSA. In the event of conflict between this SOW and the MSA, the MSA controls except where this SOW expressly amends a specific MSA provision for this engagement.
Engagement reference: {{ENGAGEMENT_REFERENCE}} (this reference will be used on all deliverables, invoices, and communications).
Buyer engagement stakeholder: {{CLIENT_STAKEHOLDER_NAME}}, {{CLIENT_STAKEHOLDER_TITLE}}, {{CLIENT_STAKEHOLDER_EMAIL}}.
Vendor engagement lead: {{VENDOR_LEAD_NAME}}, {{VENDOR_LEAD_TITLE}}, {{VENDOR_LEAD_EMAIL}}.
2. Objectives and assumptions
State why the engagement is happening and the assumptions baked into the price. Anything outside these assumptions becomes a change order under Section 10, not a discount conversation at delivery.
Engagement objectives:
- {{OBJECTIVE_1}} (e.g. validate the security posture of the in-scope assets against PTES Level 2 and OWASP ASVS Level {{ASVS_LEVEL}}).
- {{OBJECTIVE_2}} (e.g. produce a board-ready executive summary and a technical findings report suitable for engineering remediation).
- {{OBJECTIVE_3}} (e.g. provide evidence the Client can attach to its {{COMPLIANCE_FRAMEWORK}} audit).
Assumptions baked into the fee in Section 9:
- Asset count: {{ASSET_COUNT}} (e.g. 1 web application, 1 API, 2 user roles).
- Test depth: {{TEST_DEPTH}} (PTES level, ASVS level, OWASP WSTG coverage).
- Estimated tester-days: {{TESTER_DAYS}} (split by seniority, where applicable).
- Retest count: {{RETEST_COUNT}} included verifications inside the retest window in Section 8.
- Timeline: {{TIMELINE_ASSUMPTION}} business days from kickoff to final report sign-off.
- Environment availability: in-scope environments will be reachable from agreed source IPs during the testing window with no unannounced maintenance.
Anything outside these assumptions, including added assets, expanded depth, additional retests, or accelerated timelines, will be handled under the change control process in Section 10.
3. Scope and exclusions
List every in-scope asset by exact identifier and every prohibited asset or technique. Vagueness here is the dominant cause of mid-engagement disputes.
In-scope assets:
- Web applications: {{WEB_APP_URLS}}
- APIs (with documentation references): {{API_BASE_URLS_AND_DOCS}}
- External network ranges (CIDR notation): {{EXTERNAL_IP_RANGES}}
- Internal network ranges: {{INTERNAL_IP_RANGES}}
- Cloud accounts (AWS account IDs / Azure subscriptions / GCP projects): {{CLOUD_ACCOUNT_IDS}}
- Mobile applications (with platforms): {{MOBILE_APP_BUNDLES_AND_PLATFORMS}}
- Source code repositories (if code review is in scope): {{REPO_URLS}}
- Identity providers and SSO endpoints: {{IDP_ENDPOINTS}}
Out-of-scope assets and techniques:
- Assets not listed above are out of scope and must not be tested under this SOW.
- Destructive payloads, including disk wiping, encryption of buyer data, and irreversible configuration changes.
- Live denial of service or stress testing.
- Real social engineering of any individual not listed in a Social Engineering Annex executed alongside this SOW.
- Use of zero-day exploits whose impact cannot be predicted on production systems.
- Access to regulated data (PHI, cardholder data, financial records) beyond the minimum required to demonstrate impact.
Out-of-hours testing windows, regulated data handling, and any scheme constraints (CHECK, OVS, STAR, FedRAMP 3PAO, DORA TLPT) are addressed in the rules of engagement (ROE) attached as Annex A.
4. Methodology and depth
Anchor methodology to public standards by version. This makes the SOW defensible against later second-guessing and gives the engagement team a documented reference for test execution.
The Vendor will perform the engagement in accordance with the following methodology references:
- PTES (Penetration Testing Execution Standard), pre-engagement through reporting phases.
- NIST SP 800-115 Technical Guide to Information Security Testing and Assessment.
- OWASP Web Security Testing Guide v4.2 for web application testing.
- OWASP API Security Top 10 (current edition) for API testing.
- OWASP Mobile Application Security Testing Guide (MASTG) for mobile testing, where in scope.
- OWASP Application Security Verification Standard (ASVS) Level {{ASVS_LEVEL}} for application verification depth.
- CIS Benchmarks (current edition) for cloud configuration review, where in scope.
Test depth per asset class:
- Web applications and APIs: {{WEB_API_DEPTH}} (PTES Level 2 standard or Level 3 advanced).
- Network and infrastructure: {{NETWORK_DEPTH}}.
- Cloud configuration: {{CLOUD_DEPTH}} (CIS Benchmark alignment, identity surface, exposed services).
- Source code review (if in scope): {{CODE_REVIEW_DEPTH}} (manual review depth and SAST coverage).
Findings will be scored using CVSS 3.1 base vectors, with severity bands aligned to the Vendor's published rubric and recorded against the engagement record.
5. Deliverables
Specify the deliverables, the audiences, and the formats. Reporting quality is where engagements are judged at acceptance, so name the artefacts explicitly here.
The Vendor will deliver the following artefacts on the engagement record:
- Executive summary suitable for board and audit committee distribution (PDF).
- Technical findings report with one finding per page or per section, including severity, CVSS 3.1 vector, evidence, business impact, and remediation guidance (PDF).
- Remediation roadmap prioritised by severity and exploitability, with phased timelines (PDF).
- Compliance mapping where applicable: {{COMPLIANCE_MAPPING}} (e.g. PCI DSS, SOC 2, ISO 27001, HIPAA, NIST 800-53).
- Findings export in a structured machine-readable format (CSV or JSON) for ingestion into the Client's vulnerability management system.
- Raw evidence package (screenshots, request and response captures, configuration extracts) retained per Section 7.
Format requirements:
- PDF for the formal executive summary, technical report, and remediation roadmap.
- Editable format (Word or equivalent) on request, watermarked or controlled under the MSA.
- Structured findings export (CSV or JSON) attached to the engagement record for ingestion.
Deliverable timeline:
- Draft report: within {{DRAFT_REPORT_DAYS}} business days of testing close.
- Final report after Client review: within {{FINAL_REPORT_DAYS}} business days of Client written feedback.
- Retest report: within {{RETEST_REPORT_DAYS}} business days of retest close.
6. Project plan and milestones
Tie testing windows, checkpoints, and reporting milestones to dates. Vague timelines turn into delivery disputes the day the buyer needs the report for an audit.
Project milestones:
- Kickoff call: {{KICKOFF_DATE}}.
- Testing window: {{TESTING_WINDOW_START}} to {{TESTING_WINDOW_END}} ({{TIMEZONE}}).
- Mid-engagement checkpoint: {{MID_CHECKPOINT_DATE}} (status, blockers, preliminary high-severity findings).
- Testing close: {{TESTING_CLOSE_DATE}}.
- Draft report delivered: {{DRAFT_REPORT_DUE}}.
- Client review window: {{CLIENT_REVIEW_DAYS}} business days from draft delivery.
- Final report delivered: {{FINAL_REPORT_DUE}}.
- Retest window opens at final report acceptance and closes at {{RETEST_WINDOW_END}}.
Communication cadence:
- Primary channel: the engagement portal on the Vendor's platform with secure messaging, evidence storage, and finding tracking.
- Status checkpoints: written status update every {{STATUS_CADENCE}} business days, plus an out-of-band call for any critical finding.
- Severity-driven communication SLA:
- Critical findings: within the same business day, by phone plus written confirmation in the portal.
- High findings: within one business day in the portal, with summary in the next status update.
- Medium findings: visible in the portal as logged, summarised at scheduled checkpoints.
- Low and informational findings: consolidated into the final report.
7. Evidence handling and data protection
These terms protect the Client when an audit asks how testing data was handled. They also protect the Vendor from holding sensitive data longer than necessary.
Evidence handling requirements:
- All proof of compromise (screenshots, traffic captures, hashes, configuration extracts) will be encrypted at rest and in transit.
- Evidence will be transmitted to the Client only via the engagement portal or another channel agreed in writing.
- Personally identifiable data, cardholder data, and other regulated data exposed during testing will be masked in the final report unless the Client requests otherwise in writing.
- Test credentials provided by the Client will be stored encrypted on the Vendor side, never shared by email, and rotated by the Client at engagement closure.
Data retention:
- Evidence and findings will be retained for {{RETENTION_PERIOD}} after final report acceptance and then securely destroyed, with written confirmation of destruction sent to the Client.
- The Client may request earlier destruction in writing in the event of an incident, regulatory enquiry, or contract termination.
Data protection:
- The legal entities that will process Client data, the locations of those entities, and any sub-processors are listed in Annex B (Data Processing Particulars).
- Cross-border data transfers will comply with the data protection terms in the MSA.
- The Vendor will notify the Client within {{INCIDENT_NOTIFICATION_HOURS}} hours of becoming aware of any incident affecting Client data handled under this SOW.
8. Retests
Retest scope is the dominant source of post-award commercial disputes. Locking it down in the SOW preserves the Client's leverage at delivery and gives the Vendor a clear billing rule for out-of-window work.
Retest scope:
- Retest window: {{RETEST_WINDOW}} from final report acceptance (defensible default: 60 to 90 days).
- Retest count: verification of all findings the Vendor identified in the engagement, at no additional fee, inside the retest window.
- Retest verification method: per finding, with evidence equivalent to the original finding (screenshots, traffic captures, configuration extracts), so the audit trail closes the original finding rather than opening a new one.
- Regression handling: any new finding discovered during retest will be logged distinctly from the original finding it relates to, with severity, evidence, and remediation guidance, and may trigger change control under Section 10 if it materially expands scope.
- Retest pricing for items outside the retest window: per the rate card in Section 9.
Retest reporting:
- The Vendor will produce a retest report listing each original finding with its verification status (verified fixed, partially fixed, not fixed, regressed) and the supporting evidence, delivered within {{RETEST_REPORT_DAYS}} business days of retest close.
- The retest report will be paired to the original engagement record so the close-out captures the original scope, the fix description, the retest evidence, and the final outcome on a single record.
9. Pricing and invoicing
State the price model, the assumptions, the rate card for change orders, the invoicing schedule, and the payment terms. Different pricing structures across SOWs are the most common reason engagement margins drift.
Engagement fee:
- Pricing model: {{PRICING_MODEL}} (fixed fee / day rate / retainer / PTaaS subscription).
- Engagement fee: {{ENGAGEMENT_FEE_AMOUNT}} {{CURRENCY}} for the scope, methodology, deliverables, and retest count described in Sections 3, 4, 5, and 8.
- Assumptions: as set out in Section 2 (asset count, test depth, retest count, timeline).
Day rate for change orders (by tester seniority):
- Junior tester: {{DAY_RATE_JUNIOR}} {{CURRENCY}} per tester-day.
- Senior tester: {{DAY_RATE_SENIOR}} {{CURRENCY}} per tester-day.
- Principal tester: {{DAY_RATE_PRINCIPAL}} {{CURRENCY}} per tester-day.
Retest rate card for items outside the retest window in Section 8:
- {{RETEST_DAY_RATE}} {{CURRENCY}} per tester-day, minimum {{RETEST_MIN_DAYS}} tester-days per retest event.
Invoicing schedule:
- {{INVOICE_SCHEDULE_1}} (e.g. 30 percent on SOW execution, payable on {{INVOICE_TERMS}}).
- {{INVOICE_SCHEDULE_2}} (e.g. 40 percent on testing close).
- {{INVOICE_SCHEDULE_3}} (e.g. 30 percent on final report acceptance).
Payment terms: {{PAYMENT_TERMS}} from invoice issue date.
Currency: {{CURRENCY}}. Pricing exclusive of VAT or sales tax; the Vendor will charge tax according to the applicable jurisdiction.
Expenses: {{EXPENSES_POLICY}} (commonly: pre-approved travel and accommodation reimbursed at cost with receipts).
10. Change control
A documented change process protects the Vendor from doing unpaid work and protects the Client from paying for work that was not authorised. Define the threshold and the path so the Slack negotiation is replaced by a one-page change request.
Change control:
- Either party may propose a change to scope, methodology, timeline, deliverables, or fee by submitting a written change request to the engagement stakeholders named in Section 1.
- Each change request will describe: the change, the justification, the impact on price (with calculation), the impact on timeline, and the impact on assumptions in Section 2.
- A change request takes effect only when signed in writing (electronic signature accepted) by an authorised representative on each side.
De minimis changes:
- Changes below {{DE_MINIMIS_THRESHOLD}} (e.g. 5 percent of the engagement fee or one tester-day) may be approved by exchange of email between the engagement stakeholders, without a separate signed change request.
- All de minimis approvals will be logged on the engagement record and consolidated into a single closing change addendum at engagement close.
Vendor will not perform work outside the agreed scope or assumptions without an executed change request. Client will not be invoiced for work outside the agreed scope or assumptions absent an executed change request.
11. Acceptance and warranty
Define when each deliverable is accepted and the corresponding payment milestone is triggered. Tying invoicing to acceptance keeps both sides honest about deliverable quality.
Deliverable acceptance:
- Each deliverable will be reviewed by the Client engagement stakeholder named in Section 1.
- The Client will accept or return written comments within {{CLIENT_REVIEW_DAYS}} business days of delivery.
- A deliverable is accepted on the earlier of: written acceptance by the Client, or the expiry of the review window with no written comments returned.
- Material comments returned within the review window will be addressed by the Vendor within {{REWORK_DAYS}} business days, after which the revised deliverable enters a new {{REREVIEW_DAYS}} business day review window.
Warranty:
- The Vendor warrants that the engagement will be performed in accordance with the methodology references in Section 4 and the deliverables in Section 5.
- The Vendor warrants that the testers performing the engagement hold the certifications listed in the awarded proposal.
- Other warranties, liability caps, and indemnities are governed by the MSA referenced in Section 1 and are not restated in this SOW.
Termination:
- Termination is governed by the MSA. On termination, the Client will pay for all work completed in good faith up to the date of termination notice, calculated against the day rates in Section 9. Any deliverables already accepted under this Section 11 remain accepted.
12. Signatures
Both signatories should have authority to bind the legal entity they represent. Avoid signing the SOW with only operational stakeholders on either side.
Signed for and on behalf of the Buyer (the Client):
Name: {{CLIENT_SIGNATORY_NAME}}
Title: {{CLIENT_SIGNATORY_TITLE}}
Date: {{CLIENT_SIGNATURE_DATE}}
Signature: ____________________________
Signed for and on behalf of the Vendor (the Tester):
Name: {{VENDOR_SIGNATORY_NAME}}
Title: {{VENDOR_SIGNATORY_TITLE}}
Date: {{VENDOR_SIGNATURE_DATE}}
Signature: ____________________________
Annexes:
- Annex A: Rules of Engagement (operational testing rules for this SOW).
- Annex B: Data Processing Particulars (legal entities, locations, sub-processors).
- Annex C: Awarded proposal (or relevant extracts), referenced for assumptions and methodology context.
How to use this template
Confirm the scope, depth, and tester-day budget before drafting the SOW. Use the pentest scoping calculator to validate the assumptions in Section 2 against the engagement fee in Section 9.
If the engagement was awarded through an RFP, carry the agreed scope, methodology, and pricing model from the penetration testing RFP template into Sections 2, 3, 4, and 9 so the SOW reflects the basis on which the vendor was selected.
Replace every {{PLACEHOLDER}} with engagement-specific values. Do not leave placeholders in the executed document.
Attach the operational rules of engagement as Annex A using the rules of engagement template. Keep commercial terms in the SOW and operational rules in the ROE; the two documents reference each other.
Set the severity-driven SLA in Section 6 against a published rubric. The vulnerability remediation SLA calculator produces a defensible rubric you can reference in this section.
Have the final SOW reviewed by procurement, legal, and the engagement stakeholder accountable for delivery before either side signs.
Methodology and contractual references
PTES (Penetration Testing Execution Standard) covers pre-engagement, intelligence gathering, threat modelling, vulnerability analysis, exploitation, post-exploitation, and reporting. See the SecPortal PTES framework page for an operator-first walkthrough of each phase.
NIST SP 800-115 Technical Guide to Information Security Testing and Assessment, planning phase.
OWASP Web Security Testing Guide v4.2 and OWASP API Security Top 10.
OWASP Application Security Verification Standard (ASVS) for application verification depth.
CREST Defensible Penetration Test specification and CREST CHECK / OVS / STAR scheme documentation. See the CREST penetration testing framework page for accreditation context that may belong in Section 4.
For change control discipline that prevents Section 10 from drifting mid-engagement, see the SecPortal research on pentest scope creep.
When Section 10 actually fires and a mid-engagement scope change has to be executed, the pentest scope change addendum template is the one-to-two page instrument that varies the engagement letter without breaking the authorisation chain.
Where the SOW lives in the engagement
The clean paper trail for a penetration testing engagement is RFP, proposal, SOW, ROE, and engagement letter, all referenced from the same engagement record. The SOW is the contractual document that locks in scope and price; the ROE is the operational document that governs day-to-day testing. The accepted vendor proposal is the bridge between RFP and SOW: scope, methodology, deliverables, and pricing flow from the proposal into this SOW on award. The engagement letter is the instruction-to-proceed that authorises a specific instance of this SOW to start, names the testing team, and references the executed ROE.
For retest scope and verification under Section 8, see pentest retesting.
This template is provided as a starting point for a penetration testing statement of work. It is not legal advice or procurement advice, and it is not a substitute for a master services agreement. Have the final SOW reviewed by procurement, legal, and the security stakeholder accountable for the engagement before signing.