Once a vendor is selected, SecPortal stores the agreed scope, ROE, findings, report, and retests against a single engagement record. Free plan available.
No credit card required. Free plan available forever.
Full template
Copy the full RFP template
Twelve sections covering programme context, in-scope assets, methodology, evidence handling, retests, pricing, qualifications, vendor risk, and a published scoring rubric. Aligned with PTES, NIST SP 800-115, OWASP WSTG, and CREST scheme expectations. Replace every {{PLACEHOLDER}} before issuing.
1. Programme context
Tell vendors why this engagement is happening. The trigger (annual cycle, regulatory deadline, post-incident review, pre-launch validation) shapes how serious vendors price and staff the work.
Buyer (the Issuing Party): {{BUYER_LEGAL_NAME}}
Procurement reference: {{RFP_REFERENCE}}
Issue date: {{ISSUE_DATE}}
Response deadline: {{RESPONSE_DEADLINE}} ({{TIMEZONE}})
Award decision date (target): {{AWARD_DATE}}
Engagement target start date: {{ENGAGEMENT_START_DATE}}
Programme context:
- What triggered this engagement: {{TRIGGER_DESCRIPTION}} (e.g. annual programme cycle, PCI DSS Requirement 11.4, SOC 2 control verification, ISO 27001 surveillance audit, pre-launch validation, post-incident review).
- Internal stakeholder for this engagement: {{INTERNAL_STAKEHOLDER_NAME}}, {{INTERNAL_STAKEHOLDER_TITLE}}.
- Procurement contact for clarifications: {{PROCUREMENT_CONTACT_NAME}}, {{PROCUREMENT_CONTACT_EMAIL}}.
- Confidentiality: this RFP and any responses are confidential. Vendors must execute the attached NDA before submitting questions or a proposal.
The Issuing Party will not reimburse costs incurred in preparing a response. The Issuing Party reserves the right to amend, withdraw, or decline to award this RFP at any point in the process.
2. In-scope assets
List every asset the engagement must cover by exact identifier. Vagueness in this section is the dominant cause of incomparable proposals: vendors price what they think they see.
In-scope assets the engagement must cover:
- Web applications: {{WEB_APP_URLS}}
- APIs (with documentation references): {{API_BASE_URLS_AND_DOCS}}
- External network ranges (CIDR notation): {{EXTERNAL_IP_RANGES}}
- Internal network ranges: {{INTERNAL_IP_RANGES}}
- Cloud accounts (AWS account IDs / Azure subscriptions / GCP projects): {{CLOUD_ACCOUNT_IDS}}
- Mobile applications (with platforms): {{MOBILE_APP_BUNDLES_AND_PLATFORMS}}
- Source code repositories (if code review is in scope): {{REPO_URLS}}
- Identity providers and SSO endpoints: {{IDP_ENDPOINTS}}
Asset volumes:
- Approximate count of distinct user roles per application: {{USER_ROLE_COUNTS}}
- Approximate count of API endpoints: {{API_ENDPOINT_COUNT}}
- Estimated lines of code, if code review is in scope: {{SLOC_ESTIMATE}}
Assets the Issuing Party may add via change order during the engagement: {{POTENTIAL_ADDITIONS}}.
Out-of-scope assets that vendors must explicitly NOT test: {{OUT_OF_SCOPE_LIST}}.
3. Methodology and depth
Anchor the methodology requirement to a public standard so proposals can be compared on the same axis. Cite PTES, OWASP WSTG, ASVS, MASVS, or NIST SP 800-115 by version.
Required methodology references:
- PTES (Penetration Testing Execution Standard), pre-engagement through reporting phases.
- NIST SP 800-115 Technical Guide to Information Security Testing and Assessment.
- OWASP Web Security Testing Guide v4.2 for web application testing.
- OWASP API Security Top 10 (current edition) for API testing.
- OWASP Mobile Application Security Testing Guide (MASTG) for mobile testing, where in scope.
- OWASP ASVS Level {{ASVS_LEVEL}} for application security verification depth, where in scope.
Test depth requirements per asset class:
- Web applications and APIs: {{WEB_API_DEPTH}} (PTES Level 2 standard or Level 3 advanced).
- Network and infrastructure: {{NETWORK_DEPTH}}.
- Cloud configuration: {{CLOUD_DEPTH}} (CIS Benchmark alignment, identity surface, exposed services).
- Source code review (if in scope): {{CODE_REVIEW_DEPTH}} (manual review depth and SAST coverage).
Vendors must describe in their proposal how they map their internal methodology to the standards above, and where they go beyond the baseline.
4. Allowed and prohibited techniques
Specify the operational envelope. Regulatory schemes (CHECK, OVS, STAR, FedRAMP, DORA TLPT) often dictate part of this; record the relevant scheme by reference.
Allowed techniques (subject to the rules of engagement that will be agreed at award):
- Authenticated and unauthenticated testing of web applications and APIs in scope.
- Network and service discovery against in-scope assets.
- Vulnerability identification, validation, and safe exploitation against in-scope assets.
- Source code review of in-scope repositories where source access is granted.
- Lateral movement and post-exploitation inside in-scope environments to demonstrate impact, with prior written authorisation.
Prohibited techniques unless explicitly authorised in writing:
- Destructive payloads, including disk wiping, encryption of buyer data, and irreversible configuration changes.
- Live denial of service or stress testing.
- Real social engineering of any individual not listed in a Social Engineering Annex executed at award.
- Use of zero-day exploits whose impact cannot be predicted on production systems.
- Access to regulated data (PHI, cardholder data, financial records) beyond the minimum required to demonstrate impact.
Scheme constraints that apply: {{SCHEME_REFERENCES}} (e.g. UK CHECK, CREST OVS, CREST STAR, FedRAMP, DORA TLPT).
5. Evidence handling and data protection
These terms protect the buyer when an audit asks how testing data was handled. Make them part of the RFP rather than a post-award negotiation.
Evidence handling requirements:
- All proof of compromise (screenshots, traffic captures, hashes, configuration extracts) must be encrypted at rest and in transit.
- Evidence must be transmitted to the Issuing Party only via the engagement portal or another channel agreed in writing.
- Personally identifiable data, cardholder data, and other regulated data exposed during testing must be masked in the final report unless the Issuing Party requests otherwise in writing.
- Test credentials provided by the Issuing Party must be stored encrypted, never shared by email, and rotated at engagement closure by the Issuing Party.
Data retention requirements:
- Evidence and findings must be retained for {{RETENTION_PERIOD}} after engagement closure, then securely destroyed with written confirmation to the Issuing Party.
- The vendor must describe in the proposal how it handles a buyer-initiated request for early destruction (incident, regulatory enquiry, contract termination).
Data protection terms:
- The vendor must declare the legal entities that will process buyer data, the locations of those entities, and any sub-processors involved.
- Cross-border data transfers must comply with the Issuing Party's data protection terms (provided as an annex).
6. Communications and severity SLAs
Define how findings will be communicated, by severity. Vendors that cannot commit to a severity-driven SLA in the proposal rarely meet one in the engagement.
Communication channels expected:
- Primary: a vendor-provided engagement portal with secure messaging, evidence storage, and finding tracking.
- Secondary: email to named stakeholders, with PGP encryption available on request.
- Out-of-band: phone numbers for critical findings and stop-test events.
Severity-driven communication SLA expected (vendor must confirm or counter-propose):
- Critical findings: communicated within the same business day, ideally within hours, by phone plus written confirmation in the portal.
- High findings: communicated within one business day in the portal, with summary in the next scheduled checkpoint.
- Medium findings: visible in the portal as logged, summarised at scheduled checkpoints.
- Low and informational findings: consolidated into the final report.
Checkpoint cadence expected: {{CHECKPOINT_CADENCE}} (e.g. weekly written status, mid-engagement review, draft report walkthrough, final report walkthrough, retest closure).
7. Reporting and deliverables
Reporting quality is where engagements are judged. Specify the deliverables, the audiences, and the formats so proposals can be compared on report value, not just test value.
Required deliverables:
- Executive summary suitable for board and audit committee distribution.
- Technical findings report with one finding per page or per section, including severity, CVSS 3.1 vector, evidence, business impact, and remediation guidance.
- Remediation roadmap prioritised by severity and exploitability, with phased timelines.
- Compliance mapping where applicable (PCI DSS, SOC 2, ISO 27001, HIPAA, NIST 800-53).
- Raw scanner output or evidence package for the buyer's own audit trail.
Format requirements:
- PDF for the executive summary and the formal technical report.
- Editable format (Word or equivalent) on request, watermarked or controlled.
- Findings also delivered in a structured machine-readable format (CSV or JSON) for ingestion into the buyer's vulnerability management system.
Deliverable timeline expected:
- Draft report: within {{DRAFT_REPORT_DAYS}} business days of testing close.
- Final report after buyer review: within {{FINAL_REPORT_DAYS}} business days of buyer feedback.
- Retest report: within {{RETEST_REPORT_DAYS}} business days of retest close.
8. Retests
Retest scope is the dominant source of post-award commercial disputes. Defining it in the RFP avoids losing leverage at delivery.
Retest expectations:
- Retest window: {{RETEST_WINDOW}} after final report delivery (defensible default: 60 to 90 days).
- Retest count: vendor must include verification of all findings the vendor identified, at no additional fee, inside the retest window.
- Retest verification method: per finding, with evidence equivalent to the original finding (screenshots, traffic captures, configuration extracts), so the audit trail closes the original finding rather than opening a new one.
- Regression handling: any new finding discovered during retest must be logged distinctly from the original finding it relates to, with severity, evidence, and remediation guidance.
- Retest pricing for items outside the retest window: included as a rate card in the proposal.
Vendors must describe in the proposal how their platform pairs retest evidence to the original finding so the engagement record reflects what was tested, what was fixed, and what was not.
9. Pricing and commercial terms
Ask every vendor for the same pricing structure. Different structures across responses are the most common reason proposals turn out to be incomparable.
Pricing structure requested:
- Fixed fee for the engagement as scoped in Section 2, with explicit assumptions about asset count, complexity, and depth.
- Day rate for change orders inside the engagement, broken down by tester seniority (junior, senior, principal).
- Rate card for retests outside the retest window in Section 8.
- Subscription or retainer pricing if the vendor proposes an ongoing programme model. Include the cap (assets, hours, or tester-days).
Commercial terms requested:
- Payment terms: {{PAYMENT_TERMS}} (e.g. net 30 from invoice).
- Invoicing schedule: {{INVOICING_SCHEDULE}} (e.g. on award, at testing close, at final report sign-off).
- Expenses policy: {{EXPENSES_POLICY}}.
- Currency: {{CURRENCY}}.
- Tax treatment: pricing exclusive of VAT or sales tax; vendor to confirm jurisdiction.
Vendor must declare in the proposal:
- The pricing model used.
- The assumptions baked into the fixed fee.
- The conditions under which a change order would be raised.
- Whether any third-party tooling is included in the price or charged separately.
10. Vendor qualifications
Capture the credentials and the specific tester experience that will execute the work. The named lead in the proposal must be the actual lead at award, not a substitute.
Required qualifications:
- Corporate accreditations: {{CORPORATE_ACCREDITATIONS}} (e.g. CREST, CHECK, OVS, STAR, FedRAMP 3PAO, ISO 27001 certification of the testing organisation).
- Tester certifications expected on the engagement team: {{TESTER_CERTIFICATIONS}} (e.g. OSCP, CREST CRT/CCT, GPEN, OSWE).
- Years of testing experience required for the named engagement lead: {{YEARS_EXPERIENCE}}.
- Prior engagements on similar asset classes (web, API, cloud, mobile, internal network) within the last {{REFERENCE_WINDOW}} months.
Vendors must include in the proposal:
- An organisational overview, including office locations and the legal entity that will contract with the Issuing Party.
- Anonymised resumes for the named engagement lead and any named senior testers.
- A statement that the named lead in the proposal will be the actual lead at award (or the substitution policy if the named lead becomes unavailable).
- Three references from comparable engagements in the last {{REFERENCE_WINDOW}} months, with permission for the Issuing Party to contact them.
The Issuing Party reserves the right to request a one-hour technical interview with the named lead before award.
11. Vendor security and risk
The vendor will hold sensitive evidence and credentials. Treat the vendor as a third party for risk management purposes and require evidence proportional to the data handled.
Vendor security questions:
- Information security management system: ISO 27001 certification status, scope of certification, and most recent audit date.
- SOC 2 Type II report availability for the testing organisation, with reporting period.
- Background check policy for testers handling buyer data.
- Cyber insurance: provider, policy limits, and coverage type.
- Subcontractor policy: whether subcontractors are used on engagements, how they are vetted, and whether the buyer can refuse a specific subcontractor at award.
- Data residency: where evidence and findings are stored, including the cloud regions used by any vendor-provided portal.
Incident notification:
- The vendor must commit to notifying the Issuing Party within {{INCIDENT_NOTIFICATION_HOURS}} hours of any incident affecting buyer data.
- The vendor must describe its incident response posture and any prior incidents involving buyer data in the last {{INCIDENT_HISTORY_WINDOW}} months.
12. Evaluation rubric and submission instructions
Publish the rubric in the RFP. Anchoring scoring to a published rubric removes evaluator bias and produces a defensible audit trail for the procurement decision.
Weighted scoring rubric:
- Technical capability and methodology fit: {{TECH_WEIGHT}}% (e.g. 30%).
- Reporting quality and deliverables: {{REPORTING_WEIGHT}}% (e.g. 15%).
- Accreditations, references, and tester experience: {{QUALIFICATIONS_WEIGHT}}% (e.g. 15%).
- Retest policy and post-engagement support: {{RETEST_WEIGHT}}% (e.g. 10%).
- Vendor security and data protection terms: {{VENDOR_RISK_WEIGHT}}% (e.g. 10%).
- Commercial terms, pricing model, and assumptions: {{COMMERCIAL_WEIGHT}}% (e.g. 20%).
Each criterion is scored 1 to 5 by at least two evaluators, multiplied by the weight, then summed for a single composite score per vendor. Technical sections are scored before commercial sections are opened.
Submission instructions:
- Submit the proposal as a single PDF to {{SUBMISSION_EMAIL}} no later than {{RESPONSE_DEADLINE}} ({{TIMEZONE}}).
- Mark the subject line {{RFP_REFERENCE}} - PROPOSAL - {{VENDOR_NAME}}.
- Questions during the open period: send to {{PROCUREMENT_CONTACT_EMAIL}}. The Issuing Party will publish anonymised answers to all bidders within {{QUESTION_TURNAROUND}} business days.
- Late submissions will not be considered.
How to use this template
Confirm the asset count, complexity, and target test depth before drafting. Use the pentest scoping calculator to validate a tester-day budget that matches the work in Section 2.
Replace every {{PLACEHOLDER}} with programme-specific values. Do not leave placeholders in the issued document.
Lock the weighted rubric in Section 12 before responses arrive. Score technical sections before opening commercial sections so price does not anchor the evaluation.
Run the same RFP across all invited vendors. Differences in the RFP, not in the responses, are the most common reason proposals turn out to be incomparable.
Expect vendors to respond using a structured proposal that mirrors this RFP back. The SecPortal penetration testing proposal template is the vendor-side counterpart to this RFP and makes evaluation against your scoring rubric materially easier. On award, move the agreed scope and methodology into the statement of work template, and the operational rules into the rules of engagement template. The RFP, the proposal, the SOW, and the ROE should reference the same engagement record.
Score the proposals using the pentest vendor evaluation scorecard. Set the weights before the responses arrive, score on a 1 to 5 scale per criterion, and retain the completed scorecard with the procurement file so the selection record is defensible at audit time.
Methodology references
PTES (Penetration Testing Execution Standard) covers pre-engagement, intelligence gathering, threat modelling, vulnerability analysis, exploitation, post-exploitation, and reporting. See the SecPortal PTES framework page for an operator-first walkthrough.
NIST SP 800-115 Technical Guide to Information Security Testing and Assessment, planning phase.
OWASP Web Security Testing Guide v4.2 and OWASP API Security Top 10.
OWASP Application Security Verification Standard (ASVS) for application verification depth.
CREST Defensible Penetration Test specification and CREST CHECK / OVS / STAR scheme documentation. See the CREST penetration testing framework page for accreditation context.
This template is provided as a starting point for a penetration testing request for proposal. It is not legal advice or procurement advice. Have the final RFP reviewed by procurement, legal, and the security stakeholder accountable for the engagement before issuing.