Penetration Testing Debrief Deck Template pace the readout meeting and land on a remediation and retest plan
A free, copy-ready penetration testing debrief deck outline. Eleven structured slides covering title and engagement reference, agenda and ground rules, scope and methodology recap, executive risk summary, findings heat map, critical and high walkthrough, medium and low summary, disputed findings and severity calibration, remediation plan and SLAs, retest scope and schedule, and the decision log. Pairs with the executed engagement letter, the draft report, and the rules of engagement so the deck inherits scope, methodology, and authorisation rather than restating them. Aligned with PTES, NIST SP 800-115, and the CREST Defensible Penetration Test specification.
SecPortal stores the debrief deck alongside the engagement letter, SOW, ROE, draft and final reports, decision log, and retest evidence. One audit trail from authorisation to closure. Free plan available.
No credit card required. Free plan available forever.
Full template
Copy the full debrief deck outline
Eleven structured slides covering scope recap, risk summary, findings walkthrough, severity disputes, remediation plan, retest scope, and the decision log. Pair the outline with the pentest debrief meeting guide for the meeting agenda and ground rules. Replace every {{PLACEHOLDER}} with engagement-specific values before the meeting.
Slide 1. Title and engagement reference
The opening slide names the engagement so the deck is portable across audit, retest, and the next engagement. The reference matches the engagement letter, the SOW, and the report.
PENETRATION TESTING DEBRIEF
Engagement: {{ENGAGEMENT_NAME}}
Engagement reference: {{ENGAGEMENT_REFERENCE}}
Testing window: {{START_DATE}} to {{END_DATE}}
Draft report version: {{REPORT_VERSION}}, dated {{REPORT_DATE}}
Presented by: {{TESTING_FIRM_NAME}}
Engagement lead: {{ENGAGEMENT_LEAD_NAME}}, {{ENGAGEMENT_LEAD_TITLE}}
Test team: {{TEST_TEAM_NAMES}}
For: {{CLIENT_LEGAL_NAME}}
Sponsor: {{CLIENT_SPONSOR_NAME}}, {{CLIENT_SPONSOR_TITLE}}
Date of debrief: {{DEBRIEF_DATE}}
Slide 2. Agenda and ground rules
The agenda paces the meeting and signals which sections each audience cares about. Ground rules cover note-taking, recording, and how disputes will be handled live versus parked.
Agenda
1. Engagement scope and methodology recap (5 min)
2. Risk summary and headline findings (10 min)
3. Findings walkthrough (20 to 35 min)
4. Disputed findings and severity calibration (5 to 10 min)
5. Remediation plan and SLAs (5 min)
6. Retest scope and schedule (5 min)
7. Decision log and next steps (5 min)
Ground rules
- Critical and high findings are walked in full; mediums and lows are tabled and selected for detail.
- Severity disputes are captured on the disputes slide and resolved against the calibration source (CVSS 3.1 or 4.0 vector and environmental adjustment).
- Decisions are recorded on the decision log slide as we go; nothing leaves the meeting without a written note.
- The deck and the draft report were circulated 24 hours before the call. We assume the executive summary has been read.
Slide 3. Engagement scope and methodology recap
The scope and methodology recap sets the boundary for everything that follows. Restate what was tested, what was explicitly out of scope, and the methodologies the team applied. Detail lives in the report; this is the headline.
Scope tested
- Asset categories: {{ASSET_CATEGORIES}}
(for example: external web applications, internal network ranges, cloud accounts, mobile applications, source code repositories)
- Counted asset units: {{ASSET_COUNTS_PER_CATEGORY}}
- Test depth: {{TEST_DEPTH}} (PTES Level 1 reconnaissance, Level 2 standard, or Level 3 advanced; or scheme-equivalent)
Methodology applied
- {{METHODOLOGY_REFERENCES}} (PTES, NIST SP 800-115, OWASP WSTG, OWASP MASTG, OWASP ASVS, CREST as applicable)
- Tooling: {{TOOLING_LIST}} (Burp Suite, Nessus, semgrep, manual testing)
- Authentication state per asset class: {{AUTH_STATES}}
Out of scope
- {{EXPLICIT_EXCLUSIONS}} (per the executed Rules of Engagement Section 3)
- {{COMPENSATING_COVERAGE}} where exclusions had alternative coverage applied
The walkthrough that follows is bounded by this scope. Findings outside the scope listed here are not in the report and are not in this deck.
Slide 4. Risk summary at executive altitude
The risk summary is the slide the leadership audience came to see. Compress the executive summary into a single page that names the engagement risk rating, the headline number of critical and high findings, and the one-sentence story of the engagement.
Engagement risk rating: {{RISK_RATING}} ({{RISK_RATING_RATIONALE_ONE_SENTENCE}})
Findings counts
- Critical: {{CRITICAL_COUNT}}
- High: {{HIGH_COUNT}}
- Medium: {{MEDIUM_COUNT}}
- Low: {{LOW_COUNT}}
- Informational: {{INFORMATIONAL_COUNT}}
Story of the engagement
{{ONE_PARAGRAPH_NARRATIVE}}
(For example: a critical authentication bypass on the customer portal allowed account takeover without credentials. The same pattern was present in two adjacent applications. Cloud configuration was generally well managed; the principal exposures sit in the application authentication layer.)
What this means for the business
{{ONE_PARAGRAPH_BUSINESS_IMPACT}}
(For example: until the authentication bypass is remediated, customer data on the portal can be reached without authorisation. Compensating controls are limited; remediation should ship inside the next release window.)
Slide 5. Findings overview heat map
The heat map slide shows severity by domain or asset class so the engineering audience sees where to focus and the leadership audience sees the shape of the engagement. Tables work better than gradient grids in printed reports and screen-share calls.
Severity by domain (counts)
| Domain | Critical | High | Medium | Low | Info |
| Authentication | {{C_AUTH}} | {{H_AUTH}} | {{M_AUTH}} | {{L_AUTH}} | {{I_AUTH}} |
| Authorisation | {{C_AUTHZ}} | {{H_AUTHZ}} | {{M_AUTHZ}} | {{L_AUTHZ}} | {{I_AUTHZ}} |
| Input handling | {{C_INPUT}} | {{H_INPUT}} | {{M_INPUT}} | {{L_INPUT}} | {{I_INPUT}} |
| Cryptography | {{C_CRYPTO}} | {{H_CRYPTO}} | {{M_CRYPTO}} | {{L_CRYPTO}} | {{I_CRYPTO}} |
| Configuration | {{C_CONFIG}} | {{H_CONFIG}} | {{M_CONFIG}} | {{L_CONFIG}} | {{I_CONFIG}} |
| Network and transport | {{C_NET}} | {{H_NET}} | {{M_NET}} | {{L_NET}} | {{I_NET}} |
| Logging and monitoring | {{C_LOG}} | {{H_LOG}} | {{M_LOG}} | {{L_LOG}} | {{I_LOG}} |
| Dependencies | {{C_DEP}} | {{H_DEP}} | {{M_DEP}} | {{L_DEP}} | {{I_DEP}} |
Speaker notes
- Call out the top two domains by critical and high count.
- Compare against the previous engagement where one ran (regression check).
- Flag any domain with zero coverage if the scope justified coverage.
Slide 6. Critical and high findings walkthrough (one slide each)
Every critical and high finding gets a full slide. The slide uses the same six fields the report uses so the walkthrough does not invent new claims. Drop the screenshot or proof on the slide; do not narrate evidence verbally.
Finding ID: {{FINDING_ID}}
Title: {{FINDING_TITLE}}
Severity: {{SEVERITY}} (CVSS {{CVSS_VERSION}}, vector {{CVSS_VECTOR}}, base score {{CVSS_BASE}})
Affected asset(s): {{AFFECTED_ASSETS}}
What we found
{{TWO_TO_THREE_SENTENCES_PLAIN_LANGUAGE}}
Evidence
{{SCREENSHOT_OR_REQUEST_RESPONSE_REFERENCE}}
(See report Appendix {{REPORT_APPENDIX_REFERENCE}} for full evidence.)
Why it matters
{{TWO_SENTENCES_BUSINESS_IMPACT}}
Recommended remediation
{{ONE_PARAGRAPH_FIX_GUIDANCE}}
Owner candidate: {{OWNER_TEAM_OR_NAME}}
SLA target: {{SLA_DAYS}} days from acceptance ({{SEVERITY}} per the agreed remediation policy)
Speaker notes
- Walk the finding at engineering depth; expect questions on reproduction.
- If severity is disputed, flag the dispute and capture it on the disputes slide.
- Do not negotiate severity here; calibration happens against the CVSS vector, not the slide.
Slide 7. Medium and low findings summary
Mediums and lows are tabled rather than walked one by one, with selected items highlighted because they cluster, repeat across assets, or carry compliance implications. The full detail lives in the report.
Medium and low findings (count: {{MEDIUM_PLUS_LOW_COUNT}})
| ID | Title | Severity | Affected asset | Theme |
| {{ID_1}} | {{TITLE_1}} | {{SEV_1}} | {{ASSET_1}} | {{THEME_1}} |
| {{ID_2}} | {{TITLE_2}} | {{SEV_2}} | {{ASSET_2}} | {{THEME_2}} |
| {{ID_3}} | {{TITLE_3}} | {{SEV_3}} | {{ASSET_3}} | {{THEME_3}} |
| {{ID_4}} | {{TITLE_4}} | {{SEV_4}} | {{ASSET_4}} | {{THEME_4}} |
| {{ID_5}} | {{TITLE_5}} | {{SEV_5}} | {{ASSET_5}} | {{THEME_5}} |
| ... | | | | |
Highlighted (call out one or two during the walk)
- {{HIGHLIGHTED_ID_AND_REASON_1}}
(For example: a Medium repeated on five separate endpoints suggests a shared library issue rather than five distinct fixes.)
- {{HIGHLIGHTED_ID_AND_REASON_2}}
(For example: a Low that maps directly to a PCI DSS or DORA requirement and therefore needs remediation evidence even though severity is Low.)
Informational items remain in the report and do not appear on this deck unless an attendee asks.
Slide 8. Disputed findings and severity calibration
Disputes are surfaced rather than hidden. Each disputed finding is named with the proposed resolution path so the meeting closes the dispute or commits to a path to close it.
Disputed findings
| ID | Buyer challenge | Calibration source | Proposed resolution |
| {{DID_1}} | {{CHALLENGE_1}} | {{CVSS_VECTOR_AND_ENV_1}} | {{RESOLUTION_PATH_1}} |
| {{DID_2}} | {{CHALLENGE_2}} | {{CVSS_VECTOR_AND_ENV_2}} | {{RESOLUTION_PATH_2}} |
| {{DID_3}} | {{CHALLENGE_3}} | {{CVSS_VECTOR_AND_ENV_3}} | {{RESOLUTION_PATH_3}} |
Resolution paths in scope today
- Accept tester severity (no change).
- Apply documented environmental adjustment that changes the score (record vector change).
- Accept buyer challenge and downgrade the finding (record rationale).
- Record formal risk acceptance against the original severity (record acceptance reference, owner, expiry, reassessment trigger).
Speaker notes
- Calibration happens against the CVSS vector and the environmental metrics, not against opinion.
- A risk acceptance is a written document with an owner and an expiry, not a verbal agreement to leave a finding open.
- Disputes that cannot be closed today get a named owner and a closing date on the decision log.
Slide 9. Remediation plan and SLAs
The remediation plan converts findings into work items with owners and SLA targets. Severity drives the default SLA; the agreed remediation policy sets the schedule that goes on this slide.
Remediation SLA policy applied
| Severity | SLA target | Source |
| Critical | {{CRITICAL_SLA_DAYS}} days | {{POLICY_REFERENCE}} (e.g. PCI DSS, NIST SP 800-40r4, internal policy) |
| High | {{HIGH_SLA_DAYS}} days | {{POLICY_REFERENCE}} |
| Medium | {{MEDIUM_SLA_DAYS}} days | {{POLICY_REFERENCE}} |
| Low | {{LOW_SLA_DAYS}} days | {{POLICY_REFERENCE}} |
Remediation plan
- Owners and target dates are recorded against each finding on the engagement record (not just on this slide).
- KEV-tagged findings (CISA Known Exploited Vulnerabilities) are escalated regardless of CVSS severity.
- Findings the buyer has accepted as risk are marked accepted and excluded from the SLA backlog (acceptance reference required).
Cross-references
- The remediation SLA calculator on the SecPortal tools page produces a defensible severity-to-window policy aligned with NIST SP 800-40r4.
- Aging metrics will be reported as MTTR by severity and percentage past SLA at each progress checkpoint.
Slide 10. Retest scope and schedule
The retest plan is on the deck so retest scope does not drift after the meeting. The plan names which findings will be retested, when, and how the retest is authorised.
Retest scope (default)
- All Critical findings.
- All High findings.
- Selected Mediums where remediation needs evidence of fix (named on this slide).
- Findings tied to compliance evidence (PCI DSS, ISO 27001 Annex A, SOC 2, DORA TLPT) regardless of severity.
Retest schedule
- Default: 30 days from remediation closure of the targeted findings.
- Compliance windows or scheme deadlines may compress the schedule (record the binding date).
- Retest results pair to the original finding so the aging clock keeps running rather than resetting.
Retest authorisation
- Authorised by an addendum to the engagement letter, not a fresh contract.
- Addendum references this debrief deck, the engagement reference, and the list of findings to be retested.
- Signed by the same Authorising Party representative who signed the original engagement letter.
Deliverable
- Retest report paired to the original finding, with retest date, evidence, and outcome (Closed, Open, Risk Accepted) on the same record.
- Retest report distributed alongside the engagement record so the audit trail stays intact.
Slide 11. Decision log and next steps
The decision log captures every decision made during the meeting so nothing leaves the room as a verbal agreement. Next steps are dated, owned, and tracked on the engagement record.
Decisions taken in this meeting
| # | Decision | Owner | Closing date |
| 1 | {{DECISION_1}} | {{OWNER_1}} | {{DATE_1}} |
| 2 | {{DECISION_2}} | {{OWNER_2}} | {{DATE_2}} |
| 3 | {{DECISION_3}} | {{OWNER_3}} | {{DATE_3}} |
Next steps (dated within 7 days of this meeting unless noted)
- Final report issued: {{REPORT_FINAL_DATE}}
- Disputed findings resolution closed: {{DISPUTE_CLOSE_DATE}}
- Remediation plan and owners confirmed in the engagement record: {{PLAN_CONFIRM_DATE}}
- Retest addendum signed: {{RETEST_ADDENDUM_DATE}}
- First progress checkpoint: {{CHECKPOINT_DATE}}
Storage
- The final deck, the final report, the decision log, and the retest addendum live against the engagement record so the next compliance audit, the next engagement, and the next retest open with the full prior context.
How to use this debrief deck template
Confirm the draft report is final enough to circulate. The debrief should never be the first time a critical finding hits the leadership audience. The pentest executive summary guide covers the front-of-report writing the deck risk summary slide compresses for leadership.
Send the deck and the draft report to attendees twenty-four hours ahead. Treat the executive summary as read in the meeting and use the time to walk findings, close disputes, and confirm the remediation and retest plan.
Replace every {{PLACEHOLDER}} with engagement-specific values. Pay particular attention to the engagement reference on Slide 1 (it must match the engagement letter) and the CVSS vectors on the findings walkthrough slides (they must match the report).
Walk the deck from the testing firm side rather than the buyer side, so the testing narrative carries through the meeting. The finding triage during pentest guide covers severity calibration the disputes slide leans on.
Capture decisions on the decision log slide as the meeting progresses. Nothing leaves the meeting as a verbal agreement; everything goes against the engagement record so the next retest, audit, and engagement opens with the full prior context.
Distribute the final deck, the final report, the decision log, and any retest addendum within twenty-four hours of the call. Store all four against the engagement record alongside the original engagement letter.
Methodology and scheme references
PTES Section 7 (Reporting) treats the readout as a separate step from the written report. See the SecPortal PTES framework page for the operator-first walkthrough.
NIST SP 800-115 Technical Guide to Information Security Testing and Assessment, reporting phase.
CREST Defensible Penetration Test specification and CREST CHECK / OVS / STAR scheme documentation. See the CREST penetration testing framework page for accreditation context that often shapes the deck audience.
For severity-driven SLAs on Slide 9, the vulnerability remediation SLA calculator produces a defensible severity-to-window policy aligned with NIST SP 800-40r4. For severity calibration on Slide 8, the CVSS 3.1 and 4.0 calculator produces the vector and base score the disputes slide cites.
Where the debrief deck sits in the engagement
The deck is the working artefact for the readout meeting; the report is the durable record. The clean paper trail for a regulated penetration testing engagement runs RFP, proposal, SOW, ROE, engagement letter, draft report, debrief, final report, retest. The deck sits between draft and final report, paces the meeting, and binds the engagement to the remediation and retest plan everyone signed up to.
For the formal closeout artefact that follows the debrief and signs off the engagement, see the pentest closure letter template. The closure letter records the deliverable acceptance, the retest authorisation window, and the lapse of the engagement letter authorisation.
This template is provided as a starting point for a penetration testing debrief deck. It is not legal advice. Pair the deck with the engagement letter, the rules of engagement, the statement of work, and the draft report before the meeting, and store the final deck against the engagement record alongside those documents.