Vulnerability Remediation Worksheet drive each finding from triage to verified close
A free, copy-ready vulnerability remediation worksheet. Ten structured sections covering finding identification, severity at detection, affected assets, remediation ownership and SLA, the remediation plan, fix evidence, the retest pair, the closure or risk acceptance decision, SLA performance and audit trail, and linked records. Pairs to the underlying finding rather than replacing it so a successor reviewer can reconstruct the timeline without speaking to anyone who worked on it. Aligned with ISO/IEC 27001 Annex A.8 and Clause 8.3, SOC 2 CC7, PCI DSS Requirement 6, NIST SP 800-40, NIST SP 800-53 RA-5 and SI-2, and CISA KEV remediation cadence.
Run the worksheet inside the finding record, not in a spreadsheet
SecPortal stores severity, owner, SLA, retest evidence, closure timestamp, and the activity log against each finding so the worksheet is the byproduct of the work. Free plan available.
No credit card required. Free plan available forever.
Full worksheet
Copy the full vulnerability remediation worksheet
Ten structured sections, paired to the underlying finding rather than replacing it. Aligned with ISO/IEC 27001 Annex A.8 and Clause 8.3, SOC 2 CC7, PCI DSS Requirement 6, NIST SP 800-40, NIST SP 800-53 RA-5 and SI-2, and the CISA KEV cadence. Replace every {{PLACEHOLDER}} before the worksheet is signed off as closed.
1. Finding identification and source
Pair the worksheet to the underlying finding record. A successor reviewer should be able to reconstruct the entire remediation timeline without speaking to anyone who worked on it. ISO/IEC 27001 Annex A.8, SOC 2 CC7, and PCI DSS Requirement 6 all expect the source record to be traceable.
Capture the severity at the moment of detection so the audit trail is unambiguous. Severity does not change as work progresses. Closure does not lower severity, only the residual risk after the fix or compensating control.
CVSS 3.1 vector: {{CVSS_VECTOR}}
CVSS 3.1 base score: {{CVSS_BASE_SCORE}}
Severity: Critical / High / Medium / Low / Informational
CWE / OWASP mapping: {{CWE_OWASP_MAPPING}}
KEV listing (CISA Known Exploited Vulnerabilities): Yes / No / Not applicable
EPSS score (if applicable): {{EPSS_SCORE}}
Impact statement (one paragraph in plain language):
{{IMPACT_STATEMENT}}
Exploitation prerequisites:
{{EXPLOITATION_PREREQUISITES}}
3. Affected assets and scope
List every system, service, repository, or business process the finding touches. A worksheet that names only one asset when three are affected leaves the other two unprotected once the first is closed.
Affected systems / applications / repositories: {{AFFECTED_ASSETS}}
Environment (production, staging, pre-production, development): {{ENVIRONMENT}}
Data classifications in scope: {{DATA_CLASSIFICATIONS}}
Business processes impacted: {{BUSINESS_PROCESSES}}
Regulatory scope (PCI DSS, ISO 27001, SOC 2, HIPAA, GDPR, NIS2, DORA, FedRAMP, CMMC, other): {{REGULATORY_SCOPE}}
Internet-facing exposure: Yes / No / Partial
Compensating control already in place: {{COMPENSATING_CONTROL_AT_DETECTION}}
4. Remediation ownership and SLA
A finding without a named owner is a finding that nobody fixes. Pair every worksheet with an accountable remediation owner, a stakeholder who can authorise the work, and an SLA target the team has agreed in advance. "The platform team" is not an owner; a named person is.
Remediation owner (named individual): {{REMEDIATION_OWNER}}
Owner team / business unit: {{OWNER_TEAM}}
Stakeholder approving the work: {{STAKEHOLDER_NAME}}
Date assigned to owner: {{DATE_ASSIGNED}}
SLA target by severity (default ladder, tune to your programme):
- Critical: 7 days from detection
- High: 30 days from detection
- Medium: 90 days from detection
- Low: 180 days from detection
- Informational: best effort
Agreed SLA target date for this finding: {{SLA_TARGET_DATE}}
Framework alignment for the SLA (PCI DSS Requirement 6, ISO 27001 Annex A.8, SOC 2 CC7, CISA KEV cadence, internal policy): {{SLA_FRAMEWORK_BASIS}}
Linked engineering ticket (if any): {{ENGINEERING_TICKET_REFERENCE}}
5. Remediation plan and working notes
The plan describes how the finding will be fixed (or compensated for), what changes, and who reviews the change before it ships. Working notes capture the rolling diary of the remediation so reviewers and auditors can see the path from triage to fix.
Evidence is the proof a successor reviewer would need to confirm the fix landed. A configuration excerpt, a commit hash, a change ticket reference, a screenshot of the corrected behaviour. Vague claims like "fixed in production" are not evidence.
Fix landed in (environment): {{FIX_LANDED_ENVIRONMENT}}
Fix landed on (date): {{FIX_LANDED_DATE}}
Commit, change ticket, or release reference: {{COMMIT_OR_CHANGE_REFERENCE}}
Evidence required to verify the fix:
- Configuration excerpt or diff: {{CONFIG_EVIDENCE_REFERENCE}}
- Code change reference: {{CODE_EVIDENCE_REFERENCE}}
- Test or scan output showing the previously vulnerable behaviour is no longer present: {{TEST_EVIDENCE_REFERENCE}}
- Screenshot of the corrected behaviour (if user-visible): {{SCREENSHOT_REFERENCE}}
- Logging or detection rule that would surface a regression: {{DETECTION_RULE_REFERENCE}}
Side effects or regressions introduced by the fix:
{{SIDE_EFFECTS_NOTES}}
7. Retest pair
A retest is what turns "fixed" into "verified". Pair the retest to the original finding so the close-out record captures the original scope, the fix, the retest evidence, and the final outcome on a single timeline. Without a retest pair, closure is an assertion rather than a verified fact.
Retest scheduled date: {{RETEST_SCHEDULED_DATE}}
Retest performed date: {{RETEST_PERFORMED_DATE}}
Tester or verifier (named individual, not the same person who wrote the fix): {{TESTER_NAME}}
Verification method: Manual reproduction / Automated test / Authenticated scan / External scan / Code review / All applicable
Retest outcome: Verified Closed / Partial (residual issue) / Failed (issue still present) / Regressed (re-emerged)
Retest evidence references:
- Reproduction steps used during retest: {{RETEST_REPRO_STEPS_REFERENCE}}
- Output of automated test or scan: {{RETEST_AUTOMATED_OUTPUT}}
- Notes on residual or partial issues:
{{RETEST_RESIDUAL_NOTES}}
8. Closure decision or risk acceptance
Closure is one of: verified close, accepted risk with a structured form, or false positive with a documented reason. "Closed by the team" without a decision type is not a closure. Pair every closure with a timestamp and a named decision maker.
Closure decision: Verified Closed / Accepted Risk / False Positive / Duplicate / Out of Scope / Will Not Fix
Closure timestamp: {{CLOSURE_TIMESTAMP}}
Decision maker (named individual): {{CLOSURE_DECISION_MAKER}}
If Accepted Risk:
- Linked risk acceptance form reference: {{RISK_ACCEPTANCE_REFERENCE}}
- Compensating controls in place: {{COMPENSATING_CONTROLS}}
- Hard expiry of acceptance: {{ACCEPTANCE_EXPIRY_DATE}}
- Cancellation triggers: {{ACCEPTANCE_CANCELLATION_TRIGGERS}}
If False Positive:
- Reason: {{FALSE_POSITIVE_REASON}}
- Reviewer who confirmed false positive: {{FALSE_POSITIVE_REVIEWER}}
- Detection rule or scanner signature suppressed (if any): {{SUPPRESSED_RULE_REFERENCE}}
If Duplicate or Out of Scope:
- Linked canonical finding or scope statement: {{LINKED_CANONICAL_OR_SCOPE_REFERENCE}}
If Will Not Fix:
- Rationale, named approver, and review date: {{WILL_NOT_FIX_RATIONALE_AND_APPROVER}}
9. SLA performance and audit trail
Auditors ask for four things per finding: severity at detection, named owner, closure or acceptance evidence, and timeline against the agreed SLA. Capture them on one record and the export is the audit artefact.
Attach everything a successor reviewer needs to reconstruct the decision. Pair the worksheet to the original finding, the engagement, the retest, the risk acceptance form (if any), and the framework controls the work supports.
Linked records:
- Original finding record: {{LINK_TO_FINDING}}
- Engagement, scan, or report record: {{LINK_TO_ENGAGEMENT}}
- Retest record: {{LINK_TO_RETEST}}
- Risk acceptance form (if applicable): {{LINK_TO_RISK_ACCEPTANCE}}
- Engineering ticket (if applicable): {{LINK_TO_ENGINEERING_TICKET}}
- Vendor advisory or third-party reference: {{LINK_TO_VENDOR_ADVISORY}}
Framework citations supporting this remediation worksheet:
- ISO/IEC 27001:2022 Annex A.8 technological controls and Clause 8.3 information security risk treatment
- SOC 2 Trust Services Criteria CC7.1, CC7.2, CC7.3 system operations and change management
- PCI DSS v4.0 Requirement 6 develop and maintain secure systems and software
- NIST SP 800-40 patch management programme guidance
- NIST SP 800-53 RA-5 vulnerability monitoring and scanning, SI-2 flaw remediation
- CISA KEV cadence for known exploited vulnerabilities
- OWASP ASVS verification requirements (where applicable)
How to use this worksheet
Confirm the underlying finding has a CVSS 3.1 vector, evidence at detection, and remediation guidance recorded against it. The worksheet is the operating record that runs the work after the scan, not a substitute for the finding record itself. Use the SecPortal findings management feature so the worksheet, the finding, and the closure timestamp share one record.
Replace every {{PLACEHOLDER}} with the values for this specific finding. Avoid copy-paste working notes across multiple findings: every worksheet should stand on its own evidence.
Set the SLA target in Section 4 against a documented severity ladder, not a per-finding negotiation. Use the free vulnerability remediation SLA calculator to produce a defensible ladder tied to PCI DSS, ISO 27001, SOC 2, or CISA KEV expectations, then apply it consistently across the worksheet population.
Pair the retest in Section 7 to the original finding so the close-out record captures the original scope, the fix, the retest evidence, and the final outcome on one timeline. A separate retest spreadsheet is the most common reason closure timestamps drift.
Capture status transitions with timestamps so the audit log in Section 9 is a byproduct of the work rather than a separate task. The SecPortal workspace activity log records every status change, comment, evidence upload, and approval against the finding record so the audit trail does not depend on anyone remembering to log it.
Framework references
ISO/IEC 27001:2022 Annex A.8 technological controls and Clause 8.3 information security risk treatment. See the SecPortal ISO 27001 framework page for evidence expectations on remediation and risk treatment.
SOC 2 Trust Services Criteria CC7.1 (system operations), CC7.2 (monitoring), and CC7.3 (incident response and change management). See the SecPortal SOC 2 framework page for the evidence pack structure.
PCI DSS v4.0 Requirement 6 develop and maintain secure systems and software. See the SecPortal PCI DSS framework page for remediation cadence expectations inside the cardholder data environment.
NIST SP 800-40 guide to enterprise patch management planning, NIST SP 800-53 RA-5 vulnerability monitoring and scanning, and NIST SP 800-53 SI-2 flaw remediation. See the SecPortal NIST SP 800-53 framework page for control-level mapping.
For the analytical view of how findings age past SLA and turn into risk debt across PCI DSS, ISO 27001, SOC 2, and CISA KEV programmes, see the research on aging pentest findings.
This worksheet is provided as a starting point for documenting the remediation of a single vulnerability. It is not legal advice and is not a substitute for your organisation's vulnerability management policy, internal audit procedures, or regulator-specific evidence requirements. Have the final worksheet structure reviewed against the policies and frameworks that govern your programme before it is rolled out.