Use Case

Pentest vendor panel management
one record across every approved provider

Run a panel of approved penetration testing vendors as a structured record rather than a folder of master service agreements and a memory of who did the last test. Capture each vendor with their capability matrix, status, and performance history; match every new engagement to the right vendor by capability and rotation rule; score delivery against the same criteria across the panel; and walk into the next renewal with the panel evidence the buyer board expects.

No credit card required. Free plan available forever.

Run the pentest panel as a record, not a procurement folder

Most security functions that run more than two pentests a year do not run them through one vendor. They run a panel: a small list of approved providers, framework agreements signed for two or three years, capability spread across web apps, infrastructure, cloud, mobile, and the occasional red team. The panel is a procurement artefact, but the work it carries is operational, and the gap between the two is where panels age into ineffectiveness. The rotation rule stops being enforced. Performance data lives in inboxes. Findings sit in per-vendor PDFs that never aggregate. By the time the panel review opens, nobody can show which vendor outperformed which, why a specific engagement went to a specific vendor, or whether the panel actually rotated at all.

SecPortal models a pentest vendor panel as a record on the buyer workspace. The panel carries the vendor directory, the rotation policy, the engagement portfolio, the performance scorecards, and the cross-vendor findings catalogue. Each engagement opens against the panel with a tagged vendor; each closed engagement produces a structured scorecard. Renewal evidence is the panel record itself rather than a recap deck written the week before the procurement review.

What the panel record carries

Five layers compose the panel record. Each layer has its own data shape and its own owner; together they replace the procurement folder, the email thread, and the spreadsheet of vendor scores that most panels actually run on.

Vendor directory and capability matrix

The list of approved vendors on the panel, with each vendor recorded as their own entity: MSA term, insurance, certifications (CREST, CHECK, OSCP-named team count), test types delivered (web app, mobile, API, infrastructure, cloud, code review, red team), industry experience, regional presence, and rate card. The directory is the panel itself; everything else hangs off it.

Rotation and assignment policy

The rule the panel uses to match an engagement to a vendor: round-robin within the eligible subset, capability-weighted (the best match wins), named-account (a vendor owns a specific business unit), or hybrid. The policy is captured once on the panel record and applied consistently rather than negotiated engagement by engagement.

Engagement portfolio across vendors

Every pentest, retest, code review, and advisory pass opens as an engagement linked to the panel and tagged with the assigned vendor. The portfolio carries scope, ROE, evidence, findings, reports, and retests on the engagement record while the panel aggregates which vendor delivered what across the cycle.

Performance scorecard ledger

Every closed engagement produces a scorecard against shared criteria: SLA hit rate, severity calibration accuracy, evidence completeness, finding novelty, communication quality, and overrun rate. Scores are comparable across vendors because the criteria are identical, so the panel performance picture is data rather than impression.

Cross-vendor findings catalogue

Findings persist on the asset record across every vendor that has tested it. A finding raised by Vendor A in Q1 stays open when Vendor B retests in Q3, with the same identifier and the same remediation owner. The catalogue is the durable artefact; vendor reports are snapshots of it.

How panel management differs from adjacent workflows

Panel management is often confused with single-engagement project management, retainer management, programme management, or one-time vendor selection. Each is its own workflow with its own data shape; the panel is the supplier-side governance layer that sits between procurement and delivery.

Single-engagement project management

Single-engagement project management runs one pentest from scoping to delivery on one vendor. Panel management is the governance layer above it: it spans many engagements over many vendors, applies a rotation rule, scores performance against shared criteria, and produces the renewal evidence no single engagement can.

Pentest retainer management

Retainer management is a commercial parent layer for one buyer-and-vendor relationship: contracted hours, drawdown, cadence, and renewal terms. Panel management is broader: it spans multiple vendors and asset groups, governs which vendor takes which engagement, and lets retainers exist underneath it where the buyer commits to specific volumes per vendor.

Security testing programme management

Security testing programme management owns the asset coverage map, the testing cadence, and the cross-engagement findings ledger. Panel management is the supplier-side mirror: it owns the vendor directory, the rotation rule, and the performance scorecard. The two compose: a programme decides what gets tested, the panel decides who tests it, and the engagements live underneath both.

One-time vendor selection

A one-time vendor selection runs an RFP, scores proposals, picks one vendor, and signs an MSA. Panel management is the workflow that follows: the panel exists for the next two or three years, the vendor selection event seeds it, and panel management is what keeps the panel performing rather than ossifying around the original choice.

Where panels usually go wrong

Five failure modes account for most panel attrition. Each one is silent during the cycle and loud at renewal, when the buyer and the vendors look at different numbers and the framework agreement renews on inertia rather than on evidence.

No capability matrix; the right vendor is whoever is free

The panel exists in spirit but no record captures which vendors can credibly deliver which test types. Engagements get routed by who picks up the phone, capability matches are accidental, and one vendor ends up with most of the volume regardless of fit. The fix is a capability matrix on the panel record, applied at engagement opening time rather than at procurement time.

Rotation rule that nobody enforces

The procurement document says "rotate annually" but the same vendor takes the same business unit for four years running because nobody surfaces the rotation breach until the panel review. The fix is to make rotation a first-class status on the engagement record so the rule shows up at assignment, not at audit.

Per-vendor reports that never aggregate

Each vendor delivers their own PDF, the buyer files them in a folder, and the cross-vendor view only exists when a CISO asks for it and an analyst spends a day collating spreadsheets. Aging findings, repeat issues, and coverage gaps stay invisible because they only appear when somebody manually looks. The fix is one finding catalogue across every vendor on the panel.

Performance felt rather than scored

The panel review opens with an opinion that Vendor A is sharper than Vendor B and Vendor C is slow on retests. The opinion may be right, but it cannot be defended at procurement, and the underperforming vendor renews because nobody can point at the specific failure. The fix is a shared scorecard against criteria captured at engagement close, not at panel review.

Onboarding and offboarding leak audit trail

A vendor leaves the panel and their engagements, findings, and evidence go with them because the data lived in the vendor portal rather than the buyer workspace. The next panel cycle starts with a forensic exercise to reconstruct what was delivered. The fix is to keep delivery on the buyer engagement record from day one so the panel composition can change without losing the history.

How panel management looks in SecPortal

Panel management runs across four product surfaces: engagement management for the assignment and portfolio, findings management for the cross-vendor catalogue, team management for the role-based access controls that keep vendor data scoped to the right consultant, and AI report generation for the panel review writeup at the renewal window. The branded client portal carries the buyer identity even when multiple vendors deliver across the cycle, so internal stakeholders see one programme rather than three.

Vendor directory

Each approved vendor is a record with capability tags, certifications, MSA term, and insurance state. The directory is the panel itself; engagement assignment reads from it directly rather than from a slide deck.

Rotation engine

New engagements surface eligible vendors against the rotation policy, the assignment decision is recorded on the engagement, and overrides carry a written reason so the audit trail is complete.

Scorecard ledger

Closed engagements produce a structured scorecard against shared criteria. Scores roll up to the vendor record across the cycle, so the panel review opens with the data already aggregated.

Performance signals the scorecard captures

Six signals make a vendor scorecard comparable across the panel. Each signal is captured at engagement close as structured data rather than as a comment, so the next panel review can compare like with like rather than relying on memory of who felt sharper.

SignalWhat it captures
SLA hit rateScoping turnaround, kickoff schedule, report delivery, and retest delivery against the contracted SLA per engagement. Reported as a percentage and a count of breaches, severity-weighted so a critical-finding retest miss matters more than a low-severity one.
Severity calibration accuracyHow many vendor-claimed severities held up at QA review and how many were adjusted at calibration. A vendor whose severities consistently inflate or deflate signals a calibration gap that has to be fixed at the vendor level, not engagement by engagement.
Evidence completenessWhether each finding ships with the request and response, the screenshot, the reproduction steps, and the CVSS 3.1 vector at the standard expected by the QA bar. Incomplete evidence is the early signal of a vendor that will struggle on the next retest.
Finding novelty against the catalogueHow many findings the vendor raised that were genuinely new versus duplicates of issues already in the catalogue. Novelty does not mean every duplicate is a failure; it means the panel can see whether a vendor is finding what others miss or repeating what others have already said.
Communication qualityResponsiveness during the test, clarity in the report, debrief discipline, and follow-up on questions. Recorded as a structured rating against shared criteria rather than a free-text comment so the panel review compares like with like.
Overrun rateVariance between scoped hours or test slots and actual consumption. Persistent overruns flag scoping inaccuracy or delivery indiscipline. The figure is the same number that drives the change-order conversation, so commercial conversations and performance conversations stay aligned.

Renewal evidence the panel record carries by default

Panel renewals are won or lost on whether the buyer can show the work. Seven artefacts come straight off the panel record without a manual recap pass and form the spine of the renewal proposal that goes to procurement and the security committee.

  • Vendor directory with current MSA term, insurance, certifications, and capability matrix per vendor.
  • Rotation policy applied per engagement with the assignment decision recorded on the engagement.
  • Engagement portfolio across the cycle: scope, vendor, hours planned, hours actual, deliverable, retest outcome.
  • Performance scorecards per closed engagement with severity-weighted SLA breach detail.
  • Cross-vendor findings catalogue with aging picture, repeat-issue rate per asset, and verification status.
  • Vendor onboarding and offboarding events with retained delivery history regardless of panel changes.
  • Renewal proposal pulled directly from the panel record rather than from a procurement memo or recap deck.

Where panel management sits across the engagement lifecycle

Panel management is the supplier-side parent layer that sits between procurement and delivery. Engagements still run through pentest project management and retesting on their own records; the panel aggregates them. Where a buyer commits volumes to one vendor, that relationship runs as a pentest retainer underneath the panel. The asset coverage map and the testing cadence sit on the parallel security testing programme record.

Onboarding new vendors

New vendors are onboarded through pentest client onboarding applied in reverse: the buyer captures the vendor profile, the contract state, and the access scope on the panel record before the first engagement is opened.

Cross-vendor remediation

Findings raised by any panel vendor flow into the same remediation tracking workflow. Aging issues stay visible across vendor changes, and retests carry the original finding identifier regardless of which vendor runs the verification.

Pair the workflow with the surrounding tools and guides

Panel management is operational; the surrounding tools and guides cover the discrete events that feed it. Use the penetration testing RFP template to issue the original procurement, the pentest vendor evaluation scorecard to rank proposals, the statement of work template for each engagement signed under the panel, and the rules of engagement template for operational governance per test. The narrative guide on choosing a security testing provider covers the buyer-side selection logic, the pentest pricing models research explains the commercial structures vendors quote against, and the aging pentest findings research is the risk-debt argument that justifies cross-vendor finding aggregation in the first place.

Buyer and operator pairing

Internal security teams running annual pentest panels

Enterprise security functions that maintain a panel of two to five approved pentest vendors over multi-year framework agreements use the panel record to apply rotation, score performance, and produce the panel review the security committee expects.

vCISOs governing client testing programmes

vCISOs serving multiple clients run a panel on behalf of each client. The panel record carries vendor capability and performance per client, so the vCISO can show a CFO why a specific vendor renewed and why another rotated out without rebuilding the case from scratch.

Compliance and procurement teams running framework agreements

Procurement and compliance teams that own the framework-agreement process for security testing use the panel record as the single source of vendor compliance, capability, and performance data. The panel review is operational rather than archaeological.

MSSPs running a panel of subcontracted specialist firms

MSSPs that subcontract specialist work (red team, OT/ICS, mobile, hardware) to named partners run those partners as a panel under the MSSP buyer record. The MSSP keeps the buyer-facing relationship while the panel record carries which subcontractor delivered which engagement and how each performed.

Who runs this workflow

Pentest panel management is the supplier-side governance layer that internal security teams, vCISOs, compliance consultants, and MSSPs run on top of standard engagement delivery. The panel is the supplier list; the engagements under it are the work.

What good panel management feels like

One supplier view, no archaeology

The buyer sees one panel record, one finding catalogue across vendors, and one scorecard ledger. Panel review stops being a forensic exercise and starts being a decision against evidence.

Renewals run on data

Engagements delivered, scorecard averages, SLA performance, and finding contribution per vendor come straight off the record. The panel renewal opens with comparable data rather than with a procurement memo written from memory.

Panel management is the workflow that decides whether a multi-vendor pentest programme compounds into a defensible supplier portfolio or churns through framework-agreement renewals on inertia. Get it right and each panel cycle ends with the data the next procurement review needs already on the record. Get it wrong and every renewal is a fresh negotiation against a missing supplier history.

Frequently asked questions about pentest vendor panel management

What is pentest vendor panel management?

Pentest vendor panel management is the operational workflow of running a list of approved penetration testing vendors over time. It covers vendor onboarding, capability capture, rotation rules, engagement assignment, performance scoring, cross-vendor finding aggregation, and panel renewal. It is distinct from a one-time vendor selection (which seeds a panel) and from running a single vendor relationship (which is a subset of a panel of one).

How is panel management different from running a retainer with one vendor?

A retainer is a commercial parent layer for one buyer-and-vendor relationship: contracted hours, drawdown, billing cadence, and renewal terms. Panel management spans multiple vendors, applies a rotation rule between them, and scores performance across them. The two compose. A panel can include retainers, where specific vendors carry committed volumes, while other engagements still rotate across the rest of the panel.

Does SecPortal replace procurement systems for pentest vendor panels?

No. SecPortal is the operational system above the engagement; it carries the panel directory, the rotation logic, the engagement portfolio, the performance scorecards, and the cross-vendor finding catalogue. Procurement systems still own contract storage, payment terms, and supplier financial controls. The panel record references the MSA and insurance state per vendor without trying to replace the contract repository.

How is rotation enforced in practice?

The rotation rule (round-robin, capability-weighted, named-account, or hybrid) is captured on the panel record. When a new engagement opens with a scope, the panel record surfaces the eligible vendors and the rotation suggestion. The buyer can override with a recorded reason; the override stays on the engagement so the panel review can see how often rotation was applied versus overridden, and why.

How are vendor scorecards captured?

Each closed engagement produces a scorecard against shared criteria: SLA hit rate, severity calibration accuracy on QA review, evidence completeness, finding novelty, communication quality, and overrun rate. The scorecard is structured rather than free-text so cross-vendor comparison is direct. Scores accumulate against the vendor record over the panel cycle and become the spine of the renewal evidence.

What happens to a vendor that leaves the panel?

Engagements, findings, evidence, retests, and the audit trail stay on the buyer workspace because delivery has run on the buyer engagement record from day one. The vendor record on the panel is marked offboarded with the offboarding date and reason, and the historical engagements remain attributed to that vendor for the audit trail. The next panel cycle starts from the existing record rather than from a CSV reconstruction.

How does the panel work alongside a security testing programme?

A security testing programme owns the asset coverage map, the testing cadence, and the cross-engagement findings ledger; the panel owns the vendor directory, the rotation rule, and the performance scorecards. The programme decides what gets tested and when; the panel decides who tests it. Engagements sit under both records, so the buyer gets a coverage view from the programme and a supplier view from the panel.

Can the same finding be tracked when different vendors test the same asset?

Yes. Findings persist on the asset record across every engagement, regardless of which vendor delivered the test. The catalogue carries the original identifier, the original severity, the remediation owner, the retest history, and the verification status. When a different vendor retests the asset, the existing findings surface in their queue rather than being raised again under new identifiers.

How it works in SecPortal

A streamlined workflow from start to finish.

1

Compose the panel and capture the capability matrix

Onboard each approved vendor as a record on the workspace with their MSA term, insurance, certifications (CREST, CHECK, OSCP-named team count), test types they deliver (web app, mobile, API, infrastructure, cloud, code review, red team), industry experience, regional presence, and rate card. The capability matrix becomes the panel directory rather than a slide buried in the procurement folder.

2

Match engagements to vendors by capability and rotation rule

Each new engagement opens with a scope (asset group, test type, cadence, urgency). The panel record surfaces the eligible vendors, applies the rotation rule (round-robin, capability-weighted, or named-account), and assigns the vendor that fits. The match decision is recorded against the engagement so audit can see why a specific vendor ran a specific test rather than reconstructing it from email.

3

Run delivery on the engagement, not the panel record

Each engagement runs scope, ROE, kickoff, evidence, findings, reports, and retests on its own record exactly as a single-vendor pentest would. The branded client portal carries the buyer identity rather than the vendor identity, so the panel looks like one programme to internal stakeholders even when three vendors deliver across the year.

4

Score performance against shared criteria

Every engagement closes with a vendor scorecard tied to the panel record: SLA hit rate (scoping, kickoff, report turnaround, retest), severity calibration accuracy on QA review, evidence completeness, finding novelty against the existing catalogue, communication quality, and overrun rate. The score is comparable across vendors because the criteria are the same, so the panel performance picture is data rather than impression.

5

Aggregate findings across vendors without duplicating them

Findings raised by Vendor A in Q1 stay on the asset record when Vendor B retests in Q3. Duplicate detection runs across the full panel rather than per-vendor, severity calibration is normalised against the workspace policy, and aging risk debt is visible across every vendor that has touched an asset. The panel produces one finding catalogue, not three parallel ones that diverge over time.

6

Renew, rotate, or offboard with evidence

At the panel review window, pull the actual record per vendor: engagements delivered, scope coverage, scorecard averages, SLA performance, finding contribution, repeat-issue rate, and commercial overrun. Renew strong vendors against evidence, rotate underperformers, and offboard cleanly with the offboarded vendor history retained on the panel record. The next panel cycle starts from the previous panel record rather than from a blank procurement template.

Run the pentest panel on a record, not a memory

Capability matrix, rotation, scorecards, and renewal evidence in one workspace. Start free.

No credit card required. Free plan available forever.