NIST SP 800-171
CUI assessment, evidence, and POA&M tracking
NIST Special Publication 800-171 is the US federal control set for protecting Controlled Unclassified Information (CUI) in non-federal systems. Run NIST 800-171 self-assessments and assessor-led assessments end-to-end: scope the boundary, implement the 110 security requirements, capture evidence, manage POA&M items, score against the DoD Assessment Methodology, and submit to the Supplier Performance Risk System (SPRS) from one workflow.
No credit card required. Free plan available forever.
NIST 800-171 in context: protecting CUI in non-federal systems
NIST Special Publication 800-171 is the United States federal control set for protecting the confidentiality of Controlled Unclassified Information (CUI) when it sits on non-federal systems. The publication translates the federal NIST SP 800-53 baseline into a tailored set of 110 security requirements organised across 14 families, and it is the cornerstone control set for the Defense Industrial Base. NIST SP 800-171A is the companion assessment guide that defines the assessment objectives and methods for each requirement, and the DoD Assessment Methodology is the scoring rubric used to record assessment results in the Supplier Performance Risk System (SPRS).
The compliance trigger for most contractors is DFARS 252.204-7012, the clause that flows NIST 800-171 down through the DoD supply chain. DFARS 252.204-7019 and 7020 add the SPRS submission and the government assessment regime on top. The CMMC 2.0 framework builds directly on NIST 800-171 (Level 2 is the 110 NIST 800-171 controls; Level 3 adds 24 NIST SP 800-172 enhancements), so an organised 800-171 programme is the prerequisite for every CMMC certification cycle.
Who is in scope for NIST 800-171
The scope is defined by the data, not by the organisation. Any non-federal system that stores, processes, or transmits Controlled Unclassified Information falls inside the regulation through the contract clause that introduced the CUI in the first place.
DoD prime contractors and subcontractors
Any DoD contract that involves Controlled Unclassified Information triggers DFARS 252.204-7012, which contractually flows down NIST SP 800-171 to the contractor and to every subcontractor that receives CUI. The clause flows down without modification, so the supplier base inherits the same requirements as the prime.
Contractors handling Federal Contract Information beyond FCI-only
Contractors handling only Federal Contract Information typically operate under FAR 52.204-21 (a smaller 17-requirement basic safeguarding set). The moment CUI enters the environment, NIST 800-171 applies and the assessment surface expands to the full 110 requirements.
NASA, GSA, and other federal civilian agency suppliers
Federal civilian agencies progressively reference NIST 800-171 for non-federal systems handling CUI through agency-specific clauses and the planned FAR CUI rule. The control set is the same; the assessment trigger and the cadence are agency-specific.
Higher education and research institutions on federal contracts
Universities and research institutions performing federally funded work that involves CUI (export-controlled research, controlled technical data, defense research, NIH-controlled data) are routinely scoped under NIST 800-171 through the prime award terms and the institutional research compliance programme.
The 14 requirement families and how scanner output maps in
The 110 security requirements sit across 14 families. Each family carries a subset of requirements with a specific assessment objective in NIST SP 800-171A. The platform value comes from tying every confirmed finding to the requirement it evidences, with the artefact attached to the same record the assessor reads at the next cycle.
| Family | Count | Coverage notes |
|---|---|---|
| 3.1 Access Control | 22 requirements | Account management, least privilege, separation of duties, session lock and termination, remote access, wireless access, mobile device control, and CUI flow control. Most assessments lose the largest share of points here when authenticated scans reveal stale accounts, missing privilege separation, or unmanaged remote access paths. |
| 3.2 Awareness and Training | 3 requirements | Security awareness training for all users, role-based training for personnel with significant security responsibilities, and insider threat training. Tie completion records and content reviews to the assessment record so the training evidence sits alongside the technical evidence. |
| 3.3 Audit and Accountability | 9 requirements | Create, retain, review, and protect audit records sufficient to monitor and investigate unlawful activity. Authenticated scans, log monitoring evidence, and the audit retention policy are the primary artefacts; the assessor expects to see the connection between detection and the recorded event. |
| 3.4 Configuration Management | 9 requirements | Baseline configurations, change control, secure configuration settings, software usage restrictions, least functionality, and inventory of system components. Authenticated scan output provides high-signal evidence; tag baseline drift findings to 3.4.1 and 3.4.2 with a path back to the asset record. |
| 3.5 Identification and Authentication | 11 requirements | Multi-factor authentication, password complexity, replay-resistant authentication, and identifier and authenticator management. The MFA requirement (3.5.3) is one of the most heavily weighted under the DoD Assessment Methodology, so evidence should be recent, specific, and per-service. |
| 3.6 Incident Response | 3 requirements | Establish an incident handling capability, track incidents, and report to designated authorities. Tie tabletop exercises, real incident records, and the DoD reporting mailbox evidence (where applicable) to the assessment record. Cyber incident reports under DFARS 252.204-7012 must reach DoD within 72 hours of discovery. |
| 3.7 Maintenance | 6 requirements | Perform system maintenance, control tools and personnel, sanitise equipment removed for off-site maintenance, and supervise non-local maintenance. Capture vendor maintenance contracts, sanitisation records, and the access control of maintenance personnel. |
| 3.8 Media Protection | 9 requirements | Protect CUI on system media, control access to and use of media, and sanitise or destroy media before disposal or release. Pair the media inventory, the destruction certificates, and the chain of custody log with the asset register. |
| 3.9 Personnel Security | 2 requirements | Screen individuals before granting access to systems containing CUI; ensure CUI is protected during personnel actions such as transfers and terminations. Tie HR records, screening policies, and access deactivation evidence to the assessment record. |
| 3.10 Physical Protection | 6 requirements | Limit physical access to systems and the operating environment, escort visitors, monitor activity, control physical access devices, and enforce safeguarding at alternate work sites. Capture badge logs, visitor records, environmental controls, and the alternate site policy. |
| 3.11 Risk Assessment | 3 requirements | Periodically assess the risk to operations, assets, and individuals; scan for vulnerabilities; remediate vulnerabilities in accordance with risk assessments. Vulnerability scanning frequency, the risk register, and the remediation evidence are the primary artefacts here. |
| 3.12 Security Assessment | 4 requirements | Periodically assess controls, develop and implement plans of action, monitor security controls on an ongoing basis, and develop a system security plan. The SSP and POA&M evidence sit here; assessors review both at every assessment cycle. |
| 3.13 System and Communications Protection | 16 requirements | Boundary protection, separation of public components, transmission confidentiality and integrity, FIPS-validated cryptography, and protection of CUI in transit and at rest. External scan output covers a large share of these requirements directly. |
| 3.14 System and Information Integrity | 7 requirements | Identify, report, and correct system flaws; provide malicious code protection; monitor system security alerts and advisories; perform inbound and outbound communications monitoring. Patch evidence, scanner output, and detection records all map cleanly here. |
Four assessment paths and how they differ
NIST 800-171 itself does not prescribe an assessment path. The path is set by the contract clause and the customer relationship. The platform should support every path on the same evidence base so a self-assessment, a joint surveillance assessment, and a CMMC C3PAO assessment all consume the same SSP, the same POA&M, and the same finding history.
- Self-assessment with an internal team for entities under DFARS 7019/7020 that submit Basic scores into SPRS at least every three years and refresh after material changes
- Joint Surveillance Voluntary Assessment by DCMA DIBCAC for contractors seeking a higher-confidence Medium or High assessment recorded in SPRS, often as a stepping stone to CMMC Level 2
- Third-party assessment by a CMMC C3PAO under CMMC 2.0 Level 2, where the underlying control set is NIST 800-171 and the assessment guide is NIST SP 800-171A
- Customer-led assessment for primes assessing their supplier base under flow-down obligations, frequently using NIST 800-171A as the reference assessment guide regardless of contract clause
DoD Assessment Methodology: how the score is calculated
The DoD Assessment Methodology is the scoring rubric used to record NIST 800-171 assessment results in SPRS. Treat it as the canonical scoring approach even when the immediate assessment is for a non-DoD customer; primes and federal civilian agencies increasingly reference the same rubric because it is the only published method.
- Start at the maximum score of 110 and deduct points per requirement that is not fully implemented
- 5-point deductions for the most critical requirements (multi-factor authentication, FIPS-validated cryptography, restrictions on connections to external systems)
- 3-point deductions for high-impact requirements with broad systemic effect (encryption of CUI in transit, audit log protection, configuration baselines)
- 1-point deductions for the remainder, including most procedural and documentation requirements
- Partial credit available for selected 5-point and 3-point requirements where partial implementation provides meaningful protection
- POA&M items reduce the score until closed; the score, the SSP date, and the assessment date submit to SPRS
SSP, POA&M, and SPRS: the workflow in order
The System Security Plan is the document that anchors every assessment cycle. The Plan of Action and Milestones tracks remediation per requirement that is Other than Satisfied. The SPRS submission turns the assessment into a record the customer can verify. Running these three artefacts as a workflow rather than as separate documents cuts the cycle time on every reassessment and removes the most common cause of customer-driven escalation.
- Open or update the SSP describing the assessment boundary, dataflow, and per-requirement implementation status
- Run the self-assessment using NIST SP 800-171A as the assessment guide and the DoD Assessment Methodology as the scoring rubric
- Capture evidence per requirement (technical artefacts, screenshots, configuration captures, policy references) on the assessment record
- Score the assessment by applying point deductions per unimplemented requirement and recording partial credit decisions
- Open POA&M items for every Other than Satisfied requirement with weakness, owner, milestones, evidence, and the planned closeout date
- Submit the score, the SSP date, the assessment date, and the included CAGE codes to SPRS through the company contractor PIEE access
- Refresh the assessment at least every three years, after material system changes, and after a major incident
Common assessment gaps that cost real points
The DCMA DIBCAC assessment debriefs and the C3PAO Level 2 assessment patterns surface a small set of gaps that account for most of the score deductions. Knowing where the weighted points sit makes the difference between a passable score and a score that fails the customer threshold.
Multi-factor authentication implemented partially
MFA enabled for privileged users but not for non-privileged users accessing CUI, or MFA enforced only at the perimeter and bypassed by SSO with weak fallback. The 3.5.3 requirement covers both privileged and non-privileged accounts and applies to local and network access. Partial coverage costs the full 5-point deduction unless the assessment captures genuine partial credit evidence.
FIPS-validated cryptography assumed, not verified
Encryption is enabled but the cryptographic modules in use are not on the FIPS 140-2 or FIPS 140-3 validated list. The 3.13.11 requirement is specifically about FIPS-validated cryptography for protecting the confidentiality of CUI; non-validated AES-256 is non-compliant. Capture the module name, the certificate number, and the version on the assessment record.
POA&M items used as a long-term substitute for implementation
POA&M is permitted for selected requirements, but it is not a permanent shelter from implementation. Each item must have a defined closeout plan, milestones, and evidence on closure. POA&M items that roll forward across assessments without progress are a frequent reason for assessor escalation and customer audit findings.
Scope drift between SSP and reality
The SSP describes a boundary, but new systems, new SaaS connections, or new sub-processors enter the environment without the SSP being updated. The discrepancy is the assessment finding the assessor opens first, because every requirement evaluation depends on a correct boundary. Treat the SSP as a live document tied to the asset register rather than an annual export.
Vulnerability scan cadence without remediation evidence
Scans are run, but the remediation track is missing the evidence needed to close findings. The 3.11.2 and 3.11.3 requirements require both the scan and the remediation per identified vulnerability, with re-test evidence on closure. Pair every confirmed finding to a remediation event with the original detection, the action, and the verification artefact on the same record.
Penetration testing and vulnerability assessment under 800-171
The 3.11 family (Risk Assessment) and the 3.12 family (Security Assessment) require periodic vulnerability scanning and periodic control assessment. NIST 800-171 itself does not specify a fixed cadence, but the prevailing practice in DoD assessments is monthly authenticated scanning, quarterly external scanning, and annual penetration testing on systems supporting CUI processing or transmission. Pair every confirmed finding to a remediation track with re-test evidence on closure so the audit reads as one workflow rather than separate scan reports and remediation tickets. The penetration testing workflow, the vulnerability assessment workflow, and the remediation tracking workflow are designed for this kind of programme: scope, log findings against assets, track remediation against deadlines, and re-test before closure.
Evidence the assessor (and your customer) actually want
Programmes that fail review usually fail because the artifacts are scattered across drives, ticket systems, and screenshots. Build the evidence pack as the work happens, retain raw scanner output and test reports alongside the summary, and tie every artefact back to a requirement, an asset, and an owner. The assessor narrative writes itself when the underlying record is consistent, and the customer audit answers in days rather than weeks.
- System security plan covering boundary, architecture, dataflow, and per-requirement implementation status
- Asset inventory with CUI categorisation per asset (CUI Asset, Security Protection Asset, Contractor Risk Managed Asset, Specialized Asset, Out-of-Scope)
- Vulnerability scan output (external and authenticated) with the scan cadence, ruleset, and exception register
- Penetration test report and remediation track for systems supporting CUI processing or transmission
- Configuration baselines, change records, and the deviation log for in-scope systems
- MFA configuration evidence per service with factor type, enrolment record, and bypass register
- Cryptographic module evidence covering FIPS 140 validation, key management, and the use of CUI in transit and at rest
- Incident records, tabletop exercise results, and the DFARS 252.204-7012 reporting evidence per incident
- POA&M items with affected requirement, weakness, owner, milestones, evidence, and closure date
- SPRS submission record covering the basic, medium, or high score, the SSP date, the assessment date, and the included CAGE codes
Where SecPortal fits in the NIST 800-171 workflow
SecPortal is the operating layer for the NIST 800-171 programme: scoping, scans, findings, control mapping, POA&M tracking, and assessor-ready evidence packs. Compliance tracking covers 800-171 alongside the other frameworks the same firm has to satisfy, including CMMC 2.0, ISO 27001, SOC 2, NIST 800-53, and FedRAMP.
- Compliance tracking that maps every finding to NIST 800-171 requirements alongside CMMC, ISO 27001, SOC 2, NIST 800-53, and PCI DSS for entities under multiple regimes
- Engagement management for vulnerability assessments, penetration tests, internal control reviews, and third-party assessments, with scope, status, and re-test all on one record
- Findings management with CVSS 3.1 scoring, 300+ templates, and Nessus or Burp Suite imports so existing tooling feeds the same workflow
- 16-module external scanning and 17-module authenticated scanning to evidence boundary, configuration, and authenticated weakness findings between manual tests
- Continuous monitoring with scheduled scans (daily, weekly, monthly) so the scan cadence and trend evidence required by 3.11 are recorded automatically
- AI report generation that turns findings, control mappings, and remediation actions into assessor and customer-ready narratives without a manual rewrite
NIST 800-171 is a multi-year programme, not a one-off project. The first cycle establishes the SSP, the asset register, and the baseline score. Subsequent cycles tighten the evidence trail, close POA&M items, and prepare the programme for CMMC Level 2 assessment where the contract requires it. Running the work as a managed workflow pays off most over time: historical findings, classified incidents, remediation timelines, and SPRS scores stay linked, so each reassessment is a refresh rather than a rebuild. For consultants delivering 800-171 work to multiple clients, the security consultants workspace bundles that with branded client portals and AI report generation, so the deliverable looks as polished as the work behind it.
For programmes that want continuous detection and trend evidence between manual tests, the continuous monitoring capability and the external scanning capability produce the cadence and coverage record 3.11 and 3.14 expect to see.
Key control areas
SecPortal helps you track and manage compliance across these domains.
Scoping: CUI assets and the assessment boundary
The 800-171 assessment scope is defined by the systems and components that store, process, or transmit Controlled Unclassified Information. Identify CUI assets, Security Protection Assets, Contractor Risk Managed Assets, and Specialized Assets. Document the assessment boundary diagram, the data flows for CUI, and the inheritance of controls from cloud or shared services. Scope errors are the single most common cause of failed assessments and reauthorisation cycles, so the boundary, the diagram, and the rationale should live on the assessment record from day one.
3.1 Access Control: 22 requirements
Restrict system access to authorised users, processes, and devices. Enforce least privilege, separation of duties, session lock, session termination, control of remote access, and control of CUI flow. Authenticated scan output and identity reviews provide direct evidence for 3.1.1, 3.1.2, 3.1.5, and 3.1.12. Capture the implementation, the assessment objective per requirement, and the supporting artefact on the same record so the assessor can locate proof in seconds.
3.3 Audit and Accountability + 3.14 System and Information Integrity
Create and retain audit logs sufficient to monitor, analyse, investigate, and report unlawful or unauthorised activity. Identify, report, and correct system flaws on a documented cadence; provide protection from malicious code; monitor system security alerts and advisories. Tie scanner output (missing patches, end-of-life systems, weak configurations) to the audit and integrity requirements with patch availability dates so the time-to-remediate evidence is explicit per CVE.
3.4 Configuration Management + 3.7 Maintenance + 3.8 Media Protection
Establish and maintain baseline configurations, control configuration changes, restrict use of nonessential programs, and apply the principle of least functionality. Authenticated scan output provides high-signal evidence here. Tag baseline drift, software usage findings, and least functionality gaps to 3.4.1, 3.4.2, 3.4.6, and 3.4.7 with a clear path to the asset record. Pair maintenance and media protection evidence (3.7 and 3.8) with the asset register and supplier records.
3.5 Identification and Authentication
Identify users, processes, and devices uniquely; authenticate users with multi-factor authentication for privileged accounts and for non-privileged accounts that access CUI. Manage password complexity, prevent reuse, and enforce session-level requirements. Capture MFA enrolment, authenticator strength, replay-resistant authentication evidence, and the test method per service. The 3.5 family is one of the most frequently scored at 5 points (multi-factor authentication) under the DoD Assessment Methodology, so the evidence has to be specific and recent.
3.13 System and Communications Protection
Monitor, control, and protect communications at the boundary. Implement subnetworks for publicly accessible system components, prevent unauthorised information transfer, employ cryptographic mechanisms for CUI in transit and at rest, and protect the authenticity of communications sessions. External scan output covers 3.13.1, 3.13.5, 3.13.8, and 3.13.11 directly: boundary protection, public component separation, transmission confidentiality, and FIPS-validated cryptography. Tie evidence to specific scanner modules and asset scopes.
System Security Plan (SSP) and supporting documentation
NIST 800-171 requires a System Security Plan describing the assessment boundary, system architecture, data flows, and per-requirement implementation status. Track the SSP alongside the incident response plan, configuration management plan, contingency plan, access control policy, media protection procedures, and the asset inventory. Keep these linked to the assessment record so each annual reassessment is a refresh rather than a rebuild and so the same artefact serves a self-assessment, a third-party assessment, and a customer audit.
POA&M, scoring, and the SPRS submission
The DoD Assessment Methodology assigns a starting score of 110 with point deductions per unimplemented requirement (1, 3, or 5 points each, with 5 reserved for the most critical requirements such as multi-factor authentication and FIPS-validated cryptography). A Plan of Action and Milestones is permitted for selected requirements with a documented closeout plan. Track every POA&M item with affected requirement, weakness, owner, milestones, evidence, and the SPRS submission date so the score and the closeout trail are auditable. The score, the SSP date, and the assessment date all submit to SPRS for DoD-contracting entities.
Related features
Compliance tracking without a full GRC platform
Vulnerability management software that tracks every finding
Orchestrate every security engagement from start to finish
AI-powered reports in seconds, not days
Test web apps behind the login
Vulnerability scanning tools that map your attack surface
Monitor continuously catch regressions early
Run a defensible NIST 800-171 programme without spreadsheet sprawl
Track scoping, control implementation, evidence, vulnerability scans, POA&M items, and SPRS submissions in one workflow. Start free.
No credit card required. Free plan available forever.