In early March 2024, the U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) released its final Secure Software Development Attestation Form instructions, sparking a renewed urgency around understanding and complying with 31 of the 42 tasks in NIST SP 800-218 Secure Software Development Framework (SSDF) version 1.1.
The SSDF standard was released in February 2022 and specifies 42 tasks that organizations should perform as part of a secure software development life cycle, as well as ways to secure the development environment. SSDF draws from a wide variety of existing frameworks. One of the most referenced frameworks is Building Security in Maturity Model (BSIMM). SSDF refers to BSIMM more than any other framework (other than EO 14028), as shown in Figure 1.
Top 10 Referenced Standards/Framework | Number of SSDF Tasks That Reference the Standard or Framework |
---|---|
EO14028 | 42 |
BSIMM | 39 |
BSAFSS | 38 |
IEC62443 | 38 |
SP800181 | 37 |
SP80053 | 35 |
SP800161 | 35 |
PCISSLC | 28 |
OWASPSAMM | 26 |
SCSIC | 23 |
Figure 1. SSDF references to other frameworks
While SSDF’s references to the BSIMM report can help organizations understand the intent behind those 39 tasks, the NIST standard itself references BSIMM12, which was current at the time the standard was published. The BSIMM report is updated on an annual basis, and since the NIST standard was released, BSIMM has been updated twice; the current version is BSIMM14.
SSDF refers to BSIMM activities by the activity numbers, SM1.1, SM1.4, CP2.3, SFD3.2, etc., and readers can refer to the BSIMM activities based on those numbers to see the details. Note that due to the BSIMM report’s updates, the activity numbers published NIST references will not always lead to the right BSIMM activity.
The activity numbers refer to one of the 12 practices in BSIMM, identifying the level and activity number. For example, SM1.1 refers to the Strategy and Metrics Practice, level 1, activity 1. The level number reflects the relative observation rates of that activity. Level 1 activities are the most observed in a practice; level 3 the least observed and level 2 is somewhere in between.
Every year, each activity is assessed to see if it became more popular or less popular, and it is then moved and renumbered as appropriate. This means some activities listed in the NIST standard may not easy to locate in the BSIMM14 report. For example, the NIST standard references SE2.6 from BSIMM12, but as of February 2022, that activity became more popular and was renumbered SE1.3. You won’t be able to find SE2.6 unless you know how the activity number has changed.
Since the SSDF standard was released, nine activities have moved between levels, but because some are referenced by more than one SSDF task, a total of 12 SSDF tasks have been affected. The updated BSIMM mappings for these 12 tasks are shown in Figure 2.
Task | BSIMM12 (original) | BSIMM14 |
---|---|---|
PO.1.1 | SE2.6 | SE1.3 |
PO.2.2 | T3.4 | T2.11 |
PW.1.1 | AM2.2 | AM3.4 |
AM2.5 | AM3.5 | |
AA1.3 | AA2.4 | |
PW.2.1 | AA1.3 | AA2.4 |
PW.4.1 | SR2.4 | SR1.5 |
SR3.1 | SR2.7 | |
PW.4.4 | SR2.4 | SR1.5 |
SR3.1 | SR2.7 | |
PW.7.2 | CR1.6 | CR2.8 |
RV.2.1 | CMVM2.2 | CMVM1.3 |
Figure 2. BSIMM activity numbering changes
The overall SSDF to BSIMM mapping with the changes for BSIMM14 is shown in Table 1.
Task | BSIMM12 | BSIMM14 | BSIMM14 Activity Name |
---|---|---|---|
PO.1.1 | CP1.1 | CP1.1 | Unify regulatory pressures |
CP1.3 | CP1.3 | Create policy | |
SR1.1 | SR1.1 | Create security standards | |
SR2.2 | SR2.2 | Create a standards review process | |
SE1.2 | SE1.2 | Ensure host and network security basics are in place | |
SE2.6 | SE1.3 | Implement cloud security controls | |
PO.1.2 | SM1.1 | SM1.1 | Publish process and evolve as necessary |
SM1.4 | SM1.4 | Implement security checkpoints and associated governance | |
SM2.2 | SM2.2 | Enforce security checkpoints and track exceptions | |
CP1.1 | CP1.1 | Unify regulatory pressures | |
CP1.2 | CP1.2 | Identify privacy obligations | |
CP1.3 | CP1.3 | Create policy | |
CP2.1 | CP2.1 | Build a PII inventory | |
CP2.3 | CP2.3 | Implement and track controls for compliance | |
AM1.2 | AM1.2 | Use a data classification scheme for software inventory | |
SFD1.1 | SFD1.1 | Integrate and deliver security features | |
SFD2.1 | SFD2.1 | Leverage secure-by-design components and services | |
SFD3.2 | SFD3.2 | Require use of approved security features and frameworks | |
SR1.1 | SR1.1 | Create security standards | |
SR1.3 | SR1.3 | Translate compliance constraints to requirements | |
SR2.2 | SR2.2 | Create a standards review process | |
SR3.3 | SR3.3 | Use secure coding standards | |
SR3.4 | SR3.4 | Create standards for technology stacks | |
PO.1.3 | CP2.4 | CP2.4 | Include software security SLAs in all vendor contracts |
CP3.2 | CP3.2 | Ensure compatible vendor policies | |
SR2.5 | SR2.5 | Create SLA boilerplate | |
SR3.2 | SR3.2 | Communicate standards to vendors | |
PO.2.1 | SM1.1 | SM1.1 | Publish process and evolve as necessary |
SM2.3 | SM2.3 | Create or grow a satellite (security champions) | |
SM2.7 | SM2.7 | Create evangelism role and perform internal marketing | |
CR1.7 | CR1.7 | Assign code review tool mentors | |
PO.2.2 | T1.1 | T1.1 | Conduct software security awareness training |
T1.7 | T1.7 | Deliver on-demand individual training | |
T1.8 | T1.8 | Include security resources in onboarding | |
T2.5 | T2.5 | Enhance satellite (security champions) through training and events | |
T2.8 | T2.8 | Create and use material specific to company history | |
T2.9 | T2.9 | Deliver role-specific advanced curriculum | |
T3.1 | T3.1 | Reward progression through curriculum | |
T3.2 | T3.2 | Provide training for vendors and outsourced workers | |
T3.4 | T2.11 | Require an annual refresher | |
PO.2.3 | SM1.3 | SM1.3 | Educate executives on software security |
SM2.7 | SM2.7 | Create evangelism role and perform internal marketing | |
CP2.5 | CP2.5 | Ensure executive awareness of compliance and privacy obligations | |
PO.3.1 | CR1.4 | CR1.4 | Use automated code review tools |
ST1.4 | ST1.4 | Integrate opaque-box security tools into the QA process | |
ST2.5 | ST2.5 | Include security tests in QA automation | |
SE2.7 | SE2.7 | Use orchestration for containers and virtualized environments | |
PO.3.2 | SR1.1 | SR1.1 | Create security standards |
SR1.3 | SR1.3 | Translate compliance constraints to requirements | |
SR3.4 | SR3.4 | Create standards for technology stacks | |
PO.3.3 | SM1.4 | SM1.4 | Implement security checkpoints and associated governance |
SM3.4 | SM3.4 | Integrate software-defined lifecycle governance | |
SR1.3 | SR1.3 | Translate compliance constraints to requirements | |
PO.4.1 | SM1.4 | SM1.4 | Implement security checkpoints and associated governance |
SM2.1 | SM2.1 | Publish data about software security internally and use it to drive change | |
SM2.2 | SM2.2 | Enforce security checkpoints and track exceptions | |
SM2.6 | SM2.6 | Require security sign-off prior to software release | |
SM3.3 | SM3.3 | Identify metrics and use them to drive resourcing | |
CP2.2 | CP2.2 | Require security sign-off for compliance-related risk | |
PO.4.2 | SM1.4 | SM1.4 | Implement security checkpoints and associated governance | SM2.1 | SM2.1 | Publish data about software security internally and use it to drive change | SM2.2 | SM2.2 | Enforce security checkpoints and track exceptions | SM3.4 | SM3.4 | Integrate software-defined lifecycle governance |
PO.5.1 | |||
PO.5.2 | |||
PS.1.1 | SE2.4 | SE2.4 | Protect code integrity |
PS.2.1 | SE2.4 | SE2.4 | Protect code integrity |
PS.3.1 | |||
PS.3.2 | SE3.6 | SE3.6 | Create bills of materials for deployed software |
PW.1.1 | AM1.2 | AM1.2 | Use a data classification scheme for software inventory |
AM1.3 | AM1.3 | Identify potential attackers | |
AM1.5 | AM1.5 | Gather and use attack intelligence | |
AM2.1 | AM2.1 | Build attack patterns and abuse cases tied to potential attackers | |
AM2.2 | AM3.4 | Create technology-specific attack patterns | |
AM2.5 | AM3.5 | Maintain and use a top N possible attacks list | |
AM2.6 | AM2.6 | Collect and publish attack stories | |
AM2.7 | AM2.7 | Build an internal forum to discuss attacks | |
SFD2.2 | SFD2.2 | Create capability to solve difficult design problems | |
AA1.1 | AA1.1 | Perform security feature review | |
AA1.2 | AA1.2 | Perform design review for high-risk applications | |
AA1.3 | AA2.4 | Have SSG lead design review efforts | |
AA2.1 | AA2.1 | Perform architecture analysis using a defined process | |
PW.1.2 | SFD3.1 | SFD3.1 | Form a review board to approve and maintain secure design patterns |
SFD3.3 | SFD3.3 | Find and publish secure design patterns from the organization | |
AA2.2 | AA2.2 | Standardize architectural descriptions | |
AA3.2 | AA3.2 | Drive analysis results into standard design patterns | |
PW.1.3 | SFD1.1 | SFD1.1 | Integrate and deliver security features |
SFD2.1 | SFD2.1 | Leverage secure-by-design components and services | |
SFD3.2 | SFD3.2 | Require use of approved security features and frameworks | |
SR1.1 | SR1.1 | Create security standards | |
SR3.4 | SR3.4 | Create standards for technology stacks | |
PW.2.1 | AA1.1 | AA1.1 | Perform security feature review |
AA1.2 | AA1.2 | Perform design review for high-risk applications | |
AA1.3 | AA2.4 | Have SSG lead design review efforts | |
AA2.1 | AA2.1 | Perform architecture analysis using a defined process | |
AA3.1 | AA3.1 | Have engineering teams lead AA process | |
PW.4.1 | SFD2.1 | SFD2.1 | Leverage secure-by-design components and services |
SFD3.2 | SFD3.2 | Require use of approved security features and frameworks | |
SR2.4 | SR1.5 | Identify open source | |
SR3.1 | SR2.7 | Control open source risk | |
SE3.6 | SE3.6 | Create bills of materials for deployed software | |
PW.4.2 | SFD1.1 | SFD1.1 | Integrate and deliver security features |
SFD2.1 | SFD2.1 | Leverage secure-by-design components and services | |
SFD3.2 | SFD3.2 | Require use of approved security features and frameworks | |
SR1.1 | SR1.1 | Create security standards | |
PW.4.4 | CP3.2 | CP3.2 | Ensure compatible vendor policies |
SR2.4 | SR1.5 | Identify open source | |
SR3.1 | SR2.7 | Control open source risk | |
SR3.2 | SR3.2 | Communicate standards to vendors | |
SE2.4 | SE2.4 | Protect code integrity | |
SE3.6 | SE3.6 | Create bills of materials for deployed software | |
PW.5.1 | SR3.3 | SR3.3 | Use secure coding standards |
CR1.4 | CR1.4 | Use automated code review tools | |
CR3.5 | CR3.5 | Enforce secure coding standards | |
PW.6.1 | SE2.4 | SE2.4 | Protect code integrity |
PW.6.2 | SE2.4 | SE2.4 | Protect code integrity | SE3.2 | SE3.2 | Use code protection |
PW.7.1 | CR1.5 | CR1.5 | Make code review mandatory for all projects |
PW.7.2 | CR1.2 | CR1.2 | Perform opportunistic code review |
CR1.4 | CR1.4 | Use automated code review tools | |
CR1.6 | CR2.8 | Use centralized defect reporting to close the knowledge loop | |
CR2.6 | CR2.6 | Use custom rules with automated code review tools | |
CR2.7 | CR2.7 | Use a top N bugs list (real data preferred) | |
CR3.4 | CR3.4 | Automate malicious code detection | |
CR3.5 | CR3.5 | Enforce secure coding standards | |
PW.8.1 | PT2.3 | PT2.3 | Schedule periodic penetration tests for application coverage |
PW.8.2 | ST1.1 | ST1.1 | Perform edge/boundary value condition testing during QA |
ST1.3 | ST1.3 | Drive tests with security requirements and security features | |
ST1.4 | ST1.4 | Integrate opaque-box security tools into the QA process | |
ST2.4 | ST2.4 | Drive QA tests with AST results | |
ST2.5 | ST2.5 | Include security tests in QA automation | |
ST2.6 | ST2.6 | Perform fuzz testing customized to application APIs | |
ST3.3 | ST3.3 | Drive tests with design review results | |
ST3.4 | ST3.4 | Leverage code coverage analysis | |
ST3.5 | ST3.5 | Begin to build and apply adversarial security tests (abuse cases) | |
ST3.6 | ST3.6 | Implement event-driven security testing in automation | |
PT1.1 | PT1.1 | Use external penetration testers to find problems | |
PT1.2 | PT1.2 | Feed results to the defect management and mitigation system | |
PT1.3 | PT1.3 | Use penetration testing tools internally | |
PT3.1 | PT3.1 | Use external penetration testers to perform deep-dive analysis | |
PW.9.1 | SE2.2 | SE2.2 | Define secure deployment parameters and configurations |
PW.9.2 | SE2.2 | SE2.2 | Define secure deployment parameters and configurations |
RV.1.1 | AM1.5 | AM1.5 | Gather and use attack intelligence |
CMVM1.2 | CMVM1.2 | Identify software defects found in operations monitoring and feed them back to engineering | |
CMVM2.1 | CMVM2.1 | Have emergency response | |
CMVM3.4 | CMVM3.4 | Operate a bug bounty program | |
CMVM3.7 | CMVM3.7 | Streamline incoming responsible vulnerability disclosure | |
RV.1.2 | CMVM3.1 | CMVM3.1 | Fix all occurrences of software defects found in operations |
RV.1.3 | CMVM1.1 | CMVM1.1 | Create or interface with incident response |
CMVM2.1 | CMVM2.1 | Have emergency response | |
CMVM3.3 | CMVM3.3 | Simulate software crises | |
CMVM3.7 | CMVM3.7 | Streamline incoming responsible vulnerability disclosure | |
RV.2.1 | CMVM1.2 | CMVM1.2 | Identify software defects found in operations monitoring and feed them back to engineering |
CMVM2.2 | CMVM1.3 | Track software defects found in operations through the fix process | |
RV.2.2 | CMVM2.1 | CMVM2.1 | Have emergency response |
RV.3.1 | CMVM3.1 | CMVM3.1 | Fix all occurrences of software defects found in operations |
CMVM3.2 | CMVM3.2 | Enhance the SSDL to prevent software defects found in operations | |
RV.3.2 | CP3.3 | CP3.3 | Drive feedback from software lifecycle data back to policy |
CMVM3.2 | CMVM3.2 | Enhance the SSDL to prevent software defects found in operations | |
RV.3.3 | CR3.3 | CR3.3 | Create capability to eradicate bugs |
CMVM3.1 | CMVM3.1 | Fix all occurrences of software defects found in operations | |
RV.3.4 | CP3.3 | CP3.3 | Drive feedback from software lifecycle data back to policy |
CMVM3.2 | CMVM3.2 | Enhance the SSDL to prevent software defects found in operations |
Table 1. SSDF to BSIMM mapping updated for BSIMM14
Hopefully, this update to the NIST SP 800-218 mapping to BSIMM can help you find the right BSIMM activities.
Building Security In Maturity Model (BSIMM) is a data-driven model developed through analysis of real-world software security initiatives. The BSIMM report represents the latest evolution of this detailed model for software security.