Cyber Resilience Review
The Cyber Resilience Review (CRR)[1] is an assessment method developed by the United States Department of Homeland Security (DHS). It is a voluntary examination of operational resilience and cyber security practices offered at no cost by DHS to the operators of critical infrastructure and state, local, tribal, and territorial governments. The CRR has a service-oriented approach, meaning that one of the foundational principles of the CRR is that an organization deploys its assets (people, information, technology, and facilities) to support specific operational missions (or services). The CRR is offered in a facilitated workshop format and as a self-assessment package.[2] The workshop version of the CRR is led by a DHS facilitator at a critical infrastructure facility. The workshop typically takes 6–8 hours to complete and draws on a cross section of personnel from the critical infrastructure organization. All information collected in a facilitated CRR is protected from disclosure by the Protected Critical Infrastructure Information Act of 2002. This information cannot be disclosed through a Freedom of Information Act request, used in civil litigation, or be used for regulatory purposes.[3] The CRR Self-Assessment Package [4] allows an organization to conduct an assessment without the need for direct DHS assistance. It is available for download from the DHS Critical Infrastructure Cyber Community Voluntary Program website.[5] The package includes an automated data answer capture and report generation tool, a facilitation guide, comprehensive explanation of each question, and a crosswalk of CRR practices to the criteria of the National Institute of Standards and Technology (NIST) Cybersecurity Framework. )[6] )[7] The questions asked in the CRR and the resulting report are the same in both versions of the assessment. DHS partnered with the CERT Division of the Software Engineering Institute at Carnegie Mellon University to design and deploy the CRR. The goals and practices found in the assessment are derived from the CERT Resilience Management Model (CERT-RMM) Version 1.0.[8] The CRR was introduced in 2009 and received a significant revision ion 2014. [9]
Architecture
The CRR comprises 42 goals and 141 specific practices extracted from the CERT-RMM and organized in 10 domains):[10]
- Asset Management
- Controls Management
- Configuration and Change Management
- Vulnerability Management
- Incident Management
- Service Continuity Management
- Risk Management
- External Dependency Management
- Training and Awareness
- Situational Awareness
Each domain is composed of a purpose statement, a set of specific goals and associated practice questions unique to the domain, and a standard set of Maturity Indicator Level (MIL) questions. The MIL questions examine the institutionalization of practices within an organization.The performance of an organization is scored against a MIL scale.[11] This scale depicts capability divided into five levels: MIL1-Incomplete, MIL2-Performed, MIL3-Managed, MIL4-Measured, and MIL5-Defined. Institutionalization means that cybersecurity practices become a deeper, more lasting part of the organization because they are managed and supported in meaningful ways. When cybersecurity practices become more institutionalized—or “embedded”—managers can have more confidence in the practices’ predictability and reliability. The practices also become more likely to be sustained during times of disruption or stress to the organization. Maturity can also lead to a tighter alignment between cybersecurity activities and the organization’s business drivers. For example, in more mature organizations, managers will provide oversight to the particular domain and evaluate the effectiveness of the security activities the domain comprises. The number of goals and practice questions varies by domain, but the set of MIL questions and the concepts they encompass are the same for all domains. All CRR questions have three possible responses: “Yes,” “No,” and “Incomplete. The CRR measures performance of an organization at the practice, goal, domain, and MIL levels. Scores are calculated for each of individual model elements and in aggregated totals. The scoring rubric establishes the following:
- Practices can be observed in one of three states: performed, incomplete, and not performed.
- A domain goal is achieved only if all of the practices related to the goal are achieved.
- A domain is fully achieved only if all the goals in the domain are achieved.
If the above conditions are met, the organization is said to be achieving the domain in a performed state: the practices that define the domain are observable, but no determination can be made about the degree to which these practices are
- repeatable under varying conditions
- consistently applied
- able to produce predictable and acceptable outcomes
- retained during times of stress
These conditions are tested for by applying a common set of 13 MIL questions to the domain, but only after MIL1 is achieved. Consistent with the architecture of the MIL scale, MILs are cumulative; to achieve a MIL in a specific domain, an organization must perform all of the practices in that level and in the preceding MILs. For example, an organization must perform all of the domain practices in MIL1 and MIL2 to achieve MIL2 in the domain.
Results
CRR participants receive a comprehensive report containing results for each question in all domains. The report also provides graphical summaries of the organization’s performance at the goal and domain levels, depicted in a heat-map matrix. This detailed representation allows organizations to target improvement at a fine-grained level. Organizations participating in facilitated CRRs receives an additional set of graphs depicting the performance of their organization compared to all other prior participants. The CRR report includes a potential path toward improving the performance of each practice. These options for consideration are primarily sourced from the CERT-RMM and NIST special publications. Organizations can also use CRR results to measure their perform in relation to the criteria of the NIST Cybersecurity Framework. This correlation feature was introduced in February 2014.[12]
References
- ↑ "Cyber Resilience Review Fact Sheet". Retrieved 27 February 2015.
- ↑ "Cyber Resilience Review (CRR)". Retrieved 27 February 2015.
- ↑ "PCII Fact Sheet". Retrieved 27 February 2015.
- ↑ "Cyber Resilience Review (CRR)". Retrieved 27 February 2015.
- ↑ "DHS Cyber Community Voluntary Program". Retrieved 27 February 2015.
- ↑ "NIST Cybersecurity Framework Sheet". Retrieved 27 February 2015.
- ↑ "Cyber Resilience Review-NIST Cybersecurity Framework Crosswalk". Retrieved 27 February 2015.
- ↑ Caralli, R., Allen, J.,& White, D. (2010) "CERT Resilience Management Model Version 1". Software Engineering Institute, Carnegie Mellon University.
- ↑ Mehravari, N.(2014) "Resilience Management Through the Use of CERT-RMM and Associated Success Stories". Software Engineering Institute, Carnegie Mellon University.
- ↑ "Cyber Resilience Method Description and User Guide". Retrieved 28 February 2015.
- ↑ Butkovic, M.,& Caralli, R. (2013) "Advancing Cybersecurity Capability Measurement Using the CERT-RMM Maturity Indicator Level Scale". Software Engineering Institute, Carnegie Mellon University.
- ↑ Strassman, P. 2014 September 8"Cyber Resilience Review". Strassmann's Blog.