Using Military Grade Security in Traditional Business
It is one thing to talk about securing a system, but quite another when determining how much and to what depth security should be applied. All too often we talk about securing something, but do not necessarily do so in a proactive manner based on a consistent model. Moreover, one that takes into consideration of the entire system, not just the server, but the network, interactions with other systems, applications, and data stores. DIACAP is an evolutionary approach to certification an accreditation that sets a common criteria of security that takes into account the broad, interconnected nature of today’s technology infrastructures.
Risk management is a huge part of security and is the foundation for many in the allocation of security investments and security related activities. Although a proven process, you find there are hundreds of different methods in performing a risk assessment and even more concerning the interpretation of the results. What seems to be lacking for some is a framework by which risk determinations can be made actionable and more importantly, act as a feedback mechanism for ensuring the desired security posture over time. Many enterprise organizations are good at risk management and have developed or tuned existing models to their own needs and culture. As risk assessments are performed they can gain better visibility into risks as they relate to their business and use as the foundation for implementing or maintaining security. However, interestingly not all have a model to translate risk results into tangible security. For example, a risk assessment result may be “5”, assuming for the moment that’s bad and something needs to be done, to which everyone agrees, but expressing exactly what needs to be done is still elusive and open to interpretation.
DIACAP provides for a framework for defining a system’s security level. This is defined as the Mission Assurance Category (MAC) defined in DoDD 8510.01E and is assigned to all DoD systems. The MAC reflects the importance of the system related to the objectives of the organization, its criticality, and the confidentiality of the information it is managing. In DoD terms, the importance of an information system for the successful completion of a DoD mission. In enterprise terms, this is the importance of a system in ensuring the success of the company and in meetings its goals. The MAC assignment, for which there are three levels defined by the DoD, defines the minimal system requirements for availability and integrity. In short, MAC’s are driven from the system description, such as vital to mission success, important for mission support, or not necessary for mission or mission support activities and is related to day-to-day operations, which are then compared to definitions of integrity and availability. From these the level of protection measures is categorized.
Within the enterprise we’re starting to see more and more effort around data classification. In fact, a great number of DLP (Data Loss/Leak Protection/Prevention) solutions are initially implemented to discover, locate, and categorize information assets, eventually moving into a control – its original intent. Of course, data classification is important so that businesses can understand where important information is, who is using it, and where it is going, all driving more informed decisions on security investments. Think of it as an inside out approach to risk. Unfortunately, the identification and classification of information is extraordinarily difficult in the enterprise. Data is very fluid, changes in state many times, and, more importantly, may change in value over time. Not to mention that value is the eye of the beholder.
The DoD starts with a different tact and decided to initially emphasize the system that is housing and processing information relative to mission. It’s slightly less about the information specifically and more so about any information relative to a mission on one or more systems that is afforded a specific level of protection. OK… allow me a moment to elaborate. DoD is concerned about information in the traditional since, I don’t want to you to think that has changed. However, they have realized the interconnected nature of systems and the fluctuations of information value. Therefore, this can be seen as protecting the information’s mission-enabling environment regardless of the actual information. By existing within the system of interest the information is assumed to have a level of value as opposed be restricted to its specific state or content. Therefore, one can rightly assume that not all the information on a system needs the level of security applied. However, the value of that information may change given the system’s role in the mission. Therefore, the decision is an amalgamation of information and system role.
This, of course, is nothing new, especially in the government and DoD. For example, the NSA’s Information Assurance Training and Rating Program’s (IATRP) (which was decommissioned late 2009 and revitalized by Ed Fuller into ISATRP) INFOSEC Assessment Methodology (IAM) is founded on defining the system, and all that this implies. It’s starts with what information is important and what would be the impact if lost, stolen or damaged at a high level. This isn’t about word documents and spreadsheets, it’s “My customer’s information and records are critical and if stolen would destroy my company.” Then the IAM moves to quantifying the system (servers, networks, applications, databases, etc.) that are predominately used in the production and management of such information. From there the state of that system is determined and security controls put in place relative to the impact (i.e., criticality) that system represents based on the information. (For the record, it’s very difficult to summarize something so comprehensive in a paragraph. If you want to know more, read my friend Ed’s book, Security Assessment: Case studies for Implementing the NSA IAM). The point is that given the fluidity of data, we can start to have a meaningful impact by starting with quantifying the system – and this is the foundation of DIACAP and has proven (via NSA IAM) extraordinarily useful in the enterprise.
Within the enterprise this is performed in different ways. For example, when building a system to be introduced into the business we start with what is expected of that system, which is typically not aligned with a business role and mission definition standard, and more about performance or sales and marketing, etc. Then we put information on the system, which then provides some of the base elements of performing a risk assessment that in turn drives security decisions. Comparatively, the DoD starts with the system’s role and relevance to the mission. This one act alone sets in motion a meaningful path to security. From this we understand the system’s importance, type of information that can be placed on the system, applicable threats, and a myriad of other characteristics that ensure security and the system are aligned. Why this is possible is because the definition of the system and all the things discussed are pre-defined and organized with clear responsibilities of management and system owners. It is this final element that does not exist in many enterprise organizations today. Yes, we have standard builds and security standards for each, but these are very much directed at the initialization of a system and, more importantly, are typically about a “system”, as in a server and may not take into account the environment.
In addition to being assigned a MAC, DoD defines a confidentiality level (CL). The CL is based on the classification of information, such as classified, sensitive, and public that is to be on the system. The combination (very important word) of MAC and CL define the security controls (e.g., IA Controls) needed. I say “combination” is important because you can have a mission critical system with public information (technically speaking). Within the private sector these are rarely intermingled and one will usually take precedence. Or worse, the system definition is used as the guide to information classification. In working with one company it was found that the systems with the most security and BCDR controls did not in fact house the information that was critical to the company. IT was assumed that the “X” system, which was used for product management was the critical IT feature, but in fact the heart of the business was in information on a few servers in some far off closet.
DIACAP organizes IA controls into eight categories. To be an IA control it must have certain characteristics, such as: testable, as in the ability to validate its existence and effectiveness; measurable, as in the ability to quantify compliance with the DIACAP program; assignable, as in the implementation, management, testing, and measuring of a control must be able to be assigned to an individual; and finally accountable. Seeing that an individual can be assigned the responsibility of the IA control, this in turn demands accountability of the control. Within the private sector many of these characteristics that must exist to be defined as an IA control may be practiced on purpose or organically. For example, the ability to test a control seems a natural characteristic for a control. However, we start to see areas where private sector companies start to breakdown, such as measurable, assignable, and most importantly, accountability. Measuring controls in the enterprise is getting much better and is becoming core to risk management. We also see some activities in assigning the control management to people, mostly in the form of groups, which have management. However, accountability is far more rare. I’ve been in my share of meetings when it was impossible to determine who was accountable for a control. Yes, of course a manager would eventually step up and say it was in their “domain of responsibility”, but that’s lack luster at best. I think there is a great deal that can be gained by implementing MAC, CL, and IA Control frameworks that reach all the way past what and how to who.
As part of DIACAP, there are a number of systems and materials dedicated to assisting DoD organizations in the implementation of DIACAP and the IA Controls. Some of these you can use, such as the STIGs. No, not the anonymous race car driver on Top Gear, but the Security Technical Implementation Guides. The STIGs can provide a great deal of information, guidance, check lists, and even tools to help with implementing security controls. Moreover, these dovetail nicely with the NSA Guides and NIST CSRC’s Special Publications (SP-800 series). You can easily download an ISO of all the STIGs and tools, it’s freely available and a new image was published recently.
The DIACAP implementation is broken into three areas: DIACAP processes used in the certification and accreditation, the DIACAP Knowledge Services (KS) that house tools and standards for applying security controls, and the automated certification and accreditation (C&A) process used to support, manage, and report on DIACAP implementation and status, which is the Enterprise Mission Assurance Support Services (eMASS) system. The C&A process provides a mechanism to manage risk. For example, once a system is certified, based on validation processes, you must determine the residual risk and if this is acceptable the system is accredited. In short, risk management is a critical factor to the entire DIACAP processes. Therefore, this provides the first interlink to how this can be applied within the enterprise. Of course, these systems, such as eMASS are not for use in the private sector (not yet anyway), the purpose of thee systems can be readily built for the enterprise. Microsoft Sharepoint is a good start, Archer provides a security product that can be tuned to this need, and there are several other publically available products that can be used to create a system for the management of system security.
The critical part, as with all things, is management, specifically program management. Once you organize the security program to support system security using some or all of the methods I’ve shared, implementation is 99% oversight. Security, in many ways, is like painting a bridge, a constant process and not necessarily an end-state. Therefore, consistency and continuous management is essential.
There is so much to DIACAP and I tried hard to give it some justice with this short series. It is a very comprehensive process, but it is, as I’ve stated, elegant, something desperately needed in the traditional enterprise. Everything about DIACAP pretty much makes sense. More importantly, in my study of DIACAP I’ve found that what companies need today is real, solid justification for security investments. Moreover, business owners want more tangible information concerning existing investments and wanting to know of the money they’ve already spent is living up to expectations. These expectations not only include advertised effectiveness, but also how well the money was applied, how efficient security in functioning as an organization. Interestingly, as a government program you’d think features such as effectiveness, efficiency, and investment-conscious information would be impossible to extract, but the reality is the focus on ensuring the correct security for a system and information based on DIACAP provides far more “business” visibility than what most people are practicing today. While the government is concerned with appropriate security being applied and managed with arguably little focus on cost, the underlying reality is that is can be readily aligned to cost, and in business terms.
Today, security organizations fight hard for every dollar and use risk and compliance management as a primary method for garnering attention and funds. And in most cases it’s an “after the fact” defensive posture for security. It’s very difficult to justify security, especially with a reactive posture. By taking the lead on quantifying security in ways the business can understand and supporting that approach with a well-organized model that addresses the entire lifecycle, the business can be assured that they are spending exactly what is necessary to on an area of the environment that requires it.
Of course risk management is a feature in DIACAP, it is not the only feature. Risk is open to interpretation and there are a number of factors that make the results of a risk assessment questionable. Again, it is a “FUD”-based approach to security and does not necessarily give the business the entire picture of what is truly needed. However, using the risk management you have today to organize a model for categorization within a holistic system to support its entire lifecycle lends greater credibility to the security group and demonstrates that there is a well-defined plan to apply what is needed, not what is assumed, best practice, or what others are doing.
Even if you think I’ve lost my marbles or assume you’re doing all the things I’ve explained already, take the time to do some more research. Learn more about DIACAP and the nuances I haven’t covered. If you’re a business-minded security professional, you’ll quickly see past the big-brother, red-tape parade and begin to see the underlying genius of it. There are parts of DIACAP I can see would be too expensive or simply too difficult to realize in a business environment, but you don’t have to do it exactly as prescribed. Take a complete look at what is there and I think you’ll be surprised and what you can put into practice in your company.
Lastly, here is a set of links I recommend to learn more:
One or more comments are waiting for approval by an editor.