The Official (ISC)2 SSCP CBK Reference. Mike Wills. Читать онлайн. Newlib. NEWLIB.NET

Автор: Mike Wills
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Зарубежная компьютерная литература
Год издания: 0
isbn: 9781119874874
Скачать книгу
together correctly, and do so in reliable, repeatable, and deterministic ways for the overall system to have integrity.

      When we measure or assess information systems integrity, therefore, we can think of it in several ways.

       Binary: Either our information system has integrity or it does not. We can rely upon it or we cannot.

       Threshold-based: Our information system has at least a minimum level of systems and information integrity to function reliably but possibly in a degraded way, either with higher than desired (but still acceptable) error rates or at reduced transaction throughput or volume levels.

      Note that in all but the simplest of business or organizational architectures, you'll find multiple sets of business logic and therefore business processes that interact with each other throughout overlapping cycles of processing. Some of these lines of business can function independently of each other, for a while, so long as the information and information systems that serve that line of business directly are working correctly (that is, have high enough levels of integrity).

       Retail online sales systems have customer-facing processes to inform customers about products, services, and special offers. Their shopping cart systems interact with merchandise catalog databases, as well as with order completion, payment processing, and order fulfillment. Customer sales order processing and fulfillment can occur—with high integrity—even though other systems that update the catalogs to reflect new products or services or bring new vendors and new product lines into the online store are not available.

       Computer-aided manufacturing systems have to control the flow of materials, parts, subassemblies, and finished products on the factory floor, interacting with logistics and warehousing functions on both the input and output sides of the assembly line. These systems are typically not tightly coupled with the functions of other business elements, such as finance, sales and marketing, or personnel management, even though at some point the assembly line grinds to a halt if finance hasn't paid the bills to suppliers in a timely way.

       REAL WORLD EXAMPLE: Trustworthiness Is Perceptual

      You make a decision to trust in what your systems are telling you. You choose to believe what the test results, the outputs of your monitoring systems, and your dashboards and control consoles are presenting to you as “ground truth,” the truth you could observe if you were right there on the ground where the event reported by your systems is taking place. Most of the time, you're safe in doing so.

      The operators of Iran's nuclear materials processing plant believed what their control systems were reporting to them, all the while the Stuxnet malware had taken control of both the processing equipment and the monitoring and display systems. Those displays lied to their users, while Stuxnet drove the uranium processing systems to self-destruct.

      An APT that gets deep into your system can make your systems lie to you as well. Attackers have long used the techniques of perception management to disguise their actions and mislead their targets' defenders.

      Your defense: Find a separate and distinct means for verifying what your systems are telling you. Get out-of-band or out-of-channel and gather data in some other way that is as independent as possible from your mainline systems; use this alternative source intelligence as a sanity check.

      Integrity applies to three major elements of any information-centric set of processes: to the people who run and use them, to the data that the people need to use, and to the systems or tools that store, retrieve, manipulate, and share that data. Note, too, that many people in the IT and systems world talk about “what we know” in four very different but strongly related ways, sometimes referred to as D-I-K-W.

       Data consists of the individual facts, observations, or elements of a measurement, such as a person's name or their residential address.

       Information results when you process data in various ways; information is data plus conclusions or inferences.

       Knowledge is a set of broader, more general conclusions or principles that you've derived from lots of information.

       Wisdom is (arguably) the insightful application of knowledge; it is the “a-ha!” moment in which you recognize a new and powerful insight that you can apply to solve problems with or take advantage of a new opportunity—or to resist the temptation to try!

      Figure 1.1 illustrates this knowledge pyramid.

Schematic illustration of the DIKW knowledge pyramid

       FIGURE 1.1 The DIKW knowledge pyramid

      Professional opinion in the IT and information systems world is strongly divided about data versus DIKW, with about equal numbers of people holding that they are the same ideas, that they are different, and that the whole debate is unnecessary. As an information security professional, you'll be expected to combine experience, training, and the data you're observing from systems and people in real time to know whether an incident of interest is about to become a security issue, whether your organization uses knowledge management terminology like this or not. This is yet another example of just how many potentially conflicting, fuzzy viewpoints exist in IT and information security.

      Availability

      Is the data there when we need it in a form we can use?

      We make decisions based on information; whether that is new information we have gathered (via our data acquisition systems) or knowledge and information we have in our memory, it's obvious that if the information is not where we need it when we need it, we cannot make as good a decision as we might need.

       The information might be in our files, but if we cannot retrieve it, organize it, and display it in ways that inform the decision, then the information isn't available.

       If the information has been deleted, by accident, sabotage, or systems failure, then it's not available to inform the decision.

      Note that availability means something different for a system than it does for the information the system produces for us. Systems availability is measurable, such as via a percentage of capacity or a throughput rate. Information availability, by contrast, tells us one of three things.

       Yes, we have what we need to know to make this decision or take this action.

       No, we do not have what we need to know, so we have to decide blindly.

       We have some of what we need to know, and we cannot logically infer that what's missing won't cause our decision to be wrong and lead us to harm.

      Accountability

      Information and information systems represent significant investments by organizations, and as a result, there's a strong bottom-line financial need to know that such investments are paying off—and that their value is not being diluted due to loss of control of that information (via a data breach or exfiltration) or loss or damage to the data's integrity or utility. Organizations have three functional or operational needs for information regarding accountability. First, they gather information about the use of corporate information and IT systems. Then they consolidate, analyze, and audit that usage information. Finally, they use the results of those reviews to inform decision-making. Due diligence needs, for example, are addressed by resource chargeback, which attributes the per-usage costs