0% found this document useful (0 votes)
88 views50 pages

CS 564AR Lecture 11 Fall 05

1. The document discusses risk analysis and management for IT security projects. It defines key terms like risk assessment, risk identification, risk analysis, and risk control. 2. Methods of risk assessment are described, including OCTAVE, SRM, and CRAMM. Quantitative and qualitative approaches are compared. Standards for information security management are also reviewed. 3. Strategic models and methods for IT security risk management are outlined, including techniques for risk exposure estimation and balancing competing risks. Practical approaches to strategic risk analysis and management are emphasized.

Uploaded by

John Arthur
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views50 pages

CS 564AR Lecture 11 Fall 05

1. The document discusses risk analysis and management for IT security projects. It defines key terms like risk assessment, risk identification, risk analysis, and risk control. 2. Methods of risk assessment are described, including OCTAVE, SRM, and CRAMM. Quantitative and qualitative approaches are compared. Standards for information security management are also reviewed. 3. Strategic models and methods for IT security risk management are outlined, including techniques for risk exposure estimation and balancing competing risks. Practical approaches to strategic risk analysis and management are emphasized.

Uploaded by

John Arthur
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 50

CS 564 Software Requirements Engineering Lecture 11

Professor Larry Bernstein End Game: Novem er !": Lecture 11 #ecem er $: Lecture 1% #ecem er 14: &ina' E(am Please study all handouts as they supplement the material in the prerelease of the textbook I gave you. Risk Analysis

Ris) eva'uation wit* discounted cas* f'ow ana'+sis


Condier this product ith three phases! ! Phase "! R#$% &"'( ) year for * years% probability of success at the end ! +,Phase *! (arket $evelopment% &",( ) year for * years starting second year Phase .! /ales% . possible scenarios% starting year 0! ". &*0( ) year for *, years 1probability ! ..2 *. &"*( ) year for ", years 1probability ! .32 .. Abandon product 1probability ! .*2 Cash flo for this product ! 4ear " ! 5&"'( 4ear * ! 5&*'( 4ear . ! .+ 6 5&",( 4ear 05". ! 1.+ 6 .. 6 &*0(2 7 1.+ 6 .3 6 &"*(2 4ear "05*. ! .+ 6 .. 6 &*0( 8o discount the cash flo % compute today9s value of future moneys by using this formula N,- : C; ) 1"7IR2n <P= : <et Present =alue C; : Cash ;lo IR : Interest Rate 1.- for example2

< : number of years >xample ! 4ear * ! Cash flo is 5&*'(% $iscounted cash flo at .- interest rate is ! 5&*+..?( 8o get an idea of hat the pro@ect is orth one should discount cash flo s for each year and add them together. Rate of return : <et Profits ) <et Costs Ao do e estimate future salesB Chat products ill take off B A blend of sales and R#$ talent should be used ! R#$ favors disruptive changes but may be too disruptive /ales favors continuous improvement but may miss big ne opportunities >xample ! DA tape recorder that does not recordE Chat became kno n as D alkmanE has been at first snubbed by /ony9s salespeople ho did not see its potential% hile R#$ people ere able to imagine ne uses for the product and push to finally make it happen. 1should be at least ",-2

.t*er factors increasing ris)


". *. .. 0. 3. +. G. '. >xcessive /chedule Pressure 1+3- of pro@ects2 (anagement (alpractice Inaccurate and InadeFuate (etrics Poor cost >stimates /ilver Bullet /yndrome Creping ;eatures Huality /iIe

Ris) #o/s and #on/t


$on9t overestimate the risks ! too much contingency planning $on9t underestimate the risks ! leads to panic management later $on9t look for scapegoats $o deal only ith the top ", priorities% as they get solved add to the list.

0uantitative Com1utation

P1>2 : m)n P : Probability m : favorable events n : total events Risk : "5P1>2 Risk >xposure : Risk 6 Costs 8he /piral (odel reFuires a risk analysis after the prototype 1"st cycle2.

Risk Management for IT Security


Rick JaIman% $an Port% Information Technology Management, University of Hawaii $avid JlappholI% Computer Science, Stevens Institute of Technology

". Introduction 1.1 Security Risk Assessment 1.2 IT Security Risk Control 1.3 Risk Management in Practice 2. Risk Assessment Methodologies 2.1 OCTA ! 2.2 SRM" 2.3 #RAP 2.$ %uantitati&e ersus %ualitati&e A''roaches .. (anagement of Information /ecurity /tandards 3.1 TCS!C( ITS!C( CTCP!C( Common Criteria( and ISO 1)$*+ 3.2 ,S --..( ISO 1--..( and ISO TR 1333) /0MITS1 3.3 2IPAA 3.$ SS!3CMM( and ISO4I!C 21+23.) 5IST 0uidance "ocuments 0. Risk (odels $.1 "e6initions $.2 Strategic Risk Models $.3 Strategic Risk Management Methods $.$ The 5eed 6or Strategic Risk Management Methods 3. Practical /trategic Risk (odels ).1 Multi3techni7ue Strategic Methods ).2 Strategic "ecision Making and Com'eting Risks ).3 Risk o6 "elay ).$ ,alancing Com'eting Risks 6or Strategic Planning ).) 8nsuita9le S:eet S'ots ;. Practical Risk !<'osure !stimation ;.1 %ualitati&e Methods ;.2 !m'irical a''roaches ;.3 Pit6alls to A&oid -. Summary

2e+ words: risk% security risk% risk assessment% risk control% risk management% risk exposure% strategic methods
3 stract! $ealing ith risk is critical to the success of any engineering or business endeavor. Considering the nature of I8 and considering recent events% this is especially true in the case of risks to I8 security. Ce define the various notions associated ith the assessment and management of risk in general and of I8 security risk in particular% and provide both concrete examples of I8 security risks and categoriIations of ell5kno n risks. Ce also revie the various I8 guidelines and standards that have I8 security risk as ma@or components. ;inally% e detail approaches to dealing ith I8 security risk% ith an emphasis on strategic approaches.

1. INTRODUCTION
According to KCarr?.L% risks must be managed% and risk management must be part of any mature organiIation9s overall management practices and management structure. 8he primary activities identified by KCarr?.L for managing risk are! Identify! risks must first be identified before they can be managed. Analyze! risks must be analyIed so that management can make prudent decisions about them. lan! for information about a risk to be turned into action% a detailed plan% outlining both present and potential future actions% must be created. 8hese actions may mitigate the risk% avoid the risk% or even accept the risk. Trac!! risks% hether they have been acted upon or not% must be tracked% so that management can continue to exercise diligence. Control! even if a risk has been identified and addressed% it must be continually controlled% to monitor for any deviations. 8he key activity tying all of these together is assessment. Assessment is considered central to the risk management process% underlying all of the other activities. ;or the purposes of exposition% e ill follo the generic risk taxonomy sho n in ;igure "" KBoehm?"L. In this taxonomy the activity of risk management has t o ma@or sub5 activities! ris! assessment and ris! control" Risk assessment is further divided into ris! identification% ris! analysis% and ris! prioritization. Risk control is divided into ris! management planning% ris! resolution% and ris! monitoring. Chile e ill broadly discuss several areas of risk management% our focus in this chapter is primarily on risk assessment% as it applies to I8 security. Assessment is the starting point and forms the fundamental basis for all risk management activities. (any risk assessment methods and techniFues have directly analogous application to risk control. In such cases e ill note this is the case ithout elaboration. 8he terminology used in the field of risk management varies some hat among the different business and engineering areas in hich it is used 1e.g. see KCarr?.% Aall?'% Boehm?"L2. It even varies among riters in the field of I8 security risk management. 8he generic risk management concepts that e have @ust introduced ere created for soft are development 1of hich security is one attribute2. 8he reader familiar ith other
"

In ;igure "% for application to security the examples listed for Risk Analysis might include D/ecurity (odels% 8hreat Analysis% and =ulnerability ;actor Analysis.

orks on I8 security risk management should have little trouble seeing the direct applications. In this section e ill define terms informally ith examplesM in later sections% e ill formaliIe these definitions. Although most people are una are that they9re doing it% e all engage in risk management on a daily basis. Consider% as an example% a decision% on the ay out the door% on hether to stuff an umbrella into an uncomfortably heavy bag that ill be taken on a thirty5minute train ride% follo ed by a ten5minute alk * to the office. 8he decision is based on a Fuick% often almost unconscious% assessment of the risks involved and a decision on ho to control them.

&igure 1: 4oe*m/s Ris) 5anagement 6a(onom+ Nn the one hand% there9s the probability that the rain predicted by the 8= forecaster ill actually materialiIe% that it ill be in progress during the drive to the train station and)or during the alk% and% if all goes as badly as it might% of the damage it ould cause% from the point of vie of both alking in drenched clothing and% possibly% losing ork time during the drying5out period. Balanced against all of this% on the other hand% is the discomfort of carrying the extra eight% of the possibility of the precariously5situated umbrella9s dropping out of the bag and% as it did last eek% causing a spillage of hot
*

8his example% as ell as a number of others in this section% is taken% albeit ith considerably more detail% from K<I/8L.

carry5out coffee during the effort to pick it up. An additional consideration is the probability that carrying the umbrella ill solve the problem% a consideration that depends upon the expected strength of prevailing indsM if the ind proves to be too strong% the umbrella ill provide no relief from the rain. An alternative possibility to consider% assuming it9s an option% is to ork at home all morning and to go to the office only after the rain% or its un5materialiIed threat% has abated. 1.1 Security Risk Assessment In its typical definition% I8 security involves protection of the confidentiality% integrity% and availa#ility of data)information critical to the success of a business or government organiIation. <aturally% it also involves protection% from in@ury and death% of the people involved in dealing ith that information. 8he follo ing are examples of conseFuences that can result from materialiIation of risks in the areas of confidentiality% integrity% and availability! loss of confidentiality! o personal embarrassment resulting from theft and publication of personal financial% health% or other data% and possible prosecution and fine for individuals and the organiIation responsible for maintaining confidentiality o corporate loss of earnings resulting from theft of pre5patent technical data o loss of life of a covert intelligence agent resulting from theft and revelation of name and address loss of integrity! o personal embarrassment and% possibly% fine and imprisonment resulting from insertion into database of false financial data implicating the sub@ect in fraud or embeIIlement o loss of corporate auditors9 ability to detect embeIIlement% and attendant loss of funds% resulting from deliberate corruption of financial data by embeIIler o loss of life resulting from changes to database indicating that sub@ect is a covert agent hen s)he isn9t loss of availability! o personal embarrassment resulting from inability to keep appointments resulting from temporary inability to use electronic calendar o temporary inability of corporation to issue eekly pay checks to employees% ith attendant anger and loss of productivity% resulting from temporary unavailability of hours5 orked data o loss of initiative% armaments% and lives resulting from battlefield commander9s inability to connect to field5support database

8he terms threat% threat source% vulnera#ility% impact% and ris! e$posure are in common use in the field of I8 security risk assessment. 8heir application to the trip5to5 ork scenario is as follo s! the threat% or threat source% is the onset of rain% at a sufficiently strong level% during an exposed part of the trip to ork the vulnera#ility is the fact that the person involved ill get drenched if the threat materialiIes and the person has no form of shelter 1e.g. an umbrella2 the impact is the damage% measured in terms of discomfort and% possibly% of loss of productivity or even health% that ill occur if the threat materialiIes the ris! e$posure is an assessment% on either a numerical% perhaps monetary% scale or an ordinal scale O e.g.% lo % medium% or high O of the expected magnitude of the loss given the threat% the vulnerability to it% and its impact% should it threat materialiIe. In this example the risk exposure might change if the person is earing a ater5resistant coat.

8he first step in risk assessment is ris! identification% i.e.% identification of potential threats% of vulnerabilities to those threats% and of impacts that ould result should they materialiIe O all of hich e9ve already done for the scenario under discussion. In the trip5to5 ork scenario% e are concerned ith such intangibles as the threat of rain% the vulnerability of getting drenched% and the impact of discomfort and ith such tangibles as umbrellas. In the field of I8 security% e are concerned ith systems that store% process% and transmit data)information. Information systems are sometimes localiIed% and sometimes idely distributedM they involve computer hard are and soft are% as ell as other physical and human assets. 8angibles include the various sorts of eFuipment and media% and the sites in hich they and staff are housed. Intangibles include such notions as organiIational reputation% opportunity or loss of same% productivity or loss of same% etc. 8hreat sources are of at least three varieties! natural% human% and environmental. >xamples are! natural! electrical storms% monsoons% hurricanes% tornadoes% floods% avalanches% volcanic eruptions% human! incorrect data entry 1unintentional2% forgetting to lock door 1unintentional2% failure to unlock door to enable confederate to enter after hours 1intentional2% denial of service attack 1intentional2% creation and propagation of viruses 1intentional2% environmental! failure of roof or all due to use of bad construction materials% seepage of toxic chemicals through ceiling% po er outage.

=ulnerabilities have various sources% including technical failings such as those reported in the public and professional presses on a daily basis.. ;red Cohen provides an excellent% extensive taxonomy of threats and vulnerabilities in his /ecurity $atabase
.

A compilation of technical threats may be found at http))!icat.nist.gov or http!))

.cert.org

KCohen,0L. Nne uniFue aspect of this database is that the threats 1or causes2 are cross5 referenced against the attack mechanisms to provide a linkage bet een the cause and the mechanisms used. 8he attack mechanisms are also cross5referenced against the defence mechanisms to indicate hich mechanisms might be effective in some circumstances against those attack mechanisms. 8he second step in risk assessment is ris! analysis% i.e.% estimation and calculation of the risk likelihoods 1i.e. probabilities2% magnitudes% impacts and dependencies. 8his is easy in the case of monetary impacts arising from threat5vulnerability pairs hose probability of materialiIing can reasonably be computed% but considerably harder in most other cases. /pecial care must be taken hen assigning the likelihoods as the Fuality of the hole risk assessment is strongly dependent on the accuracy and realism of the assigned probabilities. 8he final step in risk assessment is risk prioritiIation% that is prioritiIing all risks ith respect to the organiIation9s relative exposures to them. It is typically necessary to utiliIe techniFues that enable ris! comparison such as calculating risk exposure in terms of potential loss. In the trip5to5 ork scenario% risks other than the one discussed above might include the risks associated ith not buckling the seat belt during the drive to the station% the risk of an accident during the drive% the risk of missing the train% etc. A meticulous person% one ho al ays leaves the house earlier than necessary and ho is very conscious of taking safety precautions ill likely rate these ne risks as having far lo er exposures than the rain risk! a less meticulous person might do other ise. In a highly5 simplified version of a business situation% three threats might be volcanic eruption% late delivery of ra materials% and embeIIlement. An organiIation located in Chicago ould likely assign a lo er priority to volcanic eruption than ould one in south5 estern CashingtonM an organiIation hose suppliers have never before been late ould likely assign a lo er priority to late delivery than ould an organiIation using a supplier for the first time. Be a are that there may be threats or vulnerabilities you may not have included in your analysis. ;or this reason% you should dra upon the experiences of others to help building a library of threats and vulnerabilities. 1.2 IT Security Risk Control $uring the risk control phase of the risk management process% e are concerned ith safeguards% also kno n as controls. /afeguards fit into at least three categories! technical% management% and operational% ith examples as follo s! technical! authentication 1prevention2% authoriIation 1prevention2% access control 1prevention2% intrusion detection 1detection2% audit 1detection2% automatic backup 1recovery2% etc. management! assignment of guards to critical venues 1prevention2% institution of user account initiation and termination procedures 1prevention2% institution of need5to5kno data access policy 1prevention2% institution of periodic risk re5 assessment policy 1prevention2% institution of organiIation5 ide security training 1prevention and detection2% etc.

operational! secure net ork hard are from access to any but authoriIed net ork administrators and)or service personnel 1prevention2% bolt desktop PC9s to desks 1prevention2% screen outsiders before permitting entry 1prevention2% set up and monitor motion alarms% sensors% and closed circuit 8= 1detection of physical threat2% set up and monitor smoke detectors% gas detectors% fire alarms 1detection of environmental threats2 e have considered has a 1fairly lo 5tech2 operational component% i.e.% carrying the ould be to ork at home all morning% if after the rain% or the threat of rain% has

In the trip5to5 ork scenario% the safeguard that technical component% i.e.% the umbrella% and an umbrella. An alternative operational safeguard that9s an option% and to go to the office only abated.

$uring the risk control phase of the risk management process% e consider alternative individual safeguards and)or complexes of safeguards that might be used to eliminate)reduce)mitigate exposures to the various identified% analyIed% and prioritiIed threats 1risk management planning2 perform the cost5benefit analysis reFuired to decide hich specific safeguards to employ and institute the relevant safeguards 1risk resolution2 institute a process for the continuous monitoring of the I8 security situation to detect and resolve problems% as they arise% and to decide% here and hen necessary% to update or change the system of safeguards 1risk monitoring2

In business and government organiIations% there may be many alternative possible safeguards% including different vendors9 hard are and)or soft are solutions to a particular threat or cluster of threats% alternative management or procedural safeguards% and alternative combinations of technical% management% and procedural safeguards. Complicating matters is the likelihood that different combinations of safeguards address different% overlapping% clusters of threats. Risk resolution begins ith the cost5benefit analysis of the various possible safeguards and controls in lo ering% to an acceptable level% the assessed risk exposures resulting from the various identified threats% vulnerabilities% and the attendant impacts. Pust as e had to consider threats% vulnerabilities% and impacts during risk assessment% e must consider safeguards% their costs% and their efficacies during risk resolution. In the trip5to5 ork scenario% the marginal cost of the technical component of the umbrella5carrying safeguard is likely nil as most people already o n umbrellasM the operational cost is the discomfort of carrying a heavier bag. >ven in this simple example% the safeguard9s efficacy must be considered. ;or example% if the ind proves to be too strong% the umbrella ill not effectively reduce the exposure. 1.3 Risk Management in Practice 8he potential conseFuences of the materialiIation of a significant threat to a business or government organiIation% and the obvious fact that risk management can greatly reduce those conseFuences makes it eminently clear that no such organiIation can afford not to engage in a serious risk management effort. 8he smaller the organiIation and the simpler

the threats% the less formal the organiIation9s risk management effort need be. ;or the small organiIation% e.g.% a single retail store belonging to a family% very informal risk management may suffice. In many situations legal statutes and acFuisition policies give an organiIation no choice O see /ection . of this chapter. 8he Aa thorne Principle states that productivity increases as a result of simply paying attention to orkers9 environments. It is likely% by analogy% that a simple concern ith risk management produces significant% though certainly not optimal% results. Nn the other hand% it should also be clear that risk management is rarely easy. In the case of the rare threat5vulnerability pair hose probability can easily be assigned a numerical value% hose impact can be assigned a precise monetary value% and for hich there exists a safeguard hose cost and efficacy can be pinned do n numerically% there is no problem. In other cases% those in hich one or more of the parameters can% at best be placed on an ordinal scale% matters are more complicated and approximate methods must be used. Consider% as examples the Fuantification of the impact of! personal embarrassment resulting from! theft and publication of personal financial% health% or other dataM insertion into database of false financial data implicating the sub@ect in fraud or embeIIlementM inability to keep appointments resulting from temporary inability to use electronic calendar corporate loss of earnings resulting from theft of pre5patent technical data loss of corporate auditors9 ability to detect embeIIlement% and attendant loss of funds% resulting from deliberate corruption of financial data by embeIIler temporary inability of corporation to issue eekly pay checks to employees% ith attendant anger and loss of productivity% resulting from temporary unavailability of hours5 orked data loss of life! of covert intelligence agent resulting from theft and revelation of name and address resulting from changes to database indicating that sub@ect is a covert agent hen s)he isn9tM resulting from battlefield commander9s inability to connect to field5support database.

8o aid in the identification and management of risks% a number of risk management methods and risk taxonomies have been created. 8he />I9s risk taxonomy% for example% divides risk into three classes! product engineering% development environment% and program constraints. 8he first level of decomposition of each of these classes is given in ;igure *. 37 ,roduct Engineering 47 #eve'o1ment Environment C7 ,rogram Constraints

". ReFuirements ". $evelopment Process ". Resources a. /tability a. ;ormality a. /chedule b. Completeness b. /uitability b. /taff c. Clarity c. Process Control c. Budget d. =alidity d. ;amiliarity d. ;acilities e. ;easibility e. Product Control *. Contract f. Precedent *. $evelopment /ystem a. 8ype of Contract g. /cale a. Capacity b. Restrictions *. $esign b. /uitability c. $ependencies a. ;unctionality c. Qsability .. Program Interfaces b. $ifficulty d. ;amiliarity a. Customer c. Interfaces e. Reliability b. Associate Contractors d. Performance f. /ystem /upport c. /ubcontractors e. 8estability g. $eliverability d. Prime Contractor f. Aard are Constraints .. (anagement Process e. Corporate (anagement g. <on5$evelopmental /C a. Planning f. =endors .. Code and Qnit 8est b. Pro@ect NrganiIation g. Politics a. ;easibility c. (anagement >xperience b. 8esting d. Program Interfaces c. Coding)Implementation 0. (anagement (ethods 0. Integration and 8est a. (onitoring a. >nvironment b. Personnel (anagement b. Product c. Huality Assurance c. /ystem d. Configuration (anagement 3. >ngineering /pecialties 3. Cork >nvironment a. (aintainability a. Huality Attitude b. Reliability b. Cooperation c. /afety c. Communication d. /ecurity d. (orale e. Auman ;actors f. /pecifications ;igure *! 8axonomy of /oft are $evelopment Risks 1from KCarr?.L2 8his taxonomy is used to DdriveE a risk assessment method. ;or each class 1such as Product >ngineering2% and for each element ithin that class 1such as $esign2 and for each attribute of that element 1such as Performance2% there are a set of Fuestions that serve to guide the risk analyst. $epending on the ans ers to these Fuestions% the analyst might be guided to still further Fuestions% probing the nature of the risk. ;or example% an analyst looking at performance risks ould first ask if a performance analysis has been done. If the ans er is DyesE% a follo 5on Fuestion ould ask about the level of confidence in this analysis. <ote that security is @ust one attribute% located under the D>ngineering /pecialtiesE element. Clearly% to be able to manage security risks e need to delve more deeply into the elements and attributes that are particular to security. Ce ill give some examples of risk management methods tailored for security in section *.

2. Risk Assessment Methodologies

According to the I/N "GG?? Information /ecurity /tandard KI/NL% risk assessment consists of! Assessment of threats to% impacts on and vulnerabilities of information and information processing facilities and the likelihood of their occurrences and risk management consists of the! Process of identifying% controlling and minimiIing or eliminating risks that may affect information systems% for an acceptable cost. R 8he bulk of this chapter thus far has been addressing I8 security risk assessment. <o e turn to the broader topic of integrating security risk assessment ith security risk management. 8he notion of risk management as introduced in /ection ". In this section e ill examine and compare some of the most common and idely used risk assessment methods for security management. 8hese are! (icrosoft9s /ecurity Risk (anagement $iscipline 1/R($2 K/R($L% 8he NC8A=> 1Nperationally Critical 8hreat% Asset% and =ulnerability >valuation2 method% developed by the /oft are >ngineering Institute and the C>R8 1Computer >mergency Response 8eam2 at Carnegie (ellon Qniversity KAlberts,*L% and the ;acilitated Risk Assessment Process% developed by Peltier KPeltier,.L. 8o facilitate our comparison of these methods% e ill look at their steps and their organiIation. /pecifically e ill examine ho they! 1. establish a conte$t and goals for the analysis% 2. focus the inFuiry% 3. perform the analysis% and 4. close the loop% tying analysis outcomes back to their original goals. By mapping these categories onto the activities to the analysis methods% here each method places its emphasis. e can see

2.1 CTA!" 8he NC8A=> approach KNC8A=>,0L describes a family of security risk evaluation methods that% unlike many other security analysis methods% are aimed at finding organizational risk factors and strategic risk issues% by examining an organiIation9s security practices KAlberts,*L. 8he focus of NC8A=> is to enable an organiIation to consider all dimensions of security risk so that they can determine their strategic best practices rather than to find specific security risks ithin specific systems. 8he NC8A=> approach thus needs to consider an organiIation9s assets% threats% and vulnerabilities 1as any security method ould2% but in addition it asks the stakeholders to explicitly consider and evaluate the organiIational impact of security policies and practices. ;or this reason an organiIation9s evaluation team must be multidisciplinary% consisting of both technical personnel and management. 8he NC8A=> process is organiIed into three phases that are carried out in a series of orkshops. In Phase "SBuild Asset5Based 8hreat ProfilesSthe team first determines the context and goals for the analysis by describing the information5related assets that

they ant to protect. 8hey do this via a set of structured intervie s ith senior management% operational management% I8 staff% and general staff. 8he team then catalogues the current practices for protecting these assets. 8he NC8A=> approach then focuses the inFuiry by selecting the most important of these assets as critical assets% hich are the sub@ect of the remainder of the analysis. ;or each critical asset the evaluation team identifies a set of threats to these assets. In Phase *SIdentify Infrastructure =ulnerabilitiesSthe analysis team performs the analysis% by first identifying a set of components that are related to the critical assets and then determining the resistance 1or vulnerability2 of each component to being compromised. 8hey do this analysis by running tools that probe the identified components for kno n vulnerabilities. ;inally% in Phase .S$evelop /ecurity /trategy and PlansSthe team closes the loop. 8hey examine the impact of the threats associated ith each of the critical assets% based on the Phase * analysis% using a common evaluation basis 1for example% a determination of DhighE% DmediumE% or Dlo E impact2. Based on these evaluations the team determines a course of action for each! a risk mitigation plan. Instead of merely determining a tactical response to these risks% the goal of the NC8A=> analysis is to determine an organiIational Dprotection strategyE for the critical assets. As an approach that is aimed at strategic organiIation5 ide risk reduction% NC8A=> also includes activities to ensure that the organiIation monitors and improves its process. 8hese risk reduction activities revolve around planning% in detail% ho to implement the protection strategy% implementing the plan% monitoring the plans% as they are being implemented% to ensure that they are on schedule and to ensure that they are effective% and finally correcting any problems encountered. 8hus the three phases of the NC8A=> approach can be seen as part of a larger picture% consisting of the activities sho n in ;igure .. Identify AnalyIe Plan Implement (onitor Control &igure !: 6*e .C63-E Life C+c'e 8he NC8A=> approach has been instantiated in t o methods to date! NC8A=> and NC8A=>5/. 8he difference bet een the methods is that NC8A=> is aimed at large Risk (anagement Activities NC8A=> Activities

organiIations ith large% complex information security reFuirements and infrastructures. NC8A=>5/% on the other hand% is aimed more at smaller organiIations 1or smaller sub5 units of large organiIations2 ith simpler information security needs. 2.2 SRM# 8he /ecurity Risk (anagement $iscipline 1/R($2 K/R($,0L provided by (icrosoft Corporation combines ideas from (icrosoft9s solutions frame ork 1a set of process guidelines for delivering effective soft are technology5centric solutions2 and their operations frame ork% hich guides organiIations to make their systems more manageable% available% and supportable. 8he /R($% as its name implies% it meant to assess and mitigate or manage security risks over a system9s entire lifecycle. As such% the /R($ is meant to be proactive and continuous. It is meant to permeate all decision5 making% rather than being a method that one enacts periodically. 8he /R($% like the NC8A=> methods% is divided into three Dprimary processesE. 8he first process% entitled Assessment% focuses first on identifying the assets that are of value to an organiIation% and assigning a specific value to each of those assets. 8his establishes the context and goals for the analysis. <ext% security risks that might impact upon those assets are brainstormed and analyIed. 8his process involves identifying threats% vulnerabilities% and exploits and then considering any available countermeasures. $uring this process the impact of a potential threat must be Fuantified% as must the cost of any potential countermeasure against the risk. Tiven this basis of information% the security risks can be prioritiIed by their R> 1although the /R($ doesn9t explicitly call it that2% and make strategic decisions on hich risks ill receive the most attention% and in hat order. 8his phase therefore does the focusing of the inFuiry as ell as the analysis. In Phase * of the /R($% $evelopment and Implementation% the risks found in Phase " are addressed and% for each one% a remediation strategy is created% implemented% and tracked. >very remediation strategy needs to be tested% including being tested in a production environment% and the results of the tests are reported% to ensure institutional learning. 8his phase thus handles closing the loop" 8he third and final phase of the /R($SNperationSrecogniIes that moving ne processes and ne assets into day5to5day operation reFuires effort and attention. 8his is another example of closing the loop. Creating ne processes starts ith a ell5defined change management process% hich includes not only moving the ne assets into production% but also accompanying those ne assets ith ne procedures as appropriate. 8hese ne and changed assets must be stabiliIed% and all personnel must become familiariIed ith them% to ensure successful transition and operation. 8he /R($% like the NC8A=> approach% emphasiIes that security must have its place in the soft are and system development life cycle% and so has described a D/ecurity ;rame ork Process (odelE. 8his model consists of six ma@or processes and milestones! Initiation of the pro@ect definition 1 here a vision scope is approved2

/ecurity assessment and analyses 1 here a pro@ect plan is approved2

/ecurity remediation development 1 here the identified scope of the remediation is covered2 /ecurity remediation testing and resource functionality testing 1 here release readiness is approved2 /ecurity policies and countermeasure deployment 1 here the deployment is completed2 $eployment complete 1 here preparations are made for the next iteration2

8he /R($ also identifies a D/ecurity Risk (anagement $isciplineE hich aids an organiIation in planning a strategy for minimiIing the risks associated ith security breaches. 8his involves guidance on ho to assess risk probabilities and losses% analyIing and prioritiIing risks% and ho to plan% schedule% and report on security risks. ;or example% to determine a risk probability% the /R($ alks a user through a series of steps for determining "2 the probability of a threat% *2 the criticality for the asset% .2 the effort reFuired to exploit the vulnerability% 02 the vulnerability factor% and 32 the asset priority. 8he first three steps allo one to determine the Dthreat levelE and the final t o steps allo one to determine the impact% or Dloss factorE. By multiplying these together e get a R> value. /imilarly the /R($ provides a set of steps and factors to consider in valuing assets. 2.3 $RAP 8he ;acilitated Risk Assessment Program 1;RAP2 is a Fualitative process% developed by 8homas Peltier. In the ;RAP% a system or a segment of a business process is examined by a team that includes both business)managerial and I8 personnel. 8he ;RAP guides them to brainstorm potential threats% vulnerabilities and the potential damages to data integrity% confidentiality and availability. Based on this brainstorming% the impacts to business operations are analyIed and threats and risks are prioritiIed. 8he ;RAP is purely %ualitative% meaning that it makes no attempt to Fuantify risk probabilities and magnitudes. 8he ;RAP consists of the follo ing three phases! ,*ase 1: 6*e ,re8&R3, 5eeting In Phase "% the revie is scoped and the initial team and the mechanics of the revie are agreed on. 8his establishes the context and goals of the inFuiry. 8he outputs of Phase " are! a scope statement% an identification of the team members 1typically bet een G and "3 people2% a visual model of the security process being revie ed% and a set of definitions. 8hese definitions serve as an anchor to the rest of the process. 8he ;RAP recommends that the team agrees on the follo ing terms! integrity% confidentiality% availability% risk% control% impact% and vulnerability. ;inally% the mechanics of the meeting need to be agreed upon in Phase "! location% schedule% materials% etc. ,*ase %: 6*e &R3, Session Phase * is itself divided into three parts. 8hese three parts serve as the focusing activity% as ell as the front5end of the analysis activity. 8he first activity is in establishing the logistics for the meeting! ho ill take hat role 1o ner% team lead% scribe% facilitator%

team member2. Nnce this is done the outputs from Phase " are revie edSthe definitions% scope statement% and so forthSto ensure that all team members are starting from a common basis of understanding. 8he second activity is brainstorming% here all of the team members contribute risks that are of concern to them. 8he third and final activity of Phase * is prioritiIation. PrioritiIation is ranked on t o dimensions! vulnerability 1lo to high2 and impact 1lo to high2. Chen the risks are documented% the team also contributes suggested controls for at least the high priority risks. ,*ase !: 6*e ,ost8&R3, 5eeting9s: Phase . may be a single meeting% or may be a series of meetings over many days. In this phase the bulk of the analysis ork is done% as ell as closing the loop. 8he outputs from the phase are! a cross reference sheet% an identification of existing controls% a set of recommendations on open risks and their identified controls% and a final report. 8he cross reference sheet is the most time5consuming of the activities. It sho s all of the risks affected by each control% as ell as any tradeoffs bet een controls that have been identified. 8he main contribution of the final report% in addition to documenting everything that has been learned in the ;RAP% is an action plan% describing the controls to implement. 2.% &uantitati'e !ersus &ualitati'e A((roaches 8he security risk management methods that e have surveyed have differed in hether they attempted to Fuantify security risks% or hether they attempted to Fualitatively assess and prioritiIe risks. 8he /R($ and the NC8A=> process approach the problems of identifying% analyIing% planning for% and managing security risks Fuantitatively 1in the NC8A=> approach% Fuantitative analysis of risks is an optional element of Phase .2. 8he ;RAP and NC8A=>5/% on the other hand% are purely Fualitative. Chat are the costs and benefits of each approachB <either approach is obviously superior to the other. Huantitative methods tend to be more expensive and time5consuming to apply% and reFuire more front5end ork% but they can produce more precise results. /uch methods tend to be associated ith higher process maturity organiIations. Hualitative methods% on the other hand% are less time consuming 1and therefore more agile to apply2 and reFuire less documentation. Also% it is often the case that specific loss estimates are not needed to make a determination to embark upon a risk mitigation activity. In section 0 e ill sho ho a Fuantitative security risk assessment practice can be implemented that is not unduly time or resource consuming.

3. Management of Information Security Standards


Tuidelines and standards exist for developing and acFuiring 1technical2 I8 security products% and for developing and acFuiring operation5oriented and management5oriented I8 security procedures. Along a second dimension% there are guidelines and standards that are associated ith mechanisms for certification% those that are associated ith la s that reFuire due diligence% and those that are purely for Dguidance.E ;inally% there are Capability (aturity (odels 1C((9s2 that are most easily characteriIed as being at one

level higher than any of the aforementioned guidelines and standards. In this section% e introduce the most important I8 /ecurity guidelines and standards in each of these categories. ;or each% e indicate its type% briefly describe its history and purpose% describe its ma@or areas of concern% and indicate hat it has to say about I8 security risk management. 3.1 TCS"C) ITS"C) CTCP"C) Common Criteria) and IS 1*%+, 8C/>C% I8/>C% C8CP>C% Common Criteria% and I/N "30,' KI/N"30,'L constitute a family of standards in the sense that the first three are ancestors of the final t o% and I/N "30,' is the I/N standard based upon Common Criteria. All deal ith security5related CN8/ products. 8C/>C stands for 8rusted Computer /ystem >valuation Criteria% I8/>C for I8 /ecurity >valuation and Certification /cheme% and C8CP>C for Canadian 8rusted Computer Product >valuation Criteria. 8he original 8C/>C document% often referred to as the Dorange book%E as published in "?'3 by the <ational Computer /ecurity Council 1<C/C2% a branch of the <ational /ecurity Agency 1</A2. 8C/>C% hich deals ith both security reFuirements and assurance 1evaluation2 reFuirements% has three ob@ectives! to provide users ith a yardstick ith hich to assess the degree of trust that can be placed in computer systems for the secure processing of classified or other sensitive information to provide guidance to manufacturers as to hat to build into their ne % idely5 available trusted commercial products in order to satisfy trust reFuirements for sensitive applicationsM and to provide a basis for specifying security reFuirements in acFuisition specifications.

According to 8C/>C9s fore ord! 8his publication is effective immediately and is mandatory for use by all $o$ KQnited /tates $epartment of $efenseL Components in carrying out Automatic $ata Processing KA$PL system technical security evaluation activities applicable to the processing and storage of classified and other sensitive $o$ information and applications as set forth herein. 8he precursors to I8/>C% 8C/>C9s QJ counterpart% began ith ork in t o government agencies. In "?'3% the Communications >lectronics /ecurity Troup 1C>/T2% created facilities for performing security evaluations of government computer systems. A fe years later the $epartment of 8rade and Industry 1$8I2 established the Commercial Computer /ecurity Centre to evaluate security5related CN8/ products. 8he documents that resulted are kno n as Uthe Treen BooksU. In $ecember "?'?% C>/T and $8I issued a @oint scheme% the QJ I8 /ecurity >valuation and Certification scheme% or% for short% the UQJ I8/>C schemeU. 8he scheme ent into effect on (ay "% "??". According to I8/>C9s mission statement! 8he ob@ectives of the /cheme are to meet the needs of Industry and Tovernment for cost effective and efficient security evaluation and certification of I8 products

and systems. 8he /cheme also aims to provide a frame ork for the international mutual recognition of certificates. Cork on 8C/>C% I8/>C% the Canadian 8rusted Computer Product >valuation Criteria 1C8CP>C2% developed by the Canadian Communications /ecurity >stablishment 1C/>2% and a number of >uropean initiatives% eventually led to development of the Common Criteria for Information 8echnology /ecurity >valuation% usually simply referred to as the DCommon Criteria%E and often further abbreviated to DCC.E 8he organiIations that participated in the development of the Common Criteria% and that are involved in certifying evaluation laboratories% are AI/>P 1Australia and <e Vealand2% C/> 1Canada2% /C//I 1;rance2% B/I 1Termany2% <L<C/A 1<etherlands2% C>/T 1QJ2% <I/8 1Q/A2 and </A 1Q/A2. An eminently readable discussion of CC9s scope of applicability% and ho CC orks may be found in a report published on the eb site of Canada9s Communication /ecurity >stablishment 1C/>2 at http!)) .cse5 cst.gc.ca)en)documents)services)ccs)brochure.pdf In Pune "???% Common Criteria became I/N "30,' KI/N"30,'L. A detailed vie of the development of the Common Criteria and I/N "30,' may be found in Annex A of DCommon Criteria for Information 8echnology /ecurity >valuation% Part "! Introduction and general model% August "???% =ersion *."% CCI(B5??5,.".U CC)I/N "30,' orks as follo s! A security5related product to be evaluated is referred to as a 8arget of >valuation 18N>2. 8he inputs to a 8N>9s Common Criteria evaluation are! a /ecurity 8arget 1/82 package

a set of producer5supplied evidence about the 8N> and the 8N> 1hard are and)or soft are2 itself.

8he /8 consists of! a list of security threats ith hich the 8N> is intended to deal the 8N>9s ob@ectives in dealing ith those threats the 8N>9s functional reFuirements and a specification of the 8N>9s implemented security functions and assurance measures.

8he follo ing are CC9s eleven functional reFuirements classes! ".Class ;AQ! /ecurity audit + Class ;(8! /ecurity management * Class ;CN! Communication G Class ;PR! Privacy . Class ;C/! Cryptographic support ' Class ;P8! Protection of the 8/; 0 Class ;$P! Qser data protection ? Class ;RQ! Resource utilisation 3 Class ;IA! Identification and ", Class ;8A! 8N> access authentication "" Class ;8P! 8rusted path)channels

>ach class is sub5categoriIed into a number of DfamiliesE and each family into a number of Dcomponents.E /ome classes consist of fe families% for example% Class ;C/! Cryptographic support consists of @ust! " Cryptographic key management 1;C/WCJ(2 * Cryptographic operation 1;C/WCNP2 Nther classes consist of considerably more families% for example% for example Class ;$P! Qser data protection includes! " Access control policy 1;$PWACC2 ' Internal 8N> transfer 1;$PWI882 * Access control functions ? Residual information protection 1;$PWAC;2 1;$PWRIP2 . $ata authentication 1;$PW$AQ2 ", Rollback 1;$PWRNL2 0 >xport to outside 8/; control "" /tored data integrity 1;$PW/$I2 1;$PW>8C2 "* Inter58/; user data 3 Information flo control policy confidentiality transfer protection 1;$PWI;C2 1;$PWQC82 + Information flo control functions ". Inter58/; user data integrity 1;$PWI;;2 transfer protection 1;$PWQI82 G Import from outside 8/; control 1;$PWI8C2 8he follo ing are CC9s seven assurance reFuirements classes! " Class AC(! Configuration 0 Class AT$! Tuidance documents management 3 Class ALC! Life cycle support * Class A$N! $elivery and + Class A8>! 8ests operation G Class A=A! =ulnerability . Class A$=! $evelopment assessment An important part of a 8N>9s /8 is the Protection Profile KPPL% hich detail reFuirements that that the product purports to satisfy or that the potential consumer must have. Common Criteria includes a set of DreFuirements of kno n validityE from hich the preparer of the /8 may choose in preparing the PPM consumers and)or developers may specify% in the PP% additional reFuirements that they deem necessary in a particular product or product category. Common Criteria evaluation orks as follo s! In the Qnited /tates% the <ational Information Assurance Partnership9s 1<IAP2 Common Criteria >valuation and =alidation /cheme 1CC>=/2 =alidation Body is @ointly run by the <ational Institute of /tandards and 8echnology 1<I/82 and the <ational /ecurity Agency 1</A2M according to the Common Criteria!
8he =alidation Body approves participation of security testing laboratories in the scheme in accordance ith its established policies and procedures. $uring the course of an evaluation% the =alidation Body provides technical guidance to those testing laboratories% validates the results of I8 security evaluations for conformance to the Common Criteria% and serves as an interface to other nations for the recognition of such evaluations. I8 security evaluations are conducted by commercial testing laboratories accredited by the <I/8Xs <ational =oluntary Laboratory Accreditation Program 1<=LAP2 and approved by the =alidation Body. 8hese approved testing laboratories are called Common Critera 8esting Laboratories 1CC8L2.

/imilar arrangements are in effect in the other countries involved in the Common Criteria. 8he result of a product9s positive CC evaluation are a confirmation that the 8N> satisfies the /8 together ith an indication of the >valuation Assurance Level 1>AL2 at hich the /8 is satisfiedM there are seven >AL9s% from >AL" 1functionally tested2 and >AL* 1structurally tested2 at the lo end to >AL+ 1semi5formally verified design and testing2 and >ALG 1formally verified design and testing2 on the high end. As far as risk is concerned% Common Criteria assumes that the organiIation contemplating the purchase and use of a security5related product ill employ the results of the product9s evaluation in performing the risk analysis reFuired to determine if the product meets the organiIation9s I8 security5related reFuirements. Chile it naturally addresses threat)risk5related issues in depth% Common Criteria% being product5centered rather than management5centered or operationally5 centered% does not address the issue of the ho an organiIation goes about performing that risk analysis. 8he follo ing Fuotations from Common Criteria for Information 8echnology /ecurity >valuation Part "! Introduction and general model Panuary *,,0 makes this point very clearly! KPart "% page 'L 8he evaluation process establishes a level of confidence that the security functions of such products and systems and the assurance measures applied to them meet these reFuirements. 8he evaluation results may help consumers to determine hether the I8 product or system is secure enough for their intended application and hether the security risks implicit in its use are tolerable. KPage "?L Consumers can use the results of evaluations to help decide hether an evaluated product or system fulfils their security needs. 8hese security needs typically identified as a result of both risk analysis and policy direction. Consumers can also use the evaluation results to compare different products or systems. Presentation of the assurance reFuirements ithin a hierarchy supports this need. KPage *3L 8he o ners of the assets ill analyIe the possible threats to determine hich ones apply to their environment. 8he results are kno n as risks. 8his analysis can aid in the selection of countermeasures to counter the risks and reduce it to an acceptable level. 8ypes of product that have received Common Criteria certification include! operating systems% database management systems% fire alls% s itches and routers% certificate management soft are% Public Jey Infrastructure 1PJI2)Jey (anagement Infrastructure soft are% etc. 8he three parts of the Common Criteria /tandard are available at! 5Part "of the C/ /tandard version *.*% Panuary *,,0! Introduction and general model http!)) .commoncriteriaportal.org)public)files)ccpart"v*.*.pdf 5Part * of the CC /tandard version *.*% Panuary *,,0! /ecurity functional reFuirements http!)) .commoncriteriaportal.org)public)files)ccpart*v*.*.pdf 5Part . of the CC /tandard version *.*% Panuary *,,0! /ecurity assurance reFuirements http!)) .commoncriteriaportal.org)public)files)ccpart.v*.*.pdf 8he three parts of the latest 1"???2 version of I/N "30,' may be ordered from the International NrganiIation for /tandardiIation.

3.2 -S ..//) IS 1..//) and IS TR 1333* 01MITS2 British /tandard GG?? 1B/ GG??2 and I/N "GG?? form a family in the sense that the latter is the I/N standard version of the second of B/ GG??9s t o parts% the first of hich is best characteriIed as guidelines% and the second of hich as a standard against hich certification is possible. I/N 8R "...3 1T(I8/2 is included in this section because it provides additional guidance% especially in the area of risk management% for organiIations seeking I/N "GG?? certification. All three deal ith operational O including% of course% site5related physical O and management aspects of I8 securityM as might be expected% they all deal ith both the implementation and the ongoing operation of I8 security activities. B/ GG?? Part " is entitled DInformation 8echnologySCode of Practice for Information /ecurity (anagement%E and B/ GG?? Part * DInformation /ecurity (anagement /ystemsS/pecification ith Tuidance for Qse.E I/N "GG?? is entitled DCode of Practice for Information /ecurity (anagement%E and I/N 8R "...3 DTuidelines for the (anagement for I8 /ecurity%E or T(I8/ for short. 8he earliest precursor of I/N "GG?? as created by the QJ $epartment of 8rade and IndustryXs 1$8I2 Commercial Computer /ecurity Centre 1CC/C2% the organiIation that developed I8/>C 1see above2. Its first incarnation as the UQsers Code of Practice%U published in "?'?. Its second incarnation as British /tandardXs guidance document P$ ,,,.% DA Code of Practice for Information /ecurity (anagement%E developed by the <ational Computing Centre 1<CC2 ith the aid of representatives from industry. In "??3 P$ ,,,. evolved into British /tandard B/GG??!"??3. A second part of the standard% B/GG??5*!"??' as published in ;ebruary "??'. In /eptember *,,*% B/GG??5*!"??' as updated to B/GG??5*!*,,* for consistency ith I/N ?,,"!*,,,% I/N "0,,"!"??+% and policies of the NrganiIation for >conomic Cooperation and $evelopment 1N>C$2. As of this riting B/GG?? Part * has not become an I/N standard% and there appears to be no effort to move it in that direction. I/N "GG??!*,,, describes "*G security controls% each follo ing ten domains! ith numerous sub5sections% ithin the

17 Securit+ ,o'ic+ 5 to provide management direction and support for information security %7 .rgani;ationa' Securit+ 5 to manage information security ithin the organiIation !7 3sset C'assification < Contro' 5 to maintain appropriate protection of organiIational assets 47 ,ersonne' Securit+ 5 to reduce the risks of human error% theft% fraud or misuse of facilities 57 ,*+sica' Securit+ 5 to prevent unauthoriIed access% damage and interference to business premises and information 67 Communication and .1eration 5anagement 5 to ensure the correct and secure operation of information processing facilities $7 3ccess Contro' 5 to control access to information

=7 S+stem #eve'o1ment and 5aintenance O to ensure that security is built into information systems >7 4usiness Continuit+ 5anagement 5 to counteract interruptions to business activities and to protect critical business processes from the effects of ma@or failures or disasters 1"7 Com1'iance 5 to avoid breaches of any criminal and civil la % and statutory% regulatory or contractual obligations% and of any security reFuirements Chile it is extremely thorough in its coverage% I/N)I>C "GG??!*,,, does not address the issues of evaluation or certificationM i.e.% it is% in the strict sense% a set of guidelines rather than a standard. B/ GG??5*!*,,* is% on the other hand% a standard in the strict sense. It specifies% in great detail% hat is expected of an organiIation for the achievement of certification and hat is expected of an assessor in the assessment of an organiIation for compliance. I/N "GG??!*,,, is intended to be used as a set of Code of Practice guidelines for organiIations desirous of orking to ard B/ GG??5*!*,,* certification. It stresses risk assessment and risk management% but% as a set of guidelines% does not specify a particular approach. B/ GG??5*!*,,* certification is based upon an organiIation9s creation of a documented Information /ecurity (anagement /ystem 1I/(/2. 8he I/(/ is based upon the continuous5 improvement Plan% $o% Check% Act 1P$CA2 feedback5loop cycle invented by Calter /he hart of Cestern >lectric9s Aa thorne Plant in the late "?.,9s and later populariIed by C. >d ards $eming K/he '+L. 8he idea behind the cycle is to develop% implement% and continuously improve the organiIation9s control and management of security. As can be seen from B/ GG??5*!*,,*9s high5level definition of P$CA% the management of risk drives the entire process! PLA< 5Implement training and a areness 5$efine the scope of the I/(/ programs 5$efine the I/(/ policy 5(anage operations 5$efine the approach to risk 5(anage resources assessment 5Implement procedures to detect and 5Identify the risks respond to security incidents 5Assess the risks 5Identify and evaluate options for the CA>CJ treatment of the risks 5>xecute monitoring procedures 5/elect control ob@ectives and 5Qndertake regular revie s of I/(/ controls effectiveness 5Prepare a /tatement of Applicability 5Revie level of residual and 1/NA2 acceptable risk 5Conduct internal I/(/ audits $N 5Perform regular management 5;ormulate a Risk 8reatment Plan revie s of the I/(/ 5Implement the Risk 8reatment Plan 5Record actions and events that 5Implement Controls impact on the I/(/

AC8 5Implement identified improvements 58ake corrective)preventive action 5Apply lessons learned

5Communicate results to interested parties 5>nsure that improvements achieve ob@ectives

I/N 8R "...35.!"??'% 8echniFues for the (anagement of I8 /ecurity% and I/N 8R "...35 0!*,,,% /election of /afeguards go into detail on the topics of risk assessment and risk management respectively. 8he British /tandard Institute9s 1B/I2 P$ .,,*% DTuide to B/ GG?? risk assessment%E and P$ .,,3% DTuide on the selection of B/ GG??5* controls%E detail ho T(I8/ Part . and T(I8/ Part 0 may be applied% respectively% to the risk assessment and risk management aspects of I/N)I>C "GG?? and B/ GG?? Part *. http!)) .bsi5global.com)IC8)/ecurity)pd.,,*.xalter http!)) .bsi5global.com)IC8)/ecurity)pd.,,3.xalter Assessment for B/GG??5*!*,,* certification is done by an assessor orking for a certification #ody. A list of certification bodies may be found at the eb site of the International I/(/ Qser Troup 1http!)) .xisec.com)2 as may a list of all certified organiIations. 8o be eligible to perform B/GG??5*!*,,* assessments% an organiIation must be accredited as a certification body by a national accreditation #ody. $ifferent national accreditation bodies maintain reciprocal recognition agreements. Identities of and contact information for national accreditation bodies in >urope may be found at the >uropean 1cooperation for2 Accreditation 1>A2 eb site at http!)) .european5accreditation.org)% as can non5>uropean accreditation bodies ith hich >A has contracts of cooperationM >A5G),. 1rev.,,% ;ebruary% *,,,2% DTuidelines for the Accreditation of Bodies Nperating Certification)Registration of Information /ecurity (anagement /ystemsE may also be found at http!)) .european5accreditation.org). 8he I/(/ International Qsers Troup lists the follo ing as accredited certification bodies! B( 8RA$A Certification Limited <ational Huality Assurance B/I <emko 1<or ay2 B=HI 1Bureau =eritas Huality International2 P/B Certification 1/ingapore2 Certification >urope RI<A /.p.A. 1Italy2 CI/ 1Austria2 /AI Tlobal Limited 1Australia2 $<= 1$et <orske =eritas2 /;/ Certification 1;inland2 $H/ TmbA 1Termany2 /T/ IC/ Limited PACN5I/ 1Papanese Audit and Certification /H/ 1/ iss Huality /ystem2 Nrganisation2 PHA 1Papanese Huality Assurance2 /8HC Certification /ervices 1India2 8eknologisk institutt /ertifisering J>(A 1<etherlands2 1<or ay2 JP(T Audit plc 8Y= Rheinland Troup 1Termany2 JP(T /A QI(Cert 1Termany2 LRHA Qnited Registrar of /ystems Limited

A/

A B/GG??5*!*,,* certification comes ith a Dscope%E hich specifies the part of the organiIation that is% in fact certified O either the entire organiIation or one or more of its parts or activities. 3.3 3IPAA Q/ Aealth Portability and Accountability Act of "??+ 1AIPAA2 is not a standard against hich an organiIation is certified% but% rather% a statute% ith hich relevant organiIations are reFuired to comply% and hich dictates government audit% ith possible severe conseFuences% in case of complaints of violations. AIPAA9s purpose% as stated in the act itself is! Z to improve portability and continuity of health insurance coverage in the group and individual markets% to combat aste% fraud% and abuse in health insurance and health care delivery% to promote the use of medical savings accounts% to improve access to long5term care services and coverage% to simplify the administration of health insurance% and for other purposes. 8he desired results are intended to result from improved utiliIation of I8% hich accounts for the act9s inclusion of a privacy standard)rule and a security standard)rule. 8he act applies to Protected Aealth Information 1PAI2 O anything to do ith a patient or patients O that is electronically stored and electronically transmitted by a Dcovered entity%E i.e.% a health plan% a health care provider% or a health care clearinghouse. 8he term Dhealth planE includes health insurers% health benefit plans% A(N9s% other managed care organiIations% etc. 8he term Dhealth care clearinghouseE includes billing services% health information providers% etc. 8he final version of the /ecurity Rule as enacted in ;ebruary *,,.M large organiIations ill be reFuired to comply by April *,,3% and small ones by April *,,+. 8he AIPAA /ecurity Rule is broken do n into three areas! Administrative 1management and operational2 /afeguards% Physical 1operational2 /afeguards% and 8echnical /afeguards. 8he three safeguards are further broken do n as follo s! 83dministrative Safeguards 5/ecurity (anagement Process 5/ecurity Responsibility 5Corkforce /ecurity 5Information Access (anagement 5/ecurity A areness and 8raining 5/ecurity Incident Procedures 5Contingency Plan 5>valuation 5Business Associate Contracts and Nther Arrangements 8,*+sica' Safeguards ;acility Access Controls Corkstation Qse Corkstation /ecurity $evice and (edia Controls 86ec*nica' Safeguards Access Controls Audit Controls Integrity Person or >ntity Authentication 8ransmission /ecurity

AIPAA9s /ecurity Rule specifies identification of the relevant I8 systems% i.e.% scoping of the compliance effort O a necessary precursor to risk identification O follo ed by risk assessment risk management planning. Because AIPAA reFuires that a covered entity9s implementation of the /ecurity Rule be Dcomprehensive and coordinated%E scalable%E and Dtechnology neutralE O i.e.% that it be updated regularly as technology changes O the act9s provisions are general% rather than specific. In (ay *,,0 <I/8 issued its /pecial Publication /P ',,5++% An Introductory Resource Tuide for Implementing the Aealth Insurance Portability and Accountability Act 1AIPAA2 /ecurity Rule 1csrc.nist.gov)publications)drafts)$RA;85sp',,5++.pdf2 to aid the large number of organiIations covered by AIPAA. As indicated above% there is no notion of AIPAA certification or accreditation. Rather% the Q/ $epartment of Aealth and Auman /ervices 1$AA/2 has assigned its Nffice of Civil Rights 1NCR2 the responsibility for enforcing AIPAA by responding to complaints of violations. An overvie of AIPAA may be found at http!))privacy.med.miami.edu)glossary)xdWhipaa.htm 3.% SS"4CMM) and IS 5I"C 21,2. Qp to this point e have discussed standards)guidelines for I8 security5related products and for the management of I8 security programs. In general% an I8 security standard)guideline can relate to the technical side of I8 security% to the management side% or to both. A guideline)standard can simply be informational% it can specify a mechanism for certification% or it can specify due diligence and provide for complaint5driven audit% ith penalties for non5compliance. A Capability (aturity (odel 1C((2 fits into none of these categories. Rather% a C(( deals ith the process hereby an activity is performed. /ome activities for hich C((s exist are soft are engineering K/C5C((?3L and systems engineering K/>5C((?3L. 8hese apply to the development of arbitrary 1not necessarily I8 security5related2 soft are applications or systems% in the first case% and of hard are)soft are systems in the second. A C(( is used to certify the process that an organiIation uses to produce its products rather than to certify the results of the application of that process% i.e.% the products produced. 8he philosophy behind this notion is that if the process used by an organiIation in the production of products is sufficiently Dmature%E then it is safe to assume that!
5the organiIation9s existing products are of sufficiently high Fuality O in the case of soft are% as an example% that it meets functional reFuirements)specifications to a sufficient degree% that it has a sufficiently high level of performance% reliability availability% maintainability% etc. 5and that a ne product% yet to be developed% ill be of sufficiently high Fuality and ill% additionally% be completed sufficiently close to schedule and sufficiently close to the pro@ected budget.

A further aspect of the C(( notion is that an organiIations maturity can be assigned a maturity level% typically on a scale of one to five% hich% in some sense% Fualifies the notion of Dsufficient%E ith level 3 representing a very high level of sufficiency% level " representing a lo level% and levels *50 being in bet een.

*+ According to the //>5C(( (odel $escription $ocument% =ersion ..,% K ith editorial comments in sFuare bracketsL% the /ystem /ecurity C(( 1//>5C((2 is extremely flexible% so flexible% in fact% that it applies to! 5 Product Developers! 8he //>5C(( includes practices that focus on gaining an understanding of the customer9s security needs. Interaction ith the customer is reFuired to ascertain them. In the case of a Knon5custom% non5contractedL product% the customer is generic as the product is developed a priori independent of a specific customer. Chen this is the case% the product marketing group or another group can be used as the hypothetical customer% if one is reFuired. In this case% the product is hard are)soft are)other physical product to be used for I8 security.
- Countermeasure Developers : Z 8he model contains practices to address determining and analyIing security vulnera#ilities% assessing operational impacts% and providing input and guidance to other groups involved 1such as a soft are group2. 8he group that provides the service of developing countermeasures needs to understand the relationships bet een these practices. In this case% the product is the countermeasures themselves. - Security Service Providers: 8o measure the process capability of an organiIation that performs ris! assessments% several groups of practices come into play. $uring system development or integration% one ould need to assess the organiIation ith regard to its ability to determine and analyze security vulnera#ilities and assess the operational impacts. In the operational case% one ould need to assess the organiIation ith regard to its ability to monitor the security posture of the system% identify and analyze security vulnera#ilities% and assess the operational impacts. K8his means assessing the entire I8 security system% technical% operational% and management. <ote that the provider can be an outside organiIation)consultant)integrator or a 1I8 security implementation and)or management2 group ithin the very organiIation that ill be using the system.L

8he references to risk management conceptsSemphasiIed here in italicsSin the Fuoted paragraphs% makes it clear ho critical the risk management is to //>5C((. According to the same document% //>5C((9s history is as follo s! 8he //>5C(( initiative began as an </A5sponsored effort in April "??. ith Z investigation of the need for a specialiIed C(( to address security engineering. $uring this Conceive Phase% a stra 5man /ecurity >ngineering C(( as developed to seed the effort. 8he information security community as invited to participate in the effort at the ;irst Public /ecurity >ngineering C(( Corkshop in Panuary "??3. Representatives from over +, organiIations reaffirmed the need for such a model. As a result of the community9s interest% Pro@ect Corking Troups ere formed at the orkshop% initiating the $evelop Phase of the effort. Z $evelopment of the model and appraisal method as accomplished through the ork of the //>5C(( /teering% Author% and Application Corking Troups ith the first version of the model published in Nctober "??+ and of the appraisal method in April "??G. 8o validate the model and appraisal method% pilots occurred from Pune "??+ through Pune "??G Z 8he pilots addressed various organiIational aspects that contributed to the validation of the modelZ In Puly "??G% the /econd Public /ystems /ecurity >ngineering C(( Corkshop as conducted Z 8he orkshop proceedings are available on the //>5C(( eb site. Z the International /ystems /ecurity >ngineering Association 1I//>A2 as formed to continue the development and promotion of the //>5C(( Z I//>A continues to maintain the model and its associated materials as ell as other activities related to systems security engineering and security in general.

*G I//>A has become active in the International NrganiIation for /tandardiIation and sponsored the //>5C(( as an international standards I/N)I>C *"'*G KI/N*"'*GL. 8he Dbase practicesE for hich C(( assesses security engineering maturity are divided into t o groups. 8he groups% along ith their Dprocess areasE are as follo s! />CQRI84 BA/> PRAC8IC>/ PA," O Administer /ecurity Controls PA,* O Assess Impact PA,. O Assess /ecurity Risk PA,0 O Assess 8hreat PA,3 O Assess =ulnerability PA,+ O Build Assurance Argument PA,G O Coordinate /ecurity PA,' O (onitor /ecurity Posture PA,? O Provide /ecurity Input PA", O /pecify /ecurity <eeds PA"" O =erify and =alidate /ecurity

PRNP>C8 A<$ NRTA<IVA8IN<AL BA/> PRAC8IC>/ Teneral /ecurity Considerations PA"* O >nsure Huality PA". O (anage Configurations PA"0 O (anage Pro@ect Risk PA"3 O (onitor and Control 8echnical >ffort PA"+ O Plan 8echnical >ffort PA"G O $efine NrganiIationXs /ystems >ngineering Process PA"' O Improve NrganiIationXs /ystems >ngineering Processes PA"? O (anage Product Line >volution PA*, O (anage /ystems >ngineering /upport >nvironment PA*" O Provide Nngoing /kills and Jno ledge PA** O Coordinate ith /uppliers

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

8he maturity levels are! Ca1a i'it+ Leve' 1 ,erformed ?nforma''+: Base practices of the process area are generally performed. 8he performance of these base practices may not be rigorously planned and tracked. Performance depends on individual kno ledge and effort. Cork products of the process area testify to their performance. Individuals ithin the organiIation recogniIe that an action should be performed% and there is general agreement that this action is performed as and hen reFuired. 8here are identifiable ork products for the process. Ca1a i'it+ Leve' % ,'anned and 6rac)ed: Performance of the base practices in the process area is planned and tracked. Performance according to specified procedures is verified. Cork products conform to specified standards and reFuirements. (easurement is used to track process area performance% thus enabling the organiIation to manage its activities based on actual performance. 8he primary distinction from Level "% Performed Informally% is that the performance of the process is planned and managed. Ca1a i'it+ Leve' ! @e'' #efined: Base practices are performed according to a ell5defined process using approved% tailored versions of standard% documented processes. 8he primary distinction from Level *% Planned and 8racked% is that the process is planned and managed using an organiIation5 ide standard process. Ca1a i'it+ Leve' 4 0uantitative'+ Contro''ed: $etailed measures of performance are collected and analyIed. 8his leads to a Fuantitative understanding of process capability and an improved ability to predict performance. Performance is ob@ectively managed% and the Fuality of ork products is Fuantitatively kno n. 8he primary distinction from the Cell $efined level is that the defined process is Fuantitatively understood and controlled. Ca1a i'it+ Leve' 5 Continuous'+ ?m1roving: Huantitative performance goals 1targets2 for process effectiveness and efficiency are established% based on the business goals of the organiIation. Continuous process improvement against these goals is enabled by Fuantitative feedback from performing the defined processes and from piloting innovative ideas and technologies. 8he primary distinction from the Fuantitatively controlled level is that the defined process and the standard process undergo continuous refinement and improvement% based on a Fuantitative understanding of the impact of changes to these processes. Considering the definitions of levels 0 and 3% it should be no surprise that the development of the first C((% the /oft are >ngineering C((% as developed under strong influence from /he hart and $eming9s ideas on statistical process control K/he '+L. //>5C(( provides documentation of both the basic model and of the appraisal method. 8he //>5C(( (odel $escription $ocument =ersion .., is available at http!)) .sse5

Lecture ""

Page *' of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

cmm.org)docs)ssecmmv.final.pdf. //>5C(( Appraisal (ethod =ersion *., is available at http!)) .sse5cmm.org)docs)//A(.pdf. According to //>5C(( Appraisal (ethod =ersion *.,% any organiIation ishing to evaluate the capability of another organiIation to perform systems security engineering activities should consider using the //A( K/ystem /ecurity Appraisal (ethodL. 8he //A( can be used to evaluate the processes of product developers% service providers% system integrators% system administrators% and security specialists to obtain a baseline or benchmark of actual practices against the standards detailed in the //>5C(( K//>5C(( (odel $escription $ocumentL. 8he International /ystems /ecurity >ngineering Association 1I//>A! http!)) .sse5 cmm.org)issea)issea.asp 2 Dis a non5profit membership organiIation dedicated to the advancement of /ystems /ecurity >ngineering as a defined and measurable discipline. >stablished in "???% I//>A and its members are tasked ith the maintenance of the //>5 C((.E According to the I//>A eb site% an appraiser certification program is currently being developed. 3.* 6IST 1uidance #ocuments As can be seen from the discussions in sections *."5*.0% the Q/ <ational Institute on /tandards and 8echnology has taken a significant role in developing I8 security guidelines and standards% ith a special emphasis on I8 security risk management. 8he follo ing is a list of <I/8 I8 security guidelines% very fe of hich are Q/5oriented rather that of general applicability% and all of hich may be obtained through <I/89s eb site http!))csrc.nist.gov)publications)nistpubs)! <I/8 /pecial Publication 1/P2 ',,5"*% An Introduction to Computer /ecurity! 8he <I/8 Computer /ecurity Aandbook% ;ebruary G% "??+. <I/8 /P ',,5.,% Risk (anagement Tuide for Information 8echnology /ystems% Panuary *,,*. ',,5.3 Tuide to Information% 8echnology /ecurity /ervices <I/8 /P ',,5"+% Information 8echnology /ecurity 8raining ReFuirements! A Role5 and Performance5 Based (odel% April "??'. <I/8 /P ',,5"'% Tuide for $eveloping /ecurity Plans for Information 8echnology /ystems% $ecember "??'. <I/8 /P ',,5."% Intrusion &etection Systems% <ovember *,,". <I/8 /P ',,5.*% Introduction to Public Jey 8echnology and the ;ederal PJI Infrastructure% ;ebruary *,,". <I/8 /P ',,5..% Qnderlying 8echnical (odels for Information 8echnology /ecurity% $ecember *,,".

Lecture ""

Page *? of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

<I/8 /P ',,5.0% Contingency Planning Tuide for Information 8echnology /ystems% Pune *,,*. <I/8 /P ',,5.+% Tuide to /electing Information 8echnology /ecurity Products% Nctober *,,.. <I/8 /P ',,50*% Tuideline on <et ork /ecurity 8esting% Nctober *,,.. <I/8 /P ',,50'% Cireless <et ork /ecurity! ',*.""% Bluetooth% and Aandheld $evices% <ovember *,,* <I/8 /P ',,53,% Building an Information 8echnology /ecurity A areness and 8raining Program% Nctober *,,.. <I/8 /P ',,533% /ecurity (etrics Tuide for Information 8echnology /ystems% Puly *,,.. <I/8 /P ',,5+0% /ecurity Considerations in the Information /ystem $evelopment Life Cycle% Nctober *,,..

%. Risk Models
In section " the fundamental techniFues of risk assessment and risk management ere introduced. In this section e discuss ho risk assessment applies to risk management decision ma!ing. Nur focus ill be on DstrategicE methodsSmethods that plan from the outset to achieve particular goals regardless of the circumstances rather than DtacticalE methods that attempt to make the best response in a given circumstance. 8o do this e need to take some of the terms that e defined informally in section " and give them more precise definitions. %.1 #efinitions A ris!% from the security perspective% is the probability that some threat ill successfully exploit a vulnerability in a system% along ith the magnitude of this loss. 8he higher the probability that a threat ill succeed% and the greater the magnitude of the potential loss% the greater the risk for the organiIation. A threat% then% is a stimulus for a riskM it is any danger to the systemM an undesirable event. A threat% ho ever% might not be a person. A Dthreat agentE is a person ho initiates or instigates a threat. 8ypically a threat ill take advantage of some system vulnera#ility" A vulnerability is a condition here the system is missing or improperly applying a safeguard or control. 8he vulnerability this increases the likelihood of the threat or its impact% or possibly both. It is important to note that the DsystemE includes people% not @ust hard are% soft are% and net orks. 8he net effect of a vulnerability being exploited by a threat agent is to expose the organiIation9s assets to a potential loss. 8his loss might be in the form of a loss or

Lecture ""

Page ., of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

disclosure of information% a corruption of the organiIation9s information 1a loss of integrity2% or a denial of service. In the language of risk% e need to consider the magnitude of all risk exposures to hich an organiIation is susceptible. 8his leads to the #asic ris! formula% hich attempts to Fuantify risk in terms of risk exposure 1R>2 KBoehm?"L! R> : probability1loss2 6 magnitude1loss2 8his is freFuently ritten as follo s R> : '() * S'(). ;or example% if organiIation A calculates the probability of a key server being do n for * hours as ",- due to denial of service attacks% and the loss due to this do n5time to be &*,%,,,% then the risk exposure facing A due to this loss is &*%,,, ;reFuently this formula is presented as a summation of all such risks to hich a system is exposed% therefore! 8otal R> : + P1Li 26 /1Li2 here Li is the loss due to the ith risk. ;or example% if organiIation A has t o other risks that they are facing% one ith a probability of 3,- and a loss of &"%,,, and another ith a probability of ,."- and a loss of &"%,,,%,,, then A9s 8otal R> is! ."6&*,%,,, 7 .36&"%,,, 7 ,.,," 6 &"%,,,%,,, : &*%,,, 7 &3,, 7 &"%,,, : &.%3,, Related to the notion of R> is ris! reduction leverage 1RRL2. RRL is a ay of gauging the effectiveness or desirability of a risk reduction techniFue. 8he formula for RRL is! RRL : 1R>before 5 R>after2 ) RRCost here RRCost stands for risk reduction cost. A similar formula can be used to compare the relative effectiveness of techniFue A ith respect to B! RRRLA%B : 1R>B O R>A2 ) 1RRCostA O RRCostB2. Chere R>A and R>B are the risk exposures after using A and B respectively 1 e assume that the R> before applying the techniFues is the same for both techniFues2. Ce see that hen RRRLA%B [ ,% techniFue A is more cost5effective than techniFue B. Let us consider an example. /ay an organiIation is considering ays of lo ering its defect risk on a safety5critical system% and has identified a structured alkthrough or an I=#= 1independent validation and verification2 activity as t o possible ays of finding defects and hence reducing the system risk% it can proceed as follo s. ;irst it must establish a cost for each risk reduction techniFue. <ext it must evaluate its current R> 1 hich is its R>before2 and the R> that ill result from the application of the techniFue 1the

Lecture ""

Page ." of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

R>after2. Let9s assume that the organiIation is interested in the risk involved ith the safety5critical system failing. /uch a failure ould result in a loss to the company of &",%,,,%,,, and currently the company believes that there is a 3- likelihood of such an occurrence. /tructured alkthroughs have% in the past% sho n to find ',- of the outstanding problems% reducing the probability of a loss to "-. I=#= is even more effective at finding problems% and it is expected that this techniFue ill reduce the probability of a loss to ,.,"-. Ao ever% the structured alkthrough is relatively inexpensive% costing @ust &3%,,, 1the time of the employees involved2. 8he I=#=% since it is independent% involves hiring a consultant% hich ill cost &",,%,,,. 8he RRL for each techniFue can no be calculated as follo s! RRLinspection : 1.,36",%,,,%,,, 5 .,"6",%,,,%,,,2 ) 3%,,, : 13,,%,,, O ",,%,,,2 ) 3%,,, : ', : 1,.36",%,,,%,,, 5 .,,,"6",%,,,%,,,2 ) ",,%,,, : 13,,%,,, O "%,,,2 ) ",,%,,, : 0.??

RRLI=#=

Clearly in this case the organiIation ill ant to choose to do the inspection first% as its RRL is far greater than the I=#= activity. Ao ever this could also be directly seen using! RRRLinspection% I=#= : 1.,,,"6",%,,,%,,, 5 .,"6",%,,,%,,,2 ) 13%,,, 5 ",,%,,,2 : 1"%,,, O ",,%,,,2 ) 15?3%,,,2 : 15??%,,,2)15?3%,,,2 : ".,0* [ ,

It is often the case that a techniFue reduces only the likelihood of a risk and not its magnitude. In this case% the RRL reduces to the cost5benefit 1CB2! CB : KPbefore1L2 O Pafter1L2L6/1L2)RRCost : \P1L26/1L2)RRCost ;inally% another one of the things that e can do ith R> is to develop a ris! profile 1or ,- profile2 ith respect to some measure of interest. ;or example% one can evaluate R> as a function of a monotonically increasing Fuantity such as elapsed time% cumulative effort% or cumulative cost. An example risk profile is given in ;igure 0.

Lecture ""

Page .* of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

&igure 4: 3n ?dea'i;ed Ris) Reduction ,rofi'e Tiven this basis of understanding e are no techniFues for strategic risk management. in a position to begin looking at specific

%.2 Strategic Risk Models >very I8 system in operation ill have some degree of security risk KBoehm?"L. Recall that security risks are possible situations or events that can cause a system harm and ill incur some form of loss. /ecurity risks range in impact from trivial to fatal and in likelihood from certain to improbable. 8hus far e have only discussed risks that are either DidentifiedE in that they arise from anticipated threats and kno n vulnerabilities ho ever there are also unidentified risks here this is not the case. /imilarly the impact of an identified risk is either !nown here the expected loss5potential has been assessed or un!nown here the loss5potential has not or cannot be assessed. Risks that are unidentified or have unkno n impacts are sometimes loosely labeled as ris!s due to uncertainty. In the case of I8 system security% risk considerations often must focus on uncertainty since by design% identified5kno n risks are either addressed or accepted as ithin a DtolerableE level. (anaging risks due to uncertainty is essentially the focus of sound risk management. ;inally note that a ris! model describes risks and their impacts for a particular system. Ce consider risk profile models because risks are generally not static. Likelihoods and impacts change ith a number of factors such as time% cost% system state% and so forth. As a conseFuence it is often desirable to consider risks ith respect to a planned set of events such as assessment effort% system operation time% development investment% etc. Representing risks that dynamically change over planned activities are called strategic ris! models. Ce ould like to utiliIe these models to promote effective risk management. As introduced previously e ill make us of risk profiling. Recall R> is computed as the product of the probability of loss and siIe that loss summed over all sources for a Lecture "" Page .. of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

particular risk. /ince security risk considerations greatly affect a system9s operational value% it is important that these risks be investigated candidly and completely. >xpressing system development and operation considerations in terms of risk profiles enables Fuantitative assessment of attributes that are typically specified only Fualitatively. A useful property of R> is that if it is computed entirely ithin a particular system 1i.e. no external loss sources2% e may assume all R> sources are additive. 8his ill be true regardless of any complex dependencies and is analogous to mathematical expectation calculations ithin classic probability theory. 8he additivity of R> can be exploited to analyIe strategies for managing risk profiles for I8 system security. /uch analyses enable cost% schedule% and risk trade off considerations that help identify effective risk management strategies. In particular this approach can help ans er difficult Fuestions such as D hat particular methods should be used to assess these vulnerabilitiesE% D hich method should be used firstE% and Dho much is enough ]risk assessment% risk mitigation% or risk control^E. As in previous sections% our focus ill be on risk assessment noting that the methods presented often have analogous counterparts in risk control. %.3 Strategic Risk Management Methods 8he purpose of risk modeling is to aid in risk management decision making. (anagement does not necessarily mean removing riskM this is not al ays possible or even economically feasible. ;or any risk% there is usually only a limited degree to hich that risk can be controlled or mitigated 1i.e. reduced expected loss2. As indicated earlier% there are risks due to uncertainty that cannot be mitigated or even assessed. As such% ris! management is the collection of activities used to address the identification% assessment% mitigation% avoidance% control% and continual reduction of risks ithin hat is actually feasible under particular conditions and constraints. As such% the goal of risk management is one of Denlightened gamblingE here e seek an expected outcome that is positive regardless of the circumstances. Assessment is the key starting point and% as stated earlier% the focus of this chapter. 8his includes gaining insight into the follo ing! hat the risks are and here is there risk due to uncertainty differentiating bet een development risks and operation risks avoidable versus unavoidable risks controllable versus uncontrollable risks cost and benefits of risk mitigation% avoidance% and control

Assessment enables a strategy to be chosen for the mitigation and control of risk. Ao ever% it is not obvious from the outset that any given assessment strategy ill be effective 1or even feasible2. In fact a poorly chosen strategy may actually increase overall risk. A strategic ris! management method is one that is produces a risk management strategy that reduces overall risk ith respect to a particular goal 1e.g. most risk reduction at lo est cost2. Aaving a particular goal here is firmly predicated on having acceptable% ell5defined strategic risk models as described earlier.

Lecture ""

Page .0 of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

Ce ill no describe models and methods for the strategic risk management of I8 system security. /uch models and methods ill aid in making important Dpre5crisisE risk management decisions by determining ho much effort 1or time2 should be invested assessing security risks ith respect to pro@ect risk factors such as cost% schedule% launch 1or operation2 indo % available skills and technology% uncontrollable external events% and so forth. In this ay e lay the foundation for a practical% economically feasible% empirically based approach for the strategic planning of risk management efforts. %.% The 6eed for Strategic Risk Management Methods 8o illustrate the need for strategic risk management e consider a general I8 security risk assessment. 8he risk exposure corresponding to the cumulative potential loss from security violations in the operational system 1e.g. system intrusions2 ill be called R>security. In the case of security risk assessment% the more assessment that is done% the lo er R>security is that results from unforeseen or uncontrolled vulnerabilities 1i.e. uncertainty2 such as those listed in 8able ". Assessed security attributes reduce both the siIe of loss due to DsurprisesE and the probability that surprises still remain. Prior to embarking on a security risk assessment% the system ill likely contain many potential vulnerabilities% either kno n or from uncertainties. 8his results in an initially high 1relative to the pro@ect2 probability of lossSP1L2Svalue. In this% some may be critical% and so /1L2% the siIe of the loss% ill be high. 8hus ithout any assessment R> security initially ill be high. Chen assessment has been employed% the likelihood of unidentified vulnerabilities ill be reduced. If assessment is done thoroughly 1and identified vulnerabilities are addressed2% most of vulnerabilities remaining are likely to minor% and R>security ill be lo . It is generally not feasible to be totally exhaustive hen performing a system assessment. As a result% the ideal assessment risk reduction profile 1as illustrated in ;igure 02 is here R>security decreases as rapidly as possible at the beginning. 8his profile is ideal because it provides the maximum risk reduction for any given amount security assessment effort. As stated previously% this profile is not a given for any strategy.
1 *.. *.+ *.*.; RE *.) *.$ *.3 *.2 *.1 * * )* 1** 1)* 2** Effort 2)* 3** 3)* 'east effort *ig*est C4 ar itrar+

;igure 3! >xample R>security Profiles Lecture "" Page .3 of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

8o give a concrete example of ho non5ideal risk reduction profile may occur% consider the data in ;igure 3 taken from a space systems ground control pro@ect. 8he data as generated by assessing each relevant security attribute in 8able " for! /1L2% in terms of percentage of the pro@ect value lost that ould result from the exploitation of a vulnerability in this attributeM and \P1L2 for the corresponding change in probability 1as a percentage2 of an exploitation occurring. 8hese in turn are used to calculate the corresponding R> reductions if the attribute is fully assessed. 8he R> has been normaliIed to the fractional portion of the total kno n R> that can be reduced through assessment. 8he cost is effort in hours used to perform the assessment of the attribute% and CB is the cost5benefit ratio 1the more specific form of RRL as mentioned earlier2. 8he attributes ere assessed using extensive security revie checklist. A"!$enial of /ervice A'!;ile $eletions A*!/ystem Crash A?!Access to Private $ata A.!(essage Hueue A",!Aard are Nverflo ;ailure A0!/ystem ;ault A""!Access to Code A3!(isled Nperator A"*!Resource QtiliIation AG!QnauthoriIed A".! ReFuirement Administrator Consistency and Completeness A+!QnauthoriIed A"0! Access Qnderstandability 8able "! >xample /ecurity Attributes Ce see that some care must be taken in choosing the order 1i.e. strategy2 to perform the assessments to achieve the ideal risk reduction profile indicated in ;igure 0. ;igure 3 compares three strategies for the order in hich the assessments might be performed. >ach tick mark on the graph for each R> profile corresponds to the assessment of a particular attribute. <ote that if the attributes are assessed in an arbitrarily order% the curve ill typically look like the approximately linear curve indicated in the middle of ;igure . and clearly does not achieve the ideal of ;igure 0. Another strategy is simply to do the least effort assessments first. 8his generally results in the even less desirable _supra5linear9 R>security reduction profile indicated in the top5most curve in ;igure .. It is seen that performing the assessments in the order of highest cost5benefit 1CB2 ill archive the desired R>security reduction profile. It can be sho n that under fairly general circumstances% this ill al ays be true. Chile e illustrated the strategic method ith only risk assessment% analogous methods exist for risk control. 8he important point here is that ithout a strategic approach% you are likely to end up ith a less than ideal R>security reduction profile as indicated in figure 3. 8he conseFuences of this are not fictitious and are significant because freFuently all assessment tasks are not 1or cannot be2 performed. 8his may happen for hen there are Lecture "" Page .+ of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

arbitrarily defined budgets% or it hen people DfeelE like enough risk has been reduced% or more commonly hen higher priority is given to other tasks over risk management. 8he result is a system ith a high5degree of risk due to uncertainty ith the usual conseFuences for DblindE risk taking KBasili,"% 8ran?GL. *. Practical Strategic Risk Models In this section e look at ho strategtic methods can be applied in practice. 8his includes using strategic models to help make complex planning decisions such as Dho much is enough risk assessmentBE and extending the strategic method to account for multiple techniFues. *.1 Multi4techni7ue Strategic Methods 8o begin ith% note that in our particular example the differences bet een risk5reduction strategies do not appear very pronounced. Nne reason for this is purely an artifact of the normaliIation of the R> scale. If a#solute R> is used 1that is actual risk removed rather than relative risk reduction2% the difference ill become more pronounced. 8here is a small concern that the absolute risk reduction profile may not be consistent ith the relative risk reduction profile. Ao ever% under general circumstances it can be sho n that an optimal relative risk5reduction implies an optimal absolute risk5reduction profile. In our example from ;igure 3% the differences in R> 1and% to a lesser extent% the effort2 bet een the attributes are relatively small hen using a single assessment techniFue 1in this example% vulnerability checklists2. Allo ing multiple assessment techniFues can make a profound difference. Consider in our example if the follo ing different assessment techniFues ere employed! A8"! Analysis using formal model A8*! API 8>/8 A8.! (odel Checking A80! Code revie A83! Lessons Learned A8+! 8est /uites A8G! =enerability checklist A8'! /tatic analysis of code A8?! >stimation A8",! Intervie vendor A8""! Investigation of past data A8"*! 8est on emulator A8".! Benchmark test A8"0! Attack /imulation
20000 18000 16000 14000 12000 10000 8000 6000 4000 2000 0 0 500
Cost

MAX CB Arbitrary

R"

1000

1500

;igure +! Absolute R> for (ultiple Assessment 8echniFues on >xample /ystem

Lecture ""

Page .G of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

A comparison of maximum cost5benefit versus arbitrary ordered assessment strategies using multiple assessment techniFues is sho n in ;igure +. 8he significant difference in strategies here is clear. 8he same amount of risk can be reduced for "). the effort. 8his kind of difference can be critical to a successful risk management effort. 8he practical application of a strategic method assumes e are able to generate realistic strategic risk models. ;rom the above discussion it is clear that e ant to apply the most appropriate risk management approach for each risk. Allo ing for multiple assessment techniFues greatly increased the risk5reduction cost5benefit. Ao to do this in general is not entirely ithout complications. ;or example% in generating a multi5attribute strategy e must account for the possibility that for some attributes% a particular assessment techniFue may not apply 1e.g. API test for (isled Nperator2 or may not be cost5effective 1e.g. (odel Checking for QnauthoriIed Administrator2. Ce no present an algorithm that generates a practical% cost5effective strategic method 1maximum risk reduction ith respect to costs2 ith multiple assessment techniFues for each attribute! /tep "! Identify the most significant system assessment attributes. Label them "% Z% n. /tep *! Identify the most significant assessment techniFues 1e.g. product testimonials% prototyping% etc.2 applicable to the pro@ect% available resources 1e.g. staff skills% tools2. Label them "%Z% m. /tep .! >stimate the relative / i1L2 Fuantities for attributes i:"%Z%n before any assessment /tep 0! >stimate the effort Ci@ % and siIe /i@1L2% and the change in risk exposures `R>i@1L2 : /i1L2 6 Pi1L2 5 /i@1L2 6 Pi@1L2 resulting from assessing attribute i using techniFue @. Aenceforth e associate i@ ith the pair 1attribute i% techniFue @2 /tep 3! Calculate the RRL matrix RRLi@ : `R>i@1L2 ) Ci@. Let 81k2:1rk%ck2 be the values for the corresponding attribute rk% techniFue ck index of the kth largest element in the matrix. ;or each k remove RRLr kck for i :"%Z%n then define 81k7"2 until all n attributes are covered. /et C81k2 to be the corresponding Ci@ and `R>81k2 to be the corresponding `R>i@1L2. /tep +! Traph the cumulative R> drop% R>1n2 : R> total 5 cumulative effort C1n2 :

! ="

`R>81k2 versus

! ="

C81k2.

8his above process produces an ideal R>security risk reduction strategy as presented earlier% and it easily generaliIes to risk control management activities. 8he strategy dictates to perform T'!) for ! ./,0,1,2 until the cost out eighs the benefit 1i.e. ,,(T'!) a "2 unless other risk reduction goals are desired 1this ill be discussed further in the next section2. 8he algorithm assumes that the entire effort allocated for each T'!) ill be expended and then the attribute ill not be assessed further. As a result% there may be more optimal strategies that allo for partial effort using multiple5techniFues per attribute. (ulti5 attribute optimiIation techniFues such as using simulated annealing could potentially be applied to find these but ill not be discussed further here. /ince the algorithm is some hat involved% an example to illustrate it is presented in 8ables *a%b%c%d%e% resulting in the R>security reduction strategy displayed in ;igures Ga%b. ;or simplicity of exposition in this example e ill assume that the techniFues only change the probability and not Lecture "" Page .' of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

the siIe of the risks. 8hat is% e ill consider CB and not RRL% ho ever the example easily generaliIes. As such% e ill be calculating `Pi@1L2 and CBi@ rather than RRLi@. misop ! misled pdt : product operator testimonials syscr ! system crash rvc : revie sysft ! system fault checklist unacc ! unauthoriIed ppy : product access prototyping denos ! denial of service misop /yscr sysft unacc denos 3, 0, 0, G, 3, 8able *a! Attribute Loss /iIe 1/teps "%*%.2 Ci@ pd t rv c pp y miso p ", ., G, sys cr ", *, G, sys ft ", *, " unac deno c s ", ., ', ", ., G.

8able *b! 1attribute i% techniFue 3) >ffort 1/tep 02 4Pi@1 miso sys sys unac deno L2 p cr ft c s pdt 0, ", +, "" *, rvc G, ., 33 ., ., ppy ?, ?, , ?, ?, 8able *c! 1attribute i% techniFue 3) R> Reduction 1/tep 02 CB miso p i@ pdt *,, rvc ""G pp +0 y sys cr 0, +, 3" sys ft *0, "", , unac c GG G, G? deno s ",, 3, +*

Lecture ""

Page .? of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

8able *d! 1attribute i% techniFue 3) Cost5Benefit 1/tep 32

81n CBi@ sorted b>81k2 4P81k2 R>total 5 b`R>81k2 2 " *0,.,, ", *0, '?+ * *,,.,, *, 0, +3+ . ""+.+G 3, G, +"+ 0 "",.,, G, 33 30+ 3 ",,.,, ', *, 0?" + G'.G3 "+, ?, 0G" G GG.,, "G, "" .'" ' G,.,, *,, ., .G, ? +0.*? *G, ?, .0, ", +".+0 .0. ?, *3, "" +,.,, .+. ., "+, "* 3".0. 0.. ?, "., ". 3,.,, 0+. ., 0, "0 0,.,, 0G. ", ", "3 ,.,, 0G0 , , 8able *e! Aighest Cost5Benefit /orted 1attribute i% techniFue 3) 1/tep 32 Chile this appears to be some hat complex at first blush% the algorithm is actually fairly straightfor ard to implement and use. ;or example% the authors performed all analysis for this example and the example in ;igure + completely ithin a spreadsheet.
1000 1400 1200 Cumulati'e Cost4-enefit 1000 800 600 400 200 0 0 100 200 300 400 500 Cum ulati'e Cost T(5) T(3) T(2) T(1) T(4) T(7) T(6) T(11) T(9) T(8) T(10) T(13) T(15) T(14) T(12) R" 900 800 700 600 T(1) T(2) T(3) 500 T(5) T(4) T(7) 400 T(6) 300 T(8) 200 100 0 0 100 200 300

T(9) T(10) T(11) 400 T(12) 500

Cumulati'e Cost

;igure Ga! Cost5Benefit 81k2

Gb! R>security reduction 81k2 1/tep +2

*.2 Strategic #ecision Making and Com(eting Risks

Lecture ""

Page 0, of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

Cith a collection of strategic risk models% the strategic method can be used to provide meaningful ans ers to Fuestions such as Dho much is enoughE risk assessment% mitigation% or control effort to invest in. 8his Fuestion is critically important as in practice it is unfeasible to implement exhaustive risk5reduction due to constraints on resources 1e.g. budget% personal% schedule% technology limitations2. >ven ithout such constraints% it is freFuently impossible to reduce a risk to Iero or even determine all possible risks for any given system. 8he best e can strive for is to reduce risk as much as possible ithin the given resources and uncertainties. Recall from previous sections that ordering risk5reduction activities from highest to lo est RRL results in the DidealE risk5reduction profile as indicated in ;igure .. It can be sho n that% ith respect to cost considerations% this is the optimal ordering for reducing risk hen only a fraction of the risk reduction activities ill be performed. 8hat is% if risk reduction activity is stopped at any point% there is no other ordering that reduces more risk and leaves less total risk hen the remaining activities are not done. In this model if cost is the only consideration% then a natural ans er to ho much is enough is hen the cost exceeds the risk reduction benefit. /ince the ordering of RRL is decreasing% there ill be no activity beyond this point that ould decrease the risk more than the cost for any of the previous activities. 8he difficulty ith using this stopping point is that it only accounts for cost ith respect to the investment in risk5reduction activities. ;urthermore it assumes that cost and risk are eFually exchangeable. In practice this is not typical and moreover management activities al ays imply direct or collateral risks. ;or example% a system patch may do a marvelous @ob plugging a critical system venerability to a type of operating system virus% but if it takes too long to implement and disseminate the patch% the virus may propagate and cause unacceptable amounts damage to the installed base. A more prudent action may be to Fuickly release a means of identifying and removing the particular virus threat at hand and perhaps delay the development of a patch. In this example% the risk5reduction activity of developing and disseminating a patch may indirectly increase the overall system risk despite the orthy% cost5effective investment in its development. 8he challenge here is of competing ris!s. 8hat is% hile one activity is reducing risk% there is another 1event or activity2 increasing risk. Chen the cumulative decreasing risk falls belo the cumulative increasing risk the overall system risk increases. Chen risk due to management activities is simply proportional 1i.e. to effort% cost% schedule% etc.2% it can be easily accounted for in R>security as an additional factor. 8his ho ever% is generally not a realistic assumption to make ith competing risks as ill be elaborated next. *.3 Risk of #elay Nur example is only one indication of competing risks. 8here are a host of others. Ao ever for this exposition e refer to these considerations collectively as ris! due to delay or R>delay and these risks are of a fundamentally different nature than R> security. /uch risks can negatively impact or even paralyIe a system to the point that the system ill fail to meet its intended operational goals. R>delay may result in losses due to non5use of the system hen reFuired or expected% from dissatisfied customers% or from productivity losses hen system capabilities are inaccessible or unreliable. R> delay ill monotonically increase since R>delay represents the cumulative R> due to delays. Ce assume that a Lecture "" Page 0" of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

system starts out ith no risk of delay and that any risk management effort expended contributes to the overall delay risk. 8o illustrate R>delay% let us consider our operating system patch example again. Ce consider this example only because the R>delay model is particularly straightfor ard. (any other models are possible. ;or our example% due to compounding of factors% it has been empirically suggested once an operating system vulnerability is identified% the risk that it ill be exploited increase supra5linearly. N ing to this% a reasonably good approximation to the R> delay risk profile can be generated through the identification of a fe ell chosen data points. Aere e consider the follo ing three _critical points9 to be relevant!
Point 1: vulnerability identified (or identifiable) Once a system vulnerability has been identified it is assumed that a malicious party may also have identified the vulnerability and will attempt to exploit it thereby increasing overall security risk. Point 2: vulnerability exploited When it is known (or very likely) that a system vulnerability has been exploited, losses are already being realized. Point 3: vulnerability exploits result in total loss When the vulnerability has been exploited (e.g. massive dissemination of highly destructive virus) to the point that the system no longer can, in confidence, operate to realize its intended value. When realized losses due to the exploits are unrecoverable (e.g. virus destroys or corrupts data, shuts out users, operating system must be reinstalled, etc.) then it is considered a total loss.

8he schedule for points "% *% . are necessarily seFuential% and the time 1in terms of effort expended2 to arrive at these points is cumulative% hence R> delay is successively non5 decreasing at these points. (oreover% the change in R>delay bet een points " and * can never be less than that bet een the starting point and point "M similarly for points * and .. 8he critical points for our example pro@ect are contained in 8able . and the resulting R>delay profile is illustrated in ;igure '.
1 0.9 0.8 0.7 0.6 R" 0.5 0.4 0.3 0.2 0.1 0 0 100 200 300 400 500 "ffort 0#uration2 !"#$rabi"ity i)$#ti*i$) !"#$rabi"ity $'("%it$) !"#$rabi"ity t%ta" "%&&

Point

S(L)

P(L)

RE

identifi ., "3 ,.,0 ", ed 3 , exploit +3 0, ,.*+ *G ed 3 total ', G, ,.3+ 0, 8able .! >xample R>delay Critical Points

As is evident in our example% creating appropriate risk models 1e.g. R> security% R>delay2 ;igure '! >xample R>delay profile reFuires an organiIation to accumulate a fair amount of calibrated experience on the nature of the risks% their probabilities and their magnitudes ith respect to risk management effort 1e.g. cost% duration2. In general this can be challenging and costly% ho ever% there are many practical approaches and soft are tools that address these topics

Lecture ""

Page 0* of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology e

specifically 1see for example K8ran?G% Nchs,"% Aall?'% Boehm?"L2. /ome of these ill touch upon in /ection +.

*.% -alancing Com(eting Risks for Strategic Planning 8he general approach in ans ering a Dho much is enoughE Fuestion in the face of competing risks is to optimiIe R>security ith respect to R>delay. $epending on the models used% this amounts DbalancingE the risk5reduction benefit of risk management and control activities ith risk increases due to cost expenditures of dollars% effort% schedule% accessibility and so forth. Consider again our risk assessment example. Ce have noted previously that it is important to have an efficient assessment strategy for reducing R> security and that such a strategy may be generated by assessing the attribute5techniFue pairs ith the highest RRL first. Ce have indicated that assessment reduces R>security hile simultaneously increasing R>delay due to the delay in removing system vulnerabilities. 8oo much assessment ill put the system increasingly at risk as it exceeds the R>delay critical points. Ao ever% too little assessment ill leave the system ith too much risk due to uncertainty in the un5assessed attributes. 8he ideal assessment strategy decreases R>security but not expend so much effort that this reduction is dominated by R>delay. 8his is a formal response to the Fuestion of Dho much is enough security assessmentBE <ote that this derives directly from the general 1often misunderstood2 risk management principle! If itXs risky to not manage% $N manage 1e.g.% uncertain% high loss potential% unprecedented2M If itXs risky to manage% $N <N8 manage 1e.g.% ell5established% ell5kno n% highly tested2. 8he goal is to apply the above principle to balance R>security and R>delay to determine a strategic amount of assessment to perform before committing to particular management or control strategy. Assuming e have generated a strategic R> security profile% the optimal assessment effort to expend ill be that hich minimiIes the R> security 7 R>delay. $oubtless there are many dependencies among the risk factors% ho ever recall that as mentioned in /ection 0..% the R>9s ill be additive. As sho n in ;igure ?a% the decreasing R> security and increasing R>delay ill have a minimum O the Ds eet spotE at some intermediate effort point. Assuming the ideal R>security reduction strategy discussed earlier% a strategic stopping point for assessment is hen this intermediate effort point has been reached. <ote that the location of the s eet spot ill vary by type of organiIation. ;or example% in a Ddot.comE company here R>delay increases rapidly due to market pressures% the resulting s eet spot ill get pushed to the left indicating that less assessment should be done. By contrast% a safety5critical product such as for a nuclear po er plant ill have greater R>security due to larger potential losses. 8he s eet spot is pushed to the right% indicating that more assessment should be done. 8he s eet spot determination for the examples previously discussed is sho n in ;igure ?b. A .rd order polynomial 1sho n in the figure2 as used to interpolate bet een the critical points so that the R> security 7 R>delay could be numerically estimated.

Lecture ""

Page 0. of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

;igure ?a! Balancing R>security and R>delay

;igure ?b! / eet /pot for >xample

*.* 8nsuita9le S:eet S(ots 8here are situations here the s eet spot is not an acceptable determination of ho much assessment to perform. Acceptability is achieved only hen 1as indicated in ;igure ?a2 the R> at s eet spot is belo a given risk tolerance and the effort at the s eet spot is less than the effort at critical *. Chile ideally the assessment effort should be less than the effort at point "% for most pro@ects the additional risk incurred by passing this point is tolerable KBasili,"L. 8his risk is softened by the fact that the effort at the s eet spot cannot be far from the effort at point ". 8he effort beyond point " to complete the assessment is small enough that the exploitability of the vulnerability ill mostly be settled prior to passing point ". 8he exception is if R> delay increases very gradually% but then in turn the increase in risk ould be much less pronounced. Chat can be done in the event that the s eet spot is above an acceptable risk toleranceB 8here are t o solutions! find another assessment approach that can lo er R> security% or mitigate potential losses 1e.g. insurance policy2 due to pro@ect delay in order to lo er R>delay. It is best if both solutions are applied. If the s eet spot effort exceeds the effort for point *% the only reasonable approach is to find another assessment approach that can lo er R>security faster. Paradoxically% if R>delay ere increased% say by imposing greater cost% schedule% or effort constraints% this too could potentially move the s eet spot effort in front of point *. Ao ever the price to pay for this is a heavy increase in the overall pro@ect risk hich likely ill exceed a tolerable level. As ith previous examples% there are analogous methods for other types of security risk assessments% mitigation and control activities.

;. Practical Risk "<(osure "stimation


A risk management program relies critically on the ability to estimate representative risk exposures. It is tempting to expend a large amount of effort to obtain precise results% ho ever this may prove untenable or impractical. ;ortunately for strategic method applications estimates do not have to be precise. In this section e ill describe one practical approach to estimating risk exposures. ;.1 &ualitati'e Methods

Lecture ""

Page 00 of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

Recall that calculating R> for a given risk involves estimating the probability of a potential loss P1L2 and the siIe of that possible loss /1L2. 8he challenge of estimating /1L2 lies in Fuantifying intangibles such as Dloss of reputation.E 8here are numerous techniFues that claim the ability to address this challenge% ho ever in our experience e have found that estimating highly sub@ective loss potentials is best accomplished by choosing a tangible DstandardE ithin the particular context at hand and then establishing the remaining values relative to this standard. ;or example% an organiIation determined that they lost &"%,,, for every minute their central sales system as do n due to lost sales. If the sales system as do n for more than 0 hours% their reputation as an Deasy to buy fromE company ould be degraded and fe er customers ould return. 8he minimum loss of 06+,6&"%,,, : &*0,%,,, as considered a D.E on a scale from ,5", in terms of loss magnitude% and the loss of return customers as considered a 3 relative to this monetary loss. 8hese siIe estimates can be used for R> calculations so long as all loss potentials are translated into loss magnitudes from ,5", relative to the system do n time. In general it is difficult to calculate P1L2 directly. 8his is primarily due to a lack of representative models that are able to generate an appropriate probability distribution. >ven though there are many onderful candidate parametric5based models% freFuently there is not enough available data to calibrate and validate them. Nne alternative to parametric models is to make use of a Fualitative based Dbetting analogyE as described in the follo ing steps! Step /! ;or the risk under consideration% define a DsatisfactoryE level Step 0! >stablish a personally meaningful amount of money% say% &",, Step 1! $etermine ho satisfactory level much money you ould be illing to risk in betting on a

Step 5! >stablish proposition e.g. Qsing virus checker ill avoid infection and loss of data Step 6! >stablish betting odds e.g. <o loss of data! you data! you lose &3,, in &",,% infection and loss of

Step 7! $etermine illingness to bet! Cilling! lo probability Qn illing! high probability <ot sure! there is risk due to uncertainty% so need to buy information >xpress your illingness to bet ith respect to the risk under consideration e.g. Ce are likely to take this bet ;or hatever Fualifies your illingness to bet statement 1in our example the Fualification is DlikelyE% use an ad@ective calibration chart such as the one illustrated in ;igure ", belo to get an estimate of the risk probability!

Lecture ""

Page 03 of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

;igure ",! Probability Ad@ective Calibration 8able ;.2 "m(irical a((roaches >mpirical methods rely on generating risk exposure estimates based on observations% historical data% or experiments. Nne of the attractive properties of risk exposure is that it represents the expectation 1or mean value2 of the potential losses associated ith a risk event. Qnder Fuite general conditions% this value can be approximated ith the sample mean. /ay you have losses (" % (* %...% (n associated ith a particular risk area% then " n ,- (i n i =" Consult any college level statistics book for further details on this. Ce can make use of this result in several ays depending on hether or not historical ris! loss data and ris! event models are available. Aistorical risk loss data is the recorded losses realiIed from actual pro@ects that ere exposed to the risk 1i.e. the risk actually occurred2 under consideration. A risk event model is a dynamic representation of all the possible losses and conditions under hich those losses ould occur. #e'1*i Studies: no *istorica' dataA no ris) event mode' 4ou may have noticed that the betting analogy relies on having a credible ad@ective calibration table. 8hese tables might have been based on surveying local experts. 8his techniFue can be used directly to estimate risk exposures. If a representative group of experts are available% a &elphi study KAelmer++L often provides orkable risk exposure estimates. /uch a study ould survey experts to estimate risk exposure values and then calculate the sample mean as an estimate for the actual risk exposure. 8o avoid getting erroneous results% $elphi studies should not be undertaken haphaIardly. 8here are a

Lecture ""

Page 0+ of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

variety of ell documented methodologies for conducting $elphi studies that should be follo ed to ensure valid results. Ris) Sam1'ing: *istorica' data avai'a 'eA no ris) event mode' If there is a reasonable amount of historical loss data available that is representative 1unbiased2 then risk sampling may be an effective means of estimating risk exposures. Calculate the estimated risk exposure by taking the sample mean of the historical data. Ao ever it is likely that the data set ill be small and the estimate may be poor. Qnder these conditions% bootstrapping and @ackknifing methods K>ffron'.L may be used to improve estimates. Ris) Event Simu'ation: no *istorica' data avai'a 'eA ris) event mode' avai'a 'e Chen historical data lacking 1or non5existent2 and you are able to model the particular risk events under consideration% then ris! event simulation may provide a means of estimating risk exposures. Qsually this amounts to creating a series of random events to run through the risk model and then calculate the sample mean to estimate the risk exposure. 8he idea here is to create a dynamic simulation of the risk events and run a series of experiments to get a representative collection of loss values. 8here are a host of po erful tools that can be of great use in creating and running these simulations. 8 o examples are the /imulink package for (atlab K(tlbL and also (odelica for $ymola K$ymlaL. ;.3 Pitfalls to A'oid As indicated in the above discussions% there are many possible complications in estimating risk exposures such as utiliIing insufficient or biased data. Nf all the potential complication perhaps the most common and troublesome is the problem of compound ris!s. A compound risk is one that is a dependent combination of risks. /ome examples are! Addressing more than one threat (anaging threats ith key staff shortages =ague vulnerability descriptions ith ambitious security plans Qntried operating system patches ith ambitious release schedule

4ou must atch out for compound risks. 8he problem is that the dependencies complicate the probabilities in difficult to determine ays. Chen you identify a compound risk% it is best to reduce it to a non5compound risks if possible. If the dependencies are too strong or complex% then this may not be possible. In this case you should plan to devote extra attention to containing such risks.

.. Summary
In this chapter e have sho n a number of I8 security risk assessment and management techniFues and standards. 8his is a rich and gro ing field of inFuiry and it reaches deep into technical as ell as organiIational and managerial practices. 8he key insights that the reader should take a ay from this chapter are that! "2 security risk management is

Lecture ""

Page 0G of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

only effective to the degree that risk assessment is effectiveM and *2 it is not obvious ho to go about applying individual security techniFues 1of hich there are many2 to ensure an optimal result at the level of the organiIation.
8o address these issues e have described and demonstrated a strategic method for Fuantitatively assessing I8 security risk that is provably optimal ith respect to cost5benefit. 8his gives a manager a po erful tool for managing I8 security risk.

1= SSAR>
4S $$>>: British /tandard GG??M consists of B/ GG?? Parts " and * 4S $$>> ,art 1: Information 8echnologySCode of Practice for Information /ecurity (anagement 4S $$>> ,art %: Information /ecurity (anagement /ystemsS/pecification ith Tuidance for Qse Common Criteria: short for! Common Criteria for Information 8echnology /ecurity >valuationM a standard for dealing ith security5related CN8/ products a descendent of I8/>C% C8CP>C% and 8C/>C C6C,EC: Canadian 8rusted Computer Product >valuation CriteriaM a standard for dealing ith security5related CN8/ productsM an ancestor of Common Criteria and I/N "30,' &R3,: 8he ;acilitated Risk Assessment Program 1;RAP2M a Fualitative process% developed by 8homas Peltier B?,33: 9Q/2 Aealth Portability and Accountability Act of "??+M a statute% ith hich relevant organiIations are reFuired to comply% and hich dictates government audit% ith possible severe conseFuences% in case of complaints of violations ?S. 154"=: 8he I/N version of Common Criteria. ?S. 1$$>>: International NrganiIation for /tandardiIation "GG??! Code of Practice for Information /ecurity (anagement ?S. 6R 1!!!5 9G5?6SA for s*ort:: International NrganiIation for /tandardiIation 8echnical Report "...3! Tuidelines for the (anagement for I8 /ecurity ?S.C?EC %1=%$: International NrganiIation for /tandardiIation)International >lectrotechnical Commission *"'*GM the I/N)I>C version of //>5C(( ?6SEC: I8 /ecurity >valuation and Certification /chemeM a standard for dealing ith security5related CN8/ productsM an ancestor of Common Criteria and I/N "30,'. N?S6: 1Q/2 <ational Institute of /tandards and 8echnology. ;ounded in "?,"% <I/8 is a non5regulatory federal agency ithin the Q./. Commerce $epartmentXs 8echnology Administration. <I/8Xs mission is to develop and promote measurement% standards% and technology to enhance productivity% facilitate trade% and improve the Fuality of life. .C63-E: Nperationally Critical 8hreat% Asset% and =ulnerability >valuationM a method% developed by the /oft are >ngineering Institute 1/>I2 and the C>R8 1Computer >mergency Response 8eam2 at Carnegie (ellon Qniversity .range 4oo): see 8C/>C SR5#: (icrosoft9s /ecurity Risk (anagement $iscipline. SSE8C55: /ystem /ecurity Capability (aturity (odel 6CSEC: 8rusted Computer /ystem >valuation Criteria. often referred to as the Dorange book%E as published in "?'3 by the <ational Computer /ecurity Council 1<C/C2% a branch of the <ational /ecurity Agency 1</A2. 8C/>C is a standard for dealing ith security5related CN8/ productsM an ancestor of Common Criteria and I/N "30,'. .

CR SS R"$"R"6C"S
Please insert

R"$"R"6C"S

Lecture ""

Page 0' of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

KAbts,"L Abts% C.% Boehm% B.% and Clark% >.B.% DCNCN8/! A /oft are CN8/5Based /ystem 1CB/2 Cost (odel%E roceedings, -SC8M 099/% April *,,"% pp. "5'. KAlberts,*L Alberts% C.% $orofee% A.% Managing Information Security ,is!s: The 8CTA;- Approach% Addison5Cesley% *,,*. KAlves,*L Alves% C.% ;inkelstein% A.% UChallenges in CN8/5$ecision (aking! A Toal5$riven ReFuirements >ngineering PerspectiveU% <or!shop on Software -ngineering &ecision Support, in con3unction with S-=->90 . Ischia% Italy% Puly *,,*. KBasili,"L Basili% =.% Boehm% B.% UCN8/5Based /ystems 8op ", ListU% I--- Computer% =ol. .0% <o. 3% (ay *,,". KBoehm?"L Boehm% B. D/oft are Risk (anagement! Principles and PracticesE% I--- Software% 1'2% "% Pan. "??"% pp..*50". KCarr?.L Carr% (. P.M Jonda% /. L.M (onarch% I.M Qlrich% ;. C. # Calker% C.;.% U8axonomy5Based Risk IdentificationU% /oft are >ngineering Institute% Carnegie (ellon Qniversity% 8echnical Report C(Q)/>I5?.58R5+% Pune% "??. KCohen,0L Cohen% ;.% D;red Cohen and Associates O /ecurity $atabase%E http!))fc.all.net% *,,0. K$ymlaL http!)) .dynasim.com

K>ffron'.L >fron% B. and T. Tong. "?'.. A leisurely look at the bootstrap% the @ackknife% and cross5validation. 8he American /tatistician .G! .+50'. KAall?'L Aall% >. (.% Managing ,is!% Addison Cesley Longman% "??' KAelmer++L Aelmer% N.% Social Technology% Basic Books% <e 4ork% "?++ KI/N"30,'L% DI/N)I>C Information technology S /ecurity techniFues S >valuation criteria for I8 securityE I/N)I>C "30,'% "???. KI/N"GG??L DInformation technologySCode of practice for information security managementE% International NrganiIation for /tandardiIation% I/N)I>C "GG??!*,,,% *,,,. KI/N*"*'GL% DI/N)I>C Information technology S /ystems /ecurity >ngineering S Capability (aturity (odel 1//>5 C((2.E I/N)I>C *"*'G% *,,*. K(tlbL Information on the soft are can be found at http!)) .math orks.com

K<I/8L DRisk (anagement Tuide for Information 8echnology /ystemsE% <I/8 /pecial Publication ',,5.,% Puly *,,*. KNchs,"L Nchs% (.% Pfahl% $.% Chrobok5$iening% T. and <othelfer5Jolb% B.% UA (ethod for >fficient (easurement5 based CN8/ Assessment and /election 5 (ethod $escription and >valuation ResultsU. roc" of the Seventh International Software Metrics Symposium M-T,ICS 099/% April *,,"% London% pp. *'35*?G. KNC8A=>,0L Nperationally Critical 8hreat% Asset% and =ulnerability >valuation 1NC8A=>2. NC8A=> C>R8 Coordination Center% /oft are >ngineering Institute at Carnegie (ellon QniversityM available at http!)) .cert.org)octave)% *,,0. KPeltier,.L 8. Peltier% P. Peltier% P. A. Blackley% Managing A ?etwor! ;ulnera#ility Assessment% Auerbach Publications% *,,.. K/R($,0L DQnderstanding the /ecurity Risk (anagement $isciplineE% in Microsoft Solution for Securing <indows 0999 Server% Chapter .% http!)) .microsoft.com)technet)/ecurity)prodtech) in*,,,)sec in*k)default.mspx% *,,0. K8ran?GL 8ran% =. and Liu% $5B. DA Risk5(itigating (odel for the $evelopment of Reliable and (aintainable Large5 /cale Commercial5Nff58he5/helf Integrated /oft are /ystemsE% in roceedings of the /@@A Annual ,elia#ility and Maintaina#ility Symposium% pp. .+"5+G% Pan "??G. K/he '+L /he hart% Calter A% /tatistical (ethod from the =ie point of Huality Control% $over Publications% "?'+ 1reprint of "?0 edition% ith fore ord by C. >d ards $eming. K/C5C((?3L Paulk% (ark C.% Charles =. Ceber% Bill Curtis% (ary Beth Chrissis% 8he Capability (aturity (odel! Tuidelines for Improving the /oft are Process% Addison5Cesley% "??3. K/>5C((?3L />5C((! Bate% Roger% et. al.% Systems -ngineering Capa#ility Maturity Model 1version "."2% /oft are >ngineering Institute% Carnegie (ellon Qniversity% <ovember "??3.

Lecture ""

Page 0? of 3,

C/ 3+0 ;all *,,3

/tevens Institute of 8echnology

Lecture ""

Page 3, of 3,

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy