0% found this document useful (0 votes)
142 views

An Overview of Computer Security PPT by Matt Bishop

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
142 views

An Overview of Computer Security PPT by Matt Bishop

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 113

Chapter 1: Introduction

• Components of computer security


• Threats
• Policies and mechanisms
• The role of trust
• Assurance
• Operational Issues
• Human Issues
November 1, 2004 Introduction to Computer Security Slide #1-1
©2004 Matt Bishop
Basic Components
• Confidentiality
– Keeping data and resources hidden
• Integrity
– Data integrity (integrity)
– Origin integrity (authentication)
• Availability
– Enabling access to data and resources

November 1, 2004 Introduction to Computer Security Slide #1-2


©2004 Matt Bishop
Classes of Threats
• Disclosure
– Snooping
• Deception
– Modification, spoofing, repudiation of origin, denial of
receipt
• Disruption
– Modification
• Usurpation
– Modification, spoofing, delay, denial of service

November 1, 2004 Introduction to Computer Security Slide #1-3


©2004 Matt Bishop
Policies and Mechanisms
• Policy says what is, and is not, allowed
– This defines “security” for the site/system/etc.
• Mechanisms enforce policies
• Composition of policies
– If policies conflict, discrepancies may create
security vulnerabilities

November 1, 2004 Introduction to Computer Security Slide #1-4


©2004 Matt Bishop
Goals of Security
• Prevention
– Prevent attackers from violating security policy
• Detection
– Detect attackers’ violation of security policy
• Recovery
– Stop attack, assess and repair damage
– Continue to function correctly even if attack
succeeds

November 1, 2004 Introduction to Computer Security Slide #1-5


©2004 Matt Bishop
Trust and Assumptions
• Underlie all aspects of security
• Policies
– Unambiguously partition system states
– Correctly capture security requirements
• Mechanisms
– Assumed to enforce policy
– Support mechanisms work correctly

November 1, 2004 Introduction to Computer Security Slide #1-6


©2004 Matt Bishop
Types of Mechanisms

secure precise broad

set of reachable states set of secure states

November 1, 2004 Introduction to Computer Security Slide #1-7


©2004 Matt Bishop
Assurance
• Specification
– Requirements analysis
– Statement of desired functionality
• Design
– How system will meet specification
• Implementation
– Programs/systems that carry out design

November 1, 2004 Introduction to Computer Security Slide #1-8


©2004 Matt Bishop
Operational Issues
• Cost-Benefit Analysis
– Is it cheaper to prevent or recover?
• Risk Analysis
– Should we protect something?
– How much should we protect this thing?
• Laws and Customs
– Are desired security measures illegal?
– Will people do them?
November 1, 2004 Introduction to Computer Security Slide #1-9
©2004 Matt Bishop
Human Issues
• Organizational Problems
– Power and responsibility
– Financial benefits
• People problems
– Outsiders and insiders
– Social engineering

November 1, 2004 Introduction to Computer Security Slide #1-10


©2004 Matt Bishop
Tying Together
Threats
Policy
Specification

Design

Implementation

Operation

November 1, 2004 Introduction to Computer Security Slide #1-11


©2004 Matt Bishop
Key Points
• Policy defines security, and mechanisms
enforce security
– Confidentiality
– Integrity
– Availability
• Trust and knowing assumptions
• Importance of assurance
• The human factor
November 1, 2004 Introduction to Computer Security Slide #1-12
©2004 Matt Bishop
Chapter 2: Access Control Matrix
• Overview
• Access Control Matrix Model
• Protection State Transitions
– Commands
– Conditional Commands

November 1, 2004 Introduction to Computer Security Slide #2-1


© 2004 Matt Bishop
Overview
• Protection state of system
– Describes current settings, values of system
relevant to protection
• Access control matrix
– Describes protection state precisely
– Matrix describing rights of subjects
– State transitions change elements of matrix

November 1, 2004 Introduction to Computer Security Slide #2-2


© 2004 Matt Bishop
Description

objects (entities)
o1 … om s1 … sn • Subjects S = { s1,…,sn }
s1 • Objects O = { o1,…,om }
s2 • Rights R = { r1,…,rk }
subjects

• Entries A[si, oj] ⊆ R



• A[si, oj] = { rx, …, ry }
sn means subject si has rights
rx, …, ry over object oj

November 1, 2004 Introduction to Computer Security Slide #2-3


© 2004 Matt Bishop
Example 1
• Processes p, q
• Files f, g
• Rights r, w, x, a, o
f g p q
p rwo r rwxo w
q a ro r rwxo

November 1, 2004 Introduction to Computer Security Slide #2-4


© 2004 Matt Bishop
Example 2
• Procedures inc_ctr, dec_ctr, manage
• Variable counter
• Rights +, –, call
counter inc_ctr dec_ctr manage
inc_ctr +
dec_ctr –
manage call call call

November 1, 2004 Introduction to Computer Security Slide #2-5


© 2004 Matt Bishop
State Transitions
• Change the protection state of system
• |– represents transition
– Xi |– τ Xi+1: command τ moves system from
state Xi to Xi+1
– Xi |– * Xi+1: a sequence of commands moves
system from state Xi to Xi+1
• Commands often called transformation
procedures
November 1, 2004 Introduction to Computer Security Slide #2-6
© 2004 Matt Bishop
Primitive Operations
• create subject s; create object o
– Creates new row, column in ACM; creates new column in ACM
• destroy subject s; destroy object o
– Deletes row, column from ACM; deletes column from ACM
• enter r into A[s, o]
– Adds r rights for subject s over object o
• delete r from A[s, o]
– Removes r rights from subject s over object o

November 1, 2004 Introduction to Computer Security Slide #2-7


© 2004 Matt Bishop
Creating File
• Process p creates file f with r and w
permission
command create•file(p, f)
create object f;
enter own into A[p, f];
enter r into A[p, f];
enter w into A[p, f];
end

November 1, 2004 Introduction to Computer Security Slide #2-8


© 2004 Matt Bishop
Mono-Operational Commands
• Make process p the owner of file g
command make•owner(p, g)
enter own into A[p, g];
end
• Mono-operational command
– Single primitive operation in this command

November 1, 2004 Introduction to Computer Security Slide #2-9


© 2004 Matt Bishop
Conditional Commands
• Let p give q r rights over f, if p owns f
command grant•read•file•1(p, f, q)
if own in A[p, f]
then
enter r into A[q, f];
end
• Mono-conditional command
– Single condition in this command

November 1, 2004 Introduction to Computer Security Slide #2-10


© 2004 Matt Bishop
Multiple Conditions
• Let p give q r and w rights over f, if p owns
f and p has c rights over q
command grant•read•file•2(p, f, q)
if own in A[p, f] and c in A[p, q]
then
enter r into A[q, f];
enter w into A[q, f];
end

November 1, 2004 Introduction to Computer Security Slide #2-11


© 2004 Matt Bishop
Key Points
• Access control matrix simplest abstraction
mechanism for representing protection state
• Transitions alter protection state
• 6 primitive operations alter matrix
– Transitions can be expressed as commands
composed of these operations and, possibly,
conditions

November 1, 2004 Introduction to Computer Security Slide #2-12


© 2004 Matt Bishop
Chapter 5: Confidentiality
Policies
• Overview
– What is a confidentiality model
• Bell-LaPadula Model
– General idea
– Informal description of rules

November 1, 2004 Introduction to Computer Security Slide #5-1


©2004 Matt Bishop
Overview
• Goals of Confidentiality Model
• Bell-LaPadula Model
– Informally
– Example Instantiation

November 1, 2004 Introduction to Computer Security Slide #5-2


©2004 Matt Bishop
Confidentiality Policy
• Goal: prevent the unauthorized disclosure
of information
– Deals with information flow
– Integrity incidental
• Multi-level security models are best-known
examples
– Bell-LaPadula Model basis for many, or most,
of these
November 1, 2004 Introduction to Computer Security Slide #5-3
©2004 Matt Bishop
Bell-LaPadula Model, Step 1
• Security levels arranged in linear ordering
– Top Secret: highest
– Secret
– Confidential
– Unclassified: lowest
• Levels consist of security clearance L(s)
– Objects have security classification L(o)

November 1, 2004 Introduction to Computer Security Slide #5-4


©2004 Matt Bishop
Example
security level subject object
Top Secret Tamara Personnel Files
Secret Samuel E-Mail Files
Confidential Claire Activity Logs
Unclassified Ulaley Telephone Lists
• Tamara can read all files
• Claire cannot read Personnel or E-Mail Files
• Ulaley can only read Telephone Lists
November 1, 2004 Introduction to Computer Security Slide #5-5
©2004 Matt Bishop
Reading Information
• Information flows up, not down
– “Reads up” disallowed, “reads down” allowed
• Simple Security Condition (Step 1)
– Subject s can read object o iff L(o) ≤ L(s) and s
has permission to read o
• Note: combines mandatory control (relationship of
security levels) and discretionary control (the
required permission)
– Sometimes called “no reads up” rule
November 1, 2004 Introduction to Computer Security Slide #5-6
©2004 Matt Bishop
Writing Information
• Information flows up, not down
– “Writes up” allowed, “writes down” disallowed
• *-Property (Step 1)
– Subject s can write object o iff L(s) ≤ L(o) and
s has permission to write o
• Note: combines mandatory control (relationship of
security levels) and discretionary control (the
required permission)
– Sometimes called “no writes down” rule
November 1, 2004 Introduction to Computer Security Slide #5-7
©2004 Matt Bishop
Basic Security Theorem, Step 1
• If a system is initially in a secure state, and
every transition of the system satisfies the
simple security condition, step 1, and the *-
property, step 1, then every state of the
system is secure
– Proof: induct on the number of transitions

November 1, 2004 Introduction to Computer Security Slide #5-8


©2004 Matt Bishop
Bell-LaPadula Model, Step 2
• Expand notion of security level to include
categories
• Security level is (clearance, category set)
• Examples
– ( Top Secret, { NUC, EUR, ASI } )
– ( Confidential, { EUR, ASI } )
– ( Secret, { NUC, ASI } )

November 1, 2004 Introduction to Computer Security Slide #5-9


©2004 Matt Bishop
Levels and Lattices
• (A, C) dom (A′, C′) iff A′ ≤ A and C′ ⊆ C
• Examples
– (Top Secret, {NUC, ASI}) dom (Secret, {NUC})
– (Secret, {NUC, EUR}) dom (Confidential,{NUC, EUR})
– (Top Secret, {NUC}) ¬dom (Confidential, {EUR})
• Let C be set of classifications, K set of categories.
Set of security levels L = C × K, dom form lattice
– lub(L) = (max(A), C)
– glb(L) = (min(A), ∅)

November 1, 2004 Introduction to Computer Security Slide #5-10


©2004 Matt Bishop
Levels and Ordering
• Security levels partially ordered
– Any pair of security levels may (or may not)
be related by dom
• “dominates” serves the role of “greater
than” in step 1
– “greater than” is a total ordering, though

November 1, 2004 Introduction to Computer Security Slide #5-11


©2004 Matt Bishop
Reading Information
• Information flows up, not down
– “Reads up” disallowed, “reads down” allowed
• Simple Security Condition (Step 2)
– Subject s can read object o iff L(s) dom L(o)
and s has permission to read o
• Note: combines mandatory control (relationship of
security levels) and discretionary control (the
required permission)
– Sometimes called “no reads up” rule
November 1, 2004 Introduction to Computer Security Slide #5-12
©2004 Matt Bishop
Writing Information
• Information flows up, not down
– “Writes up” allowed, “writes down” disallowed
• *-Property (Step 2)
– Subject s can write object o iff L(o) dom L(s)
and s has permission to write o
• Note: combines mandatory control (relationship of
security levels) and discretionary control (the
required permission)
– Sometimes called “no writes down” rule
November 1, 2004 Introduction to Computer Security Slide #5-13
©2004 Matt Bishop
Basic Security Theorem, Step 2
• If a system is initially in a secure state, and every
transition of the system satisfies the simple
security condition, step 2, and the *-property, step
2, then every state of the system is secure
– Proof: induct on the number of transitions
– In actual Basic Security Theorem, discretionary access
control treated as third property, and simple security
property and *-property phrased to eliminate
discretionary part of the definitions — but simpler to
express the way done here.

November 1, 2004 Introduction to Computer Security Slide #5-14


©2004 Matt Bishop
Problem
• Colonel has (Secret, {NUC, EUR})
clearance
• Major has (Secret, {EUR}) clearance
– Major can talk to colonel (“write up” or “read
down”)
– Colonel cannot talk to major (“read up” or
“write down”)
• Clearly absurd!
November 1, 2004 Introduction to Computer Security Slide #5-15
©2004 Matt Bishop
Solution
• Define maximum, current levels for subjects
– maxlevel(s) dom curlevel(s)
• Example
– Treat Major as an object (Colonel is writing to him/her)
– Colonel has maxlevel (Secret, { NUC, EUR })
– Colonel sets curlevel to (Secret, { EUR })
– Now L(Major) dom curlevel(Colonel)
• Colonel can write to Major without violating “no writes down”
– Does L(s) mean curlevel(s) or maxlevel(s)?
• Formally, we need a more precise notation

November 1, 2004 Introduction to Computer Security Slide #5-16


©2004 Matt Bishop
DG/UX System
• Provides mandatory access controls
– MAC label identifies security level
– Default labels, but can define others
• Initially
– Subjects assigned MAC label of parent
• Initial label assigned to user, kept in Authorization and
Authentication database
– Object assigned label at creation
• Explicit labels stored as part of attributes
• Implicit labels determined from parent directory

November 1, 2004 Introduction to Computer Security Slide #5-17


©2004 Matt Bishop
MAC Regions

A&A database, audit Administrative Region


Hierarchy User data and applications User Region
levels
VP–1 Site executables
VP–2 Trusted data Virus Prevention Region
VP–3 Executables not part of the TCB
VP–4 Executables part of the TCB
VP–5 Reserved for future use
Categories

IMPL_HI is “maximum” (least upper bound) of all levels


IMPL_LO is “minimum” (greatest lower bound) of all levels
November 1, 2004 Introduction to Computer Security Slide #5-18
©2004 Matt Bishop
Directory Problem
• Process p at MAC_A tries to create file /tmp/x
• /tmp/x exists but has MAC label MAC_B
– Assume MAC_B dom MAC_A
• Create fails
– Now p knows a file named x with a higher label exists
• Fix: only programs with same MAC label as
directory can create files in the directory
– Now compilation won’t work, mail can’t be delivered

November 1, 2004 Introduction to Computer Security Slide #5-19


©2004 Matt Bishop
Multilevel Directory
• Directory with a set of subdirectories, one per
label
– Not normally visible to user
– p creating /tmp/x actually creates /tmp/d/x where d is
directory corresponding to MAC_A
– All p’s references to /tmp go to /tmp/d
• p cd’s to /tmp/a, then to ..
– System call stat(“.”, &buf) returns inode number of
real directory
– System call dg_stat(“.”, &buf) returns inode of /tmp
November 1, 2004 Introduction to Computer Security Slide #5-20
©2004 Matt Bishop
Object Labels
• Requirement: every file system object
must have MAC label
1. Roots of file systems have explicit MAC
labels
• If mounted file system has no label, it gets
label of mount point
2. Object with implicit MAC label inherits
label of parent

November 1, 2004 Introduction to Computer Security Slide #5-21


©2004 Matt Bishop
Object Labels
• Problem: object has two names
– /x/y/z, /a/b/c refer to same object
– y has explicit label IMPL_HI
– b has explicit label IMPL_B
• Case 1: hard link created while file system on
DG/UX system, so …
3. Creating hard link requires explicit label
• If implicit, label made explicit
• Moving a file makes label explicit

November 1, 2004 Introduction to Computer Security Slide #5-22


©2004 Matt Bishop
Object Labels
• Case 2: hard link exists when file system
mounted
– No objects on paths have explicit labels: paths have
same implicit labels
– An object on path acquires an explicit label: implicit
label of child must be preserved
so …
4. Change to directory label makes child labels
explicit before the change
November 1, 2004 Introduction to Computer Security Slide #5-23
©2004 Matt Bishop
Object Labels
• Symbolic links are files, and treated as
such, so …
5. When resolving symbolic link, label of
object is label of target of the link
• System needs access to the symbolic link
itself

November 1, 2004 Introduction to Computer Security Slide #5-24


©2004 Matt Bishop
Using MAC Labels
• Simple security condition implemented
• *-property not fully implemented
– Process MAC must equal object MAC
– Writing allowed only at same security level
• Overly restrictive in practice

November 1, 2004 Introduction to Computer Security Slide #5-25


©2004 Matt Bishop
MAC Tuples
• Up to 3 MAC ranges (one per region)
• MAC range is a set of labels with upper, lower
bound
– Upper bound must dominate lower bound of range
• Examples
1. [(Secret, {NUC}), (Top Secret, {NUC})]
2. [(Secret, ∅), (Top Secret, {NUC, EUR, ASI})]
3. [(Confidential, {ASI}), (Secret, {NUC, ASI})]

November 1, 2004 Introduction to Computer Security Slide #5-26


©2004 Matt Bishop
MAC Ranges
1. [(Secret, {NUC}), (Top Secret, {NUC})]
2. [(Secret, ∅), (Top Secret, {NUC, EUR, ASI})]
3. [(Confidential, {ASI}), (Secret, {NUC, ASI})]
• (Top Secret, {NUC}) in ranges 1, 2
• (Secret, {NUC, ASI}) in ranges 2, 3
• [(Secret, {ASI}), (Top Secret, {EUR})] not
valid range
– as (Top Secret, {EUR}) ¬dom (Secret, {ASI})

November 1, 2004 Introduction to Computer Security Slide #5-27


©2004 Matt Bishop
Objects and Tuples
• Objects must have MAC labels
– May also have MAC label
– If both, tuple overrides label
• Example
– Paper has MAC range:
[(Secret, {EUR}), (Top Secret, {NUC, EUR})]

November 1, 2004 Introduction to Computer Security Slide #5-28


©2004 Matt Bishop
MAC Tuples
• Process can read object when:
– Object MAC range (lr, hr); process MAC label pl
– pl dom hr
• Process MAC label grants read access to upper bound of range
• Example
– Peter, with label (Secret, {EUR}), cannot read paper
• (Top Secret, {NUC, EUR}) dom (Secret, {EUR})
– Paul, with label (Top Secret, {NUC, EUR, ASI}) can read
paper
• (Top Secret, {NUC, EUR, ASI}) dom (Top Secret, {NUC, EUR})

November 1, 2004 Introduction to Computer Security Slide #5-29


©2004 Matt Bishop
MAC Tuples
• Process can write object when:
– Object MAC range (lr, hr); process MAC label pl
– pl ∈ (lr, hr)
• Process MAC label grants write access to any label in range
• Example
– Peter, with label (Secret, {EUR}), can write paper
• (Top Secret, {NUC, EUR}) dom (Secret, {EUR}) and (Secret,
{EUR}) dom (Secret, {EUR})
– Paul, with label (Top Secret, {NUC, EUR, ASI}), cannot
read paper
• (Top Secret, {NUC, EUR, ASI}) dom (Top Secret, {NUC, EUR})
November 1, 2004 Introduction to Computer Security Slide #5-30
©2004 Matt Bishop
Key Points
• Confidentiality models restrict flow of
information
• Bell-LaPadula models multilevel security
– Cornerstone of much work in computer security

November 1, 2004 Introduction to Computer Security Slide #5-31


©2004 Matt Bishop
Chapter 6: Integrity Policies
• Overview
• Requirements
• Biba’s models
• Clark-Wilson model

November 1, 2004 Introduction to Computer Security Slide #6-1


©2004 Matt Bishop
Overview
• Requirements
– Very different than confidentiality policies
• Biba’s model
• Clark-Wilson model

November 1, 2004 Introduction to Computer Security Slide #6-2


©2004 Matt Bishop
Requirements of Policies
1. Users will not write their own programs, but will use existing
production programs and databases.
2. Programmers will develop and test programs on a non-production
system; if they need access to actual data, they will be given
production data via a special process, but will use it on their
development system.
3. A special process must be followed to install a program from the
development system onto the production system.
4. The special process in requirement 3 must be controlled and
audited.
5. The managers and auditors must have access to both the system
state and the system logs that are generated.

November 1, 2004 Introduction to Computer Security Slide #6-3


©2004 Matt Bishop
Biba Integrity Model
• Set of subjects S, objects O, integrity levels
I, relation ≤ ⊆ I × I holding when second
dominates first
• min: I × I → I returns lesser of integrity
levels
• i: S ∪ O → I gives integrity level of entity
• r: S × O means s ∈ S can read o ∈ O
• w, x defined similarly
November 1, 2004 Introduction to Computer Security Slide #6-4
©2004 Matt Bishop
Intuition for Integrity Levels
• The higher the level, the more confidence
– That a program will execute correctly
– That data is accurate and/or reliable
• Note relationship between integrity and
trustworthiness
• Important point: integrity levels are not
security levels

November 1, 2004 Introduction to Computer Security Slide #6-5


©2004 Matt Bishop
Biba’s Model
• Similar to Bell-LaPadula model
1. s ∈ S can read o ∈ O iff i(s) ≤ i(o)
2. s ∈ S can write to o ∈ O iff i(o) ≤ i(s)
3. s1 ∈ S can execute s2 ∈ S iff i(s2) ≤ i(s1)
• Add compartments and discretionary controls to
get full dual of Bell-LaPadula model
• Information flow result holds
– Different proof, though
• Actually the “strict integrity model” of Biba’s set
of models
November 1, 2004 Introduction to Computer Security Slide #6-6
©2004 Matt Bishop
LOCUS and Biba
• Goal: prevent untrusted software from altering
data or other software
• Approach: make levels of trust explicit
– credibility rating based on estimate of software’s
trustworthiness (0 untrusted, n highly trusted)
– trusted file systems contain software with a single
credibility level
– Process has risk level or highest credibility level at
which process can execute
– Must use run-untrusted command to run software at
lower credibility level
November 1, 2004 Introduction to Computer Security Slide #6-7
©2004 Matt Bishop
Clark-Wilson Integrity Model
• Integrity defined by a set of constraints
– Data in a consistent or valid state when it satisfies these
• Example: Bank
– D today’s deposits, W withdrawals, YB yesterday’s
balance, TB today’s balance
– Integrity constraint: D + YB –W
• Well-formed transaction move system from one
consistent state to another
• Issue: who examines, certifies transactions done
correctly?
November 1, 2004 Introduction to Computer Security Slide #6-8
©2004 Matt Bishop
Entities
• CDIs: constrained data items
– Data subject to integrity controls
• UDIs: unconstrained data items
– Data not subject to integrity controls
• IVPs: integrity verification procedures
– Procedures that test the CDIs conform to the integrity
constraints
• TPs: transaction procedures
– Procedures that take the system from one valid state to
another
November 1, 2004 Introduction to Computer Security Slide #6-9
©2004 Matt Bishop
Certification Rules 1 and 2
CR1 When any IVP is run, it must ensure all CDIs
are in a valid state
CR2 For some associated set of CDIs, a TP must
transform those CDIs in a valid state into a
(possibly different) valid state
– Defines relation certified that associates a set of
CDIs with a particular TP
– Example: TP balance, CDIs accounts, in bank
example

November 1, 2004 Introduction to Computer Security Slide #6-10


©2004 Matt Bishop
Enforcement Rules 1 and 2
ER1 The system must maintain the certified
relations and must ensure that only TPs
certified to run on a CDI manipulate that CDI.
ER2 The system must associate a user with each
TP and set of CDIs. The TP may access those
CDIs on behalf of the associated user. The TP
cannot access that CDI on behalf of a user not
associated with that TP and CDI.
– System must maintain, enforce certified relation
– System must also restrict access based on user ID
(allowed relation)
November 1, 2004 Introduction to Computer Security Slide #6-11
©2004 Matt Bishop
Users and Rules
CR3 The allowed relations must meet the
requirements imposed by the principle of
separation of duty.
ER3 The system must authenticate each user
attempting to execute a TP
– Type of authentication undefined, and depends on
the instantiation
– Authentication not required before use of the
system, but is required before manipulation of
CDIs (requires using TPs)

November 1, 2004 Introduction to Computer Security Slide #6-12


©2004 Matt Bishop
Logging
CR4 All TPs must append enough
information to reconstruct the operation
to an append-only CDI.
– This CDI is the log
– Auditor needs to be able to determine
what happened during reviews of
transactions

November 1, 2004 Introduction to Computer Security Slide #6-13


©2004 Matt Bishop
Handling Untrusted Input
CR5 Any TP that takes as input a UDI may
perform only valid transformations, or no
transformations, for all possible values of the
UDI. The transformation either rejects the
UDI or transforms it into a CDI.
– In bank, numbers entered at keyboard are UDIs,
so cannot be input to TPs. TPs must validate
numbers (to make them a CDI) before using them;
if validation fails, TP rejects UDI

November 1, 2004 Introduction to Computer Security Slide #6-14


©2004 Matt Bishop
Separation of Duty In Model
ER4 Only the certifier of a TP may change
the list of entities associated with that
TP. No certifier of a TP, or of an entity
associated with that TP, may ever have
execute permission with respect to that
entity.
– Enforces separation of duty with respect to
certified and allowed relations
November 1, 2004 Introduction to Computer Security Slide #6-15
©2004 Matt Bishop
Comparison With Requirements
1. Users can’t certify TPs, so CR5 and ER4
enforce this
2. Procedural, so model doesn’t directly cover it;
but special process corresponds to using TP
• No technical controls can prevent programmer from
developing program on production system; usual
control is to delete software tools
3. TP does the installation, trusted personnel do
certification
November 1, 2004 Introduction to Computer Security Slide #6-16
©2004 Matt Bishop
Comparison With Requirements
4. CR4 provides logging; ER3 authenticates
trusted personnel doing installation; CR5,
ER4 control installation procedure
• New program UDI before certification, CDI
(and TP) after
5. Log is CDI, so appropriate TP can
provide managers, auditors access
• Access to state handled similarly
November 1, 2004 Introduction to Computer Security Slide #6-17
©2004 Matt Bishop
Comparison to Biba
• Biba
– No notion of certification rules; trusted
subjects ensure actions obey rules
– Untrusted data examined before being made
trusted
• Clark-Wilson
– Explicit requirements that actions must meet
– Trusted entity must certify method to upgrade
untrusted data (and not certify the data itself)
November 1, 2004 Introduction to Computer Security Slide #6-18
©2004 Matt Bishop
Key Points
• Integrity policies deal with trust
– As trust is hard to quantify, these policies are
hard to evaluate completely
– Look for assumptions and trusted users to find
possible weak points in their implementation
• Biba based on multilevel integrity
• Clark-Wilson focuses on separation of duty
and transactions
November 1, 2004 Introduction to Computer Security Slide #6-19
©2004 Matt Bishop
Chapter 7: Hybrid Policies
• Overview
• Chinese Wall Model
• Clinical Information Systems Security
Policy
• ORCON
• RBAC

November 1, 2004 Introduction to Computer Security Slide #7-1


©2004 Matt Bishop
Overview
• Chinese Wall Model
– Focuses on conflict of interest
• CISS Policy
– Combines integrity and confidentiality
• ORCON
– Combines mandatory, discretionary access controls
• RBAC
– Base controls on job function

November 1, 2004 Introduction to Computer Security Slide #7-2


©2004 Matt Bishop
Chinese Wall Model
Problem:
– Tony advises American Bank about
investments
– He is asked to advise Toyland Bank about
investments
• Conflict of interest to accept, because his
advice for either bank would affect his
advice to the other bank
November 1, 2004 Introduction to Computer Security Slide #7-3
©2004 Matt Bishop
Organization
• Organize entities into “conflict of interest”
classes
• Control subject accesses to each class
• Control writing to all classes to ensure
information is not passed along in violation
of rules
• Allow sanitized data to be viewed by
everyone
November 1, 2004 Introduction to Computer Security Slide #7-4
©2004 Matt Bishop
Definitions
• Objects: items of information related to a
company
• Company dataset (CD): contains objects related
to a single company
– Written CD(O)
• Conflict of interest class (COI): contains datasets
of companies in competition
– Written COI(O)
– Assume: each object belongs to exactly one COI class

November 1, 2004 Introduction to Computer Security Slide #7-5


©2004 Matt Bishop
Example

Bank COI Class Gasoline Company COI Class

Bank of America Shell Oil Standard Oil

Citibank Bank of the West Union ’76 ARCO

November 1, 2004 Introduction to Computer Security Slide #7-6


©2004 Matt Bishop
Temporal Element
• If Anthony reads any CD in a COI, he can
never read another CD in that COI
– Possible that information learned earlier may
allow him to make decisions later
– Let PR(S) be set of objects that S has already
read

November 1, 2004 Introduction to Computer Security Slide #7-7


©2004 Matt Bishop
CW-Simple Security Condition
• s can read o iff either condition holds:
1. There is an o′ such that s has accessed o′ and
CD(o′) = CD(o)
– Meaning s has read something in o’s dataset
2. For all o′ ∈ O, o′ ∈ PR(s) ⇒ COI(o′) ≠ COI(o)
– Meaning s has not read any objects in o’s conflict of
interest class
• Ignores sanitized data (see below)
• Initially, PR(s) = ∅, so initial read request
granted

November 1, 2004 Introduction to Computer Security Slide #7-8


©2004 Matt Bishop
Sanitization
• Public information may belong to a CD
– As is publicly available, no conflicts of interest
arise
– So, should not affect ability of analysts to read
– Typically, all sensitive data removed from such
information before it is released publicly (called
sanitization)
• Add third condition to CW-Simple Security
Condition:
3. o is a sanitized object

November 1, 2004 Introduction to Computer Security Slide #7-9


©2004 Matt Bishop
Writing
• Anthony, Susan work in same trading house
• Anthony can read Bank 1’s CD, Gas’ CD
• Susan can read Bank 2’s CD, Gas’ CD
• If Anthony could write to Gas’ CD, Susan
can read it
– Hence, indirectly, she can read information
from Bank 1’s CD, a clear conflict of interest

November 1, 2004 Introduction to Computer Security Slide #7-10


©2004 Matt Bishop
CW-*-Property
• s can write to o iff both of the following
hold:
1. The CW-simple security condition permits
s to read o; and
2. For all unsanitized objects o′, if s can read
o′, then CD(o′) = CD(o)
• Says that s can write to an object if all the
(unsanitized) objects it can read are in the
same dataset
November 1, 2004 Introduction to Computer Security Slide #7-11
©2004 Matt Bishop
Compare to Bell-LaPadula
• Fundamentally different
– CW has no security labels, B-LP does
– CW has notion of past accesses, B-LP does not
• Bell-LaPadula can capture state at any time
– Each (COI, CD) pair gets security category
– Two clearances, S (sanitized) and U (unsanitized)
• S dom U
– Subjects assigned clearance for compartments without
multiple categories corresponding to CDs in same COI
class

November 1, 2004 Introduction to Computer Security Slide #7-12


©2004 Matt Bishop
Compare to Bell-LaPadula
• Bell-LaPadula cannot track changes over time
– Susan becomes ill, Anna needs to take over
• C-W history lets Anna know if she can
• No way for Bell-LaPadula to capture this
• Access constraints change over time
– Initially, subjects in C-W can read any object
– Bell-LaPadula constrains set of objects that a subject
can access
• Can’t clear all subjects for all categories, because this violates
CW-simple security condition

November 1, 2004 Introduction to Computer Security Slide #7-13


©2004 Matt Bishop
Compare to Clark-Wilson
• Clark-Wilson Model covers integrity, so consider
only access control aspects
• If “subjects” and “processes” are interchangeable,
a single person could use multiple processes to
violate CW-simple security condition
– Would still comply with Clark-Wilson Model
• If “subject” is a specific person and includes all
processes the subject executes, then consistent
with Clark-Wilson Model

November 1, 2004 Introduction to Computer Security Slide #7-14


©2004 Matt Bishop
Clinical Information Systems
Security Policy
• Intended for medical records
– Conflict of interest not critical problem
– Patient confidentiality, authentication of records and
annotators, and integrity are
• Entities:
– Patient: subject of medical records (or agent)
– Personal health information: data about patient’s health
or treatment enabling identification of patient
– Clinician: health-care professional with access to
personal health information while doing job

November 1, 2004 Introduction to Computer Security Slide #7-15


©2004 Matt Bishop
Assumptions and Principles
• Assumes health information involves 1
person at a time
– Not always true; OB/GYN involves father as
well as mother
• Principles derived from medical ethics of
various societies, and from practicing
clinicians

November 1, 2004 Introduction to Computer Security Slide #7-16


©2004 Matt Bishop
Access
• Principle 1: Each medical record has an
access control list naming the individuals
or groups who may read and append
information to the record. The system must
restrict access to those identified on the
access control list.
– Idea is that clinicians need access, but no-one
else. Auditors get access to copies, so they
cannot alter records
November 1, 2004 Introduction to Computer Security Slide #7-17
©2004 Matt Bishop
Access
• Principle 2: One of the clinicians on the
access control list must have the right to
add other clinicians to the access control
list.
– Called the responsible clinician

November 1, 2004 Introduction to Computer Security Slide #7-18


©2004 Matt Bishop
Access
• Principle 3: The responsible clinician must
notify the patient of the names on the
access control list whenever the patient’s
medical record is opened. Except for
situations given in statutes, or in cases of
emergency, the responsible clinician must
obtain the patient’s consent.
– Patient must consent to all treatment, and must
know of violations of security
November 1, 2004 Introduction to Computer Security Slide #7-19
©2004 Matt Bishop
Access
• Principle 4: The name of the clinician, the
date, and the time of the access of a
medical record must be recorded. Similar
information must be kept for deletions.
– This is for auditing. Don’t delete information;
update it (last part is for deletion of records
after death, for example, or deletion of
information when required by statute). Record
information about all accesses.

November 1, 2004 Introduction to Computer Security Slide #7-20


©2004 Matt Bishop
Creation
• Principle: A clinician may open a record,
with the clinician and the patient on the
access control list. If a record is opened as
a result of a referral, the referring clinician
may also be on the access control list.
– Creating clinician needs access, and patient
should get it. If created from a referral,
referring clinician needs access to get results of
referral.
November 1, 2004 Introduction to Computer Security Slide #7-21
©2004 Matt Bishop
Deletion
• Principle: Clinical information cannot be
deleted from a medical record until the
appropriate time has passed.
– This varies with circumstances.

November 1, 2004 Introduction to Computer Security Slide #7-22


©2004 Matt Bishop
Confinement
• Principle: Information from one medical
record may be appended to a different
medical record if and only if the access
control list of the second record is a subset
of the access control list of the first.
– This keeps information from leaking to
unauthorized users. All users have to be on the
access control list.

November 1, 2004 Introduction to Computer Security Slide #7-23


©2004 Matt Bishop
Aggregation
• Principle: Measures for preventing aggregation of
patient data must be effective. In particular, a
patient must be notified if anyone is to be added
to the access control list for the patient’s record
and if that person has access to a large number of
medical records.
– Fear here is that a corrupt investigator may obtain
access to a large number of records, correlate them,
and discover private information about individuals
which can then be used for nefarious purposes (such as
blackmail)
November 1, 2004 Introduction to Computer Security Slide #7-24
©2004 Matt Bishop
Enforcement
• Principle: Any computer system that
handles medical records must have a
subsystem that enforces the preceding
principles. The effectiveness of this
enforcement must be subject to evaluation
by independent auditors.
– This policy has to be enforced, and the
enforcement mechanisms must be auditable
(and audited)
November 1, 2004 Introduction to Computer Security Slide #7-25
©2004 Matt Bishop
Compare to Bell-LaPadula
• Confinement Principle imposes lattice
structure on entities in model
– Similar to Bell-LaPadula
• CISS focuses on objects being accessed; B-
LP on the subjects accessing the objects
– May matter when looking for insiders in the
medical environment

November 1, 2004 Introduction to Computer Security Slide #7-26


©2004 Matt Bishop
Compare to Clark-Wilson
– CDIs are medical records
– TPs are functions updating records, access control lists
– IVPs certify:
• A person identified as a clinician is a clinician;
• A clinician validates, or has validated, information in the
medical record;
• When someone is to be notified of an event, such notification
occurs; and
• When someone must give consent, the operation cannot
proceed until the consent is obtained
– Auditing (CR4) requirement: make all records append-
only, notify patient when access control list changed
November 1, 2004 Introduction to Computer Security Slide #7-27
©2004 Matt Bishop
ORCON
• Problem: organization creating document
wants to control its dissemination
– Example: Secretary of Agriculture writes a
memo for distribution to her immediate
subordinates, and she must give permission for
it to be disseminated further. This is
“originator controlled” (here, the “originator”
is a person).

November 1, 2004 Introduction to Computer Security Slide #7-28


©2004 Matt Bishop
Requirements
• Subject s ∈ S marks object o ∈ O as ORCON on
behalf of organization X. X allows o to be
disclosed to subjects acting on behalf of
organization Y with the following restrictions:
1. o cannot be released to subjects acting on behalf of
other organizations without X’s permission; and
2. Any copies of o must have the same restrictions
placed on it.

November 1, 2004 Introduction to Computer Security Slide #7-29


©2004 Matt Bishop
DAC Fails
• Owner can set any desired permissions
– This makes 2 unenforceable

November 1, 2004 Introduction to Computer Security Slide #7-30


©2004 Matt Bishop
MAC Fails
• First problem: category explosion
– Category C contains o, X, Y, and nothing else. If a
subject y ∈ Y wants to read o, x ∈ X makes a copy o′.
Note o′ has category C. If y wants to give z ∈ Z a copy,
z must be in Y—by definition, it’s not. If x wants to let
w ∈ W see the document, need a new category C′
containing o, X, W.
• Second problem: abstraction
– MAC classification, categories centrally controlled,
and access controlled by a centralized policy
– ORCON controlled locally
November 1, 2004 Introduction to Computer Security Slide #7-31
©2004 Matt Bishop
Combine Them
• The owner of an object cannot change the access
controls of the object.
• When an object is copied, the access control
restrictions of that source are copied and bound to
the target of the copy.
– These are MAC (owner can’t control them)
• The creator (originator) can alter the access
control restrictions on a per-subject and per-
object basis.
– This is DAC (owner can control it)
November 1, 2004 Introduction to Computer Security Slide #7-32
©2004 Matt Bishop
RBAC
• Access depends on function, not identity
– Example:
• Allison, bookkeeper for Math Dept, has access to
financial records.
• She leaves.
• Betty hired as the new bookkeeper, so she now has
access to those records
– The role of “bookkeeper” dictates access, not
the identity of the individual.
November 1, 2004 Introduction to Computer Security Slide #7-33
©2004 Matt Bishop
Definitions
• Role r: collection of job functions
– trans(r): set of authorized transactions for r
• Active role of subject s: role s is currently in
– actr(s)
• Authorized roles of a subject s: set of roles s is
authorized to assume
– authr(s)
• canexec(s, t) iff subject s can execute transaction t
at current time

November 1, 2004 Introduction to Computer Security Slide #7-34


©2004 Matt Bishop
Axioms
• Let S be the set of subjects and T the set of
transactions.
• Rule of role assignment:
(∀s ∈ S)(∀t ∈ T) [canexec(s, t) → actr(s) ≠ ∅].
– If s can execute a transaction, it has a role
– This ties transactions to roles
• Rule of role authorization:
(∀s ∈ S) [actr(s) ⊆ authr(s)].
– Subject must be authorized to assume an active role
(otherwise, any subject could assume any role)
November 1, 2004 Introduction to Computer Security Slide #7-35
©2004 Matt Bishop
Axiom
• Rule of transaction authorization:
(∀s ∈ S)(∀t ∈ T)
[canexec(s, t) → t ∈ trans(actr(s))].
– If a subject s can execute a transaction, then
the transaction is an authorized one for the role
s has assumed

November 1, 2004 Introduction to Computer Security Slide #7-36


©2004 Matt Bishop
Containment of Roles
• Trainer can do all transactions that trainee
can do (and then some). This means role r
contains role r′ (r > r′). So:
(∀s ∈ S)[ r′ ∈ authr(s) ∧ r > r′ → r ∈ authr(s) ]

November 1, 2004 Introduction to Computer Security Slide #7-37


©2004 Matt Bishop
Separation of Duty
• Let r be a role, and let s be a subject such that r ∈
auth(s). Then the predicate meauth(r) (for
mutually exclusive authorizations) is the set of
roles that s cannot assume because of the
separation of duty requirement.
• Separation of duty:
(∀r1, r2 ∈ R) [ r2 ∈ meauth(r1) →
[ (∀s ∈ S) [ r1∈ authr(s) → r2 ∉ authr(s) ] ] ]

November 1, 2004 Introduction to Computer Security Slide #7-38


©2004 Matt Bishop
Key Points
• Hybrid policies deal with both
confidentiality and integrity
– Different combinations of these
• ORCON model neither MAC nor DAC
– Actually, a combination
• RBAC model controls access based on
functionality

November 1, 2004 Introduction to Computer Security Slide #7-39


©2004 Matt Bishop

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy