Week 8 - Friday. What did we talk about last time? Bell-La Padula model Clark-Wilson model ...
-
Upload
nigel-jones -
Category
Documents
-
view
214 -
download
0
Transcript of Week 8 - Friday. What did we talk about last time? Bell-La Padula model Clark-Wilson model ...
CS363Week 8 - Friday
Last time
What did we talk about last time? Bell-La Padula model Clark-Wilson model Chinese Wall model Biba model
Questions?
Project 2
Assignment 3
Security PresentationTaylor Ryan
Theoretical Limitations on Access Control
Determining security
How do we know if something is secure? We define our security policy using our
access control matrix We say that a right is leaked if it is added
to an element of the access control matrix that doesn’t already have it
A system is secure if there is no way rights can be leaked
Is there an algorithm to determine if a system is secure?
Mono-operational systems In a mono-operational system, each
command consists of a single primitive command: Create subject s Create object o Enter r into a[s,o] Delete r from a[s,o] Destroy subject s Destroy object o
In this system, we could see if a right is leaked with a sequence of k commands
Proof
Delete and Destroy commands can be ignored No more than one Create command is needed (in
the case that there are no subjects) Entering rights is the trouble We start with set S0 of subjects and O0 of objects With n generic rights, we might add all n rights to
everything before we leak a right Thus, the maximum length of the command
sequence that leaks a right is k ≤ n(|S0|+1)(|O0|+1) + 1
If there are m different commands, how many different command sequences are possible?
Turing machine
A Turing machine is a mathematical model for computation
It consists of a head, an infinitely long tape, a set of possible states, and an alphabet of characters that can be written on the tape
A list of rules saying what it should write and should it move left or right given the current symbol and state
1 0 1 1 1 1 0 0 0 0
A
Turing machine example
3 state, 2 symbol “busy beaver” Turing machine:
Starting state A
Tape Symbo
l
State A State B State C
Write Move Next Write Move Next Write Move Next
0 1 R B 0 R C 1 L C
1 1 R HALT 1 R B 1 L A
Church-Turing thesis
If an algorithm exists, a Turing machine can perform that algorithm
In essence, a Turing machine is the most powerful model we have of computation
Power, in this sense, means the ability to compute some function, not the speed associated with its computation
Halting problem
Given a Turing machine and input x, does it reach the halt state?
It turns out that this problem is undecidable
That means that there is no algorithm that can be to determine if any Turing machine will go into an infinite loop
Consequently, there is no algorithm that can take any program and check to see if it goes into an infinite loop
Leaking Undecidable
Simulate a Turing machine
We can simulate a Turing machine using an access control matrix
We map the symbols, states and tape for the Turing machine onto the rights and cells of an access control matrix
Discovering whether or not the right leaks is equivalent to the Turing machine halting with a 1 or a 0
The bad news
Without heavy restrictions on the rules for an access control, it is impossible to construct an algorithm that will determine if a right leaks
Even for a mono-operational system, the problem might take an infeasible amount of time
But, we don’t give up! There are still lots of ways to model
security Some of them offer more practical
results
Secure Design Principles
Secure design principles
Saltzer and Schroeder wrote an important paper in 1975 that gave 8 principles that should be used in the design of any security mechanisms1. Least privilege2. Fail-safe defaults3. Economy of mechanism4. Complete mediation5. Open design6. Separation of privilege7. Least common mechanism8. Psychological acceptability
These principles will be part of Project 3
Principle of least privilege The principle of least privilege states that a
subject should be given only those privileges that it needs in order to complete its task
This principle restricts how privileges are granted
You're not supposed to get any more privileges than absolutely necessary
Examples JayWeb Unix systems Windows systems?
Principle of fail-safe defaults The principle of fail-safe defaults states
that, unless a subject is given explicit access to an object, it should be denied access to an object
This principle restricts how privileges are initialized
A subject should always be assumed not to have access
Examples Airports Unix systems Windows systems?
Principle of economy of mechanism
The principle of economy of mechanism states that security mechanisms should be as simple as possible
This principle simplifies the design and implementation of security mechanisms
The more complex a system is, the more assumptions that are built in
Complex systems are hard to test Examples
Die Hard Houdini
Principle of complete mediation The principle of complete mediation
requires that all access to objects be checked to ensure that they are allowed
This principle restricts the caching of information (and also direct access to resources)
The OS must mediate all accesses and make no assumptions that privileges haven't changed
Examples Banks Unix systems
Principle of open design
The principle of open design states that the security of a mechanism should not depend on the secrecy of its design or implementation
"Security through obscurity" fallacy Examples
Enigma RSA Lock-picking
Principle of separation of privilege
The principle of separation of privilege states that a system should not grant permission based on a single condition
Security should be based on several different conditions (perhaps two-factor authentication)
Ideally, secure mechanisms should depend on two or more independent verifiers
Examples Nuclear launch keys PhD qualifying exams Roaccutane (used to be Accutane)
Principle of least common mechanism
The principle of least common mechanism states that mechanisms used to access resources should not be shared
Sharing allows for channels for communication
Sharing also lets malicious users or programs affect the integrity of other programs or data
Examples Virtual memory File systems
Principle of psychological acceptability
The principle of psychological acceptability states that security mechanisms should not make the resource (much) more difficult to access than if the security mechanisms were not present
Two fold issues: Users must not be inconvenienced or they might fight
against the system or take their business elsewhere Administrators must find the system easy to administer
Examples Windows UAC Retina scans Changing your password all the time
OS Security Features
Regular OS
A typical OS will make efforts to protect security in a number of ways: User authentication Memory protection File and I/O device access control Allocation and access to general
objects Enforced sharing Guaranteed fair service Interprocess communication and
synchronization Protection of OS data
Trusted OS
A trusted OS is similar to a normal OS, except that it puts a layer of access control around everything
A trusted OS will typically be careful about: User identification and authentication Mandatory access control Discretionary access control Object reuse protection Complete mediation Trusted paths Auditing Intrusion detection
Mandatory and discretionary access control
Mandatory access control (MAC) means that the controls are enforced by rules in the system, not by user choices Bell-La Padula is a perfect example of
MACDiscretionary access control
(DAC) means that the user has control over who can access the objects he or she owns Linux and Windows are largely DAC
systems Most real systems have elements of
both
Object reuse
When a file is deleted, it isn’t actually deleted It’s blocks are unlinked from the file system
When you create a new file, it usually uses a block from an old deleted file
You can examine the contents of that block and reconstruct some or all of the deleted file Software is available for home users to undelete files Digital forensics experts use more powerful tools in criminal
investigations The problem is that object reuse allows for security
violations A regular OS often does this and other kinds of object
reuse for efficiency A trusted OS will sacrifice efficiency for security
Complete mediation and trusted paths
Complete mediation means that every access goes through the system All resources are checked Past permissions are no guarantee of future
permissions A trusted path means an unmistakable
process for performing protected tasks Phishing is the opposite of a trusted path Some attacks on OS users rely on getting them to
download a file with the same name as a system command, which will then be run instead if they execute from the same directory
Auditing
Trusted systems also keep an audit log of all security-relevant actions that have been taken
Unfortunately, audit logs can become huge Even if an illegal access is known to have
happened, it might be impossible to find it in the logs
Audit reduction is the process of reducing the size of the log to critical events This may require sophisticated pattern
recognition software
Kernelized design
One approach to making a trusted system is a kernelized design
A security kernel is the low level part of the OS that enforces security mechanisms It can be a unified layer sitting between
hardware and the rest of the OS Or it can be spread throughout the
entire OS The reference monitor is the most
important part of the security kernel It controls accesses to objects It should be tamperproof,
unbypassable, and analyzable
Virtualization
Virtualization means presenting the user with a virtual machine The user can interact with the
virtual machine but cannot directly affect the real hardware
Virtual memory is a great example of this Your program sees memory
starting at 0 and going up to some limit, but the OS maps this transparently to the real memory
OS Assurance
Common OS Security Flaws User interaction is problematic because input is
often not under the direct control of the OS Hardware can vary, and it is hard to check all
software drivers Sometimes security measure are bypassed for
efficiency Ambiguity in access policy Incomplete mediation Generality
Customizability leads to unpredictable configurations or special modules that need high privilege access
Time-of-check to time-of-use issues
Assurance
There are many methods to provide assurance that a system has few vulnerabilities: Testing Penetration testing Formal verification Validation Open source model
Testing
We discussed testing briefly before It has problems:
Testing can find problems, but it can’t find the lack of problems
Testing takes time and effort because the number of states a program can undergo is exponential in its length
Black box testing cannot be guaranteed to be complete Code introduced into a program to test it can change its
behavior Complex systems can have errors that are difficult to
reproduce It is still the most common form of assurance
Penetration Testing
Penetration testing (or tiger team analysis or ethical hacking) is a kind of testing where experts try to use every trick they can to break a system
It is an art requiring creativity and a science requiring deep technical knowledge
It is not a panacea, but there is money to be made as a penetration tester
Formal verification
It is possible to prove that some programs do specific things You start with a set of preconditions You transform those conditions with each operation You can then guarantee that, with the initial
preconditions, certain postconditions will be met Using this precondition/postcondition approach to
formally describe programming languages is called Hoare semantics
Proving things about complex programs is hard and requires automated use of programs called theorem provers
Validation
Validation is checking the design against the requirements Verification is checking the
implementation against the design OS validation is often done in the
following ways: Requirements checking Design and code reviews System testing
Open source systems
In open source systems, the software is freely available for public use and criticism In most cases, anyone sufficiently skilled can even add
their own code to the systems They are popular
Microsoft CEO Steve Ballmer said in 2008 that 60% of the web servers in the world run Linux
The open source security advantage is that a huge number of people can look for flaws
The open source security disadvantage is the same Research suggests that a product being open
source or closed source is not the key determiner of security
Upcoming
Next time…
Finish OS assurance and evaluation Database background Database security requirements Claire Chambless presents
Reminders
Read Sections 6.1 and 6.2Finish Assignment 3
Due tonight before midnight Keep working on Project 2
Due next Friday