0% found this document useful (0 votes)
17 views60 pages

Unit Iv

Uploaded by

sowmyadell680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views60 pages

Unit Iv

Uploaded by

sowmyadell680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 60

UNIT – IV

• Database Security: Security Requirements of Databases, Reliability,


and Integrity, Database Disclosure.

• Cloud Computing Security: Introduction to Cloud Computing, Service


and Deployment Models, Risk Analysis, Cloud as a Security Control,
Cloud Security Tools and Techniques, Cloud Identity Management,
Securing IaaS.
Introduction to Database
Security
• Definition: Database security involves protecting databases from
unauthorized access, misuse, or corruption.
• Importance: With the increasing amount of data collected, ensuring
database security is critical to maintain confidentiality, integrity, and
availability.
• Confidentiality ensures that sensitive data is accessible only to authorized
users.
• Integrity refers to the accuracy and consistency of data within the database.
(ACID properties)
• Availability ensures that authorized users have access to data and database
services when needed
As a result, a database offers many advantages over a simple file system:
• shared access, so that many users can use one common, centralized set of
data
• controlled access, so that only authorized users are allowed to view or to
modify data values
• minimal redundancy, so that individual users do not have to collect and
maintain their own sets of data
• data consistency, so that a change to a data value affects all users of the data
value
• data integrity, so that data values are protected against accidental or
malicious undesirable changes
Security Requirements of
Databases
• The basic problems—access control, exclusion of spurious data,
authentication of users, and reliability—
• Access control refers to the mechanisms that restrict access to data within
a database to authorized users only.
• Exclusion of Spurious Data : This principle ensures that only valid and
relevant data is stored and presented in the database.
• User authentication is the process of verifying the identity of users
attempting to access the database.
• Reliability refers to the assurance that a database will function correctly
and consistently over time.
Following is a list of requirements
for database security
• 1.Physical Database Integrity: Protects data from physical damage and
ensures recoverability.
• 2.Logical Database Integrity: Maintains the structure and relationships
within the database.
• 3.Element Integrity: Ensures accuracy of individual data elements.
• 4. Auditability: Tracks access and modifications to data for
accountability
Data base Integrity (1 &2)

• Central Repository: A database serves as a central repository of data,


ensuring users can trust the accuracy of data values.
• Authorized Updates: Only authorized individuals should perform updates
to maintain integrity.
• Protection from Corruption: Data must be safeguarded against corruption
from external threats (e.g., illegal programs, natural disasters).
• Situations Affecting Integrity
• Whole Database Damage: Can occur due to storage medium failures
• Individual Data Items Unreadable: Specific data points may become inaccessible,
impacting overall reliability.
• Responsibilities for Maintaining Integrity
• DBA: Manages the overall structure and integrity of the database.
• Operating System: Provides the underlying support for database operations.
• Human Computing System Manager: Oversees database management and user access.
• Backup Strategies
• Regular Backups: Essential for protecting the database against failures.
• Reconstruction Needs: Administrators must be able to restore databases to a stable
point after failures (e.g., power outages during transactions).
• The DBMS should maintain a log of all transactions for recovery purposes.
3. Element Integrity
• Definition: Element integrity refers to the correctness and accuracy of individual data
elements in a database.
• User Responsibility: Authorized users are responsible for entering accurate data, but
mistakes can occur during data collection and entry.
• Challenges in Data Entry
• Human Error: Users and programs can make mistakes when collecting or entering data.
• Impact of Errors: Incorrect data can lead to significant issues, affecting the reliability of the entire
database.
• Mechanisms for Ensuring Element Integrity
• Three Main Actions:
• Field Checks
• Access Control
• Change Logs
Field Checks

• Definition: Activities that test for appropriate values in a specific field.


• Types of Checks: Numeric requirements (e.g., age must be a number).
• Character restrictions (e.g., uppercase letters only).
• Value bounds (e.g., ensuring a value does not exceed the sum of two
other fields).
• Purpose: Prevent simple errors during data entry.
Access Control

• Importance of Centralization: Databases centralize data storage,


reducing redundancy and ensuring consistency.
• Example Scenario: A student’s address stored in multiple files
(registrar, food service, bookstore) can lead to outdated or conflicting
information.
• Addressing Access Control Challenges
• Centralization Benefits: Ensures users have the correct information without
needing to update multiple files.
• Database Administrator Role: Responsible for resolving policy questions
related to access control and modifications.
Change Logs

• Definition: A change log records every modification made to the


database, including original and modified values.
• Functionality: Allows administrators to undo erroneous changes.
• Example: Correcting an incorrectly posted library fine by referencing
the change log.
4. Auditability

• Definition: Auditability refers to the ability to generate records of all access (read or
write) to a database.
• Purpose: Helps maintain database integrity and identifies who affected what values
and when.
• Benefits of Audit Records
• Integrity Maintenance: Audit records assist in tracking changes and maintaining data integrity.
• Incremental Access: Users can access protected data incrementally, similar to discovering clues
in a detective novel.
• Clue Tracking: Audit trails can identify which pieces of information a user has already accessed.
• Granularity in Auditing
• Granularity Issues: Audited events may not be specific enough (e.g., actions like open file vs.
specific record access).
• Record-Level Access: Effective audit trails should ideally include detailed access at the record,
field, and element levels.
• Challenges: High granularity can be prohibitive for most database applications.
2. Reliability, and Integrity
• Users expect a DBMS to protect their data from loss or damage.
• Reliability refers to the ability of a DBMS to function without failure over
extended periods, while integrity ensures the accuracy and consistency of data.
• User Expectations
• Reliability Expectations: Users expect the DBMS to provide consistent access to data
without interruptions.
• Data Protection: Users trust the DBMS to safeguard their data against loss or corruption.
• General Security Issues: Concerns for reliability and integrity are common security issues
but are particularly pronounced in databases.
• A DBMS guards against loss or damage in several ways
A DBMS guards against loss or
damage in several ways
• Database integrity: concern that the database as a whole is protected
against damage, as from the failure of a disk drive or the corruption of
the master database index. These concerns are addressed by operating
system integrity controls and recovery procedures.
• Element integrity: concern that the value of a specific data element is
written or changed only by authorized users. Proper access controls
protect a database from corruption by unauthorized users.
• Element accuracy: concern that only correct values are written into the
elements of a database. Checks on the values of elements can help
prevent insertion of improper values. Also, constraint conditions can
detect incorrect values.
Protection Features from the
Operating System
• The protection features provided by the operating system for databases include several key
aspects:
• Regular Backups: A responsible system administrator periodically backs up the files of a
database along with other user files. This practice is essential for data recovery in case of
catastrophic failures.
• Access Control: During normal operation, the operating system employs standard access
control mechanisms to protect database files from unauthorized outside access. This ensures
that only authorized users can access sensitive data.
• Integrity Checks: The operating system performs integrity checks on all data as part of the
normal read and write operations for input/output (I/O) devices. These checks help maintain
the accuracy and consistency of the data stored in the database.
• While these controls provide a basic level of security for databases, it is important to note that
the database manager must implement additional measures to enhance security further
• The Two-Phase Update technique is a critical method used in database management systems (DBMS) to
ensure data integrity during modifications.
• Problem Overview
• A significant challenge for database managers is the potential failure of the computing system during data
modification.
• If a modification involves a long field or a record with multiple attributes, only part of the new data may be
saved, leading to incorrect data in the database.
• Two-Phase Update Technique
• The two-phase update consists of two distinct phases: the intent phase and the commit phase.
• Intent Phase
• Preparation: The DBMS gathers all necessary resources for the update, such as data, dummy records, and file access. It
locks out other users and calculates final answers, but no changes are made to the database at this stage.
• Repeatability: This phase can be repeated indefinitely without any permanent changes. If a system failure occurs during
this phase, all steps can be restarted once the system is back online.
• Committing: The last action in this phase is writing a commit flag to the database, indicating that the DBMS has reached a
point of no return. After this point, the DBMS will begin making permanent changes.
• Commit Phase
• Permanent Changes: In this phase, the DBMS makes the actual updates to the database.
• Actions from the intent phase cannot be repeated, but the activities of the commit phase can be repeated as needed.
• Handling Failures: If a failure occurs during this phase, the database may contain incomplete data.
• Example Scenario
• Consider a database managing the inventory of office supplies. When the accounting
department requisitions 50 boxes of paper clips, the following steps are executed:
1. Check Inventory: The stockroom verifies that 50 boxes are available. If not, the requisition is rejected.
2. Update Inventory: If sufficient stock is available, the inventory is updated (e.g., 107 - 50 = 57).
3. Charge Budget: The accounting department's budget is charged for the requisitioned items.
4. Reorder Check: The stockroom checks if the remaining quantity is below the reorder point and flags the
item as "on order" if necessary.
5. Prepare Delivery: A delivery order is prepared for the accounting department.
• Importance of Order
• All steps must be completed in the specified order to maintain database accuracy. If a failure occurs before
the first step is complete, the transaction can be restarted without issues. However, if a failure occurs
during the update steps, the database may become inconsistent, leading to potential errors such as double
deductions or charges.
• Shadow Values
• During the two-phase commit, shadow values are maintained for key data points. These values are
computed and stored locally during the intent phase and copied to the actual database during the commit
phase.
• This ensures that each step of the intent phase relies only on unmodified values, allowing for unlimited
repetitions without compromising database integrity.
• Redundancy and Internal Consistency in Database Management Systems
• Introduction
• Redundancy and internal consistency are crucial for maintaining data integrity in DBMS.
• Additional information is maintained to detect internal inconsistencies, ranging from check bits to duplicate fields.
• Error Detection and Correction Codes
• Types of Codes:
• Parity bits
• Hamming codes
• Cyclic redundancy checks.
• Functionality:
• Check codes are computed and stored when data items are added to the database.
• Upon retrieval, check codes are compared; discrepancies indicate errors.
• Shadow Fields
• Entire attributes or records can be duplicated to provide immediate replacements in case of errors. While
effective, redundant fields require substantial storage space.
• Recovery
• In addition to these error correction processes, a DBMS can maintain a log of user accesses, particularly changes.
• In the event of a failure, the database is reloaded from a backup copy and all later changes are then applied from
the audit log.
• Concurrency/Consistency
• Database systems are often multiuser systems, and it is essential to manage access so that users do not
interfere with each other. ( both read no issues – One reading one writing?)
• Access Control: When two users share the same database, their accesses must be constrained to prevent
conflicts. The DBMS implements simple locking mechanisms to manage this.
• Read Operations: If two users attempt to read the same data item, there is no conflict, as both will receive
the same value.
• Modification Conflicts: When both users try to modify the same data item, it is often assumed that there
is no conflict because each user knows what to write. However, this assumption can lead to issues, as the
value to be written may depend on the previous state of the data item.
• Example Scenario: Consider an airline reservation system where Agent A is booking a seat for passenger
Mock and finds that seat 11D is available. Simultaneously, Agent B is trying to book seats for a family and
also sees seat 11D as available. If both agents submit update commands for seat 11D, it can result in
double booking, which is problematic.
• Atomic Operations: To resolve such conflicts, the DBMS treats the entire query-update cycle as a single
atomic operation. This means that the command must read the current value of the seat and only modify
it if it is still 'UNASSIGNED'. This process must be uninterrupted to prevent other users from accessing the
same data during the update.
• Locking Mechanisms: A final issue arises with read-write operations. If one user is updating a value while
another wishes to read it, the reader may receive partially updated data. To prevent this, the DBMS locks
DATA BASE DISCLOUSER
• Database Disclosure
• Increased Data Collection:
• More data is being collected about individuals than ever before.
• Traditionally, organizations only knew about their clients, but now data is shared
across multiple platforms.
• Role of Technology:
• Computers facilitate both the collection and sharing of vast amounts of data.
• Content of Databases:
• Databases may contain thoughts, preferences, opinions, activities, and connections.
• Inferences drawn from data can be misleading (e.g., "If Jamie is your friend and
likes frogs, you must like frogs too").
• Sensitive Data
• Definition:
• Sensitive data refers to information that should remain private and not be publicly disclosed.
• Variability in Sensitivity:
• Sensitivity is context-dependent; some databases contain no sensitive data, while others
contain entirely sensitive data.
• Access Control Challenges:
• Databases can have a mix of sensitive and non-sensitive data.
• Access controls can regulate who views what data, but the complexity increases when some
data is sensitive and some is not.
• Example:
• A university database may include:
• Less Sensitive: Name, dorm.
• Moderately Sensitive: Gender.
• Highly Sensitive: Financial aid, drug use, parking fines.
• The mere existence of certain fields (e.g., drug use) can be sensitive.
• Several factors can make data sensitive.
• Inherently sensitive.
• The value itself may be so revealing that it is sensitive.
• Examples are the locations of defensive missiles or the median income of barbers in a town with only one
barber.
• From a sensitive source.
• The source of the data may indicate a need for confidentiality.
• An example is information from an informer whose identity would be compromised if the information were
disclosed.
• Declared sensitive.
• The database administrator or the owner of the data may have declared the data to be sensitive. Examples are
classified military data or the name of the anonymous donor of a piece of art.
• Part of a sensitive attribute or record.
• In a database, an entire attribute or record may be classified as sensitive. Examples are the salary attribute of a
personnel database or a record describing a secret space mission.
• Sensitive in relation to previously disclosed information.
• Some data become sensitive in the presence of other data. For example, the longitude coordinate of a secret
gold mine reveals little, but the longitude coordinate in conjunction with the latitude coordinate pinpoints the
mine.
• All of these factors must be considered when the sensitivity of the data is being determined.
• Types of Disclosures
• 1. Exact Data Disclosure
• Definition: Release of the exact value of sensitive data.
• Risks:
• Accidental disclosure by database managers.
• Users may unknowingly request sensitive data.
• 2. Bounds Disclosure
• Definition: Revealing that a sensitive value falls within a specific range
• Enables users to narrow down to a precise value.
• Even general statistics, such as budgets or personnel numbers, can lead to serious breaches.
• Usefulness: Sometimes used to provide non-specific data ranges (e.g., salary ranges) without revealing
individual records.
• 3. Negative Result Disclosure
• Definition: Learning that a certain value does not exist (e.g., "0 is not the number”)
• Risks:
• This can imply significant information, such as prior convictions or academic standing (e.g., GPA).
• 4. Existence Disclosure
• Definition: The mere existence of certain data is sensitive.
• Example: Monitoring personal phone calls; finding a related data field reveals
surveillance.
• 5. Probable Value Disclosure
• Definition: Estimating the likelihood of a certain value for sensitive data.
• Example: Determining political affiliations based on public records.
• Types of Attacks on Data
• 1. Direct Attack
• Definition: Attempting to access sensitive data directly through precise
queries.
• Strategy: Formulating queries to match a single data item.
• 2. Inference by Arithmetic
• Definition: Releasing statistics while suppressing individual identifiers.
• Risks: Indirectly revealing sensitive information.
• 3. Aggregation
• Definition: Combining less sensitive data to infer more sensitive insights.
• Example: Police investigations using intersecting data sets to identify suspects.
• Specific Attack Techniques
• 1. Sum Attack
• Description: Inferring values from reported totals.
• Example: Knowing no female in a dormitary receives financial aid can identify specific individuals.
• 2. Count Attack
• Description: Combining count and sum data to reveal individual values.
• Example: Computing salaries based on averages and known employee counts.
• 3. Mean and Median Attacks
• Description: Deriving individual values from statistical measures.
• Example: Using the median salary to determine specific salaries.
• 4. Tracker Attacks
• Description: Manipulating queries to isolate specific data by balancing queries that cancel each other out.
• Risks: Can derive sensitive data through intelligent padding of queries.
• A tracker attack can fool the database manager into locating the desired data by using additional queries that produce
small results.
• The tracker adds additional records to be retrieved for two different queries; the two sets of records cancel each other
out, leaving only the statistic or data desired.
• 5. Linear System Vulnerability
• Analysis on Data
• As we just described, the attacker has time and computing power to analyze data.
• Correlating seemingly unrelated bits of information can, as we showed, help build a
larger picture.
• Hidden Data Attributes
• 1. Metadata
• Definition: Data about data that may reveal sensitive information.
• Risks: Can inadvertently disclose personal details through digital files.
• 2. Geotagging
• Description: Photos tagged with GPS coordinates can reveal locations.
• Example: Publicly posted photos may disclose private addresses.
• 3. Tracking Devices
• Risks: Mobile devices and RFID tags can track movements and build
comprehensive profiles of user behavior.
Preventing Disclosure: Data Suppression and
Modification

• There are no perfect solutions to the inference and aggregation


problems.
• The approaches to controlling it follow the three paths listed below.
• The first two methods can be used either to limit queries accepted or
to limit data provided in response to a query.
• The last method applies only to data released.
• Preventing Disclosure: Data Suppression and Modification
• Key Approaches to Control Inference and Aggregation Problems
• Suppress Sensitive Information
• Definition: Remove or withhold sensitive data from queries.
• Benefits: Provides immediate protection against sensitive data disclosure.
• Challenges:
• Often leads to excessive suppression, reducing database usability.
• Track User Knowledge
• Definition: Monitor what information users already possess to permit safe
disclosures.
• Benefits: Can allow more significant data sharing while minimizing risks.
• Challenges:
• High operational costs to maintain user knowledge.
• Complexity in accounting for combined knowledge of multiple users.
• Inability to prevent exploitation via multiple user identities.
• Disguise the Data
• Definition: Use techniques like random perturbation and
rounding to modify data.
• Benefits: Protects against statistical attacks reliant on precise
values.
• Challenges:
• Results in slightly inaccurate or inconsistent data for users.
• Importance of Awareness
• Recognizing the existence of potential inference problems is crucial
for understanding how to control them.
• Awareness alone is insufficient; organizations must implement
protective measures against database attacks.
Security Versus Precision
• Balancing Security and Data Disclosure
• Challenge: Determining which data is sensitive and how to share nonsensitive data.
• Philosophy: A conservative approach often leads to limiting data disclosures, which may reject
reasonable queries.
• Example Queries:
• Request for grades of students using drugs.
• Salary comparisons between genders.
• Goals
• Precision: Disclose as much nonsensitive data as possible while protecting sensitive information.
• Ideal Outcome: Achieve maximum precision with complete confidentiality of sensitive data.
• Visualization
• Concentric Circles:
• Inner Circle: Sensitive data that must be carefully concealed.
• Outer Band: Data that can be disclosed but poses a risk of inference.
Security versus Precision
Statistical Suppression
Techniques
• Limited Response Suppression:
• Eliminates low-frequency data that could reveal sensitive information.
• Combined Results:
• Combine rows/columns to obscure sensitive values.
• Example: Group counts into ranges to prevent identification of individuals.
• Aggregation:
• Aggregated data (e.g., sums, medians) can protect individual privacy while providing useful statistics.
• Small sample sizes are blocked to prevent identification.
• Random Data Perturbation:
• Introduces small random errors to data values for statistical results.
• Maintains overall data distribution without revealing exact values.
• Swapping:
• Randomly interchanging values (e.g., gender, race) to prevent linking individual records.
• Useful for maintaining aggregate statistics while protecting confidentiality.
Query Analysis

• A sophisticated method to assess queries and their implications for


potential data disclosure.
• Approach: Maintain a query history to evaluate the risk of inference
from previous results.
UNIT 4 Part 2- Computing
Security
• Introduction to Cloud Computing,
• Service and Deployment Models,
• Risk Analysis,
• Cloud as a Security Control,
• Cloud Security Tools and Techniques,
• Cloud Identity Management,
• Securing IaaS.
Introduction to cloud computing
• it is a new way of providing services by using technology.
• cloud computing as a model “for enabling convenient, on-demand
network access to a shared pool of configurable computing
resources.”
• cloud consists of networks, servers, storage, applications, and services
that are connected in a loose and easily reconfigurable way.
• Cloud computing implies export of processor, storage, applications,
or other resources. Sharing resources increases security risk.
• Cloud Computing Concepts
• The cloud has five defining characteristics:
• • On-demand self-service.
• If you are a cloud customer, you can automatically ask for computing resources (such as server time
and network storage) as you need them.
• • Broad network access.
• You can access these services with a variety of technologies, such as mobile phones, laptops,
desktops, and mainframe computers.
• • Resource pooling.
• The cloud provider can put together a large number of multiple and varied resources to provide your
requested services.
• This “multitenant model” permits a single resource (or collection of resources) to be accessed by
multiple customers, and a particular resource (such as storage, processing or memory) can be assigned
and reassigned dynamically, according to the customers’ demands.
• • Rapid elasticity.
• Services can quickly and automatically be scaled up or down to meet a customer’s need.
• To the customer, the system’s capabilities appear to be unlimited.
• • Measured service. Like water, gas, or telephone service, use of cloud services and
Service Models
• A cloud can be configured in many ways, but there are three basic
models with which clouds provide services
cloud provider gives a customer access to applications running
in the cloud. Software as a service: applications in the cloud -

the customer has his or her own applications, but the cloud
affords the languages and tools for creating them.
Platform as a service: languages and tools to support
application development in the cloud

the cloud offers processing, storage, networks,


and other computing resources that enable
customers to run any kind of software.
Deployment Models
• There are many different definitions of clouds, and many ways of
describing how clouds are deployed.
• Often, four basic offerings are described by cloud providers: private
clouds, community clouds, public clouds, and hybrid clouds.
• A private cloud has infrastructure that is operated exclusively by and
for the organization that owns it, but cloud management may be
contracted out to a third party.
• A community cloud is shared by several organizations and is usually
intended to accomplish a shared goal.
• For instance, collaborators in a community cloud must agree on its security
requirements, policies, and mission.
• It, too, may farm out cloud management to another organization.
• A public cloud, available to the general public, is owned by an
organization that sells cloud services.
• A hybrid cloud is composed of two or more types of clouds,
connected by technology that enables data and applications to be
moved around the infrastructure to balance loads among clouds.
Risk Analysis
• Before moving functionality or data to a cloud, it is important to
consider pros and cons.
• Moving to a cloud can be difficult and expensive, and it can be equally
expensive to undo.
• While every cloud offering presents its own set of risks and benefits, a
number of guidelines can help you understand whether your
functions and data should be migrated to a cloud environment, as
well as which cloud offerings will be most likely to meet your security
needs.
Risk analysis should be a part of any major security
decision, including a move to cloud services.

• 1. Identify assets.
• Moving to a cloud service generally means moving functionality and data.
• It is important that you document every function and data type that might
move to the cloud service, since it’s easy to lose track and miss something
important.
• 2. Determine vulnerabilities.
• When considering cloud services, be sure to consider cloud-specific
vulnerabilities.
• These will generally stem from having to access the system through an
Internet connection, sharing hardware and networks with potential
adversaries, and trusting a cloud provider.
• 3. Estimate likelihood of exploitation.
• Many vulnerabilities will be either more or less difficult to exploit in a cloud environment, as
well as across different cloud service models and providers.
• 4. Compute expected loss.
• Evaluate how easily vulnerabilities can be exploited in a cloud environment compared to on-
premises solutions.
• Consider whether a typical cloud provider can respond to attacks (e.g., DDoS) more
effectively than your organization.
• 5. Survey and select new controls.
• Identify necessary security controls for managing risks in the cloud, including:
• Data encryption needs.
• Logging capabilities from the provider.
• Authentication and access control options.
• 6. Project Savings
• Cost-Benefit Analysis: While moving to cloud services is often justified by projected cost
savings, hidden costs may arise.
• Understanding Total Costs: Evaluate all potential expenses, including new security controls, to
avoid unexpected financial burdens.
• Cloud as a Security Control
• While moving data and functionality to the cloud does have its risks, cloud
services can be valuable security tools in a number of ways.
• The most obvious is that cloud services are often excellent at mitigating single
points of failure.
• Mitigating Single Points of Failure
• Geographic Diversity: Having multiple data centers reduces risks from localized threats
(natural disasters, fires, internet outages).
• Choose secondary data centers far from primary ones to minimize shared risks.
• Cloud RAID Concept
• Innovative Approach: Researchers at Cornell University proposed a method to combat
vendor lock-in by engaging multiple cloud storage providers.
• Redundant Array of Cloud Storage (RACS):Treats multiple cloud providers as a single RAID
array.
• Maintains redundancy to recreate data if one provider fails.
• Estimated cost increase of only 11% compared to traditional storage solutions.
• Platform Diversity
• Varied Vulnerabilities: Different operating systems, applications, and protocols used by
cloud providers can reduce the likelihood of simultaneous attacks on both your systems
and theirs.
• Decreased Risk: The diversity in platforms helps in minimizing common vulnerabilities.

• Infrastructure Diversity
• Diverse Vulnerabilities: Differences in hardware, network configuration, security controls,
and quality of security staff between your organization and the cloud provider.
• Enhanced Security Posture: This diversity can provide additional layers of security against
attacks.
• Additional Security Operations in the Cloud
• Email Filtering: Cloud providers can filter spam and dangerous attachments before
reaching user inboxes.
• DDoS Protection: Cloud-based services can absorb attack traffic and filter malicious
packets before they reach customer systems.
• Network Monitoring: Cloud solutions can handle log analysis efficiently, alleviating
resource constraints.
Remaining Topics
• Cloud Security Tools and Techniques,
• Cloud Identity Management,
• Securing IaaS.
Cloud Security Tools and Techniques

• Introduction to Cloud Security


• Definition: Cloud security is the protection of cloud-based systems, data, and
infrastructure from cyber threats.
• Relation to Information Security: While it shares principles with general
information security, cloud security introduces unique challenges due to
shared resources.
• Unique Threat Vectors
• Shared Resources: Risks arise from shared processing, storage, and
communication resources with potential adversaries.
• Adapting Security Tools: Standard tools like encryption and network security
must be tailored for cloud environments.
• Data Protection in the Cloud
• Public Cloud Services: Involves sending private data over the Internet and storing it on the
provider's servers (SaaS, PaaS, IaaS).
• Responsibility: Users must select cloud offerings that ensure data protection from
unauthorized access and modifications.
• Securing Data in Transit
• Encryption Protocols: Use TLS for SaaS/PaaS services.
• For IaaS, consider additional methods like SSH and VPNs.
• mutual authentication, allowing the client and server to authenticate each other.
• Cloud Storage Considerations
• Data Sensitivity Assessment: Determine encryption needs based on the sensitivity of stored
data.
• Access Control Requirements: Understand sharing capabilities and their implications for
sensitive information.
• Compliance and Regulations
• Export Controls: Be aware of regulations affecting data flow across borders.
• Provider Audits: Ensure cloud providers comply with necessary regulations and standards.
• Data Confidentiality Measures
• Encryption Standards: Utilize AES-256 with individual keys for users.
• Key Management Strategy: Implement a master key system to facilitate secure
user key changes without re-encrypting large datasets.
• Trust No One Philosophy (TNO)
• Definition: Providers should not have access to user encryption keys.
• Example - Lastpass: protect user data without storing decryption keys on their
servers.
• Data Loss Prevention (DLP)
• Challenges with DLP in Cloud Environments: Traditional DLP measures may be
bypassed when accessing cloud services outside company networks.
• Solutions: Enforce VPN access for remote logins.
• Deploy DLP solutions within IaaS environments.
• Cloud Application Security
• Secure Software Development: Follow best practices for application security in shared
environments.
• Common Threats: Attacks on shared resources (e.g., SQL injection).
• Insecure APIs lead to vulnerabilities.
• Logging and Incident Response
• Importance of Logs: Essential for detecting and investigating security incidents in cloud
environments.
• Challenges in Public Cloud: Limited access to logs from SaaS/PaaS; IaaS offers more control
over logging.
• Best Practices for Incident Response
• SLAs with Providers: Include requirements for logging, evidence preservation, and incident
notification.
• Ensure logs are sent to a separately for analysis and protection against intrusions.
• Introduction to Cloud Identity Management
• Definition: Cloud Identity Management refers to the processes and technologies used to
manage user identities and their access to cloud resources.
• Importance: As organizations migrate sensitive data to the cloud, effective identity
management is crucial for authentication and authorization.
• Challenges in Cloud Identity Management
• Multiple Accounts: Users often need separate accounts for different cloud services,
increasing vulnerability.
• Password Management Issues: Weak passwords and reuse across services heighten
security risks.
• Loss of Control: Organizations may struggle to manage user accounts effectively across
various cloud providers.
• Risks of Individual Account Creation
• Security Vulnerabilities: Increased chances of data breaches due to poor password
practices.
• Administrative Burden: Difficulty in managing user access when employees leave or change
roles.
• Risks of Shared Accounts
• Increased Theft Risk: Shared passwords can easily be compromised.
• Accountability Issues: Hard to determine who accessed what data.
• Frequent Password Changes: Necessity to change passwords whenever a user leaves or
changes roles.
• Federated Identity Management (FIdM)
• Definition: FIdM allows identity information to be shared among multiple entities,
providing single sign-on (SSO) convenience.
• How It Works: One system maintains user identity information.
• Other systems query this central system for authentication.
• Benefits of Federated Identity Management
• Single Sign-On (SSO): Users can access multiple services with one set of credentials.
• Enhanced Security Controls: Organizations can enforce strong password policies and
multifactor authentication.
• Simplified User Lifecycle Management: Centralized control over user accounts ensures
efficient provisioning and de-provisioning.
VIEW OF FIdM
• Security Assertion Markup Language (SAML)
• Definition: SAML is an XML-based standard for securely exchanging user identity
information between systems.
• Key Components: Service Provider (SP) / Relying Party:: The application requesting
identity information.
• The Subject: The entity, be it user or system, that is attempting to log in to the SP
• Identity Provider (IdP)/ Asserting Party: The system authenticating the user and
providing identity assertions.
• SAML Authentication Flow
• User attempts to access SP.
• SP redirects user to IdP for authentication.
• IdP authenticates the user and sends back a signed message with identity attributes.
• SP grants access based on assertions received.
• Types of SAML Assertions
• Authentication Assertion: Confirms the user was authenticated at a specific time.
• Attribute Assertion: Provides additional attributes about the user (e.g., roles,
permissions).
• Authorization Decision Assertion: Indicates whether access to a resource is granted or
denied.
• OAuth for Authentication
OAuth does not exchange identity information, just authorization
• Integration with SAML: OAuth can be combined with SAML for native applications, allowing
seamless identity management across platforms.
• OpenID Connect (OIDC): A newer standard built on OAuth that supports broader use cases
beyond enterprise applications.
• Advantages of OpenID Connect
• User-Friendly Authentication: Allows users to log into multiple services using existing
credentials (e.g., Google).
• Support for Native Applications: OIDC is designed to work well with both web and mobile
applications.
• Best Practices for Cloud Identity Management
• Implement Strong Authentication Mechanisms: Utilize multifactor authentication and
enforce password complexity requirements.
• Regularly Review Access Permissions: Conduct audits to ensure that users have appropriate
access levels.
• Adopt Federated Identity Solutions: Use FIdM to streamline access management across
multiple cloud providers.
Securing IaaS
• Introduction to IaaS
• Infrastructure as a Service (IaaS) allows scalable server infrastructure.
• Ideal for dynamic needs of MMO games.
• Pay only for what you use; rapid elasticity to handle varying player demand.
• Key Features of IaaS
• Virtualization: Leverages hypervisors to manage multiple VMs.
• Cloud Management: Monitors, provisions, and manages workloads.
• Flexibility: Quickly adjust resources based on player activity.
• Public IaaS vs. Private Network Security
• Shared Infrastructure: New threats arise from shared resources.
• Access Methods: More access points (APIs, consoles) than traditional setups.
• Deployment Ease: Easy to create new VMs and networks, requiring stricter security.
• Addressing Shared Infrastructure Threats
• Shared Storage Risks: Data remains on disks until overwritten.
• Mitigation: Use encryption for sensitive data.
• Shared Network Risks: Network traffic could be intercepted.
• Mitigation: Encrypt traffic with TLS, SSH, or VPN.
• Securing Host Access
• Protect management interfaces (web consoles, APIs).
• Authentication Strategies:
• Use multifactor authentication.
• Avoid shared accounts; enforce least privilege.
• Use OAuth for API access.
• Virtual Infrastructure Best Practices
• Create specialized VMs for different functions (e.g., FTP server).
• Hardening VMs:
• Disable unnecessary services and privileges.
• Implement application whitelisting.
• Configure host-based firewalls.
• Network Segregation
• Utilize private network enclaves for different system functions.
• Protect enclaves with strict firewall rules.
• Use application proxy servers to manage external access.
• Monitoring and Logging
• Limit SSH and screen-sharing access to trusted IPs.
• Collect logs from VMs; do not store them on the same infrastructure.
• Analyze logs for failed login attempts to enhance security.
THE END

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy