0% found this document useful (0 votes)
7 views15 pages

Unit 4 and 5

Utility computing is a trending IT service model that offers on-demand computing resources based on a pay-per-use method, enhancing flexibility and reducing costs. Virtualization allows multiple applications to run on the same server, increasing efficiency, while automated provisioning streamlines user access to applications, improving productivity and reducing errors. Data center challenges such as scalability, security, and compliance can be addressed through cloud-based solutions and automation, leading to more efficient and reliable operations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views15 pages

Unit 4 and 5

Utility computing is a trending IT service model that offers on-demand computing resources based on a pay-per-use method, enhancing flexibility and reducing costs. Virtualization allows multiple applications to run on the same server, increasing efficiency, while automated provisioning streamlines user access to applications, improving productivity and reducing errors. Data center challenges such as scalability, security, and compliance can be addressed through cloud-based solutions and automation, leading to more efficient and reliable operations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 15

UNIT IV – UTILITY COMPUTING

Utility Computing Technology


Cloud Computing Technologies

A list of cloud computing technologies are given below -

o Virtualization
o Utility Computing

Utility Computing

Utility computing is the most trending IT service model. It provides on-demand


computing resources (computation, storage, and programming services via API) and
infrastructure based on the pay per use method. It minimizes the associated costs and
maximizes the efficient use of resources. The advantage of utility computing is that it
reduced the IT cost, provides greater flexibility, and easier to manage.

Large organizations such as Google and Amazon established their own utility services
for computing storage and application.

Virtualization
Virtualization is the process of creating a virtual environment to run multiple applications
and operating systems on the same server. The virtual environment can be anything,
such as a single instance or a combination of many operating systems, storage devices,
network application servers, and other environments.

The concept of Virtualization in cloud computing increases the use of virtual machines. A
virtual machine is a software computer or software program that not only works as a
physical computer but can also function as a physical machine and perform tasks such as
running applications or programs as per the user's demand.

Types of Virtualization
A list of types of Virtualization is given below -

i. Hardware virtualization

ii. Server virtualization

iii. Storage virtualization

iv. Operating system virtualization

v. Data Virtualization

Hyper Threading

Hyper-threading technology (sometimes also called simultaneous multithreading, or SMT) allows a


single physical processor core to behave like two logical processors, essentially allowing two
independent threads to run simultaneously. Unlike having twice as many processor cores—that can
roughly double performance—hyper-threading can provide anywhere from a slight to a significant
increase in system performance by keeping the processor pipeline busier.On a system with hyper-
threading activated, VMware Cloud on AWS assigns adjacent CPU numbers to logical processors on
the same core. Thus CPUs 0 and 1 are on the first core, CPUs 2 and 3 are on the second core, and
so on.VMware Cloud on AWS manages processor time intelligently to spread load smoothly across all
physical cores in the system. If there is no work for a logical processor, it is put into a halted state that
frees its execution resources and allows the virtual machine running on the other logical processor on
the same core to use the full execution resources of the core.

Blade Servers

A blade server, sometimes referred to as a high-density server, is a compact device containing a


computer used to manage and distribute data in a collection of computers and systems, called a
network. Its role is to act as a conduit between computers, programs, applications and systems.
Benefits of blade servers

 Cooling: Each blade is cooled individually by fans. Additionally, because of their

stackability, the servers can be kept in smaller air-controlled areas that keep all of the

mechanical parts at a proper temperature.

 Management supervisor included: Unlike their predecessors, blade servers can be

controlled and managed in tandem with other server units within a data center or network.

 Seamless movement within rack and minimal wiring: Organizations using blade

servers can experience a reduction in cabling for blade server housing compared to

larger models like box servers

 Low power consumption: Servers within a rack are able to share a single power

source, leading to a reduction in storage and power consumption costs compared to

other server types.

 Storage consolidation: Each blade in a blade server typically comes with one or two

local ATA or SCSI For additional storage, blade servers can connect to a storage pool

facilitated by a network-attached storage (NAS), Fiber Channel, or iSCSI storage-area

network (SAN).

 Compact size: Unlike traditional rack servers, blade computing does not have minimum

size restrictions

 High-trust compatibility: The nature of servers that carry out a highly individualized task

makes it possible for an organization to dedicate a single server entirely to mission-critical

applications, or programs without which the entire company or project could not exist.

Uses of blade servers

 File sharing: Any transfer of data between digital points or devices.


 Web page serving and caching: The processes of delivering web pages to visitors and

temporarily storing the information on the website on the visitor's computer so it can be

quickly pulled up and recalled in order to reduce wait time and stalling.

 SSL encryption of web communication: Ensuring that information that travels over an

internet connection is secure from outside parties, viruses and attackers.

 Transcoding: Converting the code of web page content to move seamlessly between

differently-sized and shaped devices or for other conversion purposes.

 Streaming: Transmitting audio and video content without interruption to allow viewing

and listening in real time.

 Load balancing: Like most clustering applications, blade servers can be used to include

load balancing and failover

 Virtualization: Blade servers can be used to create abstract versions of applications or

real-life activities for digital use.

 Storage: The sleeker, more compact design allows a larger amount of information to be

stored to support a larger number of applications working in unison.

Automated Provisioning

Automated provisioning, or automated user provisioning, is the method of granting and

managing access to applications, systems and data within an organization, through

automated practices. Automated Provisioning is the first key tenant to identity and

access management (IAM).

Benefits of automated provisioning.

Streamlined onboarding.

Imagine manually onboarding and provisioning hundreds of new employees at once. If


you’re an enterprise organization without IAM, that’s often the case. How do you
provision hundreds of new users to their respected systems and applications efficiently?

Automated provisioning allows you to take the onboarding burden off of your Human
Resources or IT department. Organizations can add their new employees, contractors,
consultants, etc. into their identity management system, and through automated
provisioning, they’ll get access to specific applications and resources needed to do their
job.
Automated Provisioning also increases productivity by giving users the access they need
on day one. Users don’t have to wait to receive access and are empowered to start their
work immediately.

Cost savings.

Cost reduction in the context of automated provisioning is two-fold: productivity and


efficiency. By reducing manual provisioning processes, you can free up time and
resources for other tasks. You also avoid user downtime when they don’t have the access
they need which results in greater productivity, lowering operational costs, and leading
to greater operational efficiency.

Error reduction.

Because automated provisioning eliminates manual processes, it also greatly reduces the
margin of error. There’s less of a chance of a slip-up when adding a user to the system,
and provisioning access to applications.

In addition, by automating user provisioning, you reduce the risk of security threats and
data breaches, as the only way to get access to these applications is through the roles
and permissions set up by the organization.

Organizations have full visibility into who has access to what, significantly reducing risk
of a security mishap.

Policy Based Automation


Policy-based automation focuses on separating your business and operational policies
from the mechanics of actually performing the automation according to the policies.
Policy-based automation focuses on setting your policies and Automation Control deals
with implementing them.

Automation Control policy-based automation includes resource information, groups of


resources, and relationships in the decision-making process before acting. Resource
information defines resource class and name, as well as how to start, stop, and monitor
the resource. Resources can be members of system-wide groups and relationships.

The power of a policy

 You are able to define automation requirements easily:

In a policy, you can define which resources belong together and are managed as
one (business entity). For example, a DB2® System consists of many resources.
With Automation Control, you can group and aggregate resources to more
meaningful manageable entities, for example, My-HumanResource-Application
and so forth. You can monitor on this level, issue commands on this level, and
manage at the business level rather than at the single IT resource level.

 In policies, you can specify how resources are dependent and related to each
other. For example, which of the other resources must be available before a
certain resource can be started. In another example, my database must be up
and running before my application is started.
 Policy definitions can be reused, copied, and cloned for similar applications
elsewhere in the enterprise.

 Because the underlying technology is responsible for the detailed actions, these
actions are performed in a consistent and reliable manner. With traditional
programming solutions, the testing of abnormal conditions is difficult and prone to
be incomplete. The action of automation under these abnormal conditions is,
however, critical to the entire automation solution.

The Sample Add-on policy includes automation modules for base operating system,
major middleware, and systems management software. Today’s wide range of add-on
policies are based on best practices and can help reduce time and effort in creating a
policy or updating one.

Another possibility to populate your policy is by using the autodiscovery function. If you
use this method, the data of all your address spaces that include the UNIX System
Services are gathered on your system.

Application Management
Cloud Application Management for Platforms (CAMP) is a specification for managing applications
in the context of a platform as a service (PaaS) system. CAMP is designed to address the needs of a
high-level PaaS system; one in which the consumer (generally a developer or application
administrator) provides application artifacts (code, data, graphics, etc.) and specifies which provider-
supplied services are required to realize these artifacts as an application. The details of the
infrastructure (compute, storage, and networking) used to support these services are hidden from the
consumer by the provider of the PaaS system.

CAMP defines the following:

 A domain-specific language that describes the artifacts that make up an application, the
services that are required to execute or utilize those artifacts, and the relationship of the
artifacts to those services.
 A resource model for representing applications and their constituent components as well
the services used by those components along with runtime status information,
configuration information, and metadata that describes the PaaS system.
 A RESTful protocol for manipulating these resources and, by so doing, changing the
state of the underlying application.

Evaluating Utility Management Technology


When evaluating utility management technology in cloud computing, there are some
additional factors to consider:

1. Security: How does the technology ensure the security of your data? Cloud
computing introduces new security risks, such as data breaches and unauthorized
access, so it's important to ensure that the technology you choose has robust
security measures in place.
2. Scalability: Can the technology scale to meet your changing needs over time?
Cloud computing makes it easier to scale up or down as needed, so you should
look for a technology that can grow or shrink with your business.
3. Integration: Does the technology integrate with your existing systems and
workflows? Cloud-based utility management technology should be easy to
integrate with your other cloud-based tools, such as your accounting software or
CRM system.
4. Cost: What is the total cost of ownership for the technology? Cloud-based utility
management technology typically has lower upfront costs than on-premise
solutions, but you should also consider ongoing costs such as subscription fees
and support costs.
5. Reliability: How reliable is the technology? Cloud-based solutions are typically
more reliable than on-premise solutions, but you should still look for a technology
that offers high availability and uptime guarantees.

By considering these factors, you can evaluate different utility management technologies
in cloud computing and choose the solution that best fits your needs and goals.

Virtual Test and development Environment


Virtual test and development environments in cloud computing refer to the use of cloud
resources to create an environment for building, testing, and deploying software
applications. This approach offers several benefits, such as scalability, cost savings, and
increased flexibility.

Here are some key considerations when creating a virtual test and development
environment in cloud computing:

1. Cloud Infrastructure: Choose a cloud provider that offers a robust infrastructure


with the necessary resources, such as computing power, storage, and networking
capabilities, to support your test and development needs.
2. Provisioning: Use cloud-based automation tools to quickly provision and de-
provision development and test environments as needed. This allows you to
create and dispose of environments on the fly, saving time and money.
3. Version Control: Use version control tools to keep track of changes to your code
and ensure that everyone is working with the latest version. This is important in
distributed teams where multiple developers may be working on the same
codebase.
4. Collaboration: Use cloud-based collaboration tools to enable developers to work
together seamlessly, regardless of location. This can help to streamline the
development process and improve overall efficiency.
5. Testing: Leverage cloud-based testing tools to simulate and test applications in a
variety of environments, such as load testing, performance testing, and security
testing. This can help to ensure that applications are functioning as expected
before deployment.
6. Cost: Compare the costs of virtual test and development environments in the
cloud versus on-premise environments, and consider factors such as hardware,
software licenses, and personnel costs.
Data Center Challenges and Solutions
Data centers in cloud computing can face a number of challenges. Here are some
common challenges and potential solutions:

1. Scalability: One of the biggest challenges of data centers in cloud computing is


ensuring that they can scale up or down to meet demand. To address this
challenge, organizations can use cloud-based auto-scaling tools to automatically
provision and de-provision resources as needed.
2. Security: Data centers must ensure that data is secure, both from external threats
and from internal threats such as malicious employees. To address this challenge,
data centers can use a range of security measures, such as access controls,
encryption, and intrusion detection systems.
3. Compliance: Many organizations must comply with a range of regulatory
requirements around data privacy and security. To address this challenge, data
centers can implement policies and procedures that comply with relevant
regulations, and use cloud-based tools that are designed to meet compliance
requirements.
4. Reliability: Data centers must ensure that services are highly available and
reliable. To address this challenge, organizations can use multiple data centers in
different geographic regions, and implement load balancing and failover
mechanisms to ensure continuity of service.
5. Cost: Building and operating a data center can be expensive. To address this
challenge, organizations can use cloud-based infrastructure as a service (IaaS)
providers to outsource some or all of their data center operations. This allows
them to pay only for the resources they need, rather than investing in expensive
hardware and software.

By addressing these challenges, organizations can create highly scalable, secure,


compliant, reliable, and cost-effective data centers in cloud computing.

Automating The Data Center

Automating the data center in cloud computing involves using software tools and
processes to streamline the management and operation of the data center. Automation
can help organizations to reduce costs, improve efficiency, and increase agility. Here are
some key considerations when automating the data center in cloud computing:

1. Infrastructure as Code: Use infrastructure as code (IaC) tools to automate the


provisioning and configuration of infrastructure resources, such as servers,
storage, and networks. IaC tools enable administrators to manage infrastructure
resources in the same way that they manage software code, which can lead to
faster provisioning times, greater consistency, and reduced errors.
2. Orchestration: Use orchestration tools to automate the deployment and
management of applications and services across the data center. Orchestration
tools can help to reduce manual intervention, improve consistency, and enable
rapid deployment of new applications and services.
3. Monitoring: Use monitoring tools to automatically detect and respond to changes
in the data center environment. Monitoring tools can help to identify issues before
they become critical, and can enable administrators to take proactive steps to
address problems.
4. Compliance: Use compliance automation tools to ensure that the data center is
compliant with relevant regulations and standards. Compliance automation tools
can help to identify and resolve compliance issues, and can enable administrators
to enforce policies and procedures in a consistent manner.
5. Disaster Recovery: Use disaster recovery automation tools to automate the
failover and recovery of critical applications and services in the event of an
outage. Disaster recovery automation tools can help to minimize downtime and
reduce the impact of disruptions on the organization.

By automating the data center in cloud computing, organizations can achieve greater
efficiency, consistency, and agility, while reducing costs and improving security and
compliance.

UNIT V – CLOUD COMPUTING SECURITY


ARCHITECTURE
Cloud security fundamentals
Cloud security is the set of control-based security measures and technology protection,
designed to protect online stored resources from leakage, theft, and data loss.
Protection includes data from cloud infrastructure, applications,
and threats. Security applications uses a software the same as SaaS (Software as a
Service) model.

Benefits of Cloud Security System

We understand how the cloud computing security operates to find ways to benefit your
business.

Cloud-based security systems benefit the business by:

o Protecting the Business from Dangers

o Protect against internal threats

o Preventing data loss

o Top threats to the system include Malware, Ransomware, and

o Break the Malware and Ransomware attacks

o Malware poses a severe threat to the businesses.

More than 90% of malware comes via email. It is often reassuring that employee's
download malware without analysingit. Malicious software installs itself on the network to
steal files or damage the content once it is downloaded.

Ransomware is a malware that hijacks system's data and asks for a financial ransom.
Companies are reluctant to give ransom because they want their data back.

Data redundancy provides the option to pay a ransom for your data. You can get that
was stolen with minimal service interruption.
Many cloud data protection solutions identify malware and ransomware. Firewalls keep
malicious email out of the inbox.

Vulnerability assessment tool for cloud

Vulnerability Assessment

In information technology, a vulnerability evaluation is the systematic analysis of security


vulnerabilities. It examines if the system is vulnerable to any security vulnerabilities,
defines severity levels to such vulnerabilities, and, if and whenever appropriate,
recommends abatement or mitigation.

In any device that fixes possible vulnerabilities, vulnerability testing or vulnerability


evaluation is a systematic method of discovering security loopholes.

Here, we have some examples of threats that vulnerability assessment can eliminate
are:

1. SQL injection, XSS or other attacks with code injection.

2. Privilege escalation is caused by faulty user authentication.

3. Unprotected defaults, such as a discoverable admin password, are applications


that arrive with vulnerable configurations.

Vulnerability Assessment Tools

Vulnerability assessment tools lead to multiple methods of detecting vulnerabilities in


application domains. Vulnerability tools for code analysis analyze coding glitches.
Excellently-known rootkits, backdoors, and Trojan Horses can be discovered in audit
vulnerability toolkits.

In the industry, there are several vulnerability scanners obtainable. They can be freely
accessible, charged, or open-source. On GitHub, many free and open-source tools have
been developed. Choosing which tool to use depends on a variety of variables, such as
the category of security vulnerabilities, the cost estimate, how often the tool is modified,
etc.

Here, we have discussed some of the best vulnerability scanning tools. They are-

1. OpenVAS

OpenVAS is a valuable tool for detecting vulnerabilities that endorses massive scale
scans that are appropriate for companies. This tool can be used not only in web
applications or application server, but also in databases, programming systems,
networks, and virtualization software to diagnose vulnerabilities problems.

OpenVAS provides frequent updates, which widens the exposure of vulnerability


detection. It also assists in assessing the risk and demonstrates preventive measures for
the identified vulnerabilities.

2. Nikto2
Nikto2 is a screening program for open-source exploits that emphasizes on web
application security. Nikto2 will discover about 6700 hazardous archives that cause web
server problems and evaluate iterations based on inaccurate servers. In addition, Nikto2
can immediately notify you to server installation problems and improve virtual server
audits in a minimal amount of time.

Nikto2 might not offer any preventative measures for potential vulnerabilities that
include risk management functionality. Nikto2, after all, is a constant accessed tool that
allows vulnerabilities to be covered more broadly.

3. Netsparker

Netsparker is also a vulnerability assessment tool for web applications with an


optimization feature provided for vulnerability seeking. This tool is also smart enough to
find vulnerabilities within the next few hours in millions of web application domains.

It has many additional features, but it is a charged enterprise-level vulnerability tool. It


has slithering innovation that, through crawling into the system, discovers vulnerabilities.
Netsparker will identify and recommend mitigation strategies for vulnerabilities reported.
Also, security tools are available for comprehensive vulnerability evaluation.

4. Acunetix

Acunetix is a commercial (open-source edition also obtainable) web application


vulnerability scanner with several features offered. With the help of this tool, there is a
mapping range of about 6500 vulnerabilities. It can also discover network vulnerabilities
as well, in additament to web services.

Acunetix offers the opportunity for your scan to be streamlined. It is suitable for large-
scale organizations because several systems can be controlled by it. HSBC, NASA, USA
air force are few industrial titans that use the Arachni scanning tool for vulnerability
testing.

5. Arachni

For application development, Arachni is also a deeply committed vulnerability tool. A


number of vulnerabilities are protected by this tool and are inspected periodically.
Arachni offers risk management services and recommends suggestions and defensive
measures for vulnerabilities that have been identified.

Linux, Windows, and macOS are supported by Arachni, a freely used and open-source
security vulnerabilities tool. With its capacity to adapt with recently discovered
vulnerabilities, Arachni also aims to assist in penetration testing.

6. Nmap

Among many cybersecurity experts, Nmap is one of the possibly the best-known, freely
used and open-source testing tools for networks. To explore hosts in the domain and for
software exploration, Nmap uses the penetrating technique.

In two or more distinct networks, this function aims to detect exploits. If you really are a
beginner or trying to learn to search for vulnerabilities, then the Nmap scanning tool is a
great starting point.

7. W3AF
W3AF is a platform also called Web Software Attack and Framework, open and free-
source. This software is an open-source web application analysis for vulnerabilities. By
identifying and evaluating the bugs, it provides a mechanism that is useful to protect the
web application. This software is recognized for user-friendliness. W3AF also has
infiltration services used for vulnerability assessment work, along with penetration
testing options.

W3AF contains a broad-scale set of vulnerabilities. This tool can be selected for networks
that are attacked repeatedly, particularly with previously unrecognized vulnerabilities.

8. GoLismero

GoLismero is a tool used for intrusion prevention that is free and open-source. GoLismero
aims to identify web application threats and vulnerabilities, but can also search for
network vulnerabilities. GoLismero is an efficient tool that works with outcomes obtained
by other vulnerability tools such as OpenVAS, then consolidates the findings and gives
feedback.

A broad variety of vulnerabilities, including storage and network vulnerabilities, are


protected by GoLismero. GoLismero also supports preventative measures for discovered
vulnerabilities.

Privacy and Security in cloud


Cloud security is the set of control-based security measures and technology protection,
designed to protect online stored resources from leakage, theft, and data loss.
Protection includes data from cloud infrastructure, applications,
and threats. Security applications uses a software the same as SaaS (Software as a
Service) model.

How to manage security in the cloud?

Cloud service providers have many methods to protect the data.

Firewall is the central part of cloud architecture. The firewall protects the network and
the perimeter of end-users. It also protects traffic between various apps stored in the
cloud.

Access control protects data by allowing us to set access lists for various assets. For
example, you can allow the application of specific employees while restricting others.
It's a rule that employees can access the equipment that they required. We can keep
essential documents which are stolen from malicious insiders or hackers to
maintaining strict access control.

Data protection methods include Virtual Private Networks (VPN), encryption, or masking.
It allows remote employees to connect the network. VPNaccommodates the tablets and
smartphone for remote access. Data masking maintains the data's integrity by keeping
identifiable information private. A medical company share data with data masking
without violating the HIPAA laws.

For example, we are putting intelligence information at risk in order of the importance of
security. It helps to protect mission-critical assets from threats. Disaster recovery is vital
for security because it helps to recover lost or stolen data.

Benefits of Cloud Security System


We understand how the cloud computing security operates to find ways to benefit your
business.

Cloud-based security systems benefit the business by:

o Protecting the Business from Dangers

o Protect against internal threats

o Preventing data loss

o Top threats to the system include Malware, Ransomware, and

o Break the Malware and Ransomware attacks

o Malware poses a severe threat to the businesses.

Cloud computing security architecture

loud Computing Security Architecture

Security in cloud computing is a major concern. Proxy and brokerage services should be
employed to restrict a client from accessing the shared data directly. Data in the cloud
should be stored in encrypted form.

Security Planning

Before deploying a particular resource to the cloud, one should need to analyze several
aspects of the resource, such as:

o A select resource needs to move to the cloud and analyze its sensitivity to risk.

o Consider cloud service models such as IaaS, PaaS,and These models require the
customer to be responsible for Security at different service levels.
o Consider the cloud type, such as public, private, community, or

o Understand the cloud service provider's system regarding data storage and its
transfer into and out of the cloud.

o The risk in cloud deployment mainly depends upon the service models and cloud
types.

Understanding Security of Cloud

Security Boundaries

The Cloud Security Alliance (CSA) stack model defines the boundaries between each
service model and shows how different functional units relate. A particular service model
defines the boundary between the service provider's responsibilities and the customer.
The following diagram shows the CSA stack model:

Architectural considerations
Cloud Computing Architecture

As we know, cloud computing technology is used by both small and large organizations
to store the information in cloud and access it from anywhere at anytime using the
internet connection.

Cloud computing architecture is a combination of service-oriented


architecture and event-driven architecture.

Cloud computing architecture is divided into the following two parts -

o Front End

o Back End

The below diagram shows the architecture of cloud computing -


Front End

The front end is used by the client. It contains client-side interfaces and applications that
are required to access the cloud computing platforms. The front end includes web
servers (including Chrome, Firefox, internet explorer, etc.), thin & fat clients, tablets, and
mobile devices.

Back End

The back end is used by the service provider. It manages all the resources that are
required to provide cloud computing services. It includes a huge amount of data storage,
security mechanism, virtual machines, deploying models, servers, traffic control
mechanisms, etc.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy