Unit 4 and 5
Unit 4 and 5
o Virtualization
o Utility Computing
Utility Computing
Large organizations such as Google and Amazon established their own utility services
for computing storage and application.
Virtualization
Virtualization is the process of creating a virtual environment to run multiple applications
and operating systems on the same server. The virtual environment can be anything,
such as a single instance or a combination of many operating systems, storage devices,
network application servers, and other environments.
The concept of Virtualization in cloud computing increases the use of virtual machines. A
virtual machine is a software computer or software program that not only works as a
physical computer but can also function as a physical machine and perform tasks such as
running applications or programs as per the user's demand.
Types of Virtualization
A list of types of Virtualization is given below -
i. Hardware virtualization
v. Data Virtualization
Hyper Threading
Blade Servers
stackability, the servers can be kept in smaller air-controlled areas that keep all of the
controlled and managed in tandem with other server units within a data center or network.
Seamless movement within rack and minimal wiring: Organizations using blade
servers can experience a reduction in cabling for blade server housing compared to
Low power consumption: Servers within a rack are able to share a single power
Storage consolidation: Each blade in a blade server typically comes with one or two
local ATA or SCSI For additional storage, blade servers can connect to a storage pool
network (SAN).
Compact size: Unlike traditional rack servers, blade computing does not have minimum
size restrictions
High-trust compatibility: The nature of servers that carry out a highly individualized task
applications, or programs without which the entire company or project could not exist.
temporarily storing the information on the website on the visitor's computer so it can be
quickly pulled up and recalled in order to reduce wait time and stalling.
SSL encryption of web communication: Ensuring that information that travels over an
Transcoding: Converting the code of web page content to move seamlessly between
Streaming: Transmitting audio and video content without interruption to allow viewing
Load balancing: Like most clustering applications, blade servers can be used to include
Storage: The sleeker, more compact design allows a larger amount of information to be
Automated Provisioning
automated practices. Automated Provisioning is the first key tenant to identity and
Streamlined onboarding.
Automated provisioning allows you to take the onboarding burden off of your Human
Resources or IT department. Organizations can add their new employees, contractors,
consultants, etc. into their identity management system, and through automated
provisioning, they’ll get access to specific applications and resources needed to do their
job.
Automated Provisioning also increases productivity by giving users the access they need
on day one. Users don’t have to wait to receive access and are empowered to start their
work immediately.
Cost savings.
Error reduction.
Because automated provisioning eliminates manual processes, it also greatly reduces the
margin of error. There’s less of a chance of a slip-up when adding a user to the system,
and provisioning access to applications.
In addition, by automating user provisioning, you reduce the risk of security threats and
data breaches, as the only way to get access to these applications is through the roles
and permissions set up by the organization.
Organizations have full visibility into who has access to what, significantly reducing risk
of a security mishap.
In a policy, you can define which resources belong together and are managed as
one (business entity). For example, a DB2® System consists of many resources.
With Automation Control, you can group and aggregate resources to more
meaningful manageable entities, for example, My-HumanResource-Application
and so forth. You can monitor on this level, issue commands on this level, and
manage at the business level rather than at the single IT resource level.
In policies, you can specify how resources are dependent and related to each
other. For example, which of the other resources must be available before a
certain resource can be started. In another example, my database must be up
and running before my application is started.
Policy definitions can be reused, copied, and cloned for similar applications
elsewhere in the enterprise.
Because the underlying technology is responsible for the detailed actions, these
actions are performed in a consistent and reliable manner. With traditional
programming solutions, the testing of abnormal conditions is difficult and prone to
be incomplete. The action of automation under these abnormal conditions is,
however, critical to the entire automation solution.
The Sample Add-on policy includes automation modules for base operating system,
major middleware, and systems management software. Today’s wide range of add-on
policies are based on best practices and can help reduce time and effort in creating a
policy or updating one.
Another possibility to populate your policy is by using the autodiscovery function. If you
use this method, the data of all your address spaces that include the UNIX System
Services are gathered on your system.
Application Management
Cloud Application Management for Platforms (CAMP) is a specification for managing applications
in the context of a platform as a service (PaaS) system. CAMP is designed to address the needs of a
high-level PaaS system; one in which the consumer (generally a developer or application
administrator) provides application artifacts (code, data, graphics, etc.) and specifies which provider-
supplied services are required to realize these artifacts as an application. The details of the
infrastructure (compute, storage, and networking) used to support these services are hidden from the
consumer by the provider of the PaaS system.
A domain-specific language that describes the artifacts that make up an application, the
services that are required to execute or utilize those artifacts, and the relationship of the
artifacts to those services.
A resource model for representing applications and their constituent components as well
the services used by those components along with runtime status information,
configuration information, and metadata that describes the PaaS system.
A RESTful protocol for manipulating these resources and, by so doing, changing the
state of the underlying application.
1. Security: How does the technology ensure the security of your data? Cloud
computing introduces new security risks, such as data breaches and unauthorized
access, so it's important to ensure that the technology you choose has robust
security measures in place.
2. Scalability: Can the technology scale to meet your changing needs over time?
Cloud computing makes it easier to scale up or down as needed, so you should
look for a technology that can grow or shrink with your business.
3. Integration: Does the technology integrate with your existing systems and
workflows? Cloud-based utility management technology should be easy to
integrate with your other cloud-based tools, such as your accounting software or
CRM system.
4. Cost: What is the total cost of ownership for the technology? Cloud-based utility
management technology typically has lower upfront costs than on-premise
solutions, but you should also consider ongoing costs such as subscription fees
and support costs.
5. Reliability: How reliable is the technology? Cloud-based solutions are typically
more reliable than on-premise solutions, but you should still look for a technology
that offers high availability and uptime guarantees.
By considering these factors, you can evaluate different utility management technologies
in cloud computing and choose the solution that best fits your needs and goals.
Here are some key considerations when creating a virtual test and development
environment in cloud computing:
Automating the data center in cloud computing involves using software tools and
processes to streamline the management and operation of the data center. Automation
can help organizations to reduce costs, improve efficiency, and increase agility. Here are
some key considerations when automating the data center in cloud computing:
By automating the data center in cloud computing, organizations can achieve greater
efficiency, consistency, and agility, while reducing costs and improving security and
compliance.
We understand how the cloud computing security operates to find ways to benefit your
business.
More than 90% of malware comes via email. It is often reassuring that employee's
download malware without analysingit. Malicious software installs itself on the network to
steal files or damage the content once it is downloaded.
Ransomware is a malware that hijacks system's data and asks for a financial ransom.
Companies are reluctant to give ransom because they want their data back.
Data redundancy provides the option to pay a ransom for your data. You can get that
was stolen with minimal service interruption.
Many cloud data protection solutions identify malware and ransomware. Firewalls keep
malicious email out of the inbox.
Vulnerability Assessment
Here, we have some examples of threats that vulnerability assessment can eliminate
are:
In the industry, there are several vulnerability scanners obtainable. They can be freely
accessible, charged, or open-source. On GitHub, many free and open-source tools have
been developed. Choosing which tool to use depends on a variety of variables, such as
the category of security vulnerabilities, the cost estimate, how often the tool is modified,
etc.
Here, we have discussed some of the best vulnerability scanning tools. They are-
1. OpenVAS
OpenVAS is a valuable tool for detecting vulnerabilities that endorses massive scale
scans that are appropriate for companies. This tool can be used not only in web
applications or application server, but also in databases, programming systems,
networks, and virtualization software to diagnose vulnerabilities problems.
2. Nikto2
Nikto2 is a screening program for open-source exploits that emphasizes on web
application security. Nikto2 will discover about 6700 hazardous archives that cause web
server problems and evaluate iterations based on inaccurate servers. In addition, Nikto2
can immediately notify you to server installation problems and improve virtual server
audits in a minimal amount of time.
Nikto2 might not offer any preventative measures for potential vulnerabilities that
include risk management functionality. Nikto2, after all, is a constant accessed tool that
allows vulnerabilities to be covered more broadly.
3. Netsparker
4. Acunetix
Acunetix offers the opportunity for your scan to be streamlined. It is suitable for large-
scale organizations because several systems can be controlled by it. HSBC, NASA, USA
air force are few industrial titans that use the Arachni scanning tool for vulnerability
testing.
5. Arachni
Linux, Windows, and macOS are supported by Arachni, a freely used and open-source
security vulnerabilities tool. With its capacity to adapt with recently discovered
vulnerabilities, Arachni also aims to assist in penetration testing.
6. Nmap
Among many cybersecurity experts, Nmap is one of the possibly the best-known, freely
used and open-source testing tools for networks. To explore hosts in the domain and for
software exploration, Nmap uses the penetrating technique.
In two or more distinct networks, this function aims to detect exploits. If you really are a
beginner or trying to learn to search for vulnerabilities, then the Nmap scanning tool is a
great starting point.
7. W3AF
W3AF is a platform also called Web Software Attack and Framework, open and free-
source. This software is an open-source web application analysis for vulnerabilities. By
identifying and evaluating the bugs, it provides a mechanism that is useful to protect the
web application. This software is recognized for user-friendliness. W3AF also has
infiltration services used for vulnerability assessment work, along with penetration
testing options.
W3AF contains a broad-scale set of vulnerabilities. This tool can be selected for networks
that are attacked repeatedly, particularly with previously unrecognized vulnerabilities.
8. GoLismero
GoLismero is a tool used for intrusion prevention that is free and open-source. GoLismero
aims to identify web application threats and vulnerabilities, but can also search for
network vulnerabilities. GoLismero is an efficient tool that works with outcomes obtained
by other vulnerability tools such as OpenVAS, then consolidates the findings and gives
feedback.
Firewall is the central part of cloud architecture. The firewall protects the network and
the perimeter of end-users. It also protects traffic between various apps stored in the
cloud.
Access control protects data by allowing us to set access lists for various assets. For
example, you can allow the application of specific employees while restricting others.
It's a rule that employees can access the equipment that they required. We can keep
essential documents which are stolen from malicious insiders or hackers to
maintaining strict access control.
Data protection methods include Virtual Private Networks (VPN), encryption, or masking.
It allows remote employees to connect the network. VPNaccommodates the tablets and
smartphone for remote access. Data masking maintains the data's integrity by keeping
identifiable information private. A medical company share data with data masking
without violating the HIPAA laws.
For example, we are putting intelligence information at risk in order of the importance of
security. It helps to protect mission-critical assets from threats. Disaster recovery is vital
for security because it helps to recover lost or stolen data.
Security in cloud computing is a major concern. Proxy and brokerage services should be
employed to restrict a client from accessing the shared data directly. Data in the cloud
should be stored in encrypted form.
Security Planning
Before deploying a particular resource to the cloud, one should need to analyze several
aspects of the resource, such as:
o A select resource needs to move to the cloud and analyze its sensitivity to risk.
o Consider cloud service models such as IaaS, PaaS,and These models require the
customer to be responsible for Security at different service levels.
o Consider the cloud type, such as public, private, community, or
o Understand the cloud service provider's system regarding data storage and its
transfer into and out of the cloud.
o The risk in cloud deployment mainly depends upon the service models and cloud
types.
Security Boundaries
The Cloud Security Alliance (CSA) stack model defines the boundaries between each
service model and shows how different functional units relate. A particular service model
defines the boundary between the service provider's responsibilities and the customer.
The following diagram shows the CSA stack model:
Architectural considerations
Cloud Computing Architecture
As we know, cloud computing technology is used by both small and large organizations
to store the information in cloud and access it from anywhere at anytime using the
internet connection.
o Front End
o Back End
The front end is used by the client. It contains client-side interfaces and applications that
are required to access the cloud computing platforms. The front end includes web
servers (including Chrome, Firefox, internet explorer, etc.), thin & fat clients, tablets, and
mobile devices.
Back End
The back end is used by the service provider. It manages all the resources that are
required to provide cloud computing services. It includes a huge amount of data storage,
security mechanism, virtual machines, deploying models, servers, traffic control
mechanisms, etc.