Unit-1 Edge Computing
Unit-1 Edge Computing
Edge computing is a type of computing that takes place at or near the edge of a
network. The processing occurs either within or close to the device, so less data travels
to the central server. Most operations happen in real-time near the source of data,
which leads to:
Edge computing also helps keep workloads up to date, ensure data privacy, and adhere
to data protection laws such as HIPAA, GDPR, and PCI. This processing model also
enables further innovations with artificial intelligence and machine learning.
Edge devices collect and store data before sending information to an on-premises edge
server. This server handles the following activities:
The edge center sends the most complex processing requests (big data operations and
business logic) to the data center or the cloud. While the need for a central dedicated
server is still there, a business can set up slower, less expensive connections without
risking latency due to local operations and pre-sorted data.
Autonomous vehicles are a prime edge computing use case, as they can only operate
safely and reliably when they're able to analyze in real time all the data required to
drive. Real-time analysis in the cloud, however, can be problematic; the volume of
data generated by autonomous vehicles and the corresponding potential for latency --
or even a lack of needed connectivity -- when sending data to the cloud could mean
unsafe delays. The volume of data that these vehicles amass is staggering. Industry
estimates on data generation vary significantly, but they all put data generation in the
terabytes.
2. Smart cities
Civic authorities are also using edge computing to create smart communities and run
their roadways with capabilities such as intelligent traffic controls. Edge supports a
host of areas within this broad category. It helps civic authorities, such as traffic
agencies, public transformation departments and private transportation companies
better manage their vehicle fleets and overall traffic flow by enabling rapid
adjustments based on real-time, on-the-ground conditions. For example, edge
computing platforms deployed to process vehicle data can determine which areas are
experiencing congestion and then reroute vehicles to lighten traffic.
Additionally, civic authorities -- such as city workers and regional planners -- can
deploy edge devices to process data coming from sensors on power grids, public
infrastructure, public facilities, private buildings and other locales to instantly assess
needs and speed response.
3. Stronger security
Companies, for example, can use a biometric security product with optical
technologies to perform iris scans with edge devices instantly analyzing those images
to confirm matches of workers with authorized access. Meanwhile, consumer security
products, such as video doorbells and security cameras, likewise benefit from the real-
time analysis that edge computing -- often in the form of fog nodes deployed in the
home network -- delivers.
4. Healthcare
Healthcare data is coming from numerous medical devices, including those in doctor's
offices, in hospitals and from consumer wearables bought by patients themselves. But
all that data doesn't need to be moved to centralized servers for analysis and storage -
- a process that could create bandwidth congestion and an explosion in storage needs.
Instead, edge devices can ingest and analyze data coming from endpoint medical
devices to determine what data can be discarded, what should be retained and, more
critically, what requires immediate action. Consider, for example, data from a cardiac
device; an edge device could hold a program designed to aggregate normal readings
for reporting but instantly alert to an abnormal one that requires emergency attention.
Edge computing also plays a critical role in medical care delivery, such as robot-
assisted surgery, where real-time data analysis is essential.
Industrial IoT has added millions of connected devices in manufacturing plants and
other such industries to gather data on production lines, equipment performance and
finished products. However, all the data doesn't need to be handled in centralized
servers -- every temperature reading from every connected thermometer isn't
important. In some cases, moving data to the centralized servers -- whether in the cloud
or on premises -- could be prohibitively expensive or impossible because of a facility's
remote location. In such cases, edge computing brings needed processing power to
where it's required, and those edge devices can be programmed to either transfer
aggregate data back to central systems and/or initiate required actions at the endpoint.
Moreover, edge computing delivers the speed required for manufacturing and
industrial operations, where automated assembly lines move rapidly and require real-
time interventions to address problems. Edge computing is often used to
support predictive maintenance efforts, energy efficiency initiatives, custom
production runs, smart manufacturing and intelligent operations. Industrial executives
are also using edge as part of an IoT ecosystem to monitor, analyze and manage energy
use in their factories, plants and offices. Energy utilities themselves can use edge for
monitoring and managing their own equipment in the field.
Similar to other use cases, virtual reality (VR) and augmented reality (AR) both
require the real-time processing of large data sets because any lag in analysis would
delay subsequent actions. That would mean delayed images and instructions in the
case of VR and AR, creating a poor -- or in some cases even an unsafe -- user
experience at a time when use of these technologies is greatly expanding.
Workers use these technologies to guide them through their tasks and to learn new
processes. Students use them to learn complex concepts. Individuals use them for
entertainment and skills enhancement. Businesses apply the technologies to enable
unique and customized experiences, such as personalized shopping displays. Edge
computing enables those various experiences when bandwidth limitations, costs
and/or privacy concerns make using centralized processing power a poor choice.
U.S. private industry employers reported 2.1 million nonfatal workplace injuries in
2020, according to the federal Bureau of Labor Statistics (BLS). There were 5,333
deaths due to work-related injuries in 2019, the most recent BLS figures. But industry
is using a combination of technologies -- such as endpoint sensors, computer vision
and artificial intelligence, as well as edge devices -- to power workplace safety
applications.
For example, companies can use locational data from on-site employees to enforce the
social distancing requirements brought on by the COVID-19 pandemic, alerting them
if they move and stay too close together. Because such locational data has no value
beyond that moment, the information can be collected and processed on the edge rather
than moved and stored in the corporate data center.
Similar to the use of edge with augmented and virtual reality use cases, edge
computing supports the low-latency requirements of video streaming and content
delivery. Furthermore, it enables a good user experience for both existing and
emerging features such as search functions, content suggestions, personalized
experiences and interactive capabilities.
In fact, with over-the-top streaming platforms becoming the standard means for
distributing content, media companies are using edge computing to deliver original
content, live events and regional content with a flawless user experience -- as
consumers now expect.
Businesses across industries, from banking to retail, are exploring how they can use
edge computing to deliver hyperpersonalized experiences and targeted ads to
customers. They're also developing ways to use edge computing to support new
services, such as AR-enabled interactive shopping.
The volume of data being generated and transmitted by households has exploded as
homes have gone high-tech, with everything from AI-enabled virtual assistants -- such
as Amazon's Alexa -- to connected security systems to smart speakers all adding
traffic to the available bandwidth. Edge computing located within the home could ease
the strain on service provider networks, ensure real-time response and boost privacy
by keeping more of the household's data close and out of third-party systems.
Hosting applications and data on centralized hosting platforms or centers can create
latency when users try to use them over the internet. The process of requesting data
from these data centers can get slow when there are internet connectivity issues. Edge
computing solves this issue by keeping the data on the edge of the devices for easier
access.
Therefore, with edge computing, businesses can avoid issues affecting speed and
connectivity, as data can be fetched on the endpoints rather than from a far away
centralized data center, then back to the endpoints. Reducing the time an application
travels to fetch data from a data center keeps applications optimized for better
performance and greater user experience.
Enhances privacy protections and data security
Data security and privacy protections are burning issues in the IT world. Edge
computing provides more data security and privacy protection because data is
processed within the edge rather than from central servers.
However, this does not suggest that edge devices are not vulnerable by any means.
Not at all. It only suggests that there is less data to be processed from the edge, so there
is hardly a complete collection of data that hackers can pounce on.
In other words, privacy can easily be compromised when data hosted on centralized
servers are hacked because they contain more comprehensive information about
people, locations and events. In contrast, because edge computing creates, processes
and analyzes just a set of data needed at an instance, other pieces of data that might
compromise privacy in the event of a hack are not tampered with.
Reduces operational costs
Moving data around on cloud hosting services is one of the things businesses spend a
lot of money on. The higher the volume of data being moved on these centralized
hosting providers, the more money organizations spend.
However, with edge computing, organizations spend less on operational costs due to
the minimal need to move data to the cloud. In addition, since data is processed in the
same location it’s generated, there is also a reduction in the bandwidth needed to
handle the data load.
Helps in meeting regulatory and compliance requirements
Meeting regulatory and compliance requirements can be made more difficult when
data is hosted and managed by different data centers or hosting providers. This is
because each data center has its peculiar privacy and regulatory requirements.
However, this is not the case with edge computing because data is created, stored and
processed in one place, making it easy to meet regulatory and compliance
requirements.
Enhances reliability and resiliency
With edge computing, data can still be fetched and processed with little or no
hindrances, even when there is a poor internet connectivity issue. In addition, when
there is a failure at one edge device, it won’t alter the operation of other edge devices
in the ecosystem, facilitating the reliability of the entire connected system.
Supports AI/ML applications
There is no denying the growing relevance of artificial intelligence (AI) and machine
learning (ML) in modern computing. However, AI/ML applications work by fetching
and processing huge volumes of data, which can suffer latency and connectivity issues
when the data is hosted on a centralized server.
EDGE PLATFORM :
The software and applications running on the edge define its purpose. As you
scale edge devices, managing them remotely becomes the challenge. Certainly,
custom control and deployment models exist and are used in production.
However, today, we have commercial off-the-shelf edge management
frameworks as well
as container-based methodologies that ease the burden of deploying software in a
secure and controlled manner to remote edge computers.
In either case, we want the software and system to be:
1. Robust: Capable of receiving, reimaging, and rerunning software
as it is deployed
2. Controlled: Having a central cloud or service that manages and
monitors the deployment
3. Responsive: Reporting back information on the success or failure of
software reimaging
Virtualization
We can contrast the types of virtualization as follows:
Hardware virtualization: A hardware-level abstraction that is generally
capable of running any software that can run on bare metal. It uses a
hypervisor to manage one or more virtual machines on the processor and
can support the virtual replication of hardware to multiple virtual operating
systems through hardware IO virtualization. These techniques require
processor and hardware support for virtualization usually found on higher-
end processors like ARM Cortex A series parts.
As a subcategory, there are two types of hypervisors: Type-1 hypervisors
run directly on bare metal, and Type-2 have a hosted underlying operating
system. An example of a Type-1 hypervisor is Microsoft HyperV. An
example of a Type-2 hypervisor is Microsoft Virtual PC.
Para-virtualization: Provides an abstraction layer called a hardware
abstraction layer (HAL) and requires special drivers. These drivers are
linked through the underlining hypervisor and access the hardware through
hypercalls. It requires modifications to the guest OS to enable this form of
virtualization and offers the guest OS higher performance and the ability to
communicate directly with the hypervisor.
Containers: Manage abstraction at an application level. There is no hypervisor
or guest operating system. Rather, containers require only the hosting operating
system to provide basic services.
Containers maintain separation from each other, providing a level of protection
similar to a VM. Container mangers can also adapt to changing machine
resources. For example, they can dynamically assign more memory to a
container at runtime.
For some edge computing applications, container-based abstractions are
particularly attractive. Containers do require system-level resources that need
to be considered, such as compute performance, storage capacity, and even
processor features. They offer a very lightweight and portable method of
building and deploying applications to edge computers. Since the container
approach uses
no guest OS, it is naturally leaner and more resource-efficient than traditional
virtualization. This is critical for resource-constrained edge devices.
Additionally, a qualified and working image can be containerized, and changes
and tests can be executed against that image. A container is also very portable
and can be deployed in any environment and on nearly any host OS. For this
reason, we will focus on containers as the method for edge deployment:
Figure 12: Four types of virtual abstractions: Full virtualization,
paravirtualization, Type-2 hypervisors, and
containers.
Containers
Containers are a method to virtualize underlying hardware and services
like a virtual machine (VM). Whereas a traditional virtual machine will
require a hypervisor that sits above the hardware and provides a level of
abstraction, a contain requires no hypervisor. Rather, its services reside
above the operating system layer.
Container architecture
The act of creating a container and running an application as a process in it is
called containerization. There are two fundamental definitions needed to
understand the basic elements of a container:
• Container: This is a single instantiation of a container image.
Multiple instances can exist on a single host.
• Image: The container image is a set of files containing no state but defines
the package (or snapshot) of a container.
To understand the container architecture, we will explore Docker. It is a
tool to build and manage containers and offers a free version called
DockerEE for basic services. A container deployment will consist of an
application container management engine and a repository.
To bind an application into a container image, we start by gathering the
application code and required dependencies. These dependencies are
associated libraries, binaries, middleware, and software components that may
be needed by the application. All dependencies must be included in the
container image to ensure they are resident even for functionality that may
seldom execute. This aggregation of applications and dependencies is called
the container.
The creation of a new container is straightforward in Docker. First, we choose
a base image to reference. Docker has many base images of various operating
systems and environments like Fedora or Ubuntu.
search?q=&type=image. Next, we create a Docker image file. This file details
how
the image will be built. An example is as follows:
FROM ubuntu
RUN apt-get update
RUN apt-get install iostat -y
CMD ["/usr/bin/iostat"]
In this example, the dockerfile pulls from the Ubuntu base image and then uses
the installation tool apt-getto install the iostat utility and then executes it.
After the dockerfile is constructed, we then need to build this Docker image.
The field dockerID is needed only if you intend to upload the image to the global
Docker Hub system, where you need to register for an account:
docker build -t <dockerID>/<image-name> .
The resulting image can then be deployed and instantiated anywhere given
the following command on the host:
docker run [options] [dockerID/image-name][command]
An edge computer device, as well as any other connected system using
Docker, can pull this new image and execute it in a similar manner. This
greatly eases the task of deployment and development. Additionally, it allows
development models for edge devices to be like processes on large SaaS
(software as a service) solutions using techniques like continuous integration
and continuous delivery (CI/CD).
The overall architecture of Azure IoT Edge in concert with Azure's IoT Hub
running in the cloud can be seen in the preceding diagram. The diagram
shows three sensors connected to an edge computer. The edge computer hosts
the Azure IoT Edge runtime service. The edge service runtime is lightweight
and the heart of the system. The edge runtime manages module/workload
installs, security, health monitoring, and all communication. The hub has two
roles: IoT Edge Agent and IoT Edge Hub. The agent service manages the
modules while the hub manages communication and acts as a proxy for the
larger Microsoft Azure IoT Hub. This requires some further explanation.
IoT Hub running in Azure cloud performs a superset of functions, and it is
the main interface used in Azure to connect to IoT devices. IoT Edge Hub
is not a full version of IoT Hub, but it allows a programmer to interface to
the edge as they would design software to interface to the IoT Hub in the
cloud through the Azure IoT Device SDK. The IoT Edge Hub also maintains
a manifest. This manifest identifies qualified and authenticated modules
allowed to run on the edge device. It also declares the routing rules between
different modules running on the edge.
The IoT Edge Agent manages the container image for each module running
on the device, the credentials to access private container registries, and the
rules on module creation and management.
The manifest that controls the edge runtime may look like the following
example. Notice every route needed a source and a sink. The condition is an
optional field allowing you to filter messages. The IoT Edge Hub will
guarantee "at-least-once" delivery of messages. Any destination in the sink
field will have its message delivered at least once. If there is a communication
issue or failure, it will store and cache all messages locally. This can be fine-
tuned to store messages for a set amount of time.
One of the most powerful concepts of Azure IoT Edge is that certain services and
features specifically designed to run on the Azure cloud can now be run locally on
the edge device if it meets the minimum requirements of the runtime. Such
functionality includes:
• Deploying and using Azure Functions in an Azure IoT Edge module
• Using Azure's Stream Analytics systems as an IoT Edge module
• Using Azure's machine learning subsystems within an IoT Edge module
• Performing image classification with Azure's Custom Vision Service as an
IoT Edge module
• Running SQL databases as an IoT Edge container
This ability to migrate cloud data center class services allows for rapid development and
ease of execution.
Fog Computing
Fog computing is an extension of cloud computing. It is a layer in between the edge and the
cloud. When edge computers send huge amounts of data to the cloud, fog nodes receive the
data and analyze what’s important. Then the fog nodes transfer the important data to the
cloud to be stored and delete the unimportant data or keep them with themselves for further
analysis. In this way, fog computing saves a lot of space in the cloud and transfers important
data quickly.
EDGE ROUTING :
WHAT IS AN EDGE ROUTER?
See also Things to know about Apple homekit routers. 5 tips to set up Apple homekit
routers
Layer 3 edge routers can also be used to connect two or more IP networks. For example, a
Layer 3 edge router could be used to connect an office network to the Internet.
EDGE ROUTER CHALLENGES
Optimized networking
Edge networking can help conserve network resources by offloading network traffic
away from the core network. By doing this it reduces the danger of network latency
caused by bottlenecks of the movement of data.
Flexibility
Edge networks can exist either in the cloud or on-premises, giving businesses a range
of different network architecture options that can help them meet goals in areas such as
security, compliance, and operations.
Performance
As highlighted above, edge computing ensures that workloads are placed closer to the
actual end user resulting in improved user experience and realized productivity because
of the improved performance/reduced latency.
Reduced Costs
Ensuring that you’re delivering data from closer to your user can help reduce the
infrastructure costs necessary to deliver an application as it is limiting and
consolidating requests to services residing on the network edge rather than needing to
route to data center applications and infrastructure.