0% found this document useful (0 votes)
7 views4 pages

12_13_Concurrent programming

Uploaded by

Tanya Verma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views4 pages

12_13_Concurrent programming

Uploaded by

Tanya Verma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Concurrency

Concurrency means executing multiple transactions simultaneously


to save time, but it can cause inconsistency with shared data. To
avoid this, concurrency control is needed.

What are the benefits of concurrency?


Concurrency has benefits like executing multiple applications
simultaneously, increasing efficiency, and improving resource
utilization.

Pros of Concurrency
Here are the key benefits of concurrency:
 Enhanced Efficiency: Concurrency enables the simultaneous
execution of multiple applications, leading to increased
efficiency and overall productivity of workstations.
 Optimized Resource Usage: It facilitates better utilization of
resources by allowing unused assets or data to be accessed by
other applications in an organized manner. This reduces the
waiting time between threads and enhances average response
time. Thanks to the efficient utilization of available resources,
applications can continue their operations without waiting for
others to complete.
 Improved System Performance: Concurrency contributes to the
improved performance of the operating system. This is
achieved by enabling various hardware resources to be
accessed concurrently by different applications or threads.
It allows simultaneous use of the same resources and supports
the parallel utilization of different resources. This seamless
integration of resources and applications helps accomplish the
main objective quickly and effectively.

Cons of Concurrency
Here are important points to keep in mind regarding the challenges
of concurrency when planning processes:
 Minimizing Interference: When multiple applications run
concurrently, it’s crucial to safeguard them from causing
disruptions to each other’s operations.
 Coordinated Execution: Applications running in parallel need
careful coordination, synchronization, and well-organized
scheduling. This involves allocating resources and determining
the order of execution.
 Coordinating Systems: Designing additional systems becomes
necessary to manage the coordination among concurrent
applications effectively.
 Increased Complexity: Operating systems encounter greater
complexity and performance overheads when switching
between various applications running in parallel.
 Performance Impact: Too many simultaneous processes can
lead to decreased or degraded overall system performance.
Issues of Concurrency
Understanding Concurrency Challenges: Non-Atomic Operations,
Race Conditions, Blocking, Starvation, and Deadlocks
In the world of software, dealing with multiple processes running at
the same time brings its own set of challenges. Some common issues
that can arise are:
 Non-Atomic Operations: When operations aren’t atomic, other
processes can interrupt them, causing issues. Atomic
operations are those operations, which execute without
interruption of any other process in between their execution
phase. Any operation that relies on another process is non-
atomic, which can lead to problems.
 Race Conditions: Race condition occurs when multiple threads
read and write the same variable i.e. they have access to some
shared data and they try to change it at the same time. In such
a scenario threads are “racing” each other to access/change the
data. This is a major security vulnerability. This often happens in
software that handles multiple tasks simultaneously, threads
that cooperate, or when sharing resources.
 Blocking: Imagine a process putting its work on hold while it
waits for something else to happen, like a resource becoming
available or an input operation finishing. It’s like waiting for a
green light to move forward. But if a process gets stuck waiting
a long time, it’s not pleasant, especially when regular updates
are needed.
 Starvation: Starvation occurs when a process runs out of
resources because other processes are using it. In concurrent
computing, starvation occurs when a process is continuously
denied the resources it needs to do its job. It can be caused by
errors in how resources are allocated or managed.
 Deadlock: A deadlock is a situation where a set of processes is
blocked because each process is holding a resource and waiting
for another resource acquired by some other process. Imagine
a group of friends, each waiting for another to make a move,
resulting in no one moving. That’s a deadlock. In the computing
world, it’s when processes or threads are stuck waiting for each
other to release a lock or send a message. Deadlocks can occur
in systems where processes share resources, like in parallel
computing or distributed systems.
By understanding these challenges and their implications, developers
can create better strategies for managing concurrent processes.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy