Communication and Concurrency
Communication and Concurrency
CONCURRENCY
What is Concurrency?
It refers to the execution of multiple instruction sequences at the same time. It occurs in an
operating system when multiple process threads are executing concurrently. These threads can
interact with one another via shared memory or message passing. Concurrency results in
resource sharing, which causes issues like deadlocks and resource scarcity. It aids with
techniques such as process coordination, memory allocation, and execution schedule to
maximize throughput.
Principles of Concurrency
Today's technology, like multi-core processors and parallel processing, allows multiple processes
and threads to be executed simultaneously. Multiple processes and threads can access the same
memory space, the same declared variable in code, or even read or write to the same file.
The amount of time it takes a process to execute cannot be simply estimated, and you cannot
predict which process will complete first, enabling you to build techniques to deal with the
problems that concurrency creates.
Interleaved and overlapping processes are two types of concurrent processes with the same
problems. It is impossible to predict the relative speed of execution, and the following factors
determine it:
Problems in Concurrency
It's difficult to spot a programming error because reports are usually repeatable due to the
varying states of shared components each time the code is executed.
Sharing global resources is difficult. If two processes utilize a global variable and both alter the
variable's value, the order in which the many changes are executed is critical.
3. Locking the channel
It could be inefficient for the OS to lock the resource and prevent other processes from using it.
Issues of Concurrency
1. Non-atomic
Operations that are non-atomic but interruptible by several processes may happen issues. A non-
atomic operation depends on other processes, and an atomic operation runs independently of
other processes.
2. Deadlock
In concurrent computing, it occurs when one group member waits for another member, including
itself, to send a message and release a lock. Software and hardware locks are commonly used to
arbitrate shared resources and implement process synchronization in parallel computing,
distributed systems, and multiprocessing.
3. Blocking
A blocked process is waiting for some event, like the availability of a resource or completing an
I/O operation. Processes may block waiting for resources, and a process may be blocked for a
long time waiting for terminal input. If the process is needed to update some data periodically, it
will be very undesirable.
4. Race Conditions
A race problem occurs when the output of a software application is determined by the timing or
sequencing of other uncontrollable events. Race situations can also happen in multithreaded
software, runs in a distributed environment, or is interdependent on shared resources.
5. Starvation
Advantages
1. Better Performance
It improves the operating system's performance. When one application only utilizes the
processor, and another only uses the disk drive, the time it takes to perform both apps
simultaneously is less than the time it takes to run them sequentially.
It enables resources that are not being used by one application to be used by another.
Disadvantages