0% found this document useful (0 votes)
8 views9 pages

Process Sychronisation

The document covers key concepts in concurrent programming, including threads, symmetric multiprocessing (SMP), inter-process communication (IPC), mutual exclusion, race conditions, and semaphores. It highlights the importance of synchronization mechanisms to prevent issues like deadlock and race conditions in multi-threaded environments. Additionally, it discusses the role of semaphores in managing access to shared resources and ensuring proper execution order among processes.

Uploaded by

temmidas237
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views9 pages

Process Sychronisation

The document covers key concepts in concurrent programming, including threads, symmetric multiprocessing (SMP), inter-process communication (IPC), mutual exclusion, race conditions, and semaphores. It highlights the importance of synchronization mechanisms to prevent issues like deadlock and race conditions in multi-threaded environments. Additionally, it discusses the role of semaphores in managing access to shared resources and ensuring proper execution order among processes.

Uploaded by

temmidas237
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

1.

Threads and Symmetric Multiprocessing (SMP)

Threads

 A thread is the smallest unit of a CPU's execution, sometimes referred to as a


"lightweight process." It exists within a process and shares resources (such as memory)
with other threads in the same process.
 Multithreading allows a CPU to execute multiple threads concurrently, improving the
overall performance of the system, especially for I/O-bound or parallel tasks.

Symmetric Multiprocessing (SMP)

 SMP is a system architecture where multiple processors or cores share a common


memory space and work independently on different tasks but communicate via shared
memory.
 In SMP systems, all processors are treated equally and have access to the same resources.
 Key features of SMP:
o Scalability: SMP systems can scale up with the addition of more processors.
o Shared Memory: All processors can access a global shared memory, allowing
easy communication between threads and processes.
o Parallelism: SMP systems provide parallel execution for processes or threads,
enhancing performance for parallelizable tasks.

2. Inter-Process Communication (IPC) & Clock Synchronization

Inter-Process Communication (IPC)

 IPC refers to the mechanisms that allow processes to communicate with each other. This
is crucial in a multi-process system where processes need to share data and synchronize
their actions.
 IPC can be implemented using various techniques:
o Shared Memory: A region of memory shared between processes where they can
read and write data.
o Message Passing: Processes communicate by sending messages to each other.
o Sockets: Network-based IPC, used for communication between processes on
different machines.

Clock Synchronization

 Clock synchronization is crucial in distributed systems, where multiple processes or


computers need to agree on the time.
 Techniques for clock synchronization include:
o NTP (Network Time Protocol): Used to synchronize clocks over a network.
o Logical Clocks: Such as Lamport's timestamps, used to order events in
distributed systems without requiring synchronized physical clocks.

3. Mutual Exclusion and Critical Section

Mutual Exclusion

 Mutual Exclusion (Mutex) refers to ensuring that multiple threads or processes do not
simultaneously execute a critical section of code that accesses shared resources, which
could lead to inconsistent data.
 Key requirements for mutual exclusion:
1. Mutual Exclusion: Only one process or thread can execute in the critical section
at a time.
2. Progress: If no thread is in the critical section, one should be allowed to enter.
3. Bounded Waiting: Each thread should be able to enter the critical section within
a bounded time.

Critical Section

 The critical section is a part of the code where shared resources are accessed or
modified. Proper synchronization must be used to avoid conflicts between processes or
threads.

4. Race Conditions

Race Condition

 A race condition occurs when the behavior of a program depends on the relative timing
of uncontrollable events, such as thread execution order, leading to unpredictable
outcomes.
 A race condition typically occurs when multiple threads/processes access shared data
concurrently, and at least one thread modifies the data.
 Example: If two threads increment a shared counter, without synchronization, the result
may be incorrect due to interleaving of operations.

Prevention of Race Conditions

 Using synchronization mechanisms like mutexes, semaphores, and locks ensures that
only one thread can access shared data at a time, preventing race conditions.
5. Semaphores

Semaphores

 A semaphore is a synchronization primitive used to control access to a shared resource


by multiple processes or threads in a concurrent system.
 Semaphores are integer variables that are accessed using two atomic operations:
o Wait (P): Decrements the semaphore value. If the value is negative, the process is
blocked until the value becomes non-negative.
o Signal (V): Increments the semaphore value, potentially waking up a blocked
process.

Types of Semaphores

1. Counting Semaphores: These semaphores can take any integer value and are used for
managing a pool of resources.
2. Binary Semaphores (Mutex): These are a special case of semaphores with values
restricted to 0 and 1, typically used for mutual exclusion.

Uses of Semaphores

 Mutual Exclusion: Ensuring that only one process/thread accesses a critical section at a
time.
 Synchronization: Coordinating the order of execution between processes/threads.
 Resource Allocation: Managing a limited number of resources (e.g., file handles or
database connections).

6. IPC Problems

Problems in IPC

1. Deadlock: A situation where two or more processes are unable to proceed because each
is waiting for the other to release resources.
o Conditions for Deadlock:
 Mutual Exclusion
 Hold and Wait
 No Preemption
 Circular Wait
o Solutions: Prevention, detection, and recovery strategies.
2. Starvation: A process may never get access to resources if other processes with higher
priority keep requesting resources.
o Solution: Fair scheduling algorithms, priority adjustments.
3. Race Conditions: As discussed earlier, race conditions occur when two or more
threads/processes access shared data concurrently and the final outcome depends on the
timing of execution.
4. Synchronization Overhead: The time spent on managing synchronization can reduce
the overall efficiency of a system, especially if excessive locking is involved.
5. Communication Latency: In message-passing IPC systems, the time taken for messages
to travel between processes can impact performance.

Summary

 Threads are lightweight units of execution within a process, and Symmetric


Multiprocessing (SMP) allows multiple processors to work concurrently on a shared
memory system.
 Inter-process Communication (IPC) mechanisms allow processes to communicate and
synchronize, with challenges like deadlock, race conditions, and starvation.
 Mutual Exclusion ensures that only one process can access a critical section of code at a
time, preventing conflicts.
 Semaphores are essential for synchronization and preventing race conditions in
concurrent programming.

Threads and Symmetric Multiprocessing (SMP)

1. Which of the following best describes a thread in a multi-threaded process?


o a) A lightweight process that shares resources with other threads.
o b) A heavyweight process that operates independently.
o c) A memory manager for the operating system.
o d) A process that can only perform one task at a time.
2. In Symmetric Multiprocessing (SMP) systems:
o a) Each processor works on its own memory and cannot share data.
o b) All processors have equal access to shared memory.
o c) Only one processor is active at a time.
o d) The system uses a master-slave relationship between processors.
3. What is a key advantage of multithreading in modern CPUs?
o a) Increased RAM capacity.
o b) Improved power consumption.
o c) Better CPU utilization and task parallelization.
o d) Reduced processor speeds.
4. In an SMP system, how do processors communicate?
o a) Using message passing only.
o b) Through shared memory.
o c) Via disk storage.
o d) By sending signals through the network.
5. Which of the following is an example of a thread synchronization issue?
o a) Deadlock.
o b) Context switching.
o c) Memory allocation.
o d) Resource scheduling.

Inter-Process Communication (IPC) & Clock Synchronization

6. What does IPC stand for in computing?


o a) Inter-Program Communication.
o b) Internal Process Control.
o c) Inter-Process Communication.
o d) Internal Process Communication.
7. Which IPC mechanism allows processes to communicate by sharing a region of memory?
o a) Message passing.
o b) Shared memory.
o c) Sockets.
o d) File-based communication.
8. Clock synchronization is important in which type of systems?
o a) Single processor systems.
o b) Distributed systems.
o c) Systems with only one process.
o d) Systems without networking.
9. What is the role of the Network Time Protocol (NTP)?
o a) To allocate resources in a system.
o b) To synchronize clocks in a distributed system.
o c) To send messages between processes.
o d) To prevent deadlock in multi-threaded programs.
10. What is a typical use case for message passing in IPC?

 a) Synchronizing thread execution order.


 b) Sharing a memory block between multiple processes.
 c) Sending data between processes running on different machines.
 d) Scheduling tasks within a single process.

Mutual Exclusion and Critical Section

11. Mutual exclusion is required to prevent:

 a) Two processes from accessing the CPU at the same time.


 b) Two processes from accessing shared data simultaneously.
 c) Processes from being terminated by the operating system.
 d) Memory corruption in multi-threaded programs.
12. What is the critical section in a multi-threaded application?

 a) A section of code where shared resources are accessed or modified.


 b) A process that runs continuously in the background.
 c) A portion of code that ensures efficient CPU utilization.
 d) A memory location shared by all threads.

13. Which of the following techniques is used to implement mutual exclusion in a system?

 a) Semaphore.
 b) Message passing.
 c) Shared memory.
 d) Context switching.

14. The purpose of the critical section problem is to ensure:

 a) Only one process executes at a time in a shared resource.


 b) All processes are synchronized simultaneously.
 c) Processes run in parallel without accessing shared resources.
 d) Memory is shared among processes efficiently.

15. What does the "progress" condition in mutual exclusion guarantee?

 a) The system is free from deadlock.


 b) Only one thread executes in the critical section.
 c) A process will eventually be allowed to enter the critical section.
 d) All processes can access shared resources equally.

Race Conditions

16. A race condition occurs when:

 a) Two processes access shared data concurrently and cause inconsistency.


 b) A process takes too long to execute.
 c) Threads wait indefinitely for resources.
 d) A single process is terminated due to an error.

17. Which of the following is a solution to prevent race conditions?

 a) Increasing the CPU speed.


 b) Using synchronization mechanisms such as mutexes and semaphores.
 c) Adding more memory to the system.
 d) Using multi-core processors.
18. What is a classic example of a race condition?

 a) A process running out of memory.


 b) Two threads incrementing a shared counter.
 c) A thread waiting for another thread to finish.
 d) A file being read and written at the same time.

19. In which scenario would you need to use a lock or semaphore to avoid a race condition?

 a) When data is accessed by a single process.


 b) When multiple threads share access to a resource.
 c) When no processes are sharing resources.
 d) When processes are running sequentially.

20. Which of the following is a key requirement for mutual exclusion?

 a) Race conditions are allowed.


 b) Only one process can access shared resources at a time.
 c) Multiple threads can access shared resources simultaneously.
 d) Processes can access shared resources in any order.

Semaphores

21. What is a semaphore used for in synchronization?

 a) Managing time-sharing between processes.


 b) Controlling access to shared resources.
 c) Assigning CPU time to threads.
 d) Storing process states.

22. Which of the following operations are used in a semaphore?

 a) Wait and signal.


 b) Start and stop.
 c) Read and write.
 d) Lock and unlock.

23. A binary semaphore is also called:

 a) Mutex.
 b) Counting semaphore.
 c) Process semaphore.
 d) Thread semaphore.
24. Which of the following is true about semaphores?

 a) Semaphores can only be used for mutual exclusion.


 b) Semaphores are used to synchronize the order of execution in multithreading.
 c) Semaphores can only be used with shared memory.
 d) Semaphores are exclusively used in message passing.

25. A counting semaphore can hold values:

 a) Between 0 and 1.
 b) Between -1 and 1.
 c) Any integer value.
 d) Only positive integers.

IPC Problems

26. Deadlock occurs when:

 a) A process waits indefinitely for resources held by another process.


 b) Processes complete execution without any issues.
 c) All processes run in parallel without any interference.
 d) A process is terminated unexpectedly.

27. Which of the following is not a condition for deadlock?

 a) Mutual Exclusion.
 b) No Preemption.
 c) Bounded Waiting.
 d) Circular Wait.

28. What is starvation in a multi-threaded system?

 a) A process is executed continuously without interruption.


 b) A process never gets access to resources due to other processes constantly being
prioritized.
 c) A process is blocked due to a race condition.
 d) A process waits indefinitely for an available resource.

29. The bounded waiting condition ensures that:

 a) Processes will eventually be able to enter the critical section.


 b) No process will ever be allowed to enter the critical section.
 c) Only one process will execute at a time in the system.
 d) Processes will run in a sequential order.
30. The hold and wait condition in deadlock refers to:

 a) A process waiting for resources that are being held by another process.
 b) A process continuously running without any delay.
 c) A process releasing resources once it finishes execution.
 d) A process allocating resources to other processes.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy