Chap 5
Chap 5
Threads and processes are both fundamental concepts in operating systems and concurrent
programming, but they differ in their nature and how they operate. Here are the key distinctions
between threads and processes:
1. Definition:
Process: A process is an independent program that runs in its own memory space and has
its own set of resources. It is an execution unit that contains its own code, data, and
system resources.
Thread: A thread is the smallest unit of execution within a process. Threads share the
same resources (memory space, file descriptors, etc.) with other threads in the same
process.
2. Independence:
Process: Processes are independent of each other. They run in isolation and have their
own memory space. Communication between processes is typically achieved using inter
process communication (IPC) mechanisms.
Thread: Threads within the same process share the same memory space and resources.
They can communicate more easily as they can directly access shared data.
3. Resource Overhead:
Process: Processes have a higher resource overhead as they maintain separate memory
space, file handles, and other resources.
Thread: Threads have lower resource overhead compared to processes because they
share resources with other threads in the same process.
4. Creation Time:
Process: Creating a new process is generally more time consuming and resource
intensive.
Thread: Creating a new thread is faster and requires less overhead than creating a new
process.
1
5. Communication:
Process: Inter process communication (IPC) mechanisms like message passing or shared
memories are used for communication between processes.
Thread: Threads within the same process can communicate more easily through shared
variables and data structures.
6. Parallelism:
7. Fault Isolation:
Process: Processes are more robust in terms of fault isolation. If one process crashes, it
typically doesn't affect other processes.
Thread: If one thread in a process encounters an error (e.g., accessing invalid memory),
it can potentially affect the entire process and other threads within it.
2
Parallelism Independent execution of Concurrent execution of
multiple processes. multiple threads
Fault Isolation More robust, one process Errors in one thread may affect
crash doesn't affect others the entire may affect the entire
3
Thread Synchronization: Since threads in a multithreaded environment share
resources, there is a need for synchronization to avoid conflicts and ensure data
consistency. Synchronization mechanisms like locks, semaphores, and monitors
help control access to shared resources.
Communication between Threads: Communication between threads is achieved
through shared variables or other interthread communication mechanisms. Proper
handling is essential to prevent issues such as race conditions.
Benefits of Multithreading:
Improved performance: Multithreading can lead to better resource
utilization and increased throughput.
Responsiveness: Multithreading can enhance the responsiveness of an
application by allowing certain tasks to run in the background while others
are executing.
Simplified program structure: Dividing a program into threads can make
the code more modular and easier to manage.
Efficient resource utilization.
Challenges of Multithreading:
Race conditions: Concurrent access to shared resources may lead to race
conditions where the outcome depends on the timing of thread execution.
Deadlocks: Situations where two or more threads are unable to proceed because
each is waiting for the other to release a resource.
Increased complexity: Multithreading introduces complexities in terms of
synchronization and coordination between threads.
Difficulties in debugging and maintenance.
Thread Lifecycle
The lifecycle of a thread refers to the various states through which a thread transitions during its
execution. Different programming languages and thread management libraries might have
slightly different representations of thread states, but the general lifecycle includes several
common states. Here's an overview of the typical thread lifecycle:
4
Figure 1: Life cycle of a thread
1. New (or created): In this initial state, the thread is created, but it has not yet started its
execution. At this point, resources are allocated, and the thread is ready to run.
2. Active (Runnable & Running): When the start() method is called on a thread, it
becomes Active by transitioning from the New state to the Runnable state. A thread in
this state is eligible to run, but the operating system's scheduler has not yet selected it to
be the running thread. Threads in the Runnable (Ready) state can be executing
concurrently with other threads.
Running: The thread moves to the Running state when the scheduler selects it for
execution. In this state, the thread's instructions are being executed on the CPU.
A thread in the Runnable state can transition to the Running state when the
scheduler allocates processor time to it. The thread’s run () method is executed,
and it performs its designated tasks.
3. Blocked (or Waiting): A thread can enter the Blocked or Waiting state for various
reasons. One common scenario is when a thread is waiting for a lock held by another
thread. A second scenario is when a thread is explicitly paused using methods
like Thread.sleep() or Object.wait().
4. Timed Waiting: Similar to the Blocked state, a thread enters the Timed Waiting state
when it is waiting for a specified amount of time. This could be due to calling a sleep
5
function or waiting on a timed condition. This can happen when using method
like Thread.sleep () or Object.wait (timeout).
5. Waiting: Threads enter the Waiting state when they are waiting indefinitely for a
condition to be met. For instance, a thread might be waiting for another thread to notify it
of a change in shared data.
6. Terminated (Dead): The final state in the thread lifecycle, a thread enters the
Terminated state when it completes its execution or is explicitly terminated. Once in this
state, the thread cannot transition to any other state. A terminated thread cannot be
restarted. You can check if a thread has terminated using the Thread.isAlive () method.
Example1:
package testthread;
private Thread t;
threadName = name;
if (t == null) {
t.start ();
thread1.start();
thread2.start();
7
Example2:
package testthread1;
private Thread t;
threadName = name;
try {
Thread.sleep(50);
} catch (InterruptedException e) {
8
System.out.println("Thread: " + threadName + ", " + "State: Start");
if (t == null) {
t.start ();
thread1.start();
thread2.start();
9
Example3:
package testthread2;
private Thread t;
threadName = name;
10
try {
for(int i = 4; i > 0; i) {
Thread.sleep(50);
} catch (InterruptedException e) {
if (t == null) {
t.start ();
11
ThreadDemo thread2 = new ThreadDemo( "Thread2");
thread1.start();
thread2.start();
Thread priorities are a concept in concurrent programming that determines the order in which
threads are scheduled to run by the operating system's scheduler. Thread priority is an attribute
assigned to a thread that influences the order in which the threads are granted access to the CPU.
In most operating systems, threads are assigned priorities ranging from low to high. A higher-
priority thread will be scheduled to run before a lower-priority thread. Thread priorities help in
12
managing the execution order of threads and are essential for controlling the responsiveness and
performance of a multi-threaded application.
Low Priority: Threads with low priority are scheduled to run only when no higher-
priority threads are ready to run. These threads may experience delays in execution.
Normal Priority: Most threads operate at normal priority. They are scheduled to run in a
fair manner, and the operating system aims to provide equal access to the CPU for all
normal-priority threads.
High Priority: Threads with high priority are given preference over normal and low-
priority threads. They get scheduled more frequently and have a higher chance of running
when the CPU is available.
Real-time Priority: Some systems support real-time threads that have the highest
priority. Real-time threads are guaranteed to run within a specific time frame, ensuring
timely responses. However, overuse of real-time priority can lead to system instability, as
it may starve lower-priority threads.
Critical Section: A critical section is a part of the code that should be executed by only
one thread at a time. It is the region of code where shared resources are accessed and
modified. Synchronization is necessary to prevent multiple threads from accessing the
critical section simultaneously.
Race Conditions: Race conditions occur when the final outcome of a program depends
on the order of execution of threads. This can lead to unpredictable behavior and data
corruption. Synchronization is used to eliminate or control race conditions.
13
Mutual Exclusion: Mutual exclusion ensures that only one thread at a time can execute
a specific section of code or access a particular resource. Techniques such as locks,
mutexes (mutual exclusion), and semaphores are used to enforce mutual exclusion.
Locks: Locks are synchronization primitives that enforce exclusive access to a resource.
When a thread acquires a lock, it gains permission to enter a critical section. Other
threads attempting to acquire the lock will be blocked until the lock is released.
Semaphore: A semaphore is a synchronization primitive that maintains a counter. It
allows multiple threads to access a resource simultaneously, up to a specified limit.
Semaphores are useful for controlling access to a resource when multiple instances are
allowed.
Condition Variables: Condition variables are used for signaling between threads. They
allow threads to wait until a certain condition is met before proceeding. Condition
variables are often used in conjunction with locks to coordinate the activities of multiple
threads.
Deadlocks: Deadlocks can occur when two or more threads are blocked forever, each
waiting for the other to release a resource. Careful design and use of synchronization
mechanisms help prevent deadlocks.
Thread Safety: A piece of code or a data structure is considered thread safe if it can be
safely accessed and manipulated by multiple threads without causing data corruption or
unexpected behavior. Synchronization mechanisms are employed to achieve thread
safety.
Atomic Operations: Atomic operations are operations that are performed in a single,
uninterruptible step. In multithreading, atomic operations can be used to ensure that
certain operations are executed without interruption.
14