Set B PART B - C Answer - Key
Set B PART B - C Answer - Key
Course
S.No. PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO12 PSO1 PSO2 PSO3
Outcome
1 CO1 3 2 - - - - - - - - - - - 2 -
2 CO2 3 2 - 1 - - - - - - - - - 2 -
3 CO3 3 - 2 - 2 - - - - 1 - - - 2 -
4 CO4 3 2 - 1 - - - - - - - - - 2 -
5 CO5 3 - 2 1 2 - - - - 1 - - - 2 -
Part – B
(4 x 5 = 20 Marks)
Answer All 4 Questions
21a You are an agriculture researcher trying to diagnose a 5 L3 1 1 1.3.1
rare disease based on a plant's symptoms. How would
you use the digital image processing steps and predict the
disease.
Image Sensors:
An image sensor or imager is a sensor that detects and
conveys information used to make an image.
Image sensors senses the intensity, amplitude, co-
ordinates and other features of the images and passes the
result to the image processing hardware. It includes the
problem domain.
The software that includes all the mechanisms and
algorithms that are used in image processing system.
Processing Tools. DIY Filters. Standard Filters.
GPUGraphics Processing Unit, the main →IC on a
graphics adapter (Grafikkarte) Filters.
OpenCV Filters. ImageJ Filters.
Python Tools. PIL. SciKit-Image. SimpleCV.
Dataflow Tools. FilterForge
Image Processing Hardware:
It is used to process the instructions obtained from the
image sensors. It passes the result to general purpose
computer. The three most common choices for image
processing platforms in machine vision applications are
the
Central processing unit (CPU)
Graphics processing unit (GPU),
Field programmable gate array (FPGA)
Once the image is processed then it is stored in the hard
copy device. It can be a pen drive or any external ROM
device. It includes the monitor or display screen that
displays the processed images. Network is the
connection of all the above elements of the image
processing system.
OR
21 b Illustrate the key stages of image processing with a clear 5 L2 1 1 1.3.1
and structured block diagram.
The distance between the center of the lens and the retina
along the visual axis is approximately 17 mm. The range
of focal lengths is approximately 14 mm to 17 mm, the
latter taking place when the eye is relaxed and focused at
distances greater than about 3 m
OR
22 b Describe an in-depth analysis of the following: 5 L3 1 1 1.3.1
i) Brightness Adaptation and Discrimination
Brightness adaptation, also known as light adaptation, is
the process of adjusting the sensitivity of our eyes to
changes in light levels. It allows us to see effectively in
different lighting conditions, from dim to bright
environments
Brightness discrimination, also known as lightness
discrimination, is the ability of our visual system to
perceive and differentiate between different levels of
brightness in a scene
ii) Pixel Path
A path from pixel p at (x,y) to pixel q at (s,t) is a
sequence of distinct pixels:
OR
23 b Assume you are a graphic designer; discuss how you 5 L2 1 1 1.3.1
would explain the relationship between pixels in an
image and its representation for better understanding.
OR
24 b Describe a step-by-step process, along with the rationale 5 L2 2 1 1.3.1
behind each step, for how you would use local histogram
processing and adaptive filters to enhance the quality of
an old photograph that has degraded over time and
contains significant historical information?
Histogram is a graphical representation of the
intensity distribution of an image. i.e., it
represents the number of pixels for each intensity
value considered.
It provides information about distribution of
pixel intensities, helping to understand the
overall characteristics of an image's brightness
and contrast.
In a histogram, the x-axis represents the range
of possible intensity values, usually spanning
from 0 (black) to 255 (white) in grayscale
images. The y-axis represents the frequency or
count of pixels that have a specific intensity
value.
It helps guide decisions on how to adjust or manipulate
an image's intensity values to achieve desired visual
effects or prepare it for further processing
Part – C
(1 x 10 = 10 Mark)
25 a A computer graphics designer is creating a 2D animation 10 L2 1 2 2.4.1
with rotating objects. To optimize the animation, the
designer decides to use the Discrete Fourier Transform
(DFT) algorithm. Illustrate how the DFT works for the
same.
The Discrete Fourier Transform (DFT) is defined by
Or
25 b You are a photo restoration specialist tasked with 10 L3 2 1 1.3.1
reviving an old, faded photograph that holds sentimental
value to a client. The image lacks contrast, making
details hard to discern. How would you employ spatial
domain methods to enhance the contrast and revive the
old photograph also compute the steps necessary to
implement local histogram equalization employed to
improve contrast and enhance details in an image
characterized by varying lighting conditions across
different regions.
Spatial filtering is an image operation where each pixel
value I(u,v) is changed by a function of the values of the
pixel and its neighbors.
The process consists of simply moving the filter mask
from point to point in an image.
At each point (x,y) the response of the filter at that point
is calculated using a predefined relationship.
Types of spatial filtering
Smoothing Filter
Sharpening filtering
2 types of smoothing spatial filter
Linear Filter / Mean Filter
Order Statistics / Non-linear Filter
Smoothing (also called averaging) spatial filters
are used to reduce sharp transitions in
intensity.
Because random noise typically consists of sharp
transitions in intensity, an obvious application of
smoothing is noise reduction.
Smoothing is used to reduce irrelevant detail in
an image, where ―irrelevant‖ refers to pixel
regions that are small with respect to the size of
the filter kernel.
Linear spatial filtering consists of convolving an image
with a filter kernel. Convolving a smoothing kernel with
an image blurs the image, with the degree of blurring
being determined by the size of the kernel and the values
of its coefficients
Smoothing filters are used for
Blurring
Noise Reduction
Blurring is used in preprocessing steps to
removal of small details from an image prior to
object extraction and bridging of small gaps in
lines or curves.
Noise reduction can be accomplished by
blurring.
If the operation performed on the image pixels is
linear, then the filter is called a linear spatial
filter. Otherwise, the filter is a nonlinear spatial
filter.
There are 2 ways of smoothing filters
Linear Filters (Mean Filters): If the
operation performed on the image pixels
is linear, then the filter is called a linear
spatial filter.
Order-Statistics (Non-linear) Filters.
Linear spatial filter is simply the average of the
pixels contained in the neighbourhood of the
filter mask.
The idea is replacing the value of every pixel in
an image by the average of the gray levels in the
neighborhood defined by the filter mask.
This process result in an image reduce the sharp
transitions in intensities.
Two mask
Averaging Filters
Weighted averaging filter / Gaussian
Smoothing
*Program Indicators are available separately for Computer Science and Engineering in AICTE
examination reforms policy.