Uploaded by Jane Smith's

UNIT II Notes

advertisement
UNIT II:
1.
Explain Histogram process in detail.
In digital image processing, the histogram is used for graphical representation
of a digital image. Histogram of an image represents the number of times a
particular grey level has occurred in an image. It is a graph between various
grey levels on x axis and the number of times a grey level has occurred in an
image, on y axis.
Histogram of an image is defined as h(rk) = nk, where rk = kth grey level and
nk = number of pixels in the image having the level rk.
Normalized Histogram is obtained by dividing the frequency of occurrence
os each pixel rk by the total number of pixels in an image.
P(rk) = nk/n
Histogram Equalization:
Histogram equalization is a widely used contrast-enhancement technique in
image processing because of its high efficiency and simplicity. It is one of the
sophisticated methods for modifying the dynamic range and contrast of an
image by altering that image such that its intensity histogram has the desired
shape.
Histogram stretching:
Histogram stretching, also known as contrast stretching, is a technique used in
image processing to enhance the contrast of an image. The aim of histogram
stretching is to increase the dynamic range of the image by stretching the
intensity values to cover the entire range of possible values. This results in an
image with improved contrast, making it easier to distinguish between
different objects or features in the image.
The process of histogram stretching involves the following steps:
1. Calculate the histogram of the image
2. Identify the minimum and maximum intensity values: The minimum
intensity value is the lowest intensity value in the image, and the maximum
intensity value is the highest intensity value in the image.
3. Calculate the new intensity values: The new intensity values are calculated
using a formula that stretches the intensity range of the image. One common
formula is:
4. Replace the old intensity values with the new values: Each pixel in the image
is replaced with its new intensity value.
5. Display the stretched image: The resulting image has a higher contrast, with
more distinguishable details and features.
Histogram Matching:
In image processing, histogram matching is the transformation of image so
that its histogram matches a specified histogram.
Procedure for histogram matching:
2. Explain bit plane sliding and Gray level Slicing in details
Bit plane slicing is a method of representing an image with one or more bits of
the byte used for each pixel. One can use only MSB to represent the pixel, which
reduces the original gray level to a binary image. The three main goals of bit
plane slicing is:



Converting a gray level image to a binary image.
Representing an image with fewer bits and corresponding the image to
a smaller size
Enhancing the image by focussing.
Bit plane slicing:
Since the given image has a maximum grey level of 7, it is a 3-bit image. We
convert the image to binary and separate the bit planes.
Separating the bit planes, we obtain
Gray Level Slicing:
Highlighting a specific range of gray levels in an image often is desired.
Applications include enhancing features such as masses of water in satellite
imagery and enhancing flaws in X-ray images.
There are several ways of doing level slicing, but most of them are variations of
two basic themes. One approach is to display a high value for all gray levels in
the range of interest and a low value for all other gray levels. This transformation
produces a binary image. The second approach, based on the transformation
brightens the desired range of gray levels but preserves the background and
gray-level tonalities in the image
3. Explain histogram equalization and its advantages
Advantages:
The advantages of using a histogram equalizer include:
1. Enhanced contrast: By redistributing the intensity values of an image,
histogram equalization enhances the contrast between different parts of the
image. This makes it easier to distinguish between different objects or features
in the image.
2. Improved visibility: Histogram equalization can make the details in an image
more visible by enhancing its brightness and contrast.
3. No loss of information: Histogram equalization does not result in any loss of
information in the image, unlike other image processing techniques such as
filtering or compression.
4. Simple to implement: Histogram equalization is a simple and effective
technique that can be implemented with basic image processing tools. It does
not require complex algorithms or machine learning models.
5. Applicable to different types of images: Histogram equalization can be
applied to a wide range of images, including grayscale, color, and digital
images. It is also useful in medical imaging, satellite imaging, and other fields
where image contrast is important.
4. Discuss the RGB colour model and YIQ colour model.
RGB Colour Model:
RGB is the most widely used colour space for hardware oriented applications
such as monitors, cameras, graphics, boards, etc. In this model, each colour is
represented as three values R,G,B indicating the amount of red, green and blue
which makes the colour. The colour model is based on a cartesian co-ordinate
system. The colour space is represented by a cude of unit length.
Origin (0,0,0) represents black and white (1,1,1) is at the corner furthest from
origin centre.
All the values are assumed to be normalized to the range [0,1]
Primary colours R(1,0,0), G(0,1,0), B(0,0,1) are at the three corners.
Secondary colours Cyan (0,1,1), Magenta (1,0,1) and Yellow (1,1,0) are at the
three other corners.
Line joining black and white represent all the shades of grey.It uses additive
color mixing to produce a gven colour. Quantization determined colour depth.
RGB colour models are used in monitors where three separate red, blue and
green guns allows visual perception of all colours. On the monitor, there is one
dot for each RGB colour and the dots are grouped in patterns close together.
Acquiring a colour image is the reverse process. A colour image is acquired by
using three filters sensitive to red, blue. Green respectively.
CMY and CMYK Colour Models:
Cyan, magenta and yellow are primary colours of pigments Pigments subtract
light, thus a surface coated with cyan pigment absorbs red component of light.
This is a subtractive based colour space and is mainly used in printing and hard
copy output. The fourth black(K) component is included to improve both the
density range and the available colour gamut. Symbol ‘K’ is chosen for black
instead of ‘b’ to avoid confusion with blue. CMYK refers to ‘four colour printing’
by publishers.
Very simple transform is required to translate RGB colours displayed on screen
to CMYK values for printing.
YIQ Colour model:
YIQ colour space is defined by the National television system Committee(NTSC),
where Y describes luminance, I and Q describes the chrominance. This colour
model is derived based on the characteristic of human visual system. Humans
are more sensitive to brightness or luminance than to chrominance
components. Thus YIQ model separates colour into luminance(Y) and hue(I &
Q) For black and white TV, I & Q are ignored. YUV is the colour coordinate
system used in TV in PAL system.
YIQ and YUV are good representation for compression because some of the
chrominance information can be thrown out without loss of quality in picture
as human eyes are less sensitive to chrominance than luminance. This saves
bandwidth and storage space requirements.
5. Explain smoothing linear filters in brief.
Smoothing filters are used for blurring and for noise reduction. Blurring is used in
preprocessing steps, such as removal of small details from an image prior to (large)
object extraction, and bridging of small gaps in lines or curves. Noise reduction can
be accomplished by blurring with a linear filter and also by non-linear filtering.
(1) Smoothing Linear Filters:
The output (response) of a smoothing, linear spatial filter is simply the average of
the pixels contained in the neighborhood of the filter mask. These filters sometimes
are called averaging filters. The idea behind smoothing filters is straightforward.By
replacing the value of every pixel in an image by the average of the gray levels in
the neighborhood defined by the filter mask, this process results in an image with
reduced ―sharp‖ transitions in gray levels. Because random noise typically consists
of sharp transitions in gray levels, the most obvious application of smoothing is
noise reduction.
Download