0% found this document useful (0 votes)
45 views49 pages

Chapter 7-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views49 pages

Chapter 7-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Principles of Remote sensing

Chapter 7

Digital Image processing and image


interpretation

By: Kinfe W.
Outlines
• Introduction
 Digital image processing
 Transmission and storage of Data
 Rectification and restoration
 Radiometric correction
 Geometric correction
 Common imagery file format
Digital Image and Digital Image Processing

• Image – A two-dimensional signal that can be observed by human visual


system

• Digital image – Representation of images by sampling in time and space.

• Digital image processing – perform digital signal processing operations on


digital images
Cont`d
•A digital image is a representation of a two-dimensional image as a finite
set of digital values, called picture elements or pixels
Cont`d

•Digital image processing focuses on two major tasks


• Improvement of pictorial information for human interpretation

• Processing of image data for storage, transmission and representation for autonomous
machine perception
Image Processing

Mid-Level Process High-Level Process


Low-Level Process

• Segmentation Making Sense of an Ensemble of


• Reduce Noise
• Classification Recognized Objects
• Contrast Enhancement
• Image Sharpening
Cont`d
•Image processing techniques includes:
 analog image processing
 digital image processing
• Visual/analog processing techniques is applied to hard copy data (photographs or printouts).
•Digital image processing is the manipulation and interpretation of digital images with the aid
of a computer.
•General steps in digital image Processing:
 pre-processing
 image enhancement
 image classification and analysis
Pre-processing operations
• Sometimes referred to as image restoration and rectification, are intended to correct for
sensor and platform-specific radiometric and geometric distortions of data.

• Initial stages of data processing where the image is corrected for various errors and
degradation.

• Raw imagery  corrections  pre-processed imagery  further enhancements and


analysis

• Pre-processing operations are grouped into two:


 radiometric corrections
 geometric corrections
7
Radiometric corrections
• Address variations in pixel intensities that are not caused by the object or scene being scanned.

• Applied to correct for radiometric distortions/error.

• A radiometric distortion is an error that influences the radiance or radiometric value of a scene
element (pixel).

• Radiometric corrections are grouped as:


 cosmetic
 atmospheric

•Sources pixel variations are:-


 scene illumination and viewing geometry
 atmospheric conditions (scattering of radiation)
 sensor noise and response
8
a. Variation in illumination and viewing geometry: between images
(for optical sensors) can be corrected by modeling the geometric
relationship and distance between the areas of the Earth's surface
imaged, the sun, and the sensor.
• Position of sun
Sun elevation (sun angle)
Sun - earth distance
• Correction elevation
Division of each pixel value
by the sine of solar elevation
angle for particular time and location per spectral band 9
b. Scattering of radiation: occurs as it passes through and interacts with the
atmosphere.
• Due to particles in the atmosphere:
dust, pollen, smoke, clouds.
• Correction algorithms
 dark pixel subtraction
 atmospheric modelling

c. Noise in an image: may be due to irregularities or errors that occur in the sensor
response and/or data recording and transmission.
• Common forms of noise include:
 line stripping
 periodic line dropouts
 random noise or spike noise

10
• Line striping or banding: due to non-identical response of one or more detectors
resulting from drift in response after calibration of the detectors.
• Correction method:
 Compute the histogram of one detector as standard
 Match the histograms of the other detectors to the histogram of the standard detector

11
• Periodic line drop outs: when a detector either completely fails to function or
becomes temporarily saturated during a scan.

• Caused by erroneous radiance values for pixels, lines or areas or defective scanner,
transmission, receiving or media system.

• Correction method:
Comparison of actual and computed line dropout values.
Correction by repetition of neighbouring values or taking
the average of the line above and below shows little divergence from the actual values.

12
• Random noise or spikes: caused by transmission errors or temporary disturbances.
• Correction method:
 Detect spike by comparing DN with DN of its surrounding pixels (neighbours)
 Replace DN with DN value interpolated from the surrounding pixels

Image with spike De-spiked image

13
Geometric corrections
• Both maps and images provide a representation of the earth’s surface.
• However raw, unprocessed images are not maps!
• Raw imagery has geometric errors of multiple sources.
•Why is there geometric distortion in imagery?
 Perspective of the sensor optics
 Motion of the scanning system
 Motion and instability of the platform
 Platform attitude, altitude and velocity
 Terrain relief
 Curvature and rotation of the earth

14
Cont`d
• Geometric corrections include correcting for geometric distortions due to sensor-earth
geometry variations and conversion of the data to real world coordinates (Latitude and
longitude) on the earth’s surface.

• Types of distortions:
 systematic distortions due to orbital variations (mostly corrected at ground station after image
captation)
 distortions due to relief variation
 distortions due to different projection systems

• Geometric systematic distortions include:


 s-shaped orbit
 tilt movements 15
Cont`d
• S-shaped orbit
 due to non-polar orbit
 images have oblique orientation
 earth rotation amplifies the effect

• Tilt movements
 platform makes small rotation
movements along three axes.

16
Cont`d
• Perspective effects
 off-nadir pixel size greater than size of nadir-pixels
 earth’s curvature amplifies effect

• Distortions due to the relief


 terrain height differences induce scale variations through the image
 leads to positional errors or terrain Displacements
 height differences also result in differences in resolution (higher pixels
are closer to the sensor)

17
Cont`d
• Distortions due to different projection systems

 3D earth  2D map always causes distortions


 different projections and coordinate systems exists
(e.g. UTM)

Two maps of the same area and scale will not be thesame
if they follow different projection systems.

18
Geometric correction procedures
i. Image-to-image registration

• The process of geometrically aligning two or more sets of image data such that they are geometrically
conform.

• It does not involve assigning a coordinate system.

• Data being registered may be the same type, from very different kinds of sensors, or collected at
different times.

ii. Image rectification

• A process by which the geometry of an image area is made planimetric.

• Assigning map coordinates to image data is called georeferencing.

• Orthorectification is a specific form of rectification that does correct for terrain displacements.
19
iii. Image resampling

• Resampling is the process of calculating the pixel values for the rectified image and
creating new file.

• Through a process of interpolation, the output pixel values are derived as functions input
pixel values.

• Commonly used resampling techniques are:


 Nearest neighbor

 Bilinear interpolation

 Cubic convolution

• Always follows the registration or rectification procedure.


20
• Resampling
 Calculate output pixel values

 First, a geometrically correct ‘empty’ output grid is calculated.

 Then, output pixels are calculated based on the input pixels.

21
a. Nearest neighbour resampling

• Uses the value of the closest pixel to assign to the output pixel.

• Easy and fast to calculate

• Better for classification

• Can produce blocky output

• Preserve original values in the altered scene

22
b. Bilinear interpolation
• Uses the pixel values of four pixels in a 2x2 window to calculate an output value with a
bilinear function.
• For each of the four pixels, the pixel value is weighted more if the pixel is closer to the
output pixel.
• Produces smoother output.
• Edges are smoothed out.
• The output image gives a natural look because each output value is based on several
input values.

23
c. Cubic convolution
• Uses data values of 16 pixels in a 4x4 window to calculate an output value with a cubic
function.
• The pixels farther from the output pixel have exponentially less weight than those close to
the output pixel.
• Preserves best the mean and standard deviation of the input pixels.
• Computationally intensive

24
Image enhancement
• Image enhancement is the process of making an image more interpretable for a
particular application.
• Enhancement can improve the appearance of the imagery to assist in visual interpretation
and analysis.
• After pre-processing, the image needs to be enhanced.
• The image enhancement techniques available to facilitate visual interpretation are :
 Radiometric enhancement (individual pixel, 1 band)
 Spatial enhancement (individual pixel + surrounding pixels, 1 band)
 Spectral enhancement (Multiple bands, see Image Transformations)

25
Cont`d
i. Radiometric enhancement: enhancing images based on the values of individual pixels
within each band.
• It is also referred to as a contrast stretching.
•Radiometric enhancement is integrated with the concept of image histogram.
• Image histogram is a graphical representation of the brightness values that comprise an
image.
• Inputbrightness values
(small range)  stretching
output brightness values
(broad range).

26
Cont`d
ii. Spatial enhancement: enhancing images based on the values of neighboring pixels
within each band.

• It is referred ato as neighborhood operation.

• Spatial enhancement deals largely with spatial frequency, it is the difference between
the highest and lowest values of a contiguous set of pixels.

 zero spatial frequency

 low spatial frequency

 high spatial frequency

27
Cont`d
• Spatial filtering encompasses a set of digital processing functions which are
used to enhance the appearance of an image.

• Spatial filters are designed to highlight or suppress specific features in an


image based on their spatial frequency.
a. Convolution filtering:
Low-pass filters: reduce details, smooth the image
High-pass filters: sharpen the appearance of details
Edge detection filters: highlight linear features, such as roads or field boundaries

• Convolution kernel is a matrix of numbers that is used to calculate the value of


the central pixel with the values of surrounding pixels.

28
Image Enhancement
• Reduce noise

• Emphasize certain image features

• Techniques
• Contrast enhancement

• Edge enhancement

• Noise filtering

• Sharpening

• Magnifying
29
Haze Reduction
• One means of haze compensation in multispectral data is to observe the
radiance recorded over target areas of zero reflectance

• For example, the reflectance of deep clear water is zero in NIR region of
the spectrum

Therefore, any signal observed over such an area represents the path
radiance

• This value can be subtracted from all the pixels in that band
Haze Reduction

Lapp=apparent radiance at sensor


ρ = target reflectance
T = atmospheric transmittance
E = incident solar irradiance
Lp = path radiance/haze
Haze Reduction

(a) Before haze removal (b) After haze removal


Haze removal

(a) The aerial image before haze removal (b) The aerial image after haze removal
Coordinate Systems Geographic vs. Projected

• Geographic Coordinate Systems (GCS)


• Location measured from curved surface of the earth
• Measurement units latitude and longitude
• Degrees-minutes-seconds (DMS)
• Decimal degrees (DD) or radians (rad)

• Projected Coordinate Systems (PCS)


• Flat surface
• Units can be in meters, feet, inches
• Distortions will occur, except for very fine scale maps
Map projections …
• Is the transformation of three dimensional object to two dimensional or plane

• Helps to know between locations on earth and their relative locations on a flat map

• Are mathematical expressions

• Cause the distortion of one or more map properties (scale, distance, direction, shape)
Classifications of Map Projections
Conformal – local shapes are preserved

Equal-Area – areas are preserved

Equidistant – distance from a single location to all other locations are preserved

Azimuthal – directions from a single location to all other locations are preserved
Why project data?
• Data often comes in geographic, or spherical coordinates (latitude and longitude) and can’t
be used mathematical model in most GIS software applications

• Some projections work better for different parts of the globe giving more accurate
calculations

• To representing the curved surface of the earth on a flat map.


Map Projections Types

14
Cylindrical projections
• Shapes are preserved

• But not area!

• Mercator projection

Conic projections

• Best for hemispheres or small regions

• Area and shape only slightly distorted

Planar projections

• Equidistant; good for navigation

• Only good for one hemisphere

• Distorts area, not shape


Map projections
• Three properties to consider

• Area (equal-area or equivalent)

• Shape (conformal)

• Distance (equidistant)

• Choose two out of three

• How large an area?

• Purpose of the map

• Ulterior motives?
Geographic Coordinate System

•Parallels - east to west – 0° at the Equator (0 °-90 °)

•Meridians – north to south – 0° at the Prime Meridian (0 °-180 °)

•Latitude and longitude are angular measurements made from the center of the earth to a point on the surface of
the earth
Datum
• Reference frame for locating points on Earth’s surface

• Defines origin & orientation of latitude/longitude lines

• Defined by spheroid and spheroid’s position relative to Earth’s center.


Datum
Image file format
• GPG, TIFF, GIF, PNG and EPS
What`s a GPG?
• Capable of featuring millions of colors

• It compresses rally well- perfect for online

• Minor color and detail loss through compression

• Not good for sharp edge or text

• Recommended for online picture, some print


What`s a TIFF?

• Capable of featuring millions of colors

• No color loss

• No or little compression- large file size

• Not compatible with all applications

• Recommended for highly detailed images for print.


What`s a GIF?
• Capable of featuring 256 colors only

• It compresses really well-perfect for online

• Great for shape edges or text

• Recommended for online diagrams, charts and text


What`s a PNG

• Capable of featuring millions of colors

• It compresses rally well

• Great for sharp edge, text or transparency

• Can only used for online

• Not compatible with applications

• Recommended Not, but should the online standard


What` a EPS

• Scalable to any size

• Can only be opened by graphic software

• Does not lose color or detail

• Recommended for print signage and merchandise applications

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy