0% found this document useful (0 votes)
26 views121 pages

GISppt

Uploaded by

giyis43628
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views121 pages

GISppt

Uploaded by

giyis43628
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 121

Remote Sensing : Concepts

Recommended Readings
• Remote sensing of the environment
by J.R. Jensen (Publication : Springer)
• Remote sensing and image interpretation
by T.M. Lillesand, R.W. Kiefer (Publication : Wiley)
• Fundamentals of Remote sensing
by George Joseph

www.ccrs.nrcan.gc.ca
Discussion Topics
• What is Remote Sensing ?
• How & when did it start?
• How is it today ?
• What is Remote Sensing Process ?
• What is Electromagnetic Radiation (EMR)?
• What is Electromagnetic Spectrum ?
• Radiation interactions (with atmosphere & target)
• Spectral Signature
• Passive & Active Remote Sensing
• Satellite image characteristics
What is Remote Sensing?
• It is the science of acquiring information
about the Earth's surface
without actually being in contact with it.

• This is done by sensing and recording reflected


or emitted energy and processing, analyzing, an
d applying that information.
How & when did it start?
How & when did it start?

1860 : Paris

1839 : Paris
How is it today?
How is it today?

Abu Dhabi : PAN image


How is it today?

Mumbai (LISS III + PAN image)


How is it today?

Statue of Liberty (IKONOS)


What is Remote Sensing Process ?
Recording
Energy So of energy b
urce / Illum y sensor
ination

Transmission
reception & p
rocessing Application

Radiation
&
atmosphere

Interaction
with target

Interpretation
&
Analysis
What is Electromagnetic Radiation (EMR)?
• First requirement for remote sensing is to have an e
nergy source to illuminate the target.
• This energy is in the form of electromagnetic radiati
on.
• Two characteristics of EMR are wavelength & frequ
ency.
What is Electromagnetic Radiation (EMR)?

Electromagnetic radiation consists of an

electrical field (E) which varies in magnitude in


a direction perpendicular to the direction in
which the radiation is traveling,

and a magnetic field (M) oriented at right angles


to the electrical field. Both these fields travel
at the speed of light (c).
What is Electromagnetic Spectrum?
• It ranges from the shorter
wavelengths (including gam
ma and x-rays) to the longer
wavelengths (including micr
owaves and broadcast radio
waves).
What is Electromagnetic Spectrum?
• The ultraviolet or UV porti
on of the spectrum has the
shortest wavelengths which
are practical for remote sen
sing.
• This radiation is just beyo
nd the violet portion of the v
isible wavelengths, hence it
s name.
• Some Earth surface materi
als, primarily rocks and min
erals, fluoresce or emit visib
le light when illuminated by
UV radiation.
What is Electromagnetic Spectrum?
• The light which our eyes - our "
remote sensors" - can detect is
part of the visible spectrum.

Violet: 0.4 - 0.446 µm


Blue: 0.446 - 0.500 µm
Green: 0.500 - 0.578 µm
Yellow: 0.578 - 0.592 µm
Orange: 0.592 - 0.620 µm
Red: 0.620 - 0.7 µm
Radiation interaction: with atmosphere
• Radiation before reaching the Earth's surface has to
travel through some distance of the Earth's atmosph
ere.

• Atmosphere contains particles & gases.

• Particles and gases in the atmosphere can affect th


e incoming light and radiation.

• These effects are caused by the mechanisms of sca


ttering and absorption.
Radiation interaction: with atmosphere
• Scattering occurs when particles or l
arge gas molecules present in the atm
osphere interact with and cause the el
ectromagnetic radiation to be redirect
ed from its original path.

• How much scattering takes place de


pends on several factors including the
wavelength of the radiation, the abund
ance of particles or gases, and the dis
tance the radiation travels through the
atmosphere.
Radiation interaction: with atmosphere
• Absorption is the other main mecha
nism at work when electromagnetic ra
diation interacts with the atmosphere
.

• This phenomenon causes molecules


in the atmosphere to absorb energy a
t various wavelengths.
Because these gases absorb electromagnetic energy i
n very specific regions of the spectrum, they influence
where (in the spectrum) we can "look" for remote sens
ing purposes.

Those areas of the spectrum which are not severely in


fluenced by atmospheric absorption and thus, are use
ful to remote sensors, are called atmospheric window
s.
Atmospheric window
Radiation interaction: with target

• Radiation that is not absorbed or


scattered in the atmosphere reach
es and interacts with the Earth's su
rface and different targets.

• 3 forms of interaction (when energy strikes upon the surfac


e or incidents)
Absorption (A) : When radiation is absorbed into the targ
et
Transmission (T): When radiation passes through a targ
et
Reflection (R): when radiation "bounces" off the target
Radiation interaction: with target
• We are most interested in measuring the radiation ref
lected from targets.
• Two types of reflection:
• Specular reflection and
• Diffuse reflection.

Specular reflection (mirror-like) Diffused reflection (rough surface)


Spectral Signature
• Spectral Signature or spectral response refers to respo
nse of a target to radiation in terms of absorption, trans
mission, and reflection.

• It depends on the complex make-up of the target that is


being looked at (material composition, surface propertie
s, etc) , and the wavelengths of radiation involved.

• Thus by comparing the response patterns of different f


eatures we may be able to distinguish between them, thi
s may not be achieved if we only compared them at one
wavelength.

• Spectral response can be quite variable, even for the sa


me target type, and can also vary with time
Spectral Signature : Leaves
• Leaves : Chlorophyll strongly absorbs r
adiation in the red and blue wavelengths
but reflects green wavelengths.

• Leaves appear "greenest" to us in the s


ummer, when chlorophyll content is at its
maximum.

• In autumn, reduction in chlorophyll in th


e leaves, results in less absorption and p
roportionately more reflection of the red
wavelengths, making the leaves appear r
ed or yellow.

• The internal structure of healthy leaves


act as excellent diffuse reflectors of near-
infrared wavelengths.
Spectral Signature : Water
• Longer wavelength visible (red) and
near infrared radiation is absorbed m
ore by water than shorter visible wave
lengths (violet).

• Thus water typically looks blue or bl


ue-green due to stronger reflectance
at these shorter wavelengths, and dar
ker if viewed at red or near infrared w
avelengths.

• If there is suspended sediment pres


ent in the upper layers of the water bo
dy, then this will allow better reflectivi
ty and a brighter appearance of the w
ater.
Spectral Signature : Leaves vs Water
• Water and vegetation may reflect s
omewhat similarly in the visible wave
lengths but are almost always separa
ble in the infrared.
Passive & Active Remote Sensing
• Remote sensing that uses naturally available energy
is termed as Passive Remote Sensing.

• Active Remote Sensing uses energy provided by se


nsors (i.e. own energy source for illumination)
Passive & Active Remote Sensing
• Passive Sensors measure all reflected energy, this
can only take place during the time when the sun is il
luminating the Earth.

• Energy that is naturally emitted (such as thermal inf


rared) can be detected day or night, as long as the a
mount of energy is large enough to be recorded.

• Active Sensors can obtain measurements anytime, r


egardless of the time of day or season.

• Active sensors can be used for examining waveleng


ths that are not sufficiently provided by the sun, such
as microwaves, or to better control the way a target i
s illuminated.
Satellite image characteristics
• Electromagnetic energy may be detected either pho
tographically or electronically.

• An image refers to any pictorial representation, reg


ardless of what wavelengths or remote sensing devi
ce has been used to detect and record the electroma
gnetic energy.

• A photograph refers specifically to images that hav


e been detected as well as recorded on photographi
c film.

• Photos are normally recorded over the wavelength


range from 0.3 µm to 0.9 µm - the visible and reflecte
d infrared. Based on these definitions, we can say th
at all photographs are images, but not all images are
photographs.
Satellite image characteristics
• An mage or photograph displayed i
n a digital format can be subdivided
into small equal-sized and shaped ar
eas, called picture elements or pixel
s (cells).

• Each cell represents the brightness


of each area with a numeric value ca
lled Brightness Value (BV) or Digital
Number (DN).
Satellite image characteristics
• The information from different wavelength range
s is gathered and stored in separate channels, ref
erred to as bands.

• Information from different channels can be comb


ined and displayed digitally using the three primar
y colours (blue, green, and red).

• The data from each channel is represented as on


e of the primary colours and, depending on the rel
ative brightness (i.e. the digital value) of each pixe
l in each channel, the primary colours combine in
different proportions to represent different colour
s.
Satellite image characteristics
• Thus when data from single channel or band is
displayed, it shows variation in brightness value
as different shades of gray ranging form white to
black.

• When two or more bands or channels are combi


ned and displayed, it produces colour image.
Satellite image : Bands
Discussion Topics
• Platforms & Sensors
• Orbits & Swath
• Data Reception
• Data Processing
• Data Products
• Image Resolutions (4)
• Multispectral Scanning
Platforms
• It refers to a surface on which the sensor rests.
• It may be
• Ground Borne
• Air Borne
• Space Borne
Platforms : Types
Ground Borne

Air Borne

Space Borne
Orbits
• The path followed by a satellite is referred to as its or
bit.
• Satellite orbits are matched to the capability and obje
ctive of the sensor(s) they carry.
• Orbit selection can vary in terms of altitude and their
orientation and rotation relative to the Earth.
• Types :
• Geostationary
• Polar (Sun synchronous)
Orbits

• Satellites following Geostatio


nary orbit view the same porti
on of the Earth's surface at all
times

• These are at altitude of appro


ximately 36,000 kilometres

• They revolve around the Eart


h with the speed equal to rotat
ion speed of Earth (hence they
appear stationary relative to th
e Earth).

Examples: Weather and communications satellites


Orbits
• About 800-1000 Km above Earth’s s
urface

• Some satellites follow orbits (basica


lly north-south) which, in conjunction
with the Earth's rotation (west-east) al
lows them to cover most of the Earth'
s surface over a certain period of time
.

• Many of these satellite orbits are als


o sun-synchronous such that they co
ver each area of the world at a consta
nt local time of day called local sun ti
me.
Orbits

• Satellite’s travel towards North


pole is termed as ‘Ascending pa
ss’ while that towards South pol
e is termed as ‘Descending pass

• In sun-synchronous orbits, the


ascending pass is most likely on
the shadowed side of the Earth
while the descending pass is on
the sunlit side.
Orbits
• Sensors recording reflected solar energy only image t
he surface on a descending pass, when solar illuminati
on is available.

• Active sensors which provide their own illumination o


r passive sensors that record emitted (e.g. thermal) rad
iation can also image the surface on ascending passes
.
Swath
• As a satellite revolves around the
Earth, the sensor captures a certai
n portion (strip of constant width) o
f the Earth's surface. The area imag
ed on the surface, is referred to as t
he swath.

• Imaging swaths for spaceborne se


nsors generally vary between tens
and hundreds of kilometres wide.

• As the satellite orbits the Earth fro


m pole to pole, its east-west positio
n wouldn't change if the Earth didn'
t rotate.
Satellite Passes

• As the satellite orbits the Earth fro


m pole to pole, its east-west positio
n wouldn't change if the Earth didn'
t rotate.

• However, as seen from the Earth, i


t seems that the satellite is shifting
westward because the Earth is rota
ting (from west to east) beneath it.

• This apparent movement allows th


e satellite swath to cover a new are
a with each consecutive pass.
Sensors
• Devices or the instruments which sense the ener
gy and can produce photograph or image for the r
ecorded energy from object.
• Mounted on the platform

PAN LISS III


Some Popular Sensors & their Swath
Sensor Orbit (altitude) Swath

PAN 817 Km 23 Km
LISS III 817 Km 141 Km
LISS IV
Cartosat-1
Cartosat-2
Data Reception
• The data captured by the satellite and stored in its st
orage medium is downloaded at Ground Control Stati
on.

• Indian Ground Control Station is located at Shadnag


ar, 55km south of Hyderabad city to receive data fro
m various Remote Sensing Satellites both Indian and
Foreign origin.

• It has three Antenna Receive systems (Terminals) to


support multi- mission operations.

• They are capable to Track and Receive data from an


y Remote Sensing Satellite, which is in sun-synchron
ous, polar near circular orbit
Data Reception
Data Reception
• The Ground Station basically consists of
(1) Antenna & Receive System
(2) Archival & Quick Look Browse System
(3) Test Facility and
(4) Support systems (Communication Networks;
UPS and Standby DG power systems etc.)

• Data reception is followed by Data Processing for gene


ration of products
Data Processing
• As per user requirements, Data processing caters to gen
eration of satellite data products for all IRS series satellite
s and some foreign satellites.

• The raw data recorded at the ground station is corrected


for geometric and radiometric distortions. Data products a
re categorized as Standard products and Value Added pro
ducts.

• Raw data is archived on media. A catalog of all the archiv


ed media is maintained. This media forms the input for dat
a product generation.
Data Processing

Data Processing Facility

Archival facility
Data Processing

Photo Processing facility


Data Products
Photographic Products
• Black & White (B/W) Photographs
• Colour Photographs (Natural & False Colour Composites
)
• Negative & Positive Transparencies
• Paper prints

Digital Products
• LGSOWG (Landsat Ground Station Operators
Working Group) or Super Structure Format
• Fast Format
• GeoTIFF (Geographic Tagged Image File Format)
• Hierarchical Data Format (HDF)
• CDs or Digital Audio Tape (DAT)
Satellite image procurement
• Satellite images are usually through national agencie
s or their authorised channel partners.

• In India, the purchaser needs to send the request to


NRSC (National Remote Sensing Center, Hyderabad).

• The archive images may be browsed through the ‘im


age search’ option on www.nrsc.gov.in. The data price
list is also available on the website.
Image Resolutions (4)
• Resolution = the ability to distinguish two spatially cl
ose or spectrally similar objects

• This refers to the size of the smallest possible featur


e that can be detected.

• Image resolution refers to


Spatial Resolution
Spectral Resolution
Radiometric Resolution
Temporal Resolution
Spatial Resolution
• Images are composed of a matrix of picture elements, o
r pixels, which are the smallest units of an image.

• Image pixels are normally square and represent a certai


n area on an image.

• When an image is displayed at full resolution, each pixel


represents a ground area corresponding to the spatial res
olution of the sensor.

• It is expressed in terms of linear dimension (usually in m


eters)

Example : spatial resolution of IKONOS is 1m & spatial re


solution of Quickbird is 0.60m.
Spatial Resolution

Fine Resolution

Coarse Resolution
Spatial Resolution of some of the popular images
Sensor Spatial Resolution

WiFS 188 M
LISS III 23.5 M
LISS IV 5.8 M (Colour)
PAN 5.8 M (Grayscale)
Cartosat-1 2.5 M
Cartosat-2 1.0 M

IKONOS 1.0 M
QuickBird 0.60 M
WorldView-01 0.50 M
Spectral Resolution
• Spectral resolution describes the ability of a sensor to
define fine wavelength intervals.

• As learnt earlier, spectral response changes from obje


ct to object based on its material and surface properties
.

• Thus different classes of features in an image can ofte


n be distinguished by comparing their responses over d
istinct wavelength ranges.

• Broad classes, such as water and vegetation, can usua


lly be separated using very broad wavelength ranges - t
he visible and near infrared. However, rock types may re
quire higher spectral resolution to distinguish them.
Spectral Resolution
• Black and white film records wa
velengths extending over much,
or all of the visible portion of the
electromagnetic spectrum. Its sp
ectral resolution is fairly coarse.

• Colour film has higher spectral


resolution, as it is individually se
nsitive to the reflected energy at
the blue, green, and red wavelen
gths of the spectrum.

• Thus, it can represent features


of various colours based on their
reflectance in each of these disti
nct wavelength ranges.
Spectral Resolution
• Many remote sensing systems record energy over se
veral separate wavelength ranges at various spectral r
esolutions.

• Hyperspectral sensors : They detect hundreds of ver


y narrow spectral bands throughout the visible, near-i
nfrared, and mid-infrared portions of the electromagne
tic spectrum.

• Their very high spectral resolution facilitates fine dis


crimination between different targets based on their s
pectral response in each of the narrow bands.
Radiometric Resolution

• The radiometric resolutio


n of an imaging system des
cribes its ability to discrimi
nate very slight differences
in energy.

• It is also called as Quanti


zation Level.

• The finer the radiometric r


esolution of a sensor, the
more sensitive it is to detec
ting small differences in ref
lected or emitted energy.
Radiometric Resolution
• Imagery data are represented by positive digital num
bers which vary from 0 to (one less than) a selected po
wer of 2.

• This range corresponds to the number of bits used fo


r coding numbers in binary format. . Each bit records a
n exponent of power 2.

• The maximum number of brightness levels available


depends on the number of bits used in representing th
e energy recorded.

•Thus, for a 8-bit sensor digital values will range from


0 to 255, while that for a 4-bit sensor will range from 0 t
o 15.
Temporal Resolution
• It refers to the length of time it takes for a satellite to c
omplete one entire orbit cycle. (It is the time period req
uired to image the exact same area at the same viewing
angle a second time).

• The revisit period of a satellite sensor is usually sever


al days.

• The actual temporal resolution of a sensor depends o


n a variety of factors, including the satellite/sensor cap
abilities, the swath overlap, and latitude.

• Spectral characteristics of features may change over t


ime and these changes can be detected by collecting a
nd comparing multi-temporal (multi-date) imagery.
Temporal Resolution
• Temporal Resolution is important in imaging
• Short-lived phenomena (floods, oil slicks, etc.)

• Multi-temporal comparisons are required (e.g. t


he spread of a forest disease from one year to t
he next)

• The changing appearance of a feature over tim


e can be used to distinguish it from near-similar
features (wheat / maize)
Multispectral Scanning
• A scanning system used to collect data over a variety
of different wavelength ranges is called a multispectral
scanner (MSS)

Across Track Scanning Along Track Scanning


Across Track Scanning
• Across-track scanners scan the Eart
h in a series of lines.
• The lines are oriented perpendicular
to the direction of motion of the senso
r platform (i.e. across the swath).
• Each line is scanned from one side o
f the sensor to the other, using a rotati
ng mirror (A).
• As the platform moves forward over t
he Earth, successive scans build up a
two-dimensional image of the Earth´s
surface
Across Track Scanning
• Airborne scanners typically sweep large angles (betw
een 90º and 120º), while satellites, because of their hig
her altitude need only to sweep fairly small angles (10-
20º) to cover a broad region.

• Because the distance from the sensor to the target in


creases towards the edges of the swath, the ground re
solution cells also become larger and introduce geome
tric distortions to the images.
Along Track Scanning
• Along-track scanners also use the f
orward motion of the platform to reco
rd successive scan lines and build up
a two-dimensional image, perpendicul
ar to the flight direction.

• However, instead of a scanning mirr


or, they use a linear array of detectors
.

• These systems are also referred to a


s pushbroom scanners, as the motion
of the detector array is analogous to t
he bristles of a broom being pushed a
long a floor.
Advantages of Along Track Scanning
• The array of detectors combined with the pushbroom m
otion allows each detector to "see" and measure the ener
gy from each ground resolution cell for a longer period of
time (dwell time).

• This allows more energy to be detected and improves th


e radiometric resolution.

• Detectors are usually solid-state microelectronic device


s, they are generally smaller, lighter, require less power,
and are more reliable and last longer because they have
no moving parts.

• Challenge: Cross-calibration of thousands of detectors


to achieve uniform sensitivity across the array is necess
ary and complicated.
Discussion Topics
• Elements of visual interpretation
• Tone
• Shape
• Size
• Pattern
• Texture
• Shadow
• Association.
Tone
• Tone refers to the relativ
e brightness or colour of
objects in an image.

• Tone is the fundamental


element for distinguishing
between different targets
or features.

• Variation in tone also all


ows the elements of shap
e, texture, and pattern of o
bjects to be distinguished.
Shape
• Shape refers to the general for
m, structure, or outline of individ
ual objects.

• Shape can be a very distinctive


clue for interpretation.

E.g. Forest versus farms


Size
• Size of objects in an image is a
function of scale.

• It is important to assess the siz


e of a target relative to other obj
ects in a scene, as well as the ab
solute size, to aid in the interpret
ation of that target.

• E.g. : Large buildings such as f


actories or warehouses would s
uggest commercial property, wh
ereas small buildings would indi
cate residential use.
Pattern
• Pattern refers to the spatial arr
angement of visibly discernible
objects.
Texture
•Texture refers to the arrangement a
nd frequency of tonal variation in pa
rticular areas of an image.

• Rough textures would consist of a


mottled tone where the grey levels c
hange abruptly in a small area, wher
eas smooth textures would have ver
y little tonal variation.

E.g.--Fields or grasslands usually ha


ve smooth texture while forest tend t
o have rough texture.
Shadow
• Shadow is also helpful in interp
retation as it may provide an ide
a of the profile and relative heig
ht of a target or targets which m
ay make identification easier.

• However, shadows can also re


duce or eliminate interpretation i
n their area of influence, since ta
rgets within shadows are much l
ess (or not at all) discernible fro
m their surroundings.
Association
• Association takes into account
the relationship between other r
ecognizable objects or features i
n proximity to the target of intere
st.

• The identification of features th


at one would expect to associate
with other features may provide i
nformation to facilitate identifica
tion.
Discussion Topics
• Digital Image Processing (DIP)
• Image Correction (Pre-processing)
• Image Enhancement
• Image Classification
Image Correction
• These are operations that are normally required prior
to the main data analysis and extraction of information

• Includes :
• Atmospheric corrections
• Radiometric corrections
• Geometric corrections
Image Correction : Atmospheric
• Any sensor that records EMR from the Earth’s surfac
e using visible or near-visible radiation will typically re
gister a mixture of two kinds of energy.

• Molecular & aerosol scattering & absorption by gases


(water vapor, ozone, oxygen, & aerosols)

• The value recorded at any pixel location on a remotel


y sensed image does not represent the true ground-lea
ving radiance at that point.

• Part of the brightness is due to the reflectance


of the target of interest and the remainder is derived fr
om the brightness of the atmosphere itself.

• It is intended to correct for sensor- and platform-spec


ific radiometric and geometric distortions of data.
Image Correction : Atmospheric
• To retrieve surface reflectance from remotely sensed
imagery is called atmospheric correction

• Process that converts the top-of atmosphere (TOA) ra


diance to surface reflectance

• Difficulties – variations of concentrations in time & sp


ace: aerosols & water vapor
• Aerosols – shortwave bands
• Water vapor – affects the near IR bands
􀂄
• It involves :
• Atmospheric parameter estimation
• Surface reflectance retrieval
Image Correction : Atmospheric
• Atmospheric Correction methods
• Invariant-object method (Single viewing angle imagery
)
• Look-up table method (Single viewing angle imagery)
• Histogram matching methods
• Dark object methods
• Contract reduction methods
• Cluster matching method
Image Correction : Atmospheric
Dark Object Method (Dark Pixel Reduction Method)

• The brightness values across image are observe


d for each band

• Minimum brightness value is determined for eac


h band (Since scattering is wavelength dependent,
the minimum values will vary from band to band.

• The correction is applied by subtracting the mini


mum observed value, determined for each specific
band, from all pixel values in each respective ban
d.
Image Correction : Radiometric
• Radiometric corrections include correcting the data
for sensor irregularities and unwanted sensor
or atmospheric noise, and
converting the data so they accurately represe
nt the reflected or emitted radiation measured
by the sensor.
Image Correction : Radiometric
• Noise in an image may be du
e to irregularities or errors tha
t occur in the sensor response
and/or data recording and tran
smission.

• Common forms of noise incl


ude systematic striping or ban
ding and dropped lines.

• Both of these effects should


be corrected before further en
hancement or
classification is performed.
Image Correction : Radiometric
• Dropped lines occur when t
here are systems errors whic
h result in missing or defectiv
e data along a scan line.

• Dropped lines are normally '


corrected‘ by replacing the lin
e with the pixel values in the l
ine above or below, or with th
e average of the two.
Image Correction : Geometric
• Remote Sensing data are affecte
d by geometric distortions due to
– sensor geometry,
– platform instabilities,
– earth rotation,
– earth curvature etc.

• Some of these distortions are co


rrected by image supplier, others
have to be corrected by referencin
g images to existing maps or othe
r images.
Image Correction : Geometric
• Geometric corrections are intended to compensate
for these distortions so that the geometric represent
ation of the imagery will be as close as possible to t
he real world.

• Many of these variations are systematic, or predict


able in nature and can be accounted for by accurate
modeling of the sensor and platform motion and the
geometric relationship of the platform with the Earth
.

• Other unsystematic, or random, errors cannot be


modeled and corrected through geometric registrati
on of the imagery to a known ground coordinate sys
tem must be performed.
Image Correction : Geometric
Sources of geometric distortions : Systematic
• Scan skew: ground swath is not normal to the pola
r axis

• Mirror-scan Velocity and panoramic distortion: alo


ng-scan distortion (pixels at edge are slightly larger)
.

• Earth rotation: earth rotates during scanning

Sources of geometric distortions : Non-systematic


• Altitude and attitude variations in satellite
Image Correction : Geometric

• The geometric registration process involves identifying th


e image coordinates (i.e. row, column) of several clearly dis
cernible points, called ground control points (or GCPs), in t
he distorted image (A - A1 to A4), and matching them to the
ir true positions in ground coordinates (e.g. latitude, longit
ude).
Image Correction : Geometric
• The true ground coordinates are typically measured
either on field using GPS or DGPS or measured from
a map (the latitude-longitude grid may already be prin
ted or the common features in raw map and the refere
nce map are identifiable).

• If there’s single raw map, and the coordinates are ap


plied either through field observation or through lat-lo
ng printed on it, this is known as ‘map registration’.

• If there’s a raw map and a reference map or image (a


lready with coordinates), geometric correction can be
through features common in them. This is known as ‘
map to map registration’.
Image Correction : Geometric
• It includes Rectification (applying
coordinates to be planimetric)

Resampling (extrapolating data


to a new grid)

• Rectification is through marking GCPs and assignin


g them coordinates
• Map registration
• Map-to-map registration

• Resampling
• Nearest neighbour
• Bilinear interpolation
• Cubic convolution
Image Correction : Geometric
• Nearest neighbour resampling
uses the digital value from the pi
xel in the original image which is
nearest to the new pixel location
in the corrected image.

• This is the simplest method and


does not alter the original values,
but may result in some pixel valu
es being duplicated while others
are lost.
Image Correction : Geometric
• Bilinear interpolation resamplin
g takes a weighted average of fo
ur pixels in the original image ne
arest to the new pixel location.

• The averaging process alters t


he original pixel values and crea
tes entirely new digital values in
the output image.

• This may be undesirable if furt


her processing and analysis, su
ch as classification based on sp
ectral response, is to be done.
Image Correction : Geometric
• Cubic convolution resampling
calculates a distance weighted a
verage of a block of sixteen pixe
ls from the original image which
surround the new output pixel lo
cation.

• This method results in complet


ely new pixel values.
Image Enhancement
• Image processing functions that are used to impr
ove the appearance of the imagery to assist in visu
al interpretation of images and image analysis are t
ermed as Image Enhancement.

Image Enhancement includes:

• Contrast enhancement
linear stretching
histogram equalization
piecewise stretching

• Spatial enhancement
low pass filter
high pass filters
Image Histogram
• A histogram is a graphical re
presentation of the brightness
values that comprise an image
.

• The brightness values (i.e. 0-


255) are displayed along the x-
axis of the graph.

• The frequency of occurrence


of each of these values in the i
mage is shown on the y-axis.
Image Histogram
Image Enhancement : Contrast enhancement
• In raw imagery, the useful data often populates only a s
mall portion of the available range of digital values (com
monly 8 bits or 256 levels).

• Contrast enhancement involves changing the original v


alues so that more of the available range is used, thereb
y increasing the contrast between targets and their back
grounds.
Image Enhancement : Linear contrast stretch
This involves identifying lower an
d upper bounds from the histogra
m (usually the minimum and maxi
mum brightness values in the ima
ge) and applying a transformation
to stretch this range to fill the full
range.

Example: A linear stretch uniform


ly expands this small range to co
ver the full range of values from 0
to 255.

This enhances the contrast in the


image with light toned areas appe
aring lighter and dark areas appe
aring darker, making visual interp
retation much easier
Image Enhancement : Linear contrast stretch
Image Enhancement : Histogram Equalization
• This stretch assigns more displ
ay values (range) to the
frequently occurring portions of
the histogram.

• In this way, the detail in these


areas will be better enhanced rel
ative to those areas of the origin
al histogram where values occur
less frequently.
Image Enhancement : Histogram Equalization
Image Enhancement : Piecewise stretching
• A piecewise linear contrast stre
tch allows for the enhancement
of a specific portion of data by d
ividing the lookup table into thre
e sections: low, middle, and hig
h.

• It enables you to create a numb


er of straight line segments that
can simulate a curve.

• You can enhance the contrast


or brightness of any section in a
single color gun at a time.
Image Enhancement : Spatial Filtering
• Spatial filters are designed to highlight or suppress
specific features in an image based on their spatial f
requency.

• Spatial frequency is related to image texture and it


refers to the frequency of the variations in tone that
appear in an image.

• In an image “rough" textured areas have high spati


al frequencies and the tone changes abruptly over a
small area, while "smooth" areas have low spatial fre
quencies and these are areas with little variation in t
one over several pixels.
Image Enhancement : Spatial Filtering
• A common filtering procedure involves moving a '
window' of a few pixels in dimension (e.g. 3x3, 5x5,
etc.) over each pixel in the image, applying a mathe
matical calculation using the pixel values under that
window, and replacing the central pixel with the new
value.

• The window is moved along in both the row and


column dimensions one pixel at a time and the calc
ulation is repeated until the entire image
has been filtered and a "new" image has been gener
ated.
Image Enhancement : Low Pass Filter
• A low-pass filter is designed to emphasize larger, ho
mogeneous areas of similar tone and reduce the smal
ler detail in an image.

• Thus, low-pass filters generally serve to smooth the


appearance of an image.

• A Low pass filter passes low - frequency signals bu


t attenuate (reduces the amplitude of) signals with fre
quencies higher than the cutoff frequency.

• Types : Average & median filters


Image Enhancement : Low Pass Filter
Image Enhancement : High Pass Filter
• High-pass filters do the opposite of low pass filter

• It sharpens the appearance of fine detail in an image.

• A high pass filter passes high frequencies well but att


enuates (i.e., reduces the amplitude of) frequencies lo
wer than the filter's cutoff frequency.

• Type : Directional or edge detection


Image Enhancement : High Pass Filter
Image Classification
• Image classification is used to digitally identify and
classify pixels in the data.

• Classification is usually performed on multi-channel


data sets and this process assigns each pixel in an im
age to a particular class or theme based on statistical
characteristics of the pixel brightness values.

• Classification can be done by supervised and unsup


ervised techniques.
Aerial Photography
Aerial Photography
• Air borne remote sensing
• Uses camera mounted in aircraft
• Aircraft with the camera mounted on it flies over the ar
ea to be captured and captures the photographs
• Photo capture, now-a-days, is automatic
• Aerial photographs may be digital or hard copy form
• Aerial photographs may be grayscale or coloured (nat
ural colour or false colour composite)
Types of aerial photographs

Vertical Oblique
Types of aerial photographs

High Oblique Low Oblique


Basics of aerial photography
Basics of aerial photography

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy