0% found this document useful (0 votes)
951 views42 pages

3rdsem Remote Sensing Practical File

REMOTE SENSING PRACTICAL FILE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
951 views42 pages

3rdsem Remote Sensing Practical File

REMOTE SENSING PRACTICAL FILE
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 42

Page |1

FUNDAMENTALS OF REMOTE SENSING


PRACTICAL FILE
SUBMITTED TO THE UNIVERSITY OF
DELHI
FOR THE PARTIAL FULFILMENT OF THE
REQUIREMENTS FOR
THE AWARD OF THE DEGREE OF
BACHELOR OF ARTS (GEOGRAPHY HONS.)
PAPER CODE –

SUBMITTED BY KUSHA FAUJDAR


ROLL NUMBER –
SEMESTER III, AUGUST-DECEMBER, 2023

SUPERVISED BY-
DR. BRATATI BARIK
DEPARTMENT OF GEOGRAPHY
KAMALA NEHRU COLLEGE
UNIVERSITY OF DELHI
Page |2

CERTIFICATE

This is to certify that this practical file “Fundamentals of


Remote Sensing” is my original work. It is submitted to the
Department of Geography, Kamala Nehru College, University of
Delhi for the consideration in fulfilment of the requirement for
the paper-based Fundamentals of Remote Sensing practical file.
This original work was undertaken under the supervision of the
faculty in the academic year 2023-24 for third semester in
pursuance of the degree of Geography Hons.

Name- Kusha Faujdar


Roll No.- 22/502
University Roll No. –
Department of Geography
Kamala Nehru College
University of Delhi

Signature of the Teacher-----------------------------------------


Page |3

ACKNOWLEDGEMENT

I would like to express my sincere gratitude and appreciation as


I present this report on Fundamentals of Remote Sensing. It
gives me a great sense of pleasure to acknowledge the
contributions and support of several individuals who have been
instrumental in the completion of this project.
First and foremost, my deepest thanks go to my teacher- Dr.
Bratati Barik, for their unwavering guidance, mentorship, and
valuable insights throughout the project. I would like to thank
our principal Dr. Kalpana Bhakuni and the Department of
Geography of Kamala Nehru College for giving me this
opportunity through which I got a chance to learn new things
and enhance my knowledge. Their expertise and
encouragement were invaluable in shaping the direction of this
work and played a part in making this project a reality.
I would also like to extend my gratitude to all those who aided,
feedback, and encouragement during this research. Your input
was essential in shaping this report.

Name of Supervisor
Name of Student

(DR. BRATATI BARIK)


(KUSHA FAUJDAR)
Page |4

S.NO TOPIC PAGE NO.

INTRODUCTION TO REMOTE
1. 5-13
SENSING

AERIAL PHOTOS
 CALCULATING THE SCALE
OF AERIAL PHOTOGRAPH
2. 14-22
 ANNOTATION
 INTERPRETATION

SATELLITE REMOTE SENSING
 EMR
 INTERACTION OF EMR WITH
3. 23-27
EARTH ATMOSPHERE
 MAJOR SATELLITES AND
SENSOR

SATELLITE IMAGE PROCESSING


 SUPERVISED CLASSIFICATION
4. 28-34
 UNSUPERVISED
CLASSIFICATION

APPLICATIONS OF REMOTE
SENSING
5. 35-38
 NORMALISED DIFFERENCE
VEGETATION INDEX

CONTEN
Page |5

CHAPTER : 1 (INTRODUCTION TO REMOTE


SENSING)
REMOTE SENSING
Remote sensing is a technical term which was coined by Evelyn
Pruitt, a geographer, during the space age of the 1960s,
combining the words “remote” and “sensing” to describe what
it is and what it does. The term refers to techniques which are
used to analyse objects which are far away; for example,
analysing what these objects are or what states they are in.
Remote Sensing is a technology of acquiring information about
earth surface without actually being contact with it. This is done
by sensing and recording reflection or emitted energy and
processing , analysing and applying that information.
The United Nations in their annex Principles Relating to
Remote Sensing of the Earth from Space defines it as:
"The term Remote Sensing means the sensing of the Earth's
surface from space by making use of the properties of
electromagnetic waves emitted, reflected or diffracted by the
sensed objects, for the purpose of improving natural resources
management, land use and the protection of the environment."
Lillesand and Kiefer in their book "Remote Sensing and
Image Interpretation" even define it as an art: "Remote Sensing
is the science and art of obtaining information about an object,
area, or phenomenon through the analysis of data acquired by
a device that is not in contact with the object, area, or
phenomenon under investigation."
India’s, National Remote Sensing Centre(NRSC) in 1995,
defined remote sensing as, “ Remote Sensing is the technique
of deriving information about objects on the surface of the
earth without physically coming into contact with them”.
Thus, remote sensing provides a means of observing large
areas at finer spatial and temporal frequencies. It finds
extensive applications in civil engineering including watershed
studies, hydrological states and fluxes simulation, hydrological
modelling, disaster management services such as flood and
drought warning and monitoring, damage assessment in case
of natural calamities, environmental monitoring, urban planning
etc.
Humans apply remote sensing in their day-to-day business,
through vision, hearing and sense of smell. The data collected
can be of many forms: variations in acoustic wave distributions
Page |6

(e.g., sonar), variations in force distributions (e.g., gravity


meter), variations in electromagnetic energy distributions (e.g.,
eye) etc. These remotely collected data through various
sensors may be analysed to obtain information about the
objects or features under investigation.

Fig.1. ELEMENTS OF REMOTE SENSING

PLATFORMS
Platform is a stage where sensor or camera is mounted to
acquire information about a target under investigation. There
are different types of platforms and based on its altitude above
earth surface, these may be classified as:
Page |7

Fig.2. PLATFORMS
a. GROUND BASED PLATFORM
Instruments that are ground based are often used to measure
the quantity and quality of light coming from the sun or for
close range characterization of objects. For example, to study
properties of a single plant or a small patch of grass, it would
make sense to use a ground-based instrument.
b. AIR BORNE PLATFORM
Airborne platforms are used to collect very detailed images and
facilitate the collection of data over virtually any portion of the
Earth's surface at any time. Airborne platforms were the sole
non-ground-based platforms for early remote sensing work.
Balloon, Drones, Aircraft are the examples of the air borne
platform.
c. SPACE BORNE PLATFORM
In space-borne remote sensing, sensors are mounted on-board
a spacecraft (space shuttle or satellite) orbiting the earth.
Space-borne or satellite platform are onetime cost effected but
relatively lower cost per unit area of coverage, can acquire
imagery of entire earth without taking permission. Space borne
imaging ranges from altitude 250 km to 36000 km.

SENSORS
Sensor is a device that gathers energy (EMR or other), converts
it into a signal and presents it in a suitable form for obtaining
information about the target under investigation. Depending on
the source of energy, sensors are categorized as active or
passive:
a. ACTIVE SENSORS
Active sensors are those, which have their own source of EMR
for illuminating the objects. Radar (Radio Detection and
Ranging) and Lidar (Light Detection and Ranging) are some
examples of active sensor.
Page |8

Fig.3. SENSORS TYPES

b. PASSIVE SENSORS
Passive sensors do not have their own source of energy. These
sensors receive solar electromagnetic energy reflected from the
surface or energy emitted by the surface itself. Therefore,
except for thermal sensors they cannot be used at night time.
Thus, in passive sensing, there is no control over the source of
electromagnetic radiation. The Thematic Mapper (TM) sensor
system on the Landsat satellite is a passive sensor.

RESOLUTION
Resolution refers to the amount of information available in a
satellite imagery. There are four types of resolution in satellite
imageries i.e., Spatial, Spectral, Radiometric and
Temporal resolutions. These four types of resolution in
remote sensing determine the amount and quality of
information in an imagery.

1. SPATIAL RESOLUTION
Spatial resolution is the detail in pixels of an image. High spatial
resolution means more detail and smaller pixel size. Whereas,
lower spatial resolution means less detail and larger pixel size.
Typically, drones like DJ I capture images with one of the
highest spatial resolutions. Even though satellites are highest in
the atmosphere, they are capable of 50cm pixel size or greater.

Fig.4. SPATIAL RESOLUTION

2. SPECTRAL RESOLUTION
Page |9

Spectral Resolution is the amount of spectral detail in a band.


High spectral resolution means its bands are narrower. Whereas
low spectral resolution has broader bands covering more of the

spectrum.
Fig.5. SPECTRAL RESOLUTION

3. TEMPORAL RESOLUTION
Temporal Resolution is the time it takes for a satellite to
complete a full orbit. UAVs, airplanes, and helicopters are
completely flexible. But satellites orbit the Earth in set paths.
Different types of orbits are required to achieve continuous
monitoring (meteorology), global mapping (land cover
mapping), or selective imaging (urban areas). The following
orbit types are more common for remote sensing missions;
a. Sun-synchronous orbit: also referred as near polar orbit,
having inclination angle between 98 and 99 which is relative
to a line running between the North and South poles thereby
enabling the satellite to always pass overhead at the same
time. The platforms are designed to adopt an orbit in north-

south direction. Ex- Landsat, IRS and SPOT.


Fig.6. SUN-SYNCHRONOUS ORBIT

b. Geostationary orbit: has (zero degrees) inclination angle


i.e., satellite is placed above the equator at an altitude of
36,000 km. The orbital period is kept equal to the rotational
period of the earth that results in fixed position of satellite
P a g e | 10

relative to the earth. Most commonly used for


meteorological and telecommunication satellites.
Fig.7. GEOSTATIONARY ORBIT

4. RADIOMETRIC RESOLUTION
Radiometric resolution relates to how much information is
perceived by a satellite’s sensor. The finer the radiometric
resolution of a sensor, the more sensitive it is to detecting small

differences in reflected or emitted energy.


Fig.8. RADIOMETRIC RESOLUTION

SEARCH ENGINES
EARTHDATA
It is a search and discovery application for NASA’s Earth Observing
System Data and Information System (EOSDIS) . It provides the only
means for data discovery, filtering, visualization, and access across
all of NASA’s Earth science data holdings. Earth data Search offers
users the ability to select locations on a map to find corresponding
data . Through the Earth data Forum, users can interact with subject
matter experts from several NASA Distributed Active Archive Centres
(DAACs) to discuss general questions, research needs, and data
applications.
P a g e | 11

Fig.9. EARTHDATA
EARTH EXPLORER
USGS is a federal agency that provides science-based
information on Earth-system interactions, natural hazards and
water resources. USGS Earth Explorer is a web-based
platform that allows you to search for and view various types of
data sets from the USGS, such as geocoded addresses,
features, and dates. The U.S. Geological Survey's (USGS) Earth
Explorer Web site provides access to millions of land-related
products, including the following: Satellite images from
Landsat, advanced very high-resolution radiometer (AVHRR),
and Corona data sets.

Fig.10. EARTH EXPLORER

LP-DAAC
The Land Processes Distributed Active Archive Center (LP
DAAC) is one of several discipline-specific data centers within
the NASA Earth Observing System Data and Information
System (EOSDIS). LP DAAC is a distributed archive center that
processes, archives, and distributes NASA data products related
to land processes, such as land cover, surface reflectance,
temperature, and topography.
P a g e | 12

SOFTWARE
QGIS is a free and open-source cross-
platform desktop geographic information system (GIS)
application that supports viewing, editing, printing, and analysis
of geospatial data. QGIS functions as geographic information
system (GIS) software, allowing users to analyse and edit
spatial information, in addition to composing and exporting
graphical maps.] QGIS supports raster and vector . Vector data
is stored as either point, line, or polygon features. Multiple
formats of raster images are supported, and the software
can geo-reference images. QGIS enables users to visualize their

data using maps, charts, and diagrams while customizing the


presentation with a variety of symbology choices.

Fig.11. QGIS

ARCGIS
P a g e | 13

ArcMap is Esri’s classic desktop geographic information system (GIS)


software for creating, analyzing, and visualizing
spatial data and maps by visualizing geographical statistics through
layer building maps like climate data or trade flows. ArcGIS is
geospatial software to view, edit, manage and analyze geographic
data. Esri (Environmental Systems Research Institute) develops
ArcGIS
for

mapping on desktop, mobile, and web.

Fig.12. ARCGIS

ERDAS
ERDAS (Earth Resource Data Analysis System) is a mapping
software company specializing in Geographic Imaging solutions.
Software functions include importing, viewing, altering, and
analysing raster and vector data sets.
P a g e | 14

Fig.13. ERDAS

IDRISI
IDRISI is an integrated geographic information system (GIS)
and remote sensing software developed by Clark Labs at Clark
University for the analysis and display of digital geospatial
information. IDRISI is a PC grid-based system that offers tools for
researchers and scientists engaged in analysing earth system
dynamics for effective and responsible decision making for
environmental management, sustainable resource development and

equitable resource allocation.


Fig.14. IDRISI
P a g e | 15

CHAPTER: 2 (AERIAL PHOTOS)


AERIAL PHOTOGRAPH
Scale of aerial photography refers to the proportion of distance
between two points on the aerial photograph and the distance
between the same points on the ground. It can be expressed in
unit equivalents like 1 cm= 1,000 km (or 12,000 inches) or as a
representative fraction (1:100,000).
Generally, aerial photos are classified into the following three
types based on the scale.
LARGE SCALE: Larger-scale photos (e.g., 1:25 000) cover
small areas in greater detail. A largescale photo simply means
that ground features are at a larger, more detailed size. The
area of ground coverage that is seen on the photo is less than
at smaller scales.
MEDIUM SCALE: The aerial photographs with a scale ranging
between 1 : 15,000and 1 : 30,000 are usually treated as
medium scale photographs.
SMALL SCALE: Smaller-scale photos (e.g., 1:50 000) cover
large areas in less detail. A small-scale photo simply means
that ground features are at a smaller, less detailed size. The
area of ground coverage that is seen on the photo is greater
than at larger scales.
P a g e | 16

Fig.15. GEOMETERY OF AERIAL PHOTOGRAPH

CALCULATING THE SCALE OF AERIAL


PHOTOGRAPH
The concept of scale for aerial photography is much the same
as that of a map. Scale is the ratio of a distance on an aerial
photograph the distance between the same two places on the
ground in the real world. It can be expressed in unit equivalents
like 1 cm= 1,000 km(or 12,000 inches) or as a representative
fraction (1:100,000).
Scale determines what objects would be visible, the accuracy of
estimates and how certain features will appear. When
conducting an analysis that is based on air photos, it will
sometimes be necessary to make estimates regarding the
number of objects, the area covered by a certain amount of
material or it may be possible to identify certain features based
on their length. To determine this dimension during air
photointerpretation, it will be necessary to make estimates of
lengths and areas, which require knowledge of the photo scale.
There are three methods to compute the scale of an aerial
photograph using different sets of information.
METHOD 1: By Establishing Relationship Between Photo
Distance and Ground Distance.
Ques: The distance between two points on an aerial photograph
is measured as 2 centimetres. The known distance between the
same two points on the ground is 1 km. Compute the scale of
the aerial photograph (Sp).
Formula Used: Photo Scale Photo Distance/Ground
Distance
Solution: Given : Dp(Distance on Aerial photograph)=2cm
Dg(Corresponding ground distance)=1km
Since, 1km= 100000cm
Find: Scale of the aerial photograph(Sp)
(Sp)=Dp/Dg
=2/10000
0
=1/50,00
0
P a g e | 17

Thus, the scale is calculated as 1:50,000

1 unit on photo represents 50,000


units on ground.

Method 2: By Establishing Relationship Between Photo


Distance and Map Distance.
As we know, the distances between different points on the
ground are not always known. However, if a reliable map is
available for the area shown on an aerial photograph, it can be
used to determine the photo scale. In other words, the
distances between two points identifiable both on a map and
the aerial photograph enable us to compute the scale of the
aerial photograph (Sp). The relationship between the two
distances may be expressed as under :
(Photo scale : Map scale) = (Photo distance : Map
distance)
We can derive,
Photo scale (Sp) = Photo distance (Dp) : Map distance
(Dm) x Map scale factor (MSF)

Ques: The distance measured between two points on a map is


2cm. The corresponding distance on an aerial photograph is 10
cm. Calculate the scale of the photograph when the scale of the
map is 1:50,000.
Solution: Map Distance(DM)=2cm
Photo Distance(PD)=10cm
Map Scale Factor(MSF)=1:50,000
Therefore, Scale=10:2*50,000
Scale of the photograph=1:100000

1 unit on photo is represented by 100000


units on the ground.
P a g e | 18

Method 3: By Establishing Relationship Between Focal


Length (f) and Flying Height(H) of the Aircraft .
If no additional information is available about the relative
distances on photograph and ground/map, we can determine
the photo-scale provided the information about the focal length
of the camera (f) and the flying height of the aircraft (H) are
known . The photo scale so determined could be more reliable if
the given aerial photograph is truly vertical or near vertical and
the terrain photographed is flat. The focal length of the camera
(f) and the flying height of the aircraft (H) are provided as
marginal information on most of the vertical photographs.
Focal Length (f) : Flying Height(H) =Photo distance
(Dp) : Ground distance (Dg)

Ques: A camera with a 152mm focal length takes an aerial


photograph from a flying height of 2780m above sea level and
the average elevation of the terrain above sea level is 500m
What is the scale of photo?
Given: focal length(f)= 152mm
Height of flying aircraft(H)= 2780m [1m=1000mm]
=2780*1000 = 2780000mm
Height of the ground(h)= 500m [1m=1000mm]
= 500000mm
Therefore, Scale = f/H-h
= 152/2780000-500000
= 152/2280000
Scale of the photograph= 1/15000
P a g e | 19

1 unit on photo is represented by 15000 units on the


ground.

Fig.16. Focal Length of the Camera (f) and Flying Height of the Aircraft (H)
P a g e | 20

ANNOTATION OF AN AERIAL PHOTOGRAPH

A4

Table .1. ANNOTATION OF AN AERIAL PHOTOGRAPH


P a g e | 21

INTERPRETATION OF ANNOTATION

S.NO Annotation Interpretation

1. IRSD ID: 1296 Job number/ Specification number

2. Air Survey Company: B Name of the agency

Strip number of the aerial


3. Strip Number: A4
photograph

4. Photograph Number :4 Photo number in the strip

5. Agency Number :5832 Number given by the agency

6. Secret This represent that data is secret

It represents the height at which the


7. Altimeter:
aircraft is flying

It represents the time at which


8. Time:
photograph is taken

Atmospheric Pressure and It represents the atmospheric


9. Focal Length: UAg479 and pressure and the focal length of the
151.80 camera lens

These marks are crosses on the four


10 Fiducial Marks:
corners of the photo

These marks help to locate the


precise Centre of a photograph, i.e.,
11. Principal Points:
the principal point by joining the
opposite fiducial marks
Table .2. INTERPRETATION OF ANNOTATION
P a g e | 22

VISUAL INTERPRETATION OF AERIAL PHOTOGRAPH


Table .3. VISUAL INTERPRETATION OF AERIAL PHOTOGRAPH

TONE/
TEXTU RESOLUT ASSOCIATI SHAD
FEATURES SIZE SHAPE COLO PATTERN
RE ION ON OW
UR

SETTLEME SQUARE & ROUG NEAR


SMALL DARK GOOD NA DISPERSED
NTS RECTANGLE H ROADS

REGULAR
SHAPE AMIDST
ROADWA NARRO SMOOT LINEAR &
WITH LIGHT GOOD BUILTUP NA
YS W H Y-SHAPED
SMOOTH AREA
CURVES

NEAR
MEDIU BUILTUP
CROPLAN SQUARE & SMOOT CLUSTERE
M TO MIXED GOOD AREA, NA
D RECTANGLE H D
LARGE ROADS &
RIDGE

SMALL
FALLLOW TO ROUG NEAR THE
RANDOM LIGHT GOOD NA DISPERSED
LAND MEDIU H CROPLAND
M
NEAR CROP
BARREN ROUG LAND &
SMALL IRREGULAR LIGHT GOOD NA DISPERSED
LAND H BUILTUP
AREA
NEAR THE
VEGETATI MEDIU
ROUG BUILT-UP CLUSTERE
ON M TO IRREGULAR DARK GOOD NA
H AREA & D
COVER LARGE
CROPLAND

ROUG NEAR THE CLUSTERE


SHRUBS SMALL RANDOM DARK GOOD NA
H RIDGE D

CLOSE TO
MEANDERIN SMOOT THE CONTINUO
STREAM SMALL DARK GOOD NA
G H AGRICULTU US
RAL FIELDS
CLOSE TO
LAKE/ RECTANGU SMOOT THE CONTINUO
SMALL DARK GOOD NA
POND LAR H AGRICULTU US
RAL FIELDS
AMIDST
ROUG VEGETATIO CONTINUO
RIDGE LARGE REGULAR DARK GOOD NA
H N COVER & US
CROPLAND
P a g e | 23

INTERPRETATION KEYS:
S.NO FEATURES INTERPRETATION KEYS
.
1. SETTLEMENT
2. ROADWAYS
3. CROPLAND
4. FALLOWLAND
5. BARRENLAND
6. VEGETATION COVER
7. SHRUBS
8. STREAM
9. LAKE/POND
10. RIDGE
Table .4. INTERPRETATION KEYS

1.SETTLEMENTS
Most of the communities are found in the right portion of aerial
shot. Settlements are dispersed as they are not concentrated in
an area & widely spread apart. Also, settlements are
surrounded by the agricultural land and small shrubs. It is
connected to the roads that run alongside the village.

2. ROADWAYS
A road was visible with light color in this aerial imagery. It is in
linear and Y-shaped pattern. Close to the cropland and
settlement. Additionally, a road was seen across the ridge and
fairly spread all over the region.

3. CROPLAND
Cropland consists of fertile land used for farming and
production of food, fodder and commercial crops. In this aerial
photograph, cropland can be seen almost in every direction
with varying size and shape. A large patch of agriculture land is
in the upper portion of the imagery near the pond for the
purpose of irrigation.

4. FALLOWLAND
P a g e | 24

Fallow land is the arable land which left without sowing for one
or more vegetation cycles. In this imagery, fallow land can be
seen in square shape, adjacent to croplands.
5. WASTELAND
It is the unproductive land, which is left uncultivated for more
than five years included in this category like hilly terrains,
deserts, ravines etc. It can be brought under cultivation after
improving its fertility. This imagery demonstrate wasteland with
asymmetrical shape and size.

6. VEGETATION COVER
It refers to all plants and trees collectively. In the current satellite
view, there are a few small and medium-sized patches of plants that
are irregularly shaped, have a rough texture, and lack a clear pattern.
The plants along the river and farmed land are visible in the satellite
imagery larger portion of this category can be seen in the
western and eastern part of the imagery.

7. SHRUBS
It refers to scattered group of small trees and plants across the
landscape rather being concentrated in one area. They are
visible near the ridge in circular shape and along the
boundaries of the cropland. They appear in small
agglomerations within open areas like grasslands.

8. STREAM
It is a continuous body of surface water flowing within the
narrow channels. Running in western part of the aerial
photograph and surrounded by vegetation cover. And also,
stream is surrounded by agricultural land and little forest area.

9. LAKE/POND
It is a freshwater body surrounded all sides by land. In this
imagery, lake can be seen in the northern part as an important
source of irrigation for surrounded cropland.

10. RIDGE
It is a long, narrow elevation of land. The ridge was a seen as a
large regular uplifted land amidst vegetation cover and
cropland. And it is divided by road into two parts in the north.
P a g e | 25

Another small ridge in the Centre of the imagery running from


west to east with rough texture.

CHAPTER : 3 (SATELLITE REMOTE SENSING)

EMR(Electromagnetic Radiation)
Electromagnetic radiation is a form of energy that is produced
by oscillating electric and magnetic disturbance, or by the
movement of electrically charged particles traveling through a
vacuum or matter. The electric and magnetic fields come at
right angles to each other and combined wave moves
perpendicular to both magnetic and electric oscillating fields
thus the disturbance. Electron radiation is released as photons,
which are bundles of light energy that travel at the speed of
light as quantized harmonic waves. This energy is then grouped
into categories based on its wavelength into the
electromagnetic spectrum. These electric and magnetic waves
travel perpendicular to each other and have certain
characteristics, including amplitude, wavelength, and
frequency.

Fig .17. ELECTROMAGNETIC SPECTRUM


P a g e | 26

The interaction between electromagnetic radiation and the


Earth’s atmosphere:
 INTERACTION WITH ATMOSPHERE
1. Absorption: Absorption is the process by which radiant
energy is absorbed and converted into other forms of energy.
The absorption of the incident radiant energy may take place in
the atmosphere and on the terrain. An absorption band is a
range of wavelength or frequency in the electromagnetic
spectrum within which radiant energy is absorbed by a
substance.
2. Scattering: Atmospheric scattering is the unpredictable
diffusion of radiation by particles in the atmosphere. It occurs
when particles or large gases, or molecules present in the
atmosphere interact with and cause the EMR to be radiated
from its original path. How much scattering takes place
depends on several factors, including the wavelength of the
radiation. The diameter of particles or gaseous molecules. And
the distance. The dedication travel through the atmosphere.
The amount of scattering is inversely proportional to the 4th
power of wavelength of radiation.
3. Refraction: EMR encounters substances of different
densities like air and water refraction takes place. Refraction
refers to the bending of light when it passes from one medium
to another. Refraction occurs because the media are of different
densities and the speed of EMR is different in each medium.

 INTERACTION WITH EARTH SURFACE

1. Reflection: Reflection is the process whereby radiation


‘bounces off’ an object like the top of a cloud or the terrestrial
earth. Reflection is predictable. Reflection is the process by
which electromagnetic radiation is returned either at the
boundary between two media (surface reflection) or the interior
of a medium (volume reflection), whereas transmission is the
passage of electromagnetic radiation through a medium.
P a g e | 27

2. Transmission: Transmission of radiation takes place when


radiation passes through an object without significant
attenuation. For a given thickness or depth of an object, the
ability of a medium to transmit energy can be measured as
transmittance.
P a g e | 28

MAJOR SATELLITES AND SENSOR


LANDSAT
The Landsat program is the longest-running enterprise for
acquisition of satellite imagery of Earth. It is a joint NASA/USGS
program. On July 23, 1972 the Earth Resources Technology
Satellite was launched. This was eventually renamed to
Landsat. The most recent, Landsat 8, was launched on February
11, 2013. The instruments on the Landsat satellites have
acquired millions of images. The program was initially called
the Earth Resources Technology Satellites Program, which was
used from 1966 to 1975. In 1975, the name was changed to
Landsat.
Landsat 1 through 5 carried the Landsat Multispectral Scanner
(MSS). Landsat 4 and 5 carried both the MSS and Thematic
Mapper (TM) instruments. Landsat 7 uses the Enhanced
Thematic Mapper Plus (ETM+) scanner. Landsat 8 uses two
instruments, the Operational Land Imager (OLI) for optical
bands and the Thermal Infrared Sensor (TIRS) for thermal
bands.

IRS (Indian Remote Sensing)


The Indian Remote Sensing (IRS) program is a series of Earth
observation satellites. The Indian Remote Sensing (IRS) satellite
system was launched in 1988 by the Indian Space Research
Organization (ISRO). The first satellite, IRS-1A, was launched on
March 17, 1988. The IRS program currently has more than a
dozen satellites in operation. The IRS is the largest civilian
remote-sensing satellite constellation in the world. Following
the successful demonstration flights of Bhaskara-1 and
Bhaskara-2 satellites launched in 1979 and 1981, respectively,
India began to develop the indigenous Indian Remote Sensing
(IRS) satellite program to support the national economy in the
areas of agriculture, water resources, forestry and ecology,
geology, water sheds, marine fisheries and coastal
management.
It has sensors like LISS-I which had a spatial resolution of 72.5
meters with a swath of 148 km on ground. LISS-II had two
separate imaging sensors, LISS-II A and LISS-II B, with spatial
resolution of 36.25 meters each and mounted on the spacecraft
in such a way to provide a composite swath of 146.98 km on
ground
P a g e | 29
P a g e | 30

DOWNLOADING BHUVAN DATA

Fig.18. RESOURCESAT-1 META DATA


P a g e | 31

DOWNLOADING LANDSAT DATA

Fig.19. LANDSAT METADATA


P a g e | 32

CHAPTER : 4 (SATELLITE IMAGE PROCESSING)


Two major categories of image classification technique include
unsupervised (calculated by software) and supervised
(human guided classification).

SUPERVISED CLASSIFICATION
AREA OF INTEREST : GOBIND SAGAR LAKE ,
HIMACHAL PRADESH

Supervised image classification is a method where the user


provides a set of labelled training samples for each class of
interest. The classifier algorithms use these training samples to
learn the characteristics of each class and then apply this
knowledge to classify the entire image into the specified
categories. This method relies on the user’s expertise and
understanding of the study area and typically results in higher
accuracy compared to unsupervised classification. The main
advantage of Supervised Classification is that it allows for
accurate and efficient classification of large areas. It is also a
flexible technique that can be used for a variety of applications.

There are Three Steps for Performing Supervised Classification:


1. Create training set
2. Develop signature file
3. Classify image
P a g e | 33

METHODOLOGY:
 Open QGIS Software .
 From Layer, click on add raster layer.
 Now select the files from the folder . we have to select 7
consecutive bands from the folder.
 Now add those 7 raster files.
 Now go to SCP in toolbar and click on Band set. In the Band
set select all 7 layers by clicking on “+” sign and select
wavelength as Landsat 8 and wavelength unit given below.
 Now go to Preprocessing in SCP and click on clip raster
bands and click “+” and clip the area from the raster image
taken. After that run the process and save the files for future
reference.
 Now add the Virtual Band set layer from the box RGB.
(Type 5-4-3 in RGB).
 Add SCP Dock in the layer and now select Create a new
Training Input option from the SCP Dock and save the file.
 Now make macro class and class in MC and C options.
 Click on Create a ROI Polygon.
 Make 4-5 macro class and sub classes and save all class
signature.
 Now go to SCP and then go to Band Processing and then
go to Classification.
 Now if we want to use macro class then select MC ID and in
algorithm select Maximum Likelihood.
 Then select save signature raster and then click save
classifier and then click on RUN.
 Now we can see the macro class classification.
 FOR SPECTRAL SIGNATURE, Select all Macro ID .
 Click on highlighted signature to spectral signature
plot in SCP Dock.
 Create a map layout for the layers generated and then export the
map as placed on the page that follows.
P a g e | 34

Fig.20. SUPERVISED CLASSIFICATION


P a g e | 35

Fig.21. SPECTRAL SIGNATURE


P a g e | 36

UNSUPERVISED CLASSIFICATION
AREA OF INTEREST : GOBIND SAGAR LAKE ,
HIMACHAL PRADESH
Unsupervised classification is a technique in remote sensing
where the classification algorithm automatically groups pixels
with similar spectral properties into clusters. The user does not
need to provide any training data to the algorithm. Instead, the
algorithm analyses the spectral properties of the image and
identifies clusters of pixels with similar spectral properties.

Unsupervised classification is a faster and easier approach than


supervised classification because the user does not need to
provide any training data. However, it is less accurate than
supervised classification because the algorithm does not have
any information about the land cover types.
There are Two Steps for Performing Unsupervised Classification:
1. Generate clusters
2. Assign classes
P a g e | 37

METHODOLOGY:
 Open QGIS Software .
 From Layer, click on add raster layer.
 Now select 7 consecutive clips from the folder and add them.
 Now go to SCP in toolbar and click on Band set. In the Band set
select all 7 layers by clicking on “+” sign and select wavelength
as Landsat 8 and wavelength unit given below.
 Click on Processing in toolbar and then go to toolbox, click on
SAGA and select K means clustering for supervised raster
under Imagery-Classification.
 Dialog box appear. Select all clips in Grids option. Write 5 in
clusters option.
 Double click on clusters in layer panel and go to properties.
 Select Single band pseudocolor as Render type.
 Select Discrete in Interpolation. And change mode to Equal
Interval.
 Create a map layout for the layers generated and then export
the map as placed on the page that follows.
P a g e | 38

Fig.22. UNSUPERVISED CLASSIFICATION


P a g e | 39

CHAPTER : 5
APPLICATIONS OF REMOTE SENSING

 Land Use Land Cover: Remote sensing is extensively used for


the assessment of land use and land cover. With the help of the
satellite imageries with a combination of different bands, you
can easily differentiate settlements and land covered from the
imageries. With comparing from the multi-aged data, the
percentage of settlements, deductions of forestry and rise in
human intervention in nature can be easily assessed.

 Vegetation monitoring: Traditional ground surveys of the


forest is not always feasible or a viable option. In these cases,
remote sensing technology helps in a great way to map,
identify and delineate various forest types. Using remote
sensing, a large area of the forest can be studied, and the
species can be identified with very specific details including
tree type, height and density. With high-resolution data and
satellite imagery, the texture and leaf density too can be
assessed to determine any infestations or stresses on the trees.
With the possibility of clear-cut mapping of the forested areas,
the extent of deforestation and its implications can be
understood better. Moreover, through the remote sensing data,
the forest fire assessment can be done in which you would get
to know the details of the area in which it has been affected as
the colour of the burnt leaves would leave a trace in the
satellite data than the normal greens leaves. Another use can
be the settlement of the area done by cutting the forest area.
You can clearly get the picture of deforestation from the
subsequent data.

 Urban sprawl: Remote sensing and geographic information


systems (GIS) are common tools for analysing urban sprawl at
different spatial scales Remote sensing allows the change and
distribution of urban areas to be delineated more effectively,
while GIS enables valuable spatial information to be analysed in
a study, remote sensing and GIS methods were used to monitor
and evaluate urban sprawl by extracting built-up regions from
Landsat classified images of four distinct periods
P a g e | 40

NORMALIZED DIFFERENCE VEGETATION INDEX


NDVI or Normalized Difference Vegetation Index is a remote
sensing method that uses the reflectance of light in the visible
and near-infrared (NIR) wavelengths to determine the amount
and health of vegetation in an area. NDVI is widely used in
agriculture, forestry, and ecology to monitor the growth and
health of vegetation and to identify areas of stress or damage.
NDVI values can also be used to map and classify vegetation
types, and to detect changes in vegetation cover over time.
Normalized Difference Vegetation Index (NDVI) quantifies
vegetation by measuring the difference between near-infrared
(which vegetation strongly reflects) and red light (which
vegetation absorbs). Formula to calculate NDVI :

 NDVI always ranges from -1 to +1 .


 -1 is likely that its water.
 +1 means dense green leaves.
 Close to 0 means there are likely no green leaves and it could
even be an urbanised area.

Healthy vegetation (chlorophyll) reflects more near-infrared


(NIR) and green light compared to other wavelengths. But it
absorbs more red and blue light.
P a g e | 41

METHODOLOGY:
 Open QGIS Software .
 From Layer, click on add raster layer.
 Now select 4th and 5th clip from the folder and Add them.
 Now go to SCP in toolbar and click on Band set. In the Band set
select both clips by clicking on “+” sign and select wavelength
as Landsat 8 and wavelength unit given below.
 Now go to Raster in Toolbar and then click on Raster
Calculator.
 A dialog box appears. Select location to save file.
 Put formula to calculate NDVI.
 Double click on NDVI layer in layer panel and go to properties.
 Select Single band pseudocolor as Render type.
 Select Discrete in Interpolation. And change mode to Equal
Interval.
 Create a map layout for the layers generated and then export
the map as placed on the page that follows.
P a g e | 42

Fig.23. NORMALISED DIFFERENCE VEGETATION INDEX

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy