Remote Sensing Notes Class XII – ENG

  • Home
  • Remote Sensing Notes Class XII – ENG
Shape Image One
Remote Sensing Notes Class XII – ENG
Remote Sensing Chapter 1

Class XII — Geospatial Technology

Chapter 1: Remote Sensing (RS)

Introduction to Remote Sensing
Core Definition:

Remote Sensing (RS) is the observation of an object, surface, or phenomenon through the use of recording devices that are not in direct physical contact with the object. RS deals with inventory, monitoring, and assessment of natural resources through analysis of data obtained from remote sensing platforms such as aircraft, spacecraft, satellites, or ships equipped with cameras, lasers, radar, sonar, and other sensors.

Remote sensing is the process of acquiring information about Earth features without being in direct contact with them. It enables continuous acquisition of up-to-date information about Earth's surface and atmosphere using electromagnetic radiation (energy) that cannot be reached by human vision alone.

Why Remote Sensing Matters:
  • Assessing and observing vegetation types and health
  • Conducting soil surveys and understanding soil properties
  • Carrying out mineral exploration and geological studies
  • Creating and updating maps and thematic layers
  • Planning and monitoring water resources
  • Urban and regional planning
  • Assessing crop yields and agricultural management
  • Assessing and managing natural disasters
  • Studying spatial relationships between features and delineating regional trends
  • Multi-disciplinary resource monitoring and inventory
Key Advantage:

Remote sensing data offers multidisciplinary applications—the same RS data can be used by researchers in different disciplines such as geology, forestry, land use, agriculture, hydrology, and environmental science. It offers wide regional coverage combined with good spectral resolution, making it economical and efficient.

Electromagnetic Radiation (EMR) Fundamentals
What is Electromagnetic Radiation?

Electromagnetic Radiation (EMR) is energy that travels as waves through space at the speed of light. All objects with temperatures higher than absolute zero (−273°C or 0 K) emit EMR continuously. The intensity of emitted radiation depends on the composition and temperature of the body.

Wien's Displacement Law:

This fundamental law relates the temperature of an object to the wavelength of peak radiation it emits:

λmax = b / T

Where:

  • λmax = wavelength (in micrometers) at which highest radiation occurs
  • b = Wien's constant = 0.29 cm·K (or 2898 μm·K)
  • T = absolute temperature (in Kelvin)
Application:

Using this law, you can estimate the temperature of objects by measuring the wavelength of peak radiation.

Examples:
  • Sun: λmax = 0.48 μm (temperature ≈ 6,000 K) — peaks in visible/yellow region
  • Fire: λmax = 0.58 μm (temperature ≈ 5,000 K)
  • Incandescent lamp: λmax = 0.72 μm (temperature ≈ 4,000 K)
  • Earth (ambient): λmax = 9.7 μm (temperature ≈ 300 K) — peaks in thermal infrared
Blackbody Radiation

A blackbody is an ideal theoretical object that absorbs 100% of incident electromagnetic radiation at all wavelengths and emits radiation at maximum efficiency. The Sun is a close approximation to a blackbody radiator. The amount and wavelength of radiation emitted by a blackbody depends solely on its temperature.

Key Principle: Hotter objects emit more total energy and peak at shorter wavelengths (blue/visible light), while cooler objects emit less energy and peak at longer wavelengths (infrared).

The Electromagnetic Spectrum

The electromagnetic spectrum encompasses all types of electromagnetic radiation arranged by wavelength or frequency. Remote sensing utilizes various regions of this spectrum.

RegionWavelength RangeFrequency RangePrimary Applications in RS
Gamma Rays< 0.03 nm> 1019 HzRadioactive material detection
X-Rays0.03 nm - 3 nm1017 - 1019 HzMaterial analysis, astronomy
Ultraviolet3 nm - 0.4 μm7.5×1014 - 1017 HzOzone monitoring, mineral identification
Visible Light0.4 - 0.7 μm4.3×1014 - 7.5×1014 HzPhotography, multispectral imaging
  • Violet0.40 - 0.45 μm-Coastal/aerosol monitoring
  • Blue0.45 - 0.50 μm-Water penetration, bathymetry
  • Green0.50 - 0.58 μm-Vegetation vigor, water turbidity
  • Yellow0.58 - 0.60 μm-Sediment detection
  • Orange0.60 - 0.65 μm-Soil/vegetation discrimination
  • Red0.65 - 0.70 μm-Chlorophyll absorption, vegetation health
Near Infrared (NIR)0.7 - 1.3 μm2.3×1014 - 4.3×1014 HzVegetation analysis, biomass estimation
Short-wave Infrared (SWIR)1.3 - 3.0 μm1013 - 2.3×1014 HzSoil moisture, mineral mapping
Mid Infrared (MIR)3.0 - 8.0 μm3.75×1013 - 1014 HzFire detection, thermal analysis
Thermal Infrared (TIR)8.0 - 14.0 μm2.1×1013 - 3.75×1013 HzSurface temperature, heat mapping
Microwave1 mm - 1 m3×108 - 3×1011 HzAll-weather imaging, RADAR, soil moisture
Radio Waves> 1 m< 3×108 HzCommunication, ionosphere studies
Energy Interactions with Earth's Surface

When electromagnetic energy reaches Earth's surface or atmosphere, it can undergo several interactions:

Types of Interactions:
  1. Absorption: Energy is absorbed by the object and converted to heat
  2. Transmission: Energy passes through the object
  3. Reflection: Energy bounces off the surface (most important for RS)
    • Specular reflection: Mirror-like reflection from smooth surfaces (water, glass)
    • Diffuse reflection: Scattered reflection from rough surfaces (soil, vegetation)
  4. Scattering: Energy is redirected by particles in the atmosphere
    • Rayleigh scattering: By particles smaller than wavelength (causes blue sky)
    • Mie scattering: By particles similar to wavelength (fog, haze)
    • Non-selective scattering: By particles larger than wavelength (clouds)

Energy Balance Equation:

EI = ER + EA + ET

Where:

  • EI = Incident energy
  • ER = Reflected energy (measured by sensors)
  • EA = Absorbed energy
  • ET = Transmitted energy
Spectral Reflectance Signatures

Different Earth surface features reflect electromagnetic radiation differently across wavelengths, creating unique spectral signatures that help identify and classify objects.

1. Vegetation Spectral Signature
  • Visible (0.4-0.7 μm): Low reflectance due to chlorophyll absorption (especially red and blue light for photosynthesis). Healthy vegetation appears green because green light is reflected.
  • Near Infrared (0.7-1.3 μm): Very high reflectance (40-50%) due to leaf cellular structure (spongy mesophyll). This is the most distinctive feature of healthy vegetation.
  • SWIR (1.3-3.0 μm): Moderate to low reflectance, influenced by water content in leaves. More water = lower reflectance.

Key Point: The NIR/Red ratio is used extensively to assess vegetation health (NDVI - Normalized Difference Vegetation Index).

2. Water Spectral Signature
  • Visible Blue (0.45-0.50 μm): Maximum penetration and reflection (water appears blue)
  • Visible Green-Red (0.50-0.70 μm): Decreasing reflectance
  • NIR and beyond (>0.7 μm): Almost complete absorption (near-zero reflectance). Water appears very dark in infrared images.
  • Turbidity effect: Sediment-laden water shows higher reflectance in visible bands
  • Depth effect: Shallow water shows mixed signature of water + bottom substrate
3. Soil Spectral Signature
  • General pattern: Gradual increase in reflectance from visible to SWIR
  • Moisture content: Wet soil has lower reflectance than dry soil across all wavelengths
  • Organic matter: Higher organic content = lower reflectance (darker soil)
  • Texture: Fine-textured soils reflect more than coarse-textured soils
  • Iron oxide content: High iron = reddish color, characteristic absorption in blue region
  • Roughness: Rough soil surface = lower reflectance due to shadowing
4. Rock/Mineral Spectral Signature
  • Highly variable depending on mineral composition
  • Iron minerals: Absorption features in visible and NIR (0.5-1.0 μm)
  • Carbonates: Strong absorption around 2.3 μm
  • Clay minerals: Absorption features at 1.4, 1.9, and 2.2 μm (water and hydroxyl bonds)
  • Overall: Generally higher reflectance than vegetation, lower than bright soil
Feature TypeVisible (0.4-0.7 μm)NIR (0.7-1.3 μm)SWIR (1.3-3.0 μm)Key Characteristic
Healthy VegetationLow (5-10%)Very High (40-50%)Moderate (10-20%)Strong NIR peak
Stressed VegetationModerate (10-20%)Moderate (20-30%)Low (5-15%)Reduced NIR reflectance
Clear WaterLow (2-5%)Near Zero (<1%)ZeroNIR absorption
Turbid WaterModerate (10-20%)Very Low (1-3%)Near ZeroIncreased visible reflectance
Dry SoilModerate (15-25%)High (25-35%)High (25-40%)Gradual increase
Wet SoilLow (5-15%)Moderate (10-20%)Low (10-20%)Overall low reflectance
Urban/ConcreteHigh (20-35%)High (30-45%)High (30-50%)Uniform high reflectance
Snow/IceVery High (80-95%)High (60-80%)Low (10-30%)Highest visible reflectance
Resolution in Remote Sensing

Resolution refers to the ability of a sensor to distinguish and record information. There are four types of resolution that determine the quality and applicability of remote sensing data:

1. Spatial Resolution

Definition: The smallest object or area on the ground that can be distinguished as separate from its surroundings. It is determined by the size of a single pixel in the image.

Ground Sample Distance (GSD): The physical size on the ground represented by one pixel.

Resolution CategoryPixel SizeExample SatellitesTypical Applications
Very High Resolution< 1 mWorldView-3 (0.31 m), GeoEye-1 (0.41 m), Pleiades (0.5 m)Urban planning, detailed infrastructure mapping, precision agriculture, military intelligence
High Resolution1 - 10 mSPOT-6/7 (1.5 m), Sentinel-2 (10 m), IRS-P6 Cartosat (2.5 m)Land use/land cover mapping, agricultural monitoring, disaster assessment
Medium Resolution10 - 100 mLandsat 8/9 (30 m), IRS-1C/1D LISS-III (23.5 m), Sentinel-2 (20 m)Regional resource monitoring, forest inventory, watershed management
Low Resolution> 100 mMODIS (250-1000 m), AVHRR (1.1 km), INSAT (1 km)Weather forecasting, climate studies, global vegetation monitoring, ocean studies

Trade-off: Higher spatial resolution provides more detail but covers smaller areas and generates larger data volumes. Lower resolution covers larger areas but with less detail.

2. Spectral Resolution

Definition: The ability of a sensor to distinguish between different wavelengths of electromagnetic radiation. It refers to the number and width of spectral bands a sensor can record.

TypeNumber of BandsBand WidthExamplesApplications
Panchromatic1 bandVery broad (0.4-0.9 μm)IRS-P6 PAN (5.8 m), Cartosat-1 (2.5 m)High-resolution mapping, stereo imaging, cartography
Multispectral3-10 bandsBroad (50-200 nm)Landsat 8 (11 bands), Sentinel-2 (13 bands), SPOT (4 bands)General land cover classification, vegetation monitoring, water quality
Superspectral10-50 bandsNarrow (10-50 nm)MODIS (36 bands), ASTER (14 bands)Detailed vegetation analysis, mineral identification, atmospheric studies
Hyperspectral>50 bands (often 100-250)Very narrow (5-10 nm)Hyperion (220 bands), AVIRIS (224 bands), PRISMA (250 bands)Precision agriculture, detailed mineral mapping, species identification, chemical detection
Ultraspectral>250 bandsExtremely narrow (<5 nm)Advanced research sensorsSpecialized research, atmospheric chemistry, detailed material analysis

Key Principle: Narrower bands (higher spectral resolution) provide more detailed spectral information, enabling better discrimination between similar materials. However, they require more data storage and processing capacity.

3. Radiometric Resolution

Definition: The ability of a sensor to detect and record subtle differences in energy intensity (brightness). It is expressed as the number of digital levels (bits) used to represent the brightness values.

Bit DepthNumber of LevelsBrightness RangeExamplesCharacteristics
8-bit28 = 2560-255Older sensors, basic camerasLimited sensitivity, smaller file size, adequate for basic applications
10-bit210 = 1,0240-1023SPOT-5, IRS-P6 LISS-IVBetter sensitivity, improved contrast detection
11-bit211 = 2,0480-2047Landsat 8 OLI, Sentinel-2Good sensitivity for most applications
12-bit212 = 4,0960-4095WorldView-3, Landsat 7 ETM+High sensitivity, excellent contrast detection
14-bit214 = 16,3840-16383Advanced multispectral sensorsVery high sensitivity, detailed tonal variations
16-bit216 = 65,5360-65535Hyperspectral sensors, scientific instrumentsExtremely high sensitivity, maximum tonal detail

Practical Impact: Higher radiometric resolution (more bits) means the sensor can detect smaller differences in brightness, crucial for identifying subtle changes in vegetation health, water quality, or atmospheric conditions.

4. Temporal Resolution

Definition: The frequency at which a sensor revisits the same area on Earth's surface. Also called revisit time or repeat cycle.

Temporal ResolutionRevisit TimeExamplesTypical Applications
Very HighMinutes to HoursGeostationary satellites (INSAT-3D, GOES), UAVs/DronesWeather monitoring, disaster response (floods, fires), real-time surveillance
HighDailyMODIS (Terra/Aqua), NOAA AVHRR, Sentinel-3Daily weather, ocean monitoring, rapid change detection, crop monitoring
Medium2-5 daysSentinel-2 (5 days with both satellites), SPOT-6/7 (1-4 days off-nadir)Agricultural monitoring, vegetation phenology, seasonal change detection
Moderate1-2 weeksIRS series (5-24 days), WorldView (1-4.5 days off-nadir)Land use change, forest monitoring, urban growth studies
Low>2 weeksLandsat 8/9 (16 days), Cartosat series (5-26 days)Long-term change detection, geological mapping, multi-temporal analysis

Important Considerations:

  • Trade-off with spatial resolution: Sensors with higher spatial resolution typically have longer revisit times
  • Cloud cover: Even with frequent revisits, cloud cover can limit usable data acquisition
  • Off-nadir viewing: Some satellites can point their sensors at angles to improve revisit time, but this affects geometric accuracy
  • Constellation approach: Multiple satellites in coordination can improve temporal resolution (e.g., Sentinel-2A and 2B together provide 5-day revisit)
Stages in the Remote Sensing Process

Remote sensing involves a systematic sequence of steps from energy source to final application:

  1. Energy Source/Illumination: The process begins with an energy source—either the Sun (passive RS) or an artificial source like radar (active RS) that emits electromagnetic radiation.
  2. Atmospheric Interaction (Path 1): As energy travels from the source to Earth's surface, it interacts with the atmosphere through absorption and scattering, which can modify the signal.
  3. Interaction with Target: Energy reaches Earth's surface and interacts with various features (vegetation, water, soil, etc.) through reflection, absorption, and transmission. Each feature has a unique spectral signature.
  4. Recording of Energy by Sensor (Path 2): The reflected/emitted energy travels back through the atmosphere (more interaction) and is detected and recorded by sensors on satellites or aircraft.
  5. Transmission, Reception, and Processing: The recorded data is transmitted to ground receiving stations, where it undergoes initial processing (radiometric and geometric corrections).
  6. Interpretation and Analysis: Processed data is analyzed using visual interpretation or digital image processing techniques to extract meaningful information.
  7. Application: The interpreted information is applied to solve real-world problems in various fields (agriculture, forestry, urban planning, disaster management, etc.).

Two Types of Remote Sensing Systems:

  • Passive Remote Sensing: Uses natural energy source (Sun). Examples: optical satellites (Landsat, Sentinel-2), aerial photography. Limited to daytime and clear weather conditions.
  • Active Remote Sensing: Provides its own energy source. Examples: RADAR, LiDAR, SONAR. Can operate day/night and through clouds.
Visual Image Interpretation

Visual interpretation is the act of examining remote sensing images to identify objects and assess their significance through systematic analysis.

Elements of Visual Image Interpretation (STOP CHATS)

These are the fundamental characteristics used to identify features in imagery:

  1. Size: The physical dimensions of an object provide clues to its identity. Compare object size with known features. Large buildings, small cars, tree crown diameter, field size, etc.
  2. Tone/Color: The brightness or color of an object in an image. Different features have different tones in different spectral bands (vegetation appears dark in visible, bright in NIR).
  3. Shape: The general form or outline of an object. Regular geometric shapes often indicate human-made features (rectangular buildings, circular water tanks), while irregular shapes suggest natural features.
  4. Pattern: The spatial arrangement of objects. Examples: orchards have regular spacing, natural forests are irregular, residential areas have specific road patterns.
  5. Texture: The frequency of tonal change in an image—the arrangement of fine detail. Smooth texture (water, paved surfaces) vs. rough texture (forests, rocky terrain).
  6. Shadow: Can help determine height of objects and provide profile information. Important for identifying trees, buildings, and topographic features. Can also obscure features.
  7. Association: The relationship between objects. Features are often found together (schools near residential areas, certain vegetation near water sources).
  8. Site/Location: The topographic or geographic position. Helps identify features based on where they're typically found (docks near water, ski resorts on mountains).

Interpretation Approach: Use multiple elements together for accurate identification. A single element may be ambiguous, but combined evidence improves accuracy.

Aerial Photography

Aerial photography is one of the oldest forms of remote sensing, using cameras mounted on aircraft to capture images of Earth's surface.

Photo Scale and Scale Formula

Photo Scale: The ratio between a distance on the photograph and the corresponding distance on the ground.

Photo Scale Formula:

Scale = f / H = Photo Distance / Ground Distance

Where:

  • f = focal length of the camera (mm or cm)
  • H = flying height above ground level (m or km)

Alternative form: Scale = 1 : (H/f)

Example Calculation:

Given: Focal length (f) = 150 mm = 0.15 m; Flying height (H) = 3,000 m above ground

Scale: 1 : (H/f) = 1 : (3000/0.15) = 1 : 20,000

Meaning: 1 cm on the photo represents 20,000 cm (200 m) on the ground.

Types of Aerial Photographs
TypeCamera AngleCharacteristicsAdvantagesDisadvantagesApplications
Vertical (Nadir)Camera pointed straight down (perpendicular to ground)Uniform scale throughout (for flat terrain), minimal distortion, overhead viewAccurate measurements, easy to create mosaics, consistent scaleLimited perspective, some features hiddenMapping, surveying, resource inventory, topographic map creation
Oblique - LowTilted less than 30° from verticalHorizon not visible, moderate perspective view, variable scaleBetter visualization of features, some perspectiveScale variations, difficult for accurate measurementsReconnaissance, visualization, preliminary surveys
Oblique - HighTilted more than 30° from verticalHorizon visible, strong perspective view, large scale variationsExcellent visualization, panoramic view, dramatic perspectiveSignificant scale distortion, not suitable for measurement, difficult to interpretPresentations, public communication, scenic documentation, general reconnaissance
Aerial Photography Overlap

Overlap refers to the common area covered by successive photographs. It's essential for stereoscopic viewing and complete coverage.

  • Forward Overlap (End Lap):
    • Overlap between consecutive photos along the flight line
    • Standard: 60-65% overlap
    • Purpose: Enables stereoscopic (3D) viewing for height/depth perception
    • Minimum 50% needed for stereo pairs
  • Side Overlap (Side Lap):
    • Overlap between adjacent flight strips
    • Standard: 20-30% overlap
    • Purpose: Ensures complete coverage without gaps, compensates for aircraft drift
    • Higher overlap (30-40%) used in hilly/mountainous terrain

Why Overlap Matters:

  • Stereoscopic Vision: 60% forward overlap creates stereo pairs that allow 3D viewing when viewed with proper instruments (stereoscopes)
  • Height Measurement: Parallax differences in overlapping areas enable calculation of object heights and terrain elevation
  • Complete Coverage: Side overlap ensures no area is missed between flight lines
  • Quality Control: Overlap areas provide reference points for mosaicking and geometric correction
Types of Photographic Film
Film TypeSpectral RangeCharacteristicsOutputApplications
Black & White (Panchromatic)0.4 - 0.7 μm (entire visible spectrum)Records all visible light as grayscale tones, excellent spatial detail, high contrastGrayscale image (black, white, and shades of gray)Topographic mapping, general interpretation, high-resolution detail work, historical aerial photography
True Color (Natural Color)Blue (0.4-0.5 μm), Green (0.5-0.6 μm), Red (0.6-0.7 μm)Records colors as human eye sees them, intuitive interpretation, familiar appearanceColor image matching human visionUrban planning, land use mapping, public presentations, general resource assessment, environmental monitoring
Color Infrared (CIR) / False ColorGreen (0.5-0.6 μm), Red (0.6-0.7 μm), NIR (0.7-0.9 μm)NIR recorded as red, red as green, green as blue. Healthy vegetation appears bright red/magentaFalse color image: vegetation = red, water = black/dark blue, soil = blue/brownVegetation analysis, forestry, agriculture, wetland mapping, plant stress detection, camouflage detection

CIR Film Advantage: Since healthy vegetation has very high NIR reflectance, it appears bright red in CIR imagery, making it easy to distinguish vegetation from other features and to assess vegetation health. Stressed vegetation appears less red or pinkish.

Factors Affecting Aerial Photography Quality
  1. Flying Height:
    • Higher altitude = smaller scale, larger area coverage, less detail
    • Lower altitude = larger scale, smaller area coverage, more detail
    • Affects atmospheric interference and image clarity
  2. Atmospheric Conditions:
    • Haze, smoke, and pollution reduce image clarity
    • Cloud cover blocks ground features
    • Best photography: clear, stable atmospheric conditions
    • Optimal time: 10 AM - 2 PM (sun angle reduces shadows)
  3. Sun Angle and Illumination:
    • Low sun angle creates long shadows (good for relief but obscures features)
    • High sun angle minimizes shadows (better for mapping)
    • Season affects sun angle and vegetation condition
  4. Ground Conditions:
    • Vegetation phenology (leaf-on vs. leaf-off conditions)
    • Soil moisture (affects reflectance and contrast)
    • Snow cover (can be advantageous or problematic depending on purpose)
  5. Aircraft Stability:
    • Pitch, roll, and yaw cause geometric distortions
    • Wind and turbulence affect image quality
    • Modern systems use GPS/INS for correction
  6. Camera and Lens Quality:
    • Focal length determines scale and field of view
    • Lens distortion must be corrected
    • Film resolution or digital sensor quality affects detail
Digital Image Processing

Digital image processing involves computer-based manipulation and analysis of remotely sensed imagery to extract information. Digital images consist of discrete picture elements (pixels), each with a numerical value representing brightness.

Major Categories of Digital Image Processing
1. Image Restoration and Rectification

Purpose: Correct distortions and degradation in the original image data to create a more faithful representation of the scene.

Types of Corrections:

  • Radiometric Correction:
    • Corrects sensor irregularities and atmospheric effects
    • Removes sensor noise and detector calibration errors
    • Atmospheric correction to remove haze and scattering effects
    • Converts digital numbers (DN) to physical units (radiance/reflectance)
  • Geometric Correction:
    • Removes geometric distortions caused by sensor viewing angle, Earth's curvature, terrain relief, and platform motion
    • Geo-referencing: assigns real-world coordinates to image pixels
    • Orthorectification: removes terrain-induced distortions using DEM
    • Image registration: aligns multiple images to common coordinate system
2. Image Enhancement

Purpose: Improve visual appearance of images to facilitate human interpretation or subsequent computer processing.

Enhancement Techniques:

  • Contrast Enhancement:
    • Linear stretch: expands narrow range of brightness values to full dynamic range
    • Histogram equalization: redistributes pixel values for better visual separation
    • Useful when original image has low contrast
  • Spatial Filtering:
    • Low-pass filters: smooth image, reduce noise, emphasize large features
    • High-pass filters: enhance edges and fine details, emphasize boundaries
    • Edge detection: identify boundaries between features
  • Multi-image Operations:
    • Band ratioing: dividing one band by another (e.g., NIR/Red for NDVI)
    • Principal Component Analysis (PCA): reduces data redundancy
    • False color composites: combines bands in non-natural ways for analysis
3. Image Classification

Purpose: Automatically categorize pixels into information classes or themes (land cover types).

Classification Approaches:

  • Unsupervised Classification:
    • Computer automatically groups pixels with similar spectral characteristics
    • No prior knowledge required
    • Algorithms: K-means, ISODATA
    • User assigns meaning to classes after classification
    • Fast, objective, but may not match desired categories
  • Supervised Classification:
    • User selects representative samples (training areas) for each class
    • Computer learns spectral signatures from training data
    • Applies learned signatures to classify entire image
    • Algorithms: Maximum Likelihood, Minimum Distance, Neural Networks, Support Vector Machines
    • More accurate but requires expertise and training data
  • Object-Based Classification:
    • Groups pixels into objects based on spectral and spatial properties
    • Considers shape, texture, context in addition to spectral values
    • Better for high-resolution imagery

Accuracy Assessment: After classification, accuracy must be evaluated using ground truth data or reference imagery. Common metrics: Overall Accuracy, Producer's Accuracy, User's Accuracy, Kappa Coefficient.

Vegetation Indices

Vegetation indices are mathematical combinations of spectral bands designed to enhance vegetation signal and minimize other influences.

NDVI (Normalized Difference Vegetation Index)

The most widely used vegetation index, based on the contrast between NIR (high vegetation reflectance) and Red (chlorophyll absorption).

NDVI = (NIR - Red) / (NIR + Red)

Where:

  • NIR = Near Infrared band reflectance (0.7-1.3 μm)
  • Red = Red band reflectance (0.6-0.7 μm)

Value Range: -1 to +1

NDVI RangeInterpretationTypical Features
-1 to 0No vegetation, non-vegetated surfacesWater bodies, clouds, snow, bare rock, built-up areas
0 to 0.2Sparse or no vegetationBare soil, sand, desert, urban areas, dead/dying vegetation
0.2 to 0.4Low vegetation density or stressed vegetationGrasslands, shrublands, crops with low biomass, stressed vegetation
0.4 to 0.6Moderate vegetationTemperate grasslands, croplands, moderate vegetation cover
0.6 to 0.8Dense vegetationDense forests, healthy crops at peak growth, wetland vegetation
0.8 to 1.0Very dense, healthy vegetationTropical rainforests, very dense vegetation with high chlorophyll content

NDVI Applications:

  • Monitoring crop health and predicting yields
  • Assessing vegetation phenology (seasonal changes)
  • Drought detection and monitoring
  • Forest health assessment
  • Desertification and land degradation studies
  • Biomass estimation
  • Habitat mapping and biodiversity studies
Other Important Vegetation Indices:
  • EVI (Enhanced Vegetation Index): Improves NDVI by reducing atmospheric and soil background effects
  • SAVI (Soil Adjusted Vegetation Index): Reduces soil brightness influence
  • GNDVI (Green NDVI): Uses Green band instead of Red, more sensitive to chlorophyll concentration
  • LAI (Leaf Area Index): Estimates total leaf area per unit ground area
Advanced Remote Sensing Technologies
Hyperspectral Remote Sensing

Definition: Imaging in hundreds of narrow, contiguous spectral bands (typically 5-10 nm wide), providing detailed spectral information for each pixel.

Key Characteristics:

  • 100-250+ spectral bands across visible, NIR, and SWIR regions
  • Creates a continuous spectrum for each pixel (spectral signature)
  • Can identify specific materials, chemicals, and species
  • Generates large data volumes requiring specialized processing

Applications:

  • Precision agriculture: crop species identification, disease detection, nutrient status
  • Mineral exploration: detailed lithological mapping, alteration detection
  • Environmental monitoring: water quality assessment, pollution detection
  • Defense: camouflage detection, target identification
  • Forestry: tree species classification, forest health assessment

Examples: Hyperion (NASA), PRISMA (Italy), EnMAP (Germany), AVIRIS (airborne)

Thermal Remote Sensing

Definition: Detection and measurement of electromagnetic radiation emitted by objects in the thermal infrared region (3-14 μm), related to object temperature.

Key Characteristics:

  • Measures emitted (not reflected) radiation from Earth's surface
  • Can operate day and night (independent of solar illumination)
  • Two atmospheric windows: 3-5 μm (MIR) and 8-14 μm (TIR)
  • Temperature resolution typically 0.1-1.0°C

Applications:

  • Urban heat island studies and climate research
  • Volcanic activity monitoring and geothermal exploration
  • Forest fire detection and monitoring
  • Soil moisture estimation (evapotranspiration)
  • Water temperature mapping (ocean currents, thermal pollution)
  • Building energy efficiency assessment
  • Irrigation management and crop water stress

Examples: Landsat 8/9 TIRS, ASTER TIR, MODIS thermal bands, ECOSTRESS

Microwave Remote Sensing

Definition: Remote sensing using microwave radiation (wavelength 1 mm - 1 m), including both passive and active systems.

Key Advantages:

  • All-weather capability: Penetrates clouds, fog, rain, and smoke
  • Day/night operation: Active systems provide own illumination
  • Surface penetration: Can penetrate dry soil, sand, vegetation canopy
  • Sensitive to: Surface roughness, moisture content, dielectric properties

Types:

  • Passive Microwave: Detects natural microwave emission (e.g., AMSR, SSM/I for soil moisture, sea ice)
  • Active Microwave: RADAR systems that transmit pulses and measure backscatter
RADAR (Radio Detection and Ranging)

Definition: Active microwave remote sensing system that transmits microwave pulses toward the target and records the backscattered energy.

Key Characteristics:

  • Measures intensity (backscatter strength) and time delay (distance)
  • Backscatter depends on surface roughness, moisture, and dielectric properties
  • Different wavelengths (bands): X-band, C-band, L-band, P-band (longer wavelengths penetrate deeper)
  • Side-looking geometry for optimal imaging

RADAR Bands:

  • X-band (2.4-3.8 cm): High resolution, minimal penetration, sensitive to surface roughness
  • C-band (3.8-7.5 cm): Moderate penetration, all-weather monitoring, crop monitoring
  • L-band (15-30 cm): Deep penetration, forest biomass, soil moisture
  • P-band (30-100 cm): Maximum penetration, subsurface features, forest structure

Applications:

  • Flood mapping and disaster monitoring (through clouds)
  • Ship detection and ocean wave monitoring
  • Forest structure and biomass estimation
  • Soil moisture mapping
  • Archaeological site detection (subsurface features)
  • Ice sheet monitoring and glacier movement
  • Land deformation and subsidence (InSAR technique)

Examples: Sentinel-1 (C-band), RADARSAT-2 (C-band), ALOS PALSAR (L-band), TerraSAR-X (X-band)

LiDAR (Light Detection and Ranging)

Definition: Active remote sensing technique using laser pulses to measure distances to Earth's surface, creating precise 3D elevation data.

How It Works:

  • Emits rapid laser pulses (typically 10,000-500,000 pulses/second)
  • Measures time for each pulse to return after reflecting from surface
  • GPS and IMU determine precise sensor position and orientation
  • Calculates exact 3D coordinates of reflection points (point cloud)

Types of LiDAR:

  • Topographic LiDAR: Uses NIR laser (1064 nm), maps land surfaces and vegetation structure
  • Bathymetric LiDAR: Uses green laser (532 nm) that penetrates water, maps underwater topography in clear water

Key Products:

  • DSM (Digital Surface Model): Elevation of all surfaces including buildings and vegetation
  • DEM (Digital Elevation Model): Bare earth elevation (vegetation and structures removed)
  • CHM (Canopy Height Model): Height of vegetation above ground (DSM minus DEM)

Applications:

  • High-accuracy topographic mapping
  • Forest inventory: tree height, canopy structure, biomass
  • Urban 3D modeling and building mapping
  • Flood modeling and hydrological studies
  • Power line and infrastructure corridor mapping
  • Archaeological site detection (penetrates vegetation)
  • Coastal zone mapping and erosion monitoring
  • Transportation planning and engineering

Advantages:

  • Extremely high accuracy (vertical: 5-15 cm; horizontal: 30-50 cm)
  • Penetrates vegetation canopy to ground (multiple returns)
  • Direct 3D measurements without stereoscopic processing
  • Can operate day or night

Examples: ICESat-2 (satellite), GEDI (space station), numerous airborne and UAV systems

Image Processing Software

Various software packages are available for processing and analyzing remote sensing data:

SoftwareTypeKey FeaturesBest For
ERDAS ImagineCommercialComprehensive image processing, photogrammetry, advanced classificationProfessional remote sensing analysis, large projects
ENVICommercialAdvanced spectral analysis, hyperspectral processing, extensive algorithmsResearch, hyperspectral analysis, specialized applications
ArcGIS (Spatial Analyst, Image Analyst)CommercialIntegrated GIS and RS, comprehensive spatial analysis, user-friendlyGIS-integrated projects, general mapping, analysis
PCI GeomaticaCommercialOrthorectification, SAR processing, comprehensive toolsetProfessional image processing, SAR analysis
eCognitionCommercialObject-based image analysis (OBIA), machine learningHigh-resolution imagery, object-based classification
Google Earth EngineFree (cloud)Massive satellite data archive, cloud processing, JavaScript/Python APILarge-scale analysis, time series, rapid prototyping
QGIS (with Semi-Automatic Classification Plugin)Free/Open SourceGIS with RS capabilities, classification, change detectionBudget-conscious projects, education, general analysis
SNAP (Sentinel Application Platform)Free (ESA)Optimized for Sentinel satellites, SAR processing, optical processingSentinel data processing, education, SAR analysis
GRASS GISFree/Open SourcePowerful RS and GIS tools, terrain analysis, extensive algorithmsResearch, advanced users, scientific applications
Orfeo ToolBoxFree/Open SourceLarge-scale processing, machine learning, comprehensive algorithmsAutomated processing pipelines, research
MATLAB (with Image Processing Toolbox)CommercialCustom algorithm development, advanced mathematics, scriptingResearch, algorithm development, scientific computing
Python (libraries: Rasterio, GDAL, scikit-learn, TensorFlow)Free/Open SourceFlexible scripting, machine learning integration, automationCustom workflows, machine learning, automated processing
Advantages and Benefits of Remote Sensing
  1. Large Area Coverage: Can map extensive regions quickly and efficiently (entire countries or continents)
  2. Inaccessible Areas: Provides data from remote, dangerous, or inaccessible locations (dense forests, mountains, disaster zones)
  3. Synoptic View: Offers complete overview of large areas simultaneously, revealing spatial patterns and relationships
  4. Multitemporal Analysis: Regular revisits enable monitoring of changes over time (land use change, deforestation, urban growth)
  5. Cost-Effective: More economical than traditional ground surveys for large areas
  6. Multispectral Capability: Captures information beyond visible spectrum (IR, thermal, microwave), revealing features invisible to human eye
  7. Digital Format: Data readily available for computer processing, quantitative analysis, and integration with GIS
  8. Permanent Record: Creates archived data for historical analysis and comparison
  9. Objective Data: Provides consistent, repeatable measurements less subject to human bias
  10. Rapid Data Acquisition: Especially important for disaster response and emergency management
  11. Non-Invasive: Observes without disturbing the environment or phenomena being studied
  12. Multidisciplinary Applications: Same data useful across multiple fields (agriculture, forestry, geology, hydrology, urban planning)
Key Definitions - Quick Reference
TermDefinition
Remote SensingAcquiring information about objects or phenomena without direct physical contact, using sensors on platforms like satellites or aircraft
EMRElectromagnetic Radiation - energy that travels as waves at the speed of light
Wavelength (λ)Distance between successive wave crests, measured in micrometers (μm) or nanometers (nm)
Frequency (ν)Number of wave crests passing a point per unit time, measured in Hertz (Hz)
Spectral SignatureUnique pattern of reflectance/emission across wavelengths that characterizes a particular material or feature
PixelPicture element - smallest unit in a digital image with a single brightness value
GSDGround Sample Distance - physical size on ground represented by one pixel
BandSpecific range of wavelengths in the electromagnetic spectrum recorded by a sensor
PanchromaticSingle broad band covering entire visible spectrum (black and white imagery)
Multispectral3-10 relatively broad spectral bands
HyperspectralMany (50-250+) narrow, contiguous spectral bands
Passive RSSensors that detect natural radiation emitted or reflected by objects (depends on external energy source like Sun)
Active RSSensors that provide their own energy source (e.g., RADAR, LiDAR)
NDVINormalized Difference Vegetation Index - (NIR-Red)/(NIR+Red) - measures vegetation health and density
DEMDigital Elevation Model - 3D representation of terrain surface elevations
OrthorectificationProcess of removing geometric distortions from imagery to create accurate, map-like representation
ClassificationProcess of categorizing pixels into thematic classes (land cover types)
Supervised ClassificationClassification using training samples selected by analyst
Unsupervised ClassificationAutomatic grouping of pixels by computer without training data
Ground TruthField-collected reference data used to train or validate remote sensing analysis
Common Student Pitfalls and Exam Tips

Common Mistakes to Avoid:

  1. Confusing wavelength and frequency: Remember: shorter wavelength = higher frequency (inverse relationship: λ = c/ν)
  2. Mixing up NIR behavior: Vegetation reflects HIGH in NIR (not absorbs). Water absorbs in NIR (appears dark).
  3. Photo scale confusion: Larger scale number (1:50,000) = SMALLER scale (less detail), smaller scale number (1:5,000) = LARGER scale (more detail)
  4. Resolution types: Don't confuse spatial (pixel size), spectral (number/width of bands), radiometric (brightness levels), and temporal (revisit time)
  5. NDVI interpretation: Higher positive values (closer to +1) = healthier, denser vegetation. Negative or near-zero = water, bare soil, built-up
  6. Active vs Passive: RADAR and LiDAR are ACTIVE (provide own energy). Optical/multispectral satellites are PASSIVE (use sunlight).
  7. Wien's Law application: Higher temperature objects peak at SHORTER wavelengths (Sun peaks in visible; Earth peaks in thermal IR)
  8. Overlap percentages: Forward overlap (60-65%) ≠ Side overlap (20-30%). Don't mix these up!

Exam Success Tips:

  • Memorize key formulas: Wien's Law, Photo Scale, NDVI, Energy Balance
  • Know the EM spectrum regions, wavelength ranges, and typical RS applications for each
  • Understand spectral signatures: can you sketch vegetation, water, soil curves?
  • Practice calculations: photo scale, temperature from wavelength using Wien's Law
  • Remember specific examples: satellite names, resolution values, band numbers
  • Be clear about resolution types - know examples of each with actual numbers
  • Understand the complete RS process flow from energy source to application
  • Know visual interpretation elements (STOP CHATS mnemonic)
  • Compare and contrast: active vs passive, supervised vs unsupervised, vertical vs oblique photos
  • For long answers, structure with: definition → principles → examples → applications
Practice Questions
Very Short Answer Questions (1-2 marks)
Q1. Define Remote Sensing.
Q2. What is electromagnetic radiation?
Q3. State Wien's Displacement Law.
Q4. What is a spectral signature?
Q5. Define spatial resolution.
Q6. What does NDVI stand for?
Q7. Differentiate between active and passive remote sensing.
Q8. What is a pixel?
Q9. Name two Indian remote sensing satellites.
Q10. What is the purpose of forward overlap in aerial photography?
Q11. Define GSD (Ground Sample Distance).
Q12. What is hyperspectral remote sensing?
Q13. Give one application of thermal remote sensing.
Q14. What does LiDAR stand for?
Q15. Name the two types of classification in image processing.
Short Answer Questions (3-5 marks)
Q1. Explain the electromagnetic spectrum with reference to remote sensing applications.
Q2. Describe the spectral reflectance signature of healthy vegetation.
Q3. Explain the four types of resolution in remote sensing.
Q4. What are the main interactions of electromagnetic energy with Earth's surface?
Q5. Describe the elements of visual image interpretation (STOP CHATS).
Q6. Explain NDVI formula, range, and interpretation.
Q7. Compare vertical and oblique aerial photographs.
Q8. Explain the concept of photo scale and its calculation.
Q9. What is the significance of overlap in aerial photography?
Q10. Describe the main categories of digital image processing.
Q11. Differentiate between supervised and unsupervised classification.
Q12. Explain RADAR remote sensing and its advantages.
Q13. What is LiDAR? Explain its working principle and applications.
Q14. Compare multispectral and hyperspectral remote sensing.
Q15. Describe the factors affecting aerial photography quality.
Long Answer Questions (8-10 marks)
Q1. Describe the complete process of remote sensing from energy source to application. Include a diagram.
Q2. Explain Wien's Displacement Law in detail with examples. How is it applied in remote sensing?
Q3. Discuss the spectral reflectance signatures of vegetation, water, and soil. How do these signatures help in feature identification?
Q4. Explain all four types of resolution in detail with examples of satellites for each category. How do they affect data quality and applications?
Q5. Describe the major categories of digital image processing: restoration, enhancement, and classification. Provide examples of techniques in each category.
Q6. Explain aerial photography in detail including scale calculation, types of photographs, overlap requirements, and film types.
Q7. Discuss advanced remote sensing technologies: hyperspectral, thermal, RADAR, and LiDAR. Compare their characteristics and applications.
Q8. What are vegetation indices? Explain NDVI in detail including formula, interpretation, and applications.
Q9. Describe the advantages and limitations of remote sensing. Provide examples of applications in different fields.
Q10. Explain the electromagnetic spectrum in detail. Discuss the different regions and their specific applications in remote sensing.

One-Page Revision Sheet

Key Formulas:
  • Wien's Law: λmax = b / T (b = 2898 μm·K)
  • Photo Scale: Scale = f / H or 1 : (H/f)
  • NDVI: (NIR - Red) / (NIR + Red)
  • Energy Balance: EI = ER + EA + ET
EM Spectrum Quick Reference:
  • UV: 0.003-0.4 μm (ozone, minerals)
  • Visible: 0.4-0.7 μm (photography, multispectral)
  • NIR: 0.7-1.3 μm (vegetation analysis)
  • SWIR: 1.3-3.0 μm (soil moisture, minerals)
  • Thermal IR: 8-14 μm (temperature mapping)
  • Microwave: 1 mm-1 m (all-weather RADAR)
Spectral Signatures:
  • Vegetation: Low visible (chlorophyll), HIGH NIR (leaf structure), moderate SWIR (water)
  • Water: Blue reflection, NIR/SWIR absorption (very dark)
  • Soil: Gradual increase visible→SWIR (moisture lowers reflectance)
Resolution Types:
  • Spatial: Pixel size (VHR: <1m, HR: 1-10m, MR: 10-100m, LR: >100m)
  • Spectral: Band number/width (Pan: 1, Multi: 3-10, Hyper: >50)
  • Radiometric: Brightness levels (8-bit: 256, 12-bit: 4096, 16-bit: 65536)
  • Temporal: Revisit time (Daily: MODIS, 5-day: Sentinel-2, 16-day: Landsat)
Visual Interpretation - STOP CHATS:

Size, Tone/Color, Shape, Pattern, Texture, Shadow, Association, Site

Aerial Photography:
  • Forward overlap: 60-65% (for stereo viewing)
  • Side overlap: 20-30% (complete coverage)
  • Types: Vertical (mapping), Low oblique, High oblique (visualization)
  • Films: B&W (detail), True color (natural), CIR (vegetation=red)
Image Processing:
  • Restoration: Radiometric/geometric correction
  • Enhancement: Contrast stretch, filtering, ratios
  • Classification: Supervised (training samples) vs Unsupervised (automatic)
NDVI Ranges:
  • -1 to 0: Water, snow, clouds
  • 0 to 0.2: Bare soil, urban
  • 0.2-0.4: Sparse/stressed vegetation
  • 0.4-0.6: Moderate vegetation
  • 0.6-1.0: Dense, healthy vegetation
Advanced Technologies:
  • Hyperspectral: 100-250+ narrow bands (mineral ID, precision ag)
  • Thermal: 8-14 μm (temperature, fires, urban heat)
  • RADAR: Active microwave (all-weather, day/night, penetration)
  • LiDAR: Laser pulses (3D elevation, high accuracy 5-15cm)
Important Satellites:
  • Landsat 8/9: 30m multispectral, 16-day revisit
  • Sentinel-2: 10m multispectral, 5-day revisit (both)
  • MODIS: 250-1000m, daily (climate, weather)
  • WorldView-3: 0.31m, very high resolution
  • IRS/Cartosat: Indian satellites (various resolutions)
Key Advantages:

Large area coverage, inaccessible areas, synoptic view, multitemporal, cost-effective, multispectral, digital format, permanent record

Study Strategy:
  • Focus on understanding concepts, not just memorization
  • Practice drawing and labeling diagrams (RS process, spectral curves, EM spectrum)
  • Work through numerical problems multiple times
  • Create comparison tables for different resolution types, satellites, classification methods
  • Review this revision sheet regularly, especially formulas and key values
  • Practice explaining concepts to others - teaching reinforces learning
  • Connect theoretical knowledge with real-world applications

Best wishes for your exam! 🚀🛰️

Master these concepts and apply them confidently.

Leave a Reply

Your email address will not be published. Required fields are marked *