Hubbry Logo
Remote sensingRemote sensingMain
Open search
Remote sensing
Community hub
Remote sensing
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Remote sensing
Remote sensing
from Wikipedia

Synthetic aperture radar image of Death Valley colored using polarimetry

Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation. The term is applied especially to acquiring information about Earth and other planets. Remote sensing is used in numerous fields, including geophysics, geography, land surveying and most Earth science disciplines (e.g. exploration geophysics, hydrology, ecology, meteorology, oceanography, glaciology, geology). It also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others.

In current usage, the term remote sensing generally refers to the use of satellite- or airborne-based sensor technologies to detect and classify objects on Earth. It includes the surface and the atmosphere and oceans, based on propagated signals (e.g. electromagnetic radiation). It may be split into "active" remote sensing (when a signal is emitted by a sensor mounted on a satellite or aircraft to the object and its reflection is detected by the sensor) and "passive" remote sensing (when the reflection of sunlight is detected by the sensor).[1][2][3][4][5]

Overview

[edit]
This video is about how Landsat was used to identify areas of conservation in the Democratic Republic of the Congo, and how it was used to help map an area called MLW in the north.

Remote sensing can be divided into two types of methods: Passive remote sensing and Active remote sensing. Passive sensors gather radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay between emission and return is measured, establishing the location, speed and direction of an object.

Illustration of remote sensing

Remote sensing makes it possible to collect data of dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed.

Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation,[6][7] greenhouse gas monitoring,[8] oil spill detection and monitoring,[9] and national security and overhead, ground-based and stand-off collection on border areas.[10]

Types of data acquisition techniques

[edit]

The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas. For a summary of major remote sensing satellite systems see the overview table.

Applications of remote sensing

[edit]
Radar image of Aswan Dam, Egypt taken by Umbra

Conventional radar is mostly associated with air traffic control, early warning, and certain large-scale meteorological data. Doppler radar is used by local law enforcements' monitoring of speed limits and in enhanced meteorological collection such as wind speed and direction within weather systems in addition to precipitation location and intensity. Other types of active collection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precise digital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan). Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of water caused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height and wavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents and directions. Ultrasound (acoustic) and radar tide gauges are used to measure sea level, tides and wave direction in coastal and offshore tide gauges.

Light detection and ranging (LiDAR) is used for weapon ranging, laser illuminated homing of projectiles, and to detect and measure the concentration of various chemicals in the atmosphere while airborne LiDAR can be used to measure the heights of objects and features on the ground more accurately than radar technology. LiDAR can be used to detect ground surface changes typically by creating Digital Surface Models (DSMs) or Digital Elevation Models (DEMs).[11] Vegetation remote sensing is a principal application of LIDAR.[12]

The most common instruments in use are radiometers and photometers, which collect reflected and emitted radiation in a wide range of frequencies. The most prevalent of these frequencies are visible and infrared sensors, followed by microwave, gamma-ray, and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals, providing data on chemical concentrations in the atmosphere. Radiometers are also used at night, as artificial light emissions are a key signature of human activity.[13] Applications include remote sensing of population, GDP, and damage to infrastructure from war or disasters. Radiometers and radar onboard of satellites can be also used to monitor volcanic eruptions[14][15]

Examples of remote sensing equipment deployed by
or interfaced with oceanographic research vessels.[16]

Spectropolarimetric Imaging has been reported to be useful for target tracking purposes by researchers at the U.S. Army Research Laboratory. They determined that manmade items possess polarimetric signatures that are not found in natural objects. These conclusions were drawn from the imaging of military trucks, like the Humvee, and trailers with their acousto-optic tunable filter dual hyperspectral and spectropolarimetric VNIR Spectropolarimetric Imager.[17][18]

Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrain analysts in trafficability and highway departments for potential routes, in addition to modelling terrestrial habitat features.[19][20][21]

Simultaneous multi-spectral platforms such as Landsat have been in use since the early 1970s. These thematic mappers take images in multiple wavelengths and are usually found on Earth observation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land cover and land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage, detect invasive vegetation, deforestation, and examine the health of indigenous plants and crops (satellite crop monitoring), including entire farming regions or forests.[22] Prominent scientists using remote sensing for this purpose include Janet Franklin and Ruth DeFries. Landsat images are used by regulatory agencies such as KYDOW to indicate water quality parameters including Secchi depth, chlorophyll density, and total phosphorus content. Weather satellites are used in meteorology and climatology.

Hyperspectral imaging produces image cubes where each pixel has full spectral information with imaging narrow spectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications including mineralogy, biology, defence, and environmental measurements. Within the scope of the combat against desertification, remote sensing allows researchers to follow up and monitor risk areas in the long term, to determine desertification factors, to support decision-makers in defining relevant measures of environmental management, and to assess their impacts.[23] Remotely sensed multi- and hyperspectral images can be used for assessing biodiversity at different spatial scales. Since the spectral properties of different plants species are unique, it is possible to get information about properties that relates to biodiversity such as habitat heterogeneity, spectral diversity and plant functional trait.[24][25][26] Remote sensing has also been used to detect rare plants to aid in conservation efforts. Prediction, detection, and the ability to record biophysical conditions were possible from medium to very high resolutions.[27] Remote sensing is often utilized in the collection of agricultural and environmental statistics, usually combining classified satellite images with ground truth data collected on a sample selected on an area sampling frame[28]

Geodetic

[edit]

Geodetic remote sensing can be gravimetric or geometric. Overhead gravity data collection was first used in aerial submarine detection. This data revealed minute perturbations in the Earth's gravitational field that may be used to determine changes in the mass distribution of the Earth, which in turn may be used for geophysical studies, as in GRACE. Geometric remote sensing includes position and deformation imaging using InSAR, LIDAR, etc.[29]

Acoustic and near-acoustic

[edit]

Three main types of acoustic and near-acoustic remote sensing exist: Sonarpassive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar, emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwater objects and terrain. Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing the relative intensity and precise timings. Ultrasound acoustic sensing is made up of ultrasound sensors that emit high-frequency pulses and listening for echoes, used for detecting water waves and water level, as in tide gauges or for towing tanks.

To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location and the orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems. The rotation and orientation are often provided within a degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods including navigation from stars or known benchmarks.

Data characteristics

[edit]

The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.

Spatial resolution
The size of a pixel that is recorded in a raster image – typically pixels may correspond to square areas ranging in side length from 1 to 1,000 metres (3.3 to 3,280.8 ft).
Spectral resolution
The bandwidth of the different frequency bands recorded – usually, this is related to the number of frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several in the infrared spectrum, ranging from a spectral resolution of 0.7 to 2.1 μm. The Hyperion sensor on Earth Observing-1 resolves 220 bands from 0.4 to 2.5 μm, with a spectral resolution of 0.10 to 0.11 μm per band.
Radiometric resolution
The number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8 to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, in each band. It also depends on the instrument noise.
Temporal resolution
The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community where repeated coverage revealed changes in infrastructure, the deployment of units or the modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat the collection of said location.

Data processing

[edit]

In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem is resolved is called georeferencing and involves computer-aided matching of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced.

In addition, images may need to be radiometrically and atmospherically corrected.

Radiometric correction
Allows avoidance of radiometric errors and distortions. The illumination of objects on the Earth's surface is uneven because of different properties of the relief. This factor is taken into account in the method of radiometric distortion correction.[30] Radiometric correction gives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiance values.
Topographic correction (also called terrain correction)
In rugged mountains, as a result of terrain, the effective illumination of pixels varies considerably. In a remote sensing image, the pixel on the shady slope receives weak illumination and has a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a high radiance value. For the same object, the pixel radiance value on the shady slope will be different from that on the sunny slope. Additionally, different objects may have similar radiance values. These ambiguities seriously affected remote sensing image information extraction accuracy in mountainous areas. It became the main obstacle to the further application of remote sensing images. The purpose of topographic correction is to eliminate this effect, recovering the true reflectivity or radiance of objects in horizontal conditions. It is the premise of quantitative remote sensing application.
Atmospheric correction
Elimination of atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in water bodies) corresponds to a pixel value of 0. The digitizing of data also makes it possible to manipulate the data by changing gray-scale values.

Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects' known dimensions in order to detect modifications. Image Analysis is the recently developed automated computer-aided application that is in increasing use.

Object-Based Image Analysis (OBIA) is a sub-discipline of GI-Science devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale.

Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-tone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment.

Generally speaking, remote sensing works on the principle of the inverse problem: while the object or phenomenon of interest (the state) may not be directly measured, there exists some other variable that can be detected and measured (the observation) which may be related to the object of interest through a calculation. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emissions may then be related via thermodynamics to the temperature in that region.

Data processing levels

[edit]

To facilitate the discussion of data processing in practice, several processing "levels" were first defined in 1986 by NASA as part of its Earth Observing System[31] and steadily adopted since then, both internally at NASA (e. g.,[32]) and elsewhere (e. g.,[33]); these definitions are:

Level Description
0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.
1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).
1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.
2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.
3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).
4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

While these processing levels are particularly suitable for typical satellite data processing pipelines, other data level vocabularies have been defined and may be appropriate for more heterogeneous workflows.

Applications

[edit]

Satellite images provide very useful information to produce statistics on topics closely related to the territory, such as agriculture, forestry or land cover in general. The first large project to apply Landsata 1 images for statistics was LACIE (Large Area Crop Inventory Experiment), run by NASA, NOAA and the USDA in 1974–77.[34][35] Many other application projects on crop area estimation have followed, including the Italian AGRIT project and the MARS project of the Joint Research Centre (JRC) of the European Commission.[36] Forest area and deforestation estimation have also been a frequent target of remote sensing projects,[37][38] the same as land cover and land use[39]

Ground truth or reference data to train and validate image classification require a field survey if we are targeting annual crops or individual forest species, but may be substituted by photointerpretation if we look at wider classes that can be reliably identified on aerial photos or satellite images. It is relevant to highlight that probabilistic sampling is not critical for the selection of training pixels for image classification, but it is necessary for accuracy assessment of the classified images and area estimation.[40][41][42] Additional care is recommended to ensure that training and validation datasets are not spatially correlated.[43]

We suppose now that we have classified images or a land cover map produced by visual photo-interpretation, with a legend of mapped classes that suits our purpose, taking again the example of wheat. The straightforward approach is counting the number of pixels classified as wheat and multiplying by the area of each pixel. Many authors have noticed that estimator is that it is generally biased because commission and omission errors in a confusion matrix do not compensate each other[44][45][46]

The main strength of classified satellite images or other indicators computed on satellite images is providing cheap information on the whole target area or most of it. This information usually has a good correlation with the target variable (ground truth) that is usually expensive to observe in an unbiased and accurate way. Therefore, it can be observed on a probabilistic sample selected on an area sampling frame. Traditional survey methodology provides different methods to combine accurate information on a sample with less accurate, but exhaustive, data for a covariable or proxy that is cheaper to collect. For agricultural statistics, field surveys are usually required, while photo-interpretation may better for land cover classes that can be reliably identified on aerial photographs or high resolution satellite images. Additional uncertainty can appear because of imperfect reference data (ground truth or similar).[47][48]

Some options are: ratio estimator, regression estimator,[49] calibration estimators[50] and small area estimators[39]

If we target other variables, such as crop yield or leaf area, we may need different indicators to be computed from images, such as the NDVI, a good proxy to chlorophyll activity.[28]

History

[edit]
The TR-1 reconnaissance/surveillance aircraft
The 2001 Mars Odyssey used spectrometers and imagers to hunt for evidence of past or present water and volcanic activity on Mars.

The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar) made photographs of Paris from his balloon in 1858.[51] Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes.

Systematic aerial photography was developed for military surveillance and reconnaissance purposes beginning in World War I.[52] After WWI, remote sensing technology was quickly adapted to civilian applications.[53] This is demonstrated by the first line of a 1941 textbook titled "Aerophotography and Aerosurverying," which stated the following:

"There is no longer any need to preach for aerial photography-not in the United States- for so widespread has become its use and so great its value that even the farmer who plants his fields in a remote corner of the country knows its value."

— James Bagley[53]

The development of remote sensing technology reached a climax during the Cold War with the use of modified combat aircraft such as the P-51, P-38, RB-66 and the F-4C, or specifically designed collection platforms such as the U2/TR-1, SR-71, A-5 and the OV-1 series both in overhead and stand-off collection.[54] A more recent development is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include infrared, conventional, Doppler and synthetic aperture radar.[55]

The development of artificial satellites in the latter half of the 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War.[56] Instrumentation aboard various Earth observing and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few examples.[57][58]

Recent developments include, beginning in the 1960s and 1970s, the development of image processing of satellite imagery. The use of the term "remote sensing" began in the early 1960s when Evelyn Pruitt realized that advances in science meant that aerial photography was no longer an adequate term to describe the data streams being generated by new technologies.[59][60] With assistance from her fellow staff member at the Office of Naval Research, Walter Bailey, she coined the term "remote sensing".[61][62] Several research groups in Silicon Valley including NASA Ames Research Center, GTE, and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data. In 1999 the first commercial satellite (IKONOS) collecting very high resolution imagery was launched.[63]

Training and education

[edit]

Remote sensing has a growing relevance in the modern information society. It represents a key technology as part of the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X and RapidEye are developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural disasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the software was downloaded 100 million times. But studies have shown that only a fraction of them know more about the data they are working with.[64] There exists a huge knowledge gap between the application and the understanding of satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to strengthen the support for teaching on the subject.[65] A lot of the computer software explicitly developed for school lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote sensing requires a consolidation of physics and mathematics as well as competences in the fields of media and methods apart from the mere visual interpretation of satellite images.

Many teachers have great interest in the subject "remote sensing", being motivated to integrate this topic into teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing information.[66] In order to integrate remote sensing in a sustainable manner organizations like the EGU or Digital Earth[67] encourage the development of learning modules and learning portals. Examples include: FIS – Remote Sensing in School Lessons,[68] Geospektiv,[69] Ychange,[70] or Spatial Discovery,[71] to promote media and method qualifications as well as independent learning.

Software

[edit]

Remote sensing data are processed and analyzed with computer software, known as a remote sensing application. A large number of proprietary and open source applications exist to process remote sensing data.

Remote Sensing with gamma rays

[edit]

There are applications of gamma rays to mineral exploration through remote sensing. In 1972 more than $2 million were spent on remote sensing applications with gamma rays to mineral exploration. Gamma rays are used to search for deposits of uranium. By observing radioactivity from potassium, porphyry copper deposits can be located. A high ratio of uranium to thorium has been found to be related to the presence of hydrothermal copper deposits. Radiation patterns have also been known to occur above oil and gas fields, but some of these patterns were thought to be due to surface soils instead of oil and gas.[72]

Satellites

[edit]
Six Earth observation satellites comprising the A-train satellite constellation as of 2014.

An Earth observation satellite or Earth remote sensing satellite is a satellite used or designed for Earth observation (EO) from orbit, including spy satellites and similar ones intended for non-military uses such as environmental monitoring, meteorology, cartography and others. The most common type are Earth imaging satellites, that take satellite images, analogous to aerial photographs; some EO satellites may perform remote sensing without forming pictures, such as in GNSS radio occultation.

The first occurrence of satellite remote sensing can be dated to the launch of the first artificial satellite, Sputnik 1, by the Soviet Union on October 4, 1957.[73] Sputnik 1 sent back radio signals, which scientists used to study the ionosphere.[74] The United States Army Ballistic Missile Agency launched the first American satellite, Explorer 1, for NASA's Jet Propulsion Laboratory on January 31, 1958. The information sent back from its radiation detector led to the discovery of the Earth's Van Allen radiation belts.[75] The TIROS-1 spacecraft, launched on April 1, 1960, as part of NASA's Television Infrared Observation Satellite (TIROS) program, sent back the first television footage of weather patterns to be taken from space.[73]

In 2008, more than 150 Earth observation satellites were in orbit, recording data with both passive and active sensors and acquiring more than 10 terabits of data daily.[73] By 2021, that total had grown to over 950, with the largest number of satellites operated by US-based company Planet Labs.[76]

Most Earth observation satellites carry instruments that should be operated at a relatively low altitude. Most orbit at altitudes above 500 to 600 kilometers (310 to 370 mi). Lower orbits have significant air-drag, which makes frequent orbit reboost maneuvers necessary. The Earth observation satellites ERS-1, ERS-2 and Envisat of European Space Agency as well as the MetOp spacecraft of EUMETSAT are all operated at altitudes of about 800 km (500 mi). The Proba-1, Proba-2 and SMOS spacecraft of European Space Agency are observing the Earth from an altitude of about 700 km (430 mi). The Earth observation satellites of UAE, DubaiSat-1 & DubaiSat-2 are also placed in Low Earth orbits (LEO) orbits and providing satellite imagery of various parts of the Earth.[77][78]

To get global coverage with a low orbit, a polar orbit is used. A low orbit will have an orbital period of roughly 100 minutes and the Earth will rotate around its polar axis about 25° between successive orbits. The ground track moves towards the west 25° each orbit, allowing a different section of the globe to be scanned with each orbit. Most are in Sun-synchronous orbits.

A geostationary orbit, at 36,000 km (22,000 mi), allows a satellite to hover over a constant spot on the earth since the orbital period at this altitude is 24 hours. This allows uninterrupted coverage of more than 1/3 of the Earth per satellite, so three satellites, spaced 120° apart, can cover the whole Earth. This type of orbit is mainly used for meteorological satellites.

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Remote sensing is the science of acquiring information about objects, areas, or phenomena without physical contact, by detecting and measuring their reflected and emitted from a distance, typically using sensors aboard , satellites, or ground-based platforms. This process relies on the interaction of electromagnetic energy with matter, enabling the inference of surface properties such as composition, , and health through spectral analysis. Originating from early 20th-century , remote sensing evolved significantly with the launch of in 1957 and subsequent satellites like in 1960, which pioneered space-based data collection for atmospheric and surface monitoring. Key milestones include the deployment of multispectral scanners in the , facilitating global programs such as Landsat, which have provided continuous data for mapping and since 1972. The technology encompasses passive systems, which measure naturally emitted or reflected energy like , and active systems, such as , which transmit signals and analyze returns to penetrate clouds or darkness. Applications span , for crop yield prediction, via flood and wildfire mapping, , and , with achievements including precise tracking and variability assessment through long-term datasets. Despite these advances, limitations persist, including atmospheric interference like restricting optical sensors, resolution constraints in distinguishing fine-scale features, and the need for ground validation to ensure data accuracy, underscoring the field's reliance on complementary in-situ measurements.

Fundamentals

Definition and Principles

Remote sensing constitutes the acquisition of information about physical objects, areas, or phenomena by measuring reflected or emitted from a distance, without physical contact between the sensor and the target. This process fundamentally depends on the interaction of electromagnetic waves with matter, where incident radiation from sources such as the Sun or artificial emitters interacts with atmospheric constituents and surface materials through mechanisms including reflection, absorption, transmission, and emission, altering the wave's properties based on the target's composition, , and state. Central principles encompass multiple dimensions of resolution that govern data quality and interpretability. Spatial resolution defines the finest resolvable detail, typically expressed as the corresponding to one , enabling detection of features from meters to kilometers depending on altitude and optics. Spectral resolution specifies the 's capacity to distinguish wavelengths, quantified by the number and bandwidth of spectral bands, which allows differentiation of materials based on unique reflectance signatures across the . Temporal resolution measures revisit frequency, critical for monitoring dynamic processes like vegetation growth or urban expansion, often constrained by or flight schedules. Radiometric resolution quantifies the number of detectable intensity levels, typically in bits per , influencing sensitivity to subtle variations in radiance. Retrieving target properties from observed signals poses an , wherein forward models simulate radiance from assumed surface states, but ill-posedness arises as multiple configurations—such as varying or loads—can yield indistinguishable measurements, necessitating regularization techniques and prior knowledge for unique solutions like estimating fractions. Causal factors including sensor-target distance and atmospheric path introduce signal degradation; for example, gaseous absorption by and oxygen attenuates signals, with empirical models showing losses of about 0.01 dB/km in dry air at 22 GHz, accumulating to 1 dB over a 100 km slant path, thereby reducing signal-to-noise ratios and biasing retrievals without correction.

Physical Basis and Electromagnetic Interactions

Remote sensing operates on the principle that electromagnetic radiation interacts with atmospheric constituents and surface materials through absorption, reflection, transmission, and emission, altering the radiation's intensity, direction, and spectral composition before detection by sensors. All objects with temperatures above absolute zero emit electromagnetic radiation, with the emitted spectrum approximating a blackbody curve shifted by emissivity, which measures radiative efficiency and varies by wavelength and material (ranging from 0 to 1). For opaque targets, the relationship between absorption (A) and reflection (R) follows A + R = 1, while transmission (T) is negligible; Kirchhoff's law equates absorptivity to emissivity at thermal equilibrium, enabling passive thermal infrared sensing of surface temperatures. The relevant to remote sensing spans (below 0.4 μm), visible (0.4–0.7 μm), near- (0.7–1.3 μm), shortwave (1.3–3 μm), (3–100 μm), and (above 1 mm), with interactions determined by molecular structure, electronic transitions, and geometry. In the visible and near-, exhibits strong absorption in wavelengths (around 0.65 μm) due to pigments capturing photons for , contrasted by high reflection (up to 50–60%) in near- from internal in mesophyll cells, which lack absorption centers at those wavelengths. and show opposite patterns, with absorbing strongly beyond 0.7 μm due to molecular vibrations, while bare soils reflect more uniformly but with lower near- values than . interactions involve properties, where and content influence backscattering via volume and surface mechanisms. Atmospheric effects modify upwelling radiation through gaseous absorption by species like , , and (peaking at specific bands, e.g., 9.6 μm for O3), and scattering: by molecules dominates shorter wavelengths (intensity proportional to λ^{-4}, explaining blue sky dominance), while by aerosols and cloud droplets affects visible to infrared with less wavelength dependence. These processes attenuate signals and add path radiance, necessitating corrections via radiative transfer models like MODTRAN, which simulates layered atmospheric transmission, molecular/particle absorption-emission, and multiple for wavelengths from to far-infrared. Longer wavelengths like microwaves penetrate clouds effectively because cloud droplet diameters (typically 10–50 μm) are much smaller than wavelengths (centimeters), rendering cross-sections negligible (σ ∝ (2πa/λ)^4 a^2, where a is droplet ), with minimal compared to optical bands where droplet sizes approximate wavelengths, causing strong forward and obscuration. This contrasts with visible/near-infrared limitations, where cumulative and absorption by hydrometeors block surface signals, underscoring wavelength-scale dependencies in propagation physics.

Platforms

Spaceborne Platforms

Spaceborne platforms enable remote sensing at global scales through satellites in (LEO) and (GEO), leveraging for extensive coverage independent of terrestrial constraints. LEO altitudes, typically 500-800 km for missions, position satellites close enough for detailed imaging while allowing sun-synchronous paths to minimize illumination variability across revisits. However, the rapid orbital velocity—approximately 7.8 km/s—necessitates multiple passes or constellations to achieve practical , as single satellites cover only narrow swaths per . GEO platforms, stationed at 35,786 km, match for stationary viewpoints over equatorial regions, providing uninterrupted hemispheric views but with inherent resolution limits due to greater distance. The Landsat series illustrates LEO capabilities, with Landsat 9, launched September 27, 2021, operating at 705 km in a near-polar, , yielding a 185 km swath width and 16-day revisit interval that halves to 8 days when paired with Landsat 8's offset phasing. The Copernicus Sentinel constellation, commencing with Sentinel-1A on April 3, 2014, deploys pairs in 693-700 km LEO orbits 180° apart, enabling 6-12 day global revisits scalable with additional units for enhanced temporal density. Commercial LEO fleets, such as ' Dove nanosatellites, amplify coverage via large constellations exceeding 150 units at varied altitudes under 600 km, delivering near-daily imaging of all land surfaces since achieving full deployment around 2017. These systems exploit orbital trade-offs: proximity in LEO boosts ground resolution for fixed apertures but constrains instantaneous field-of-view to tens of kilometers, demanding high for pole-to-pole access and increasing vulnerability to atmospheric drag that shortens mission life without propulsion. Elevated altitudes expand swaths for broader synoptic data but dilute resolution proportionally to distance, elevating requirements for larger apertures or enhanced to counter diminished received power from inverse-square . GEO satellites like the GOES-R series prioritize persistence, imaging full Western Hemisphere disks every 5-15 minutes from fixed positions, ideal for real-time monitoring of transient events such as storms, though pixel scales degrade to 0.5-4 km owing to the 36,000 km vantage. This configuration avoids revisit gaps but limits utility for fine-scale terrestrial features, underscoring the causal interplay where altitude inversely scales resolution and power budgets while directly enhancing coverage continuity.

Airborne Platforms

![USAF U-2 aircraft, precursor to NASA's ER-2][float-right] Airborne platforms encompass manned and unmanned aerial vehicles (UAVs) deployed for remote sensing, operating at altitudes from tens of meters to over 20 km to deliver enhanced spatial resolution and deployment flexibility relative to spaceborne systems. These platforms facilitate rapid response missions and repeated observations over targeted regions, with manned high-altitude like NASA's ER-2 flying at approximately 21 km (70,000 feet) to simulate satellite perspectives while minimizing atmospheric interference, as the aircraft operates above 95% of the Earth's atmosphere. The ER-2 supports up to 12-hour flights equipped with diverse sensors for Earth observation, including and hyperspectral instruments. UAVs, such as the Matrice series, enable low-altitude targeted surveys, integrating hyperspectral sensors for detailed spectral analysis in applications like . For instance, the Matrice 300 RTK has been used to acquire hyperspectral data over protected areas, offering precise control over flight paths and sensor orientation. At altitudes around 100 m, these systems achieve ground sample distances (GSD) in the sub-meter range, far surpassing typical satellite resolutions for fine-scale features. Key advantages of airborne platforms include superior revisit frequency for dynamic regional monitoring and reduced costs compared to satellite operations for localized tasks, allowing on-demand without orbital constraints. Recent developments in 2024-2025 emphasize UAV-satellite techniques, such as pixel-based and feature-based integration, to combine high-resolution UAV with broader coverage for multi-scale in areas like stress detection. This fusion enhances temporal and spatial complementarity, addressing limitations in individual platform revisit times and coverage.

Ground-Based and Proximal Platforms

Ground-based remote sensing employs sensors mounted on static structures like tripods, towers, or scaffolds, or mobile platforms such as vehicles, to collect data directly from terrestrial surfaces at short ranges, typically enabling resolutions down to centimeters. These platforms facilitate detailed measurements of surface properties, including spectral reflectance via tripod-mounted spectrometers and structural features through terrestrial (TLS) systems, which emit laser pulses to map vegetation height or terrain topography with sub-centimeter precision. Proximal sensing, a operating at distances under a few meters, often integrates optical sensors like hyperspectral radiometers or proximal to capture near-field data on , crop canopies, or atmospheric profiles, minimizing path length effects inherent in elevated or orbital systems. In practice, these platforms serve as critical tools for ground truthing, where proximal spectrometers measure in-situ reflectance spectra to calibrate and validate models derived from airborne or spaceborne imagery, ensuring spectral signatures align with empirical surface interactions rather than distorted proxies. For instance, vehicle-mounted proximal sensors, such as those combining and , provide simultaneous soil property profiles during field campaigns, correlating proximal data with laboratory analyses to refine remote sensing algorithms for variables like organic carbon content. Acoustic sensors, deployed ground-based for near-surface applications, detect subsurface features via sound wave propagation, complementing optical methods in environments with high particulate interference. The causal advantage of ground-based and proximal approaches lies in their negligible atmospheric traversal, which empirically reduces signal from absorption and —effects quantified in proximal soil sensing studies as lowering error variances by up to 20-30% compared to aerial equivalents due to direct surface-to-sensor coupling. This proximity preserves raw electromagnetic or acoustic signatures, enabling higher fidelity in for local phenomena, such as vegetation water content via proximal fluorescence measurements, without the confounding variables of tropospheric or aerosols prevalent in longer-range acquisitions. Consequently, these methods underpin and site-specific validation, where empirical datasets from proximal platforms anchor broader remote sensing interpretations against overgeneralized atmospheric models.

Sensing Technologies

Passive Sensing Methods

Passive remote sensing methods detect emitted or reflected by natural sources, such as solar illumination on Earth's surface or emissions from terrestrial objects, without the sensor providing its own source. These techniques rely on the physical principles of , where s measure radiance arriving from the target scene after interaction with the atmosphere. Common implementations include optical systems for reflected and radiometers for emitted , operating primarily in the visible to shortwave (0.4–2.5 μm) and (8–14 μm) spectral regions, respectively. Optical passive sensors, such as multispectral cameras, capture reflected solar radiation in discrete bands to quantify surface properties, enabling material identification through spectral contrast. For instance, the Thematic Mapper instrument on , launched on March 1, 1984, acquired data in seven bands with 30-meter spatial resolution, supporting long-term monitoring of land cover changes despite its decommissioning in January 2013. Advanced hyperspectral variants extend this to hundreds of contiguous narrow bands for finer ; the PRISMA satellite, launched March 22, 2019, by the , images in over 200 bands from 400 to 2500 nm at 30-meter resolution, enhancing discrimination of subtle biochemical signatures in and minerals. (SNR) in these systems is fundamentally limited by arrival statistics, detector noise, and atmospheric , with empirical data showing SNR degradation under low solar zenith angles due to reduced incident flux. Thermal radiometers measure blackbody-like emissions from surfaces, governed by and Stefan-Boltzmann relation, where radiance correlates with kinetic raised to the fourth power, modulated by . These sensors detect heat contrasts day or night, independent of sunlight, but remain constrained by atmospheric absorption in bands and opacity, which blocks surface emissions entirely. SNR in thermal systems varies with target temperature differential and integration time, often achieving 100–300 in clear conditions for mid-resolution sensors, though empirical tests reveal drops below 50 under partial interference from scattered noise. Overall, passive methods' efficacy hinges on external illumination or emission strength, imposing inherent temporal and weather dependencies absent in active counterparts, as validated by field-calibrated datasets showing null returns in darkness for reflective bands.

Active Sensing Methods

Active remote sensing methods employ sensors that actively transmit electromagnetic energy toward a target and detect the backscattered signal to derive information about the target's properties, distance, and motion. Unlike passive methods reliant on natural illumination, active techniques operate independently of solar or ambient light, enabling continuous during darkness or in shadowed areas. Microwave-based systems, such as , additionally penetrate atmospheric clouds, , and to varying degrees depending on , providing all-weather capabilities essential for consistent monitoring. Radar systems, operating in the microwave portion of the (wavelengths from millimeters to meters), transmit pulses or continuous waves and measure the time delay and phase shift of echoes for ranging and imaging. (SAR) enhances resolution by simulating a large through platform motion, achieving ground resolutions down to meters from spaceborne platforms. For instance, the European Space Agency's satellites, equipped with C-band SAR (wavelength approximately 5.6 cm), provide interferometric wide-swath imaging at resolutions of 5 m by 20 m, supporting applications requiring high temporal revisit rates of 6-12 days. Longer wavelengths, such as L-band (around 23 cm), exhibit greater penetration into vegetation canopies, with empirical studies demonstrating signal interaction with underlying terrain in forested areas up to several meters deep, as evidenced by backscatter analyses from spaceborne missions. Interferometric SAR (InSAR) exploits phase differences between multiple acquisitions to generate digital elevation models with centimeter-level accuracy over large areas. Doppler radar variants utilize the shift in returned signals caused by relative between the and target, enabling measurements with precisions on the order of millimeters per second. This effect arises from the compression or extension of wavefronts, directly quantifying radial speeds for detecting dynamic phenomena like surface deformation or fluid flows. In remote sensing contexts, Doppler processing in SAR modes supports , complementing amplitude-based imaging. Light Detection and Ranging () systems transmit short pulses, typically in the near-infrared spectrum (e.g., 1064 nm), and compute distances from the round-trip , yielding high-precision three-dimensional point clouds. Spaceborne , such as NASA's mission launched on September 15, 2018, employs the Advanced Topographic System (ATLAS) to measure surface elevations with vertical accuracies better than 10 cm along strong beam tracks spaced 17 m apart. While offers sub-meter horizontal resolutions and dense sampling for topographic mapping, its penetration is limited to translucent media like sparse or , unlike radar's broader subsurface access in certain bands. Active methods' self-illumination principle ensures direct causal measurement of target response, minimizing dependencies on external variables like solar geometry.

Multispectral, Hyperspectral, and Radar Techniques

Multispectral remote sensing acquires reflectance data across a limited number of discrete, relatively broad spectral bands, typically 3 to 10, enabling differentiation of surface materials by exploiting distinct reflectance patterns in visible, near-infrared, and sometimes thermal wavelengths. The Moderate Resolution Imaging Spectroradiometer (MODIS), deployed on NASA's Terra and Aqua satellites since 1999 and 2002 respectively, exemplifies this approach with 36 bands spanning 0.4 to 14.5 μm and nadir resolutions of 250 m (bands 1-2), 500 m (bands 3-7), and 1 km (bands 8-36). This configuration balances coverage and computational feasibility but limits fine-grained material identification due to coarser spectral sampling. Hyperspectral remote sensing advances material discrimination by capturing data in hundreds of contiguous narrow bands, often 200 or more, yielding continuous spectra that reveal subtle absorption features tied to molecular composition. The EnMAP satellite, launched April 1, 2022, by the , delivers 246 bands from 420 to 2450 nm at 30 m spatial resolution, calibrated for quantitative spectroscopic analysis. Assessments as recent as October 2025 confirm EnMAP's utility in deriving detailed endmember libraries for sub-pixel material mapping, with innovations in preprocessing enhancing signal-to-noise ratios for low-reflectance targets. Techniques like spectral unmixing further exploit this density by linearly decomposing mixed pixels into pure endmember spectra and abundance fractions, assuming pixels comprise convex combinations of spectrally distinct components, thus enabling resolution of heterogeneity below the native pixel scale. Radar techniques complement optical methods through active microwave illumination, penetrating clouds and operating day or night to probe surface geometry via backscattering. Synthetic aperture radar (SAR) polarimetry quantifies roughness by transmitting and receiving in orthogonal polarizations (e.g., HH, VV, HV), yielding a covariance matrix decomposable into surface, volume, and double-bounce scattering contributions per the Pauli or Freeman-Durden models. Entropy-alpha decomposition, for instance, parameterizes roughness via the alpha angle derived from eigenvector analysis of the coherency matrix, with higher entropy indicating diffuse scattering from irregular surfaces. Fully polarimetric data at X-band (8-12 GHz), as in airborne systems, resolve roughness variations on scales comparable to wavelength, distinguishing smooth from corrugated terrains through cross-polarization ratios exceeding -20 dB for rough interfaces.

Data Management

Data Acquisition Characteristics

Remote sensing data acquisition yields raw datasets characterized primarily by four resolution types: spatial, spectral, radiometric, and temporal. Spatial resolution determines the smallest discernible feature on the ground, typically measured in meters per pixel, with values ranging from sub-meter for high-end commercial satellites to hundreds of meters for coarse sensors like MODIS. Spectral resolution specifies the number and width of electromagnetic bands captured, enabling differentiation of materials based on reflectance signatures, as in multispectral systems with 4-10 bands or hyperspectral with hundreds. Radiometric resolution, quantified by bit depth (e.g., 8-bit yielding 256 gray levels or 12-bit offering 4096), governs the sensor's ability to distinguish subtle intensity variations, with higher depths preserving fidelity in low-contrast scenes but increasing data size. Temporal resolution reflects revisit frequency, often 1-16 days for sun-synchronous orbits like Landsat, constrained by orbital mechanics and swath width. Raw data is commonly stored in self-describing formats like HDF5, which supports hierarchical structures for multidimensional arrays, metadata, and extensibility, as used in satellites for efficient handling of petabyte-scale archives. Accompanying metadata includes geolocation coordinates, acquisition timestamps, sensor orientation, and platform , with absolute geolocation accuracy varying from meters in optical systems to sub-meter in SAR due to precise range-azimuth measurements. Geometric distortions inherent to raw acquisitions arise from platform motion, off-nadir viewing, and terrain relief, manifesting as relief displacement in optical or foreshortening and in side-looking , independent of post-acquisition correction. These effects scale with and incidence , potentially shifting features by tens of pixels in uncorrected data from airborne or agile satellites. Acquisition from satellite constellations generates vast volumes, often exceeding petabytes annually—e.g., NASA's archive at 40 PB as of 2020—balancing extensive coverage against per-scene quality trade-offs like reduced signal-to-noise in miniaturized CubeSats versus dedicated platforms. Higher-resolution amplifies volume exponentially, necessitating onboard compression or selective downlinking to manage bandwidth limits.

Preprocessing and Calibration

Preprocessing in remote sensing involves initial corrections to raw sensor data to mitigate distortions arising from instrumental, environmental, and platform-specific factors, enabling conversion of digital numbers (DN) to physically meaningful quantities such as radiance or reflectance. Key error sources include sensor calibration inaccuracies and drift due to degradation over time, which can introduce systematic biases if unaddressed from first principles of radiative transfer. These steps precede higher-level analysis and focus on empirical validation against ground truth to achieve sub-pixel accuracy where feasible. Radiometric calibration standardizes sensor response by transforming raw DN values into at-sensor radiance or top-of-atmosphere reflectance, often using pre-launch laboratory measurements adjusted via in-flight vicarious methods. Vicarious calibration employs stable ground sites, such as the test site in China's , where simultaneous surface reflectance measurements from field instruments validate data; for instance, experiments on December 14, 2021, at assessed multispectral imager accuracy to within 5% for select bands. Networks like RadCalNet provide automated, global vicarious reference for absolute , reducing reliance on manufacturer coefficients prone to post-launch drift. Destriping addresses striping artifacts from detector non-uniformity or calibration errors in pushbroom scanners, employing variational models that minimize stripe directionality while preserving edges, as demonstrated in hyperspectral data where stripe noise arises from sensor response variations. Geometric rectification corrects spatial distortions from viewing , platform motion, and , typically through orthorectification that projects onto a grid using digital elevation models (DEMs) and ground control points (GCPs). This removes relief displacement, achieving accuracies often below 1 RMSE when validated against independent GCPs; for example, assessments of orthorectified products report RMSE values of 0.5-2 pixels depending on DEM resolution and type. Empirical validation via RMSE quantifies residual errors, with lower values indicating effective tie-point distribution and model fidelity, though unmodeled platform variations can propagate if not accounted for in . Atmospheric correction compensates for and absorption effects that attenuate and alter signals, converting at-sensor radiance to surface via models. The FLAASH algorithm, based on MODTRAN4, performs this for visible to shortwave hyperspectral and multispectral by inverting path radiance and along the line-of-sight, incorporating estimates from image histograms or ancillary ; it handles adjacency effects and nonuniform atmospheres, yielding corrections accurate to 2-5% in validation against in-situ spectra. Such methods prioritize causal error propagation from molecular and particulate , validated empirically rather than assumed neutral, to ensure downstream usability.

Analysis Pipelines and Levels

Remote sensing pipelines follow a hierarchical structure, transforming raw observations into actionable insights through sequential processing stages. These pipelines typically adhere to standardized levels defined by agencies like , where Level 0 (L0) consists of reconstructed, unprocessed instrument data at full resolution, including both signal and without . Level 1 (L1) data incorporate radiometric and geometric corrections, yielding calibrated and geolocated instrument measurements suitable for initial analysis. Higher levels build upon these: Level 2 (L2) derives specific geophysical variables, such as surface or indices, from L1 inputs using algorithms tailored to characteristics; Level 3 (L3) aggregates L2 data onto uniform spatiotemporal grids for statistical analysis; and Level 4 (L4) integrates model assimilations or simulations, producing synthesized outputs like forecasts that combine remote sensing with or numerical models. Core methods within these pipelines include pixel-based or object-based to categorize or features, employing supervised techniques (e.g., maximum likelihood or support vector machines trained on labeled datasets) or unsupervised approaches (e.g., clustering via k-means) to partition imagery. pipelines compare multi-temporal datasets to identify alterations, such as post-classification comparison or spectral differencing, often benchmarked by metrics like the coefficient, which measures agreement between classified maps beyond chance, with values above 0.8 indicating strong performance in validated studies. These methods propagate through levels, ensuring derived products at L2 and above retain to raw inputs via metadata on processing history and algorithmic parameters. Recent advancements incorporate to automate and enhance efficiency, particularly in onboard processing for real-time applications; for instance, zero-shot AI models enable automated without extensive retraining, reducing computational demands for large-scale remote sensing datasets as demonstrated in 2025 frameworks. architectures, such as convolutional neural networks, have been integrated into classification and at L2 stages, improving accuracy in complex scenes like urban expansion monitoring by handling nonlinear feature interactions that traditional methods overlook. Uncertainty propagation remains integral, employing first-order error analysis or simulations to quantify how radiometric noise or geometric distortions at L0 amplify into L4 model outputs, thereby supporting in downstream applications like environmental modeling. This rigorous handling ensures derived products include error bounds, with peer-reviewed benchmarks showing propagated uncertainties typically under 5-10% for well-calibrated sensors in L2 vegetation indices.

Applications

Environmental Monitoring and Earth Science

Remote sensing provides empirical observations of Earth's dynamic environmental systems, enabling the quantification of changes in land, ocean, and atmospheric variables over decadal scales. Satellite platforms deliver repeatable, global coverage that surpasses ground-based networks in spatial extent, supporting causal analyses of phenomena like vegetation dynamics and hydrological cycles. For instance, time-series data from missions such as Landsat have documented forest cover losses, while altimetry and ocean color sensors track sea level and productivity shifts, offering baselines for validating process-based models. In climate tracking, radar altimetry from the TOPEX/Poseidon and Jason series satellites has measured global mean at 111 mm from 1993 to 2023, with the rate doubling from 2.1 mm per year initially to 4.5 mm per year by 2024, attributed to and ice melt contributions discernable through precise orbit and instrument calibrations. Landsat-derived indices, such as (NDVI) time-series, have quantified rates in the Amazon, where annual losses exceeded 10,000 km² between 2019 and 2022, informing policy responses despite variability from seasonal clouding and selective logging detection limits. These datasets underpin IPCC reports by providing observational constraints on essential climate variables, such as neutrality progress, though integration requires cross-validation with in-situ measurements to mitigate algorithmic assumptions. Oceanographic applications leverage passive sensors like MODIS on the Aqua satellite to estimate surface chlorophyll-a concentrations via bio-optical algorithms, proxying phytoplankton biomass and revealing spatiotemporal patterns in marine productivity linked to nutrient upwelling and temperature stratification. Such data highlight global baselines for biodiversity hotspots, yet optical methods suffer from cloud cover biases that skew tropical and high-latitude sampling, potentially underestimating variability by up to 15-20% in discharge or productivity estimates without active radar supplementation. Empirical strengths lie in long-term consistency, as evidenced by multi-decadal archives, but causal inferences demand caution against overreliance, given propagation of preprocessing errors into downstream analyses and the need for ground-truthed calibration to distinguish signal from noise in heterogeneous terrains.

Military and Intelligence Operations

Remote sensing has been integral to military reconnaissance since the Cold War era, enabling surveillance over denied territories without risking personnel. The Corona program, initiated by the U.S. in 1959, launched its first successful mission on August 18, 1960, from Vandenberg Air Force Base, capturing photographic imagery via film-return satellites that produced over 800,000 images across 145 missions until 1972, providing critical intelligence on Soviet capabilities. Declassified in 1995, these images demonstrated remote sensing's capacity for strategic monitoring, filling gaps left by U-2 overflights after the 1960 U-2 incident. The U-2 aircraft, operational since 1956, conducted high-altitude missions up to 70,000 feet, employing optical and radar sensors for signals intelligence and imagery in operations like the 1991 Gulf War, where it delivered near-real-time data to commanders. Synthetic aperture radar (SAR) enhances military operations by providing all-weather, day-night imaging capable of penetrating and foliage to detect concealed targets, such as vehicle movements or underground structures. Deployed on platforms from to satellites, SAR supports , , and battle damage assessment, as evidenced by its use in tracking enemy positions and in modern conflicts. While susceptible to jamming, empirical successes in operations underscore its strategic value, offering superior situational awareness over optical methods limited by weather. In contemporary intelligence, remote sensing verifies treaties through national technical means, including satellite monitoring of nuclear sites and missile deployments, as protected under agreements like the 1972 SALT I treaty. During the 2022 Russia-Ukraine conflict, commercial providers like Maxar supplied high-resolution imagery under U.S. government contracts, enabling real-time analysis of troop movements, damage, and debunking , with datasets confirming widespread destruction via algorithms. These integrations highlight hybrid commercial-military models, where firms secure multimillion-dollar defense deals for , augmenting national assets despite vulnerabilities like signal interference.

Agriculture, Resource Management, and Disaster Response

Remote sensing enables by providing data for site-specific crop management, such as using the (NDVI) derived from satellite imagery to predict yields. For instance, time-integrated NDVI from Landsat imagery has been modeled to forecast yields through linear mixed-effects approaches, correlating over growing seasons with outcomes. Empirical studies demonstrate that integrating multispectral remote sensing with improves yield estimation accuracy, allowing farmers to optimize and application, thereby reducing input costs by up to 20-30% while maintaining or increasing . Variable rate technology guided by these data minimizes nutrient losses and , enhancing long-term soil resilience without yield penalties. In , remote sensing supports surveys by mapping surface alterations and recovery post-extraction. and aerial imagery assess land disturbance, with hyperspectral data identifying compositions for exploration efficiency. Case studies from the U.S. Geological Survey illustrate how multi-temporal remote sensing tracks mine site reclamation, quantifying regrowth rates and risks to inform and sustainable practices. This approach reduces exploratory needs by prioritizing high-potential areas, though accuracy depends on resolution matching variability. For disaster response, remote sensing facilitates rapid damage assessment and early warnings, as seen in the Copernicus Emergency Management Service's activation for the February 2023 Turkey-Syria earthquakes (magnitudes 7.8 and 7.5), where generated displacement maps across affected zones within days. The Famine Early Warning Systems Network (FEWS NET) employs satellite-derived vegetation indices to monitor drought impacts on crops, enabling predictions of food insecurity phases that guide aid distribution in regions like . However, limitations include data latency from processing delays, which can hinder real-time acute event response, and reduced efficacy in rugged terrains where cloud cover or topographic shadows obscure optical sensors. These constraints underscore the need for complementary active sensing methods like to ensure reliable coverage.

Urban Planning, Infrastructure, and Commercial Uses

Remote sensing technologies, including and , enable detailed mapping of urban land use and expansion patterns, supporting planners in assessing growth trajectories and zoning decisions. For instance, satellite-derived data processed through platforms like Google Earth Engine have been used to quantify in regions such as Ambon City, , by analyzing Landsat imagery from 1990 to 2020 to detect increases and inform strategies. Similarly, in , Earth Engine applications have mapped sprawl over three decades, revealing annual expansion rates exceeding 5% in peri-urban areas through classification of built-up versus vegetated lands. These tools provide repeatable, large-scale analyses that traditional ground surveys cannot match in scope or frequency, though initial requires computational expertise. In infrastructure management, systems facilitate non-contact of critical assets like bridges, generating high-resolution 3D point clouds to detect deformations, cracks, and without halting . A Transportation Research Board study demonstrated mobile 's efficacy in capturing structural geometries during routine scans, achieving sub-centimeter accuracy for over multiple . For example, drone-mounted has been applied to assess bridge decks and towers, reducing times from days to hours while minimizing worker exposure to hazards, as evidenced in U.S. pilots. Such applications enhance but face limitations from high equipment costs, often exceeding $100,000 per system, and atmospheric interference in adverse weather. Traffic analysis in urban settings benefits from remote sensing via and aerial , allowing extraction of counts, speeds, and flow patterns across entire cities. High-resolution data, combined with algorithms, has enabled monitoring of traffic volumes at scales beyond fixed networks, as shown in studies of urban intersections where daily densities were estimated with 85-90% accuracy. Thermal and optical remote sensing further quantifies congestion impacts, such as heat island effects from roadways during low-traffic periods like , aiding in . These methods offer efficiency gains over manual counts but are constrained by obscuring optical sensors and the need for ground-truth validation to mitigate algorithmic errors in complex scenes. Commercially, remote sensing drives revenue through services like hyperspectral detection for hydrocarbon exploration and spill response, where spectral signatures distinguish oil from water with over 90% classification accuracy in controlled tests. The global remote sensing technology market, encompassing these applications, is projected to reach $21.11 billion in 2025, fueled by demand in urban and industrial sectors for efficient, scalable data over labor-intensive alternatives. Despite advantages in rapid deployment—such as UAV hyperspectral surveys covering kilometers in minutes—adoption barriers include data processing expenses and regulatory hurdles for commercial satellite operations.

Historical Development

Pre-20th Century Origins

The conceptual precursors to remote sensing emerged from 17th- and 19th-century advancements in and aerial observation, enabling distant acquisition of environmental data without physical contact. Isaac Newton's 1672 experiments with prisms demonstrated that white light disperses into a of colors, revealing the heterogeneous nature of and establishing foundational principles of and spectral decomposition that underpin later spectroscopic identification of materials via reflected or emitted . These optical insights, grounded in empirical measurements, facilitated causal understanding of how electromagnetic interactions with matter produce detectable signatures, a core mechanism in remote sensing. By the mid-19th century, the invention of intersected with to produce the first elevated imagery. In 1858, French photographer Gaspard Félix Tournachon () captured the earliest known aerial photograph from a tethered hot-air over the Bièvre Valley near at an altitude of about 1,200 feet (365 meters), using wet-collodion plates to record landscape features from afar. This marked an initial application of non-contact imaging for topographic depiction, though limited by exposure times and stability, with Nadar's subsequent tethered ascents in 1859-1860 aimed at systematic land surveying despite technical challenges like motion blur. Military contexts adapted these elevation techniques for reconnaissance during conflicts. In the American Civil War, starting in 1861, Union aeronaut Thaddeus S. C. Lowe conducted balloon ascents—such as his June 18 demonstration over , at 500 feet (152 meters)—transmitting real-time visual observations of and troop movements via telegraph to ground commanders, providing strategic overviews unattainable from surface positions. Lowe's balloons, inflated with and tethered for controlled observation, supported over 3,000 ascents by war's end, emphasizing causal advantages in visibility for artillery spotting and enemy positioning without direct exposure, though reliant on human visual interpretation rather than recorded imagery. These efforts highlighted remote sensing's potential for operational intelligence, predating photographic integration in warfare.

Mid-20th Century Advancements

During , military demands accelerated remote sensing through enhanced and systems for , with Allied forces employing oblique and vertical photography to map enemy positions and infrastructure. Postwar, the U.S. utilized captured German V-2 rockets for suborbital sounding missions from White Sands Proving Ground starting in 1946, equipping them with 35mm motion picture cameras to capture the first ground images from altitudes exceeding 100 km, demonstrating the feasibility of space-based observation. In the , the U.S. military developed side-looking airborne (SLAR) systems, such as those pioneered by Westinghouse, enabling all-weather terrain imaging from high-altitude aircraft for mapping and surveillance, with operational tests occurring by the mid-decade. The , operational from 1956, extended these capabilities with high-resolution photography from 70,000 feet, proving pivotal in the 1962 when imagery from October 14 missions revealed Soviet sites in western , informing U.S. naval decisions and averting escalation. The Corona satellite program, initiated in 1960 under CIA auspices, introduced orbital remote sensing with film-return capsules, successfully recovering the first images on August 19, 1960, and producing over 800,000 photographs by its 1972 conclusion, primarily for strategic intelligence during the Cold War; the imagery remained classified until declassification in 1995. Paralleling military advances, civilian applications emerged in the 1960s through NASA and USGS aircraft-based multispectral scanning experiments, which tested wavelength-specific sensors for resource identification, directly informing the design of the Earth Resources Technology Satellite-1 (ERTS-1), launched July 23, 1972, as the first satellite multispectral imager.

Late 20th to Early 21st Century Expansion

The launch of the satellite on September 24, 1999, marked the advent of commercial high-resolution remote sensing, delivering panchromatic imagery at 1-meter resolution and multispectral data at 4 meters globally. This development privatized access to sub-meter detail previously limited to government programs, spurring applications in mapping and urban analysis while challenging regulatory frameworks on data export. Concurrently, the 1990s saw widespread integration of GPS with remote sensing for precise , enabling overlay of satellite imagery with ground-truthed coordinates to correct distortions and enhance feature extraction accuracy in GIS environments. In 2008, the U.S. Geological Survey opened the Landsat archive to free public access, releasing over 2 million scenes from through 7 dating back to 1972, which democratized petabyte-scale datasets for global users and accelerated longitudinal studies of change. This policy shift, effective by December 2008, reduced and fostered international collaboration, with download volumes surging from thousands to millions of scenes annually. The 2010s witnessed explosive growth in small constellations, exemplified by deployments for frequent Earth revisits, such as ' Dove fleet providing daily global coverage at 3-meter resolution starting around 2014. These low-cost, proliferated systems—numbering hundreds by mid-decade—enabled near-real-time monitoring, contrasting earlier infrequent orbits. Remote sensing data volumes escalated from terabytes to approaching exabytes cumulatively by the late , driven by higher-resolution sensors and denser orbital networks, necessitating advances in cloud-based processing. During the 2020 outbreak, such capabilities facilitated rapid mapping of mobility patterns and urban density shifts via integrated and derived datasets. ![A-Train satellite constellation][float-right] This era's globalization extended to multinational missions, including Europe's Sentinel series from 2014, enhancing data interoperability and coverage equity beyond U.S.-centric archives.

Challenges and Limitations

Technical and Operational Constraints

Remote sensing systems face fundamental physical constraints on spatial resolution due to wave diffraction, where the minimum resolvable angle is approximated by the Rayleigh criterion, θ ≈ 1.22 λ / D, with λ as the wavelength and D as the aperture diameter. For visible-light sensors (λ ≈ 500 nm) on satellites with apertures of 0.5–2 m, this yields angular resolutions of 0.3–1 arcseconds, translating to ground resolutions of several meters at low Earth orbit altitudes of 500–800 km, though practical limits are often coarser due to pixel sampling and atmospheric turbulence. Atmospheric interference severely limits optical remote sensing, as clouds, aerosols, and attenuate or scatter signals, rendering passive visible and near-infrared imagery unusable over 50–70% of Earth's surface on average, with tropical regions experiencing persistent exceeding 80% during certain seasons. (SAR) mitigates some weather effects but suffers from signal decorrelation in vegetated or dynamic surfaces and speckle noise, reducing effective resolution. Empirical studies report classification error rates for land cover mapping from optical data at 10–30%, depending on vegetation heterogeneity and resolution, with finer classes like shrubs or crops often misclassified due to similarities and mixed pixels. Inverting remote sensing measurements to retrieve geophysical parameters—such as surface or from radiance data—constitutes an ill-posed , where multiple surface states can produce identical observations due to non-uniqueness and sensitivity to noise, necessitating prior assumptions or regularization that introduce model-dependent biases. Atmospheric path radiance and bidirectional reflectance effects exacerbate this, with retrieval uncertainties often exceeding 20% for key variables like without ground validation. Operational logistics impose additional constraints, including high costs for satellite deployment and maintenance; small remote sensing satellites cost approximately $100–150 per kg to , with full missions exceeding $50–100 million including launches, while data downlink and processing add recurring expenses of millions annually. platforms, essential for high-resolution imaging, experience atmospheric drag-induced , with satellites below 600 km altitude deorbiting within 1–5 years absent , limiting mission lifetimes and requiring frequent replacements.

Ethical, Privacy, and Surveillance Controversies

Remote sensing technologies, particularly high-resolution commercial , have sparked significant ethical debates over erosion, as persistent monitoring capabilities enable the tracking of individual movements . A 2023 study surveying 99 participants highlighted public concerns that commercial satellites' high temporal and —such as daily imaging from constellations like those operated by —could facilitate granular of private activities, including vehicle tracking and behavioral pattern analysis, potentially conflicting with expectations of in yards or homes. This capability raises legal and ethical challenges, as unfettered access to such data by private entities or governments could exacerbate threats or enable misuse, though few respondents favored unrestricted availability despite its utility. Balancing these risks, proponents argue that anonymization and regulatory frameworks could mitigate harms while preserving societal benefits from . In surveillance applications, for monitoring has been critiqued for inadvertently incentivizing noncompliance, as lower detection costs for minor violations may encourage parties to test boundaries or escalate subtly. A September 2025 analysis in Surveillance & Society examined how remote sensing technology (RST) in monitored —intended to enhance compliance—can motivate new through mechanisms like cheaper probing actions, devaluing traditional verification methods, and creating informational asymmetries that provoke retaliation. from conflict zones suggests that while RST augments observational power, it often fails to deter behavioral changes, potentially undermining fragile truces rather than ensuring . This challenges overly optimistic views of as a , emphasizing causal pathways where monitoring alters incentives in ways that amplify rather than suppress violations, though gains in verified compliance persist in select cases. Counterbalancing these concerns, remote sensing has demonstrably advanced human rights accountability by exposing atrocities that ground access might obscure. For instance, the Australian Strategic Policy Institute's 2018 report utilized commercial satellite imagery to map over 380 suspected internment facilities in Xinjiang, China, revealing the scale of Uyghur detention camps through structural analysis and temporal changes, corroborated by open-source intelligence. Organizations like Amnesty International have employed such data since 2007 to document abuses, integrating imagery with witness testimony to validate mass graves and conflict incidents, thereby providing verifiable evidence for international tribunals. These applications underscore remote sensing's role in causal realism for justice—enabling empirical verification of hidden violations—while ethical guidelines for data use in investigations address veracity risks from private providers. Despite institutional biases in some advocacy sources, the technology's evidentiary value holds when grounded in multi-sourced analysis.

Geopolitical and Accessibility Barriers

Geopolitical barriers to remote sensing arise from national assertions of , which often restrict the collection, dissemination, or use of over sensitive territories. Under the , no state can claim sovereignty over space itself, yet nations impose domestic regulations limiting foreign remote sensing activities; for instance, the enforces the Kyl-Bingaman Amendment, prohibiting licenses for high-resolution commercial of to protect allied security interests. Similarly, export controls under the (ITAR) and (EAR) classify high-resolution imaging technologies as dual-use items, constraining transfers to non-allied nations and maintaining U.S. strategic advantages in capabilities. These measures, while aimed at preventing proliferation, can hinder global scientific collaboration and data sharing for non-military applications. A pronounced North-South divide exacerbates accessibility issues, with developing countries in the Global South experiencing empirical gaps in remote sensing coverage despite acute needs for monitoring , disasters, and resources. Studies indicate that intergovernmental factors, including limited technical capacity and high costs of , impede adoption in these regions, where local often lacks the power or expertise to utilize advanced effectively. For example, while Northern nations dominate constellations and analysis, Southern counterparts rely heavily on imported data, facing delays and incomplete datasets that widen disparities in applications like environmental management. This divide persists amid uneven global orbits and licensing, leaving vast areas underserved and perpetuating reliance on foreign providers subject to geopolitical strings. Military-commercial entanglements further complicate access, as private satellite firms increasingly supply data for defense purposes, blurring lines between civilian and strategic uses. During the 2022 , Ukraine's government requested and received high-resolution imagery from at least eight commercial providers, including Maxar and , which aided targeting and but raised concerns over data weaponization and potential retaliatory restrictions from adversaries. Such integrations demonstrate how commercial remote sensing supports , prompting nations like to jam signals or develop countermeasures, thereby indirectly limiting peacetime data flows and heightening tensions over dual-use technologies. These dynamics underscore causal risks where strategic dependencies on private actors can politicize ostensibly markets.

Future Directions

Technological Innovations

Recent advancements in emphasize sensor miniaturization to enable deployment on smaller platforms, reducing component costs and facilitating broader applications in remote sensing. Developments as of late 2024 target compact designs suitable for unmanned aerial systems (UAS) and low-Earth orbit satellites, improving for material identification without sacrificing portability. Quantum represent a nascent hardware frontier, leveraging atomic-level precision for enhanced remote sensing measurements, including Rydberg-based systems for hyperspectral . NASA's exploratory efforts demonstrate prototypes integrating these sensors to achieve finer detection limits in environmental and atmospheric monitoring, outperforming classical in signal under varying conditions. Market analyses project quantum sensor adoption in remote platforms growing significantly by 2035, driven by sensitivity gains in magnetic and gravitational field mapping. Multi-sensor fusion techniques have advanced to streamline UAS-satellite pipelines, combining high-resolution aerial imagery with orbital multispectral inputs for pixel- and feature-level integration. A 2025 review highlights optimized workflows yielding improved temporal coverage and accuracy in land-use mapping, with fusion algorithms processing complementary datasets to mitigate individual gaps like interference in satellites or limited swath in UAS. The trend toward smaller satellites, as outlined in Lockheed Martin's 2025 space technology outlook, supports proliferated constellations for persistent remote sensing coverage. Platforms like the LM 50 and LM 400 series enable rapid deployment of payloads, with production scaling to meet demands for frequent revisits in monitoring and . Satellite swarms offer empirical pathways to sub-meter resolutions, approaching centimeter-scale through coordinated multi-view and interferometric synthesis. Conceptual designs project swarms achieving 30 cm ground sampling distance via dense orbital arrays, enhancing reconstruction for topographic and tasks beyond single-satellite limits.

Integration with AI and Emerging Systems

Artificial intelligence enhances remote sensing by automating and multisource , enabling the identification of subtle patterns in large datasets that exceed human capabilities. In , unsupervised AI methods applied to Landsat-8 imagery have successfully pinpointed mineral deposits like by isolating deviations from baseline spectral signatures, demonstrating superior performance over traditional statistical detectors such as RX in empirical tests on hyperspectral data. integrates complementary remote sensing modalities—e.g., optical and SAR—for improved inference, as reviewed in studies showing AI models achieving over 90% accuracy in mapping by combining SAR with optical data, outperforming threshold-based approaches reliant on manual . NASA's Dynamic Targeting technology exemplifies AI-driven autonomy in remote sensing, allowing Earth-observing satellites to analyze lookahead data in under 90 seconds and reorient primary instruments toward high-value without ground intervention. Tested successfully in July 2025 on , this system processes real-time to prioritize dynamic events like wildfires or storms, enhancing yield by focusing acquisitions causally linked to observed precursors rather than predefined schedules. Integration with unmanned aerial vehicles (UAVs) via AI-enabled communications supports real-time remote sensing for applications requiring low-latency processing, such as urban monitoring or . AI optimizes UAV trajectories and spectrum allocation in -UAV networks, enabling edge-computed fusion of onboard multispectral data with feeds to achieve near-instantaneous anomaly alerts, though gains depend on predictive interference mitigation models. Empirical evaluations report accuracies improving by 15-20% over non-AI baselines in land-use mapping when AI handles UAV- data streams, attributed to reduced noise from adaptive fusion rather than raw sensor upgrades. Challenges persist in AI opacity, where deep learning models function as "black boxes," obscuring causal pathways from inputs to outputs and complicating validation in geoscientific contexts like remote sensing interpretation. Despite this, explainable AI techniques, such as attention mechanisms in convolutional networks, mitigate risks by highlighting influential bands, fostering trust through verifiable decision traces. Emerging systems prioritize models to supplant correlative patterns, potentially diminishing interpretive biases inherent in human-led analysis by enforcing physically grounded priors over data-driven approximations alone.

References

  1. https://www.[mdpi](/page/MDPI).com/2073-4433/11/5/517
Add your contribution
Related Hubs
User Avatar
No comments yet.