Imagery and GIS. Kass Green. Читать онлайн. Newlib. NEWLIB.NET

Автор: Kass Green
Издательство: Ingram
Серия:
Жанр произведения: География
Год издания: 0
isbn: 9781589484894
Скачать книгу
Images

      Notice how water is significantly lower in the infrared band than are the other land-cover types. Also, urban has high values in all bands relative to the other classes. Riparian vegetation and water are similar in the red, green, and blue bands, but significantly different in the infrared band, indicating that without the infrared band it might be difficult to distinguish the greenish water from the green vegetation.

      At this point, we can begin to see how variations in land-cover types can be related to variations in spectral responses, and it becomes straightforward to group the similar pixels of the image sample in figure 3.14 together into land-cover classes, as depicted in figure 3.16. Of course, it is never quite this straightforward to turn image data into map information, which is why chapters 7 to 9 thoroughly examine the methods and tools for image interpretation and classification.

Images

      Figure 3.16. Infrared DN values from figure 3.13 combined into land-cover classes

       Radiometric Resolution

      Radiometric resolution is the minimum variation in electromagnetic energy that a sensor can detect, and therefore determines the information content of an image. Like spectral resolution, radiometric resolution is determined by the sensor.

      In film systems, radiometric resolution is determined by the contrast of the film. Higher-contrast films will have higher radiometric resolutions than low-contrast films. In digital sensors, the potential range of DN values that can be recorded for each band determines the sensor’s radiometric resolution. The larger the number of bits or intensities discernible by the sensor, the higher its radiometric resolution and the better the sensor can detect small differences in energy. In general, higher radiometric resolution increases the ability to more finely distinguish features on the imagery. Discerning objects within shadowed areas or extremely bright areas is particularly enhanced by higher radiometric resolution.

      Digital data is built with binary machine code, therefore each bit location has only two possible values (one or zero, on or off), and radiometric resolution is measured as a power of 2. One-bit data would result in image pixels being either black or white, so no shades of gray would be possible. The first digital sensors were 6 bit, allowing 64 levels of intensity. More recent sensors such as Landsat 8, Sentinel-2, and WorldView-3 have 11- to 14-bit radiometric resolutions (for a range of from 2,048 to 16,384 levels of intensity).

      The range of electromagnetic energy intensities that a sensor actually detects is termed its dynamic range. Specifically, dynamic range is defined as the ratio of the maximum intensity that can be measured by a device divided by the lowest intensity level discernible. It is important to note the difference between radiometric resolution and dynamic range. The radiometric resolution defines the potential range of values a digital remote sensing device can record. Dynamic range is calculated from the actual values of a particular image. Dynamic range is defined by the difference between the lowest detectable level and the brightest capturable level within one image. It is governed by the noise floor/minimal signal and the overflow level of the sensor cell.

      The sensor used to originally capture an image determines the radiometric resolution of the image. Thus, scanning a film image to create a digital version results in a digital image with the radiometric resolution of the film sensor, not of the digital scanner, even though the radiometric resolution of the scanner may be better than that of the film image.

       Spatial Resolution

      An image’s spatial resolution is determined by the altitude of the platform, and the viewing angle, lens focal length, and resolving power of the sensor. Spatial resolution has two different definitions:

       The smallest spatial element on the ground that is discernible on the image captured by the remote sensing system. The definition of “discernible” can refer to the ability to detect an element as separate from another, or to both detect and label the different elements. This definition was commonly used when remotely sensed images were collected primarily on film.

       The smallest spatial unit on the ground that the sensor is able to image. This is the more common meaning and is the one relied upon by makers and users of digital remote sensing systems. Usually, it is expressed as the ground sample distance (GSD), which is the length on the ground of one side of a pixel.

      GSD is a function of sensor pixel size, height above terrain, and focal length, as expressed in the following equation:

Images

      The distance to ground is a function of platform altitude and sensor viewing angle. If focal length and sensor resolving power are held constant (as they are in most airborne systems), then the lower the altitude of the system, the smaller the GSD and the higher the spatial resolution of the resulting imagery. If focal length and distance to ground are held constant (as they are in satellite systems), then the higher the sensor resolving power, the higher the spatial resolution. If sensor resolving power and distance to ground are held constant, then the longer the focal length, the higher the spatial resolution of the sensor. Because the sensor and the altitude of satellite remote sensing systems are constant over the usable life of the system, their spatial resolutions are also fairly constant for each satellite system and change only when the viewing angle is changed.

      Airborne systems have varying spatial resolutions depending on the sensor flown and the altitude of the aircraft platform. Spatial resolution is also affected by whether the sensor has a stabilized mount, a forward motion compensation unit, or both, which compensate for the forward motion of the aircraft and minimize the blur caused by the motion of the platform relative to the ground by moving the sensor in the reverse direction of that of the platform (and at the ground speed of the platform) during sensor exposure. Figure 3.17 compares the spatial resolution of 15-meter pan-sharpened Landsat imagery to that of airborne 1-meter National Agriculture Imagery Program (NAIP) imagery over a portion of Sonoma County, California. Figure 3.18 compares the NAIP imagery to 6-inch multispectral imagery over a subset of the same area.

Images

      Figure 3.17. Comparison of Landsat 15-meter pan-sharpened satellite imagery to 1-meter National Agriculture Imagery Program (NAIP) airborne imagery over a portion of Sonoma County, California. Color differences are due to sensor differences and the imagery being collected in different seasons. (esriurl.com/IG317)

Images

      Figure 3.18. Comparison of 1-meter National Agriculture Imagery Program (NAIP) imagery to 6-inch airborne imagery over a subset of the area of figure 3.17. Color and shadow differences are due to sensor differences and the imagery being collected in different seasons. (esriurl.com/IG318)

      The highest spatial resolution obtainable from a civilian satellite is WorldView-4’s 30 centimeters (11.8 inches). High-resolution airborne multispectral sensors have spatial resolutions of 2 to 3 centimeters at an altitude of 500 feet (e.g., UltracamEagle). Because they can fly lower than piloted aircraft, UASs can obtain higher spatial resolutions than manned aircraft.