Imagery and GIS. Kass Green. Читать онлайн. Newlib. NEWLIB.NET

Автор: Kass Green
Издательство: Ingram
Серия:
Жанр произведения: География
Год издания: 0
isbn: 9781589484894
Скачать книгу
without being in contact with the source of the stimulus” (ASCE, 1994). Examples of remote sensing systems include our eyes, ears, and noses; the camera in your phone; a video camera recording traffic or ATM activity; sensors on satellites; and cameras on UASs, helicopters, or airplanes.

Images

      Figure 3.1. Comparison of example percent reflectance of different types of objects across the electromagnetic spectrum (esriurl.com/IG31)

      The type of sensor used to capture energy determines which portions of the electromagnetic spectrum the sensor can measure (the imagery’s spectral resolution) and how finely it can discriminate between different levels of energy (its radiometric resolution). The type of platform employed influences where the sensor can travel, which will affect the temporal resolution of the imagery. The remote sensing system—the combination of the sensor and the platform—impacts the detail perceivable by the system, the imagery’s spatial resolution, the viewing angle of the imagery, and the extent of landscape viewable in each image.

       Sensors

      This section provides an understanding of remote sensors by examining their components and explaining how different sensors work. As mentioned in chapter 1, a wide variety of remote sensors have been developed over the last century. Starting with glass-plate cameras and evolving into complex active and passive digital systems, remote sensors have allowed us to “see” the world from a superior viewpoint.

      All remote sensors are composed of the following components, as shown in figure 3.2:

       Devices that capture either electromagnetic energy or sound, either chemically, electronically, or biologically. The devices may be imaging surfaces (used mostly in electro-optical imaging) or antennas (used in the creation of radar and sonar images).

       Lenses that focus the electromagnetic energy onto the imaging surface.

       Openings that manage the amount of electromagnetic energy reaching the imaging surface.

       Bodies that hold the other components relative to one another.

Images

      Figure 3.2. The similar components of the human eye and a remote sensor

      Our eyes, cameras, and the most advanced passive and active digital sensors fundamentally all work the same way. Electromagnetic energy passes through the opening of the sensor body where it reaches a lens that focuses the energy onto the imaging surface. Our brains turn the data captured by our retinas into information. Similarly, we convert remotely sensed image data into information through either manual interpretation or semi-automated image classification.

       Imaging Surfaces

      Imaging surfaces measure the electromagnetic energy that is captured by digital sensors such as a charged coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) array. The wavelengths of energy measured are determined by either filters or dispersing elements placed between the sensor opening and the imaging surface. The energy is generated either passively by a source (such as the sun) other than the sensor, or actively by the sensor.

       The Electromagnetic Spectrum

      Most remote sensing imaging surfaces work by responding to photons of electromagnetic energy. Electromagnetic energy is caused by the phenomenon of photons freeing electrons from atoms. Termed the photoelectric effect, it was first conceptualized by Albert Einstein, earning him the Nobel Prize in physics in 1921.

      Electromagnetic energy occurs in many forms, including gamma rays, x-rays, ultraviolet radiation, visible light, infrared radiation, microwaves, and radio waves. It is characterized by three important variables: 1) speed, 2) wavelength, and 3) frequency The speed of electromagnetic energy is a constant of 186,000 miles/second, or 3 × 108 meters/second, which is the speed of light. Wavelength is the distance between the same two points on consecutive waves and is commonly depicted as the distance from the peak of one wave to the peak of the next, as shown in figure 3.3. Frequency is the number of wavelengths per unit time.

Images

      Figure 3.3. Diagram demonstrating the concepts of electromagnetic wavelength and frequency

      The relationship between wavelength, wave speed, and frequency is expressed as

Images

      Because electromagnetic energy travels at the constant speed of light, when wavelengths increase, frequencies decrease, and vice-versa (i.e., they are inversely proportional to each other). Photons with shorter wavelengths carry more energy than those with longer wavelengths. Remote sensing systems capture electromagnetic energy emitted or reflected from objects above 0 degrees Kelvin (absolute 0).

      Electromagnetic energy is typically expressed as either wavelengths or frequencies. For most remote sensing applications, it is expressed in wavelengths. Some electrical engineering applications such as robotics and artificial intelligence express it in frequencies. The entire range of electromagnetic wavelengths or frequencies is called the electromagnetic spectrum and is shown in figure 3.4.

Images

      Figure 3.4. The electromagnetic spectrum

      The most significant difference between our eyeballs and digital cameras is how the imaging surfaces react to the energy of photons. As shown in figure 3.4, the retinas in human eyes sense only the limited visible light portion of the electromagnetic spectrum. While able to capture more of the spectrum than human eyes, film is limited to wavelengths from 0.3 to 0.9 micrometers (i.e., the ultraviolet, visible, and near infrared). CCD or CMOS arrays in digital sensors are sensitive to electromagnetic wavelengths from 0.2 to 1400 micrometers. Because remote sensors extend our ability to measure more portions of the electromagnetic spectrum than our eyes can sense, remote sensors extend our ability to “see.”

       Film versus Digital Array Imaging Surfaces

      The imaging surfaces of our eyes are our retinas. Cameras once used only film, but now primarily use digital (CCD or CMOS) arrays. From its beginnings in the late 1800s to the 1990s, most remote sensing sensors relied on film to sense the electromagnetic energy being reflected or emitted from an object. Classifying the resulting photographs into information required manual interpretation