How Do Satellite Images Work?

Welcome to a new science capsule, today we will discuss the wondrous world of satellite images! This topic has a lot that goes into it. From the history of it to the various imaging methods and resolutions, satellite photography is very fascinating … and a bit complex. The focus of this article will be on getting a general grasp of the science behind it. Hopefully, by the end, we will all understand some of the intricacies of how satellites take photos of Earth. 

Satellite images

The History

It was October 24th, 1946, when the United States launched the V-2 suborbital flight. This machine took a picture every 1.5 seconds. The apogee of this flight was 105 km, almost 5 times the previous record held by Explorer II, in 1935, of 22 km. These types of suborbital flights ended up being the precursors of the satellite photography we know today. 

The first example of that ended up being the Explorer 6 satellite, which launched on August 14, 1959. Through this mission, we were able to photograph Earth from space for the first time. The pictures taken by satellites kept increasing in quality, leading to the iconic “Blue Marble” photo of Earth. This was taken by the Apollo 17 mission in 1972, as the crew was on its way to the Moon. 

During that same year, the United States started the Landsat Program, the largest program to take images of Earth from space. Then, in 1977, the first real-time satellite imagery of Earth was taken by the KH-11 system. The most recent Landsat satellite launched on September 27th, 2021. This type of large-scale imagery gathering is also what would lead to modern programs, such as Google Earth. All of NASA’s satellites images are also part of the public domain. For anyone interested, they are readily accessible via the NASA Earth Observatory.

Types of Resolution in Satellite Images

As I alluded to at the beginning there are different resolutions when it comes to the photos taken by satellites. These can be split into 5 types: spatial, geometric, spectral, temporal, and radiometric. So, it is time to break these down one by one.

Spatial

When it comes to the resolution for spatial imagery, the defining characteristic is the distance represented by a single pixel. The smaller the distance that each pixel corresponds to, the higher the resolution. The standard range for commercial images of this kind is 2-5 meters. However, the more advanced satellites can capture photos with much lower pixel to distance ratios. Some can even go as far as having 30-70 cm per pixel. 

The main difference that these higher resolution satellites provide is the ability to capture objects like cars and people, as opposed to just large shapes and/or bodies of water.  Of course, the technology for these is more expensive, which is what leads to the vast majority of satellites falling in the 2-5 meters per pixel range. 

Geometric

This is one of the more unique resolution types. Geometric resolution refers to the ability of a satellite to accurately represent a portion of Earth’s surface in a single pixel. While this may sound similar to the spatial imagery, geometric resolution does not have a distance to pixel ratio. In fact, it is a lot less straightforward to compare one satellite’s geometric resolution to another. 

The way this resolution is measured is in GSD (ground sample distance) units. This term describes the satellite’s ability to “see” an object on the ground in a single pixel. The Landsat, for example, has a GSD of 30 m. This indicates that the satellite can see a 30×30 m area in a single pixel. Some of the more advanced satellites, such as GeoEye-1 and WorldView-3 can map 0.41- and 0.31-meters areas in a single pixel. This kind of resolution would have previously only been found on military instruments, such as the ones from the Corona project. 

Spectral

This type of resolution is defined by the ability of a sensor to detect finer wavelengths. The more wavelength bands a sensor has and the narrower they are, the finer its resolution. Most sensors will have 3-10 bands they can detect, making them multispectral; however, some will have hundreds, or even thousands, of bands, placing them in the hyperspectral category. The latter sensors are, of course, able to capture even more information than their multispectral counterparts. This includes being able to distinguish between similar types of objects, such as two different minerals. 

This resolution can also apply to machines not observing Earth. A good example is the James Webb Space Telescope, which has both multi and hyperspectral sensors. The multispectral ones should provide the highest angular resolution. The hyperspectral sensors, on the other hand, will, of course, provide the highest spectral resolution. If you would like to learn more about James Webb, you can check out our previous capsules on it here and here

Temporal

This resolution is strictly connected to a satellite’s orbit and speed. It is measured by the time required for a satellite to complete one orbit and observe the same area again. The swath width, also, affects this type of resolution. 

To put this in more concrete terms, a satellite in geostationary orbit will have a finer temporal resolution. This is because it matches the rate of Earth’s rotation. A polar orbit satellite, on the other hand, can have temporal resolutions that vary from 1 to 16 days. Different temporal resolutions enable the gathering of different types of data. The shorter one of 1-2 days can show daily changes on Earth, while the longer ones of 15-16 days can examine bi-monthly ones. 

Radiometric

The final type of resolution left to discuss is radiometric. This refers to how much information a sensor’s pixels can store. This resolution is expressed in bits — a terms that is probably familiar to anyone working with computers. In this case, the more bits a sensor’s pixel can store, the higher its radiometric resolution.

The bits all record exponents of power 2, meaning that an 8-bit resolution has 28, or 256, digital values. The range for these would be 0-255. Given their exponential nature, increasing from an 8- to a 12-bit resolution would result in a massive jump, as the values available would go from 256 to 4096.

Potential Obstacles

Now that we have a grasp of how satellites capture images, it is time to go over some of the issues that are common when working with these machines. A big one is the sheer volume of data that this type of imaging entails. Not only are there 5 different ways to capture the same area — all useful in their own right — but Earth is a large object to survey. This leads to massive data files, which make processing these images quite time-consuming. In fact, preprocessing may be required in the form of image destriping. The objective of this is exactly what it sounds like: removing stripes from images. 

However, there are multiple ways to go about it. Full disclosure, I do not know the exact mechanisms behind these processes, so I will only list them out here. If you would like us to delve deeper into the argument, let us know in the chatbox, and we will do our best to oblige. The most common one is Fourier filtering. This is used across almost all industries. There are also statistical methods, which are usually adopted for satellites using multiple-sensor imaging systems. Finally, there is a compressed sensing method, which is more recent and aims to regularize this optimization process to get back stripe-free images. Unfortunately, all these incur the risk of taking away important data — at least for the moment.

The other potential problem satellites incur is the weather. Depending on the type of images, places that are frequently overcast will make it harder to get clear pictures of them. I guess the United Kingdom must have been quite the pain to capture via satellites. 

Until Next Week 

Even with these obstacles, the data gathered by satellites has been crucial for many scientific endeavors and will continue to be so for as long as mankind is around. And this goes for observing objects outside of Earth as well. Including the topic of next week’s capsule, the (now dwarf) planet Pluto. If you want to learn more about what used to be the farthest planet in our Solar System, I suggest checking back here, at impulso.space, next week. “See” you then.

Related Posts