The integration of various satellite images refines our image of activity on Earth (editorial)
Amanda Ziemann is a remote sensing scientist at Los Alamos National Laboratory in New Mexico. She contributed this article to Space.com’s Expert Voices: Op-Ed & Insights.
Being able to accurately detect changes to the Earth’s surface using satellite imagery can help in everything from climate change research and agriculture to human migration patterns and nuclear non-proliferation. But until recently, it was impossible to flexibly integrate images from multiple types of sensors – for example, those that show changes in area (like the construction of new buildings) versus those that show changes in area. materials (like water to sand). Now, thanks to a new algorithmic capability, we can – and by doing so, get a more frequent and complete picture of what is happening on the ground.
At Los Alamos National Laboratory, we have developed a flexible mathematical approach to identify changes in Satellite pairs of images collected from different satellite modalities or types of sensors using different detection technologies, allowing faster and more complete analysis. It is easy to assume that all satellite images are the same and therefore comparing them is straightforward. But the reality is very different. Hundreds of different imaging sensors are orbiting Earth right now, and nearly all of them take photos of the ground in a different way than others.
Satellites quiz: do you know what is orbiting the earth?
Take, for example, multispectral imaging sensors. These are among the most common types of sensors and give us the images most of us think of when we hear “satellite images”. Multispectral imaging sensors are similar in that they can capture color information beyond what the human eye can see, making them extremely sensitive to material changes. For example, they can clearly capture a grass field which, a few weeks later, is replaced by synthetic turf.
But how they capture these changes varies greatly from multispectral sensor to multispectral sensor. One can measure four different colors of light, for example, while another can measure six. Each sensor can measure the color red differently.
Add to this the fact that multispectral imaging sensors are not the only modality of satellite imagery. There is also Synthetic aperture radar, or SAR, which captures radar images of the Earth’s surface structure at fine spatial resolution. These SAR images are sensitive to surface changes or deformations and are commonly used for applications such as monitoring volcanoes and geothermal energy. So, again, we have an imaging sensor that picks up information in a completely different way than another.
It is a real challenge when comparing these images. When the signals come from two different remote sensing techniques, traditional approaches to detecting changes will fail because the underlying math and physics no longer makes sense. But there is information to be gained there, as these sensors all image the same scenes, but in different ways. So how can you look at all of these images – multispectral images captured by different types of sensors and SAR images – in a way that automatically identifies changes over time?
Our mathematical approach makes this possible by creating a framework that not only compares images from different detection modalities, but also effectively “normalizes” different types of imaging, while retaining the original signal information.
But the most important advantage of this integration of images is that we can see changes as frequent as a few minutes apart. Previously, the time that elapsed between images captured by the same sensor could take days or weeks. But being able to integrate images from various detection modalities means that we can use data from more sensors more quickly, and thus see changes more quickly, allowing for more rigorous analysis.
Related: Top 10 views of Earth from space
To test our method, we looked at images of the construction of the new SoFi stadium in Los Angeles from 2016. We started by comparing multispectral images from different multispectral sensors, as well as SAR images, over the same date range. to see what modalities picked up what changes. For example, in one case, the roof of a building next to the stadium was replaced, changing from beige to white over the course of several months. Multispectral imaging sensors detected this change because it was related to color and matter. SAR, however, did not, as we expected. However, the SAR was very sensitive to surface deformation due to moving piles of soil, whereas multispectral imagery was not.
When we integrated the images using our new algorithmic ability, we were able to see both changes – surface and material – at a much faster rate than if we were focusing on a single satellite. This had never been done on a large scale before, and it signals a potential fundamental change in the way satellite imagery is analyzed.
We were also able to demonstrate how changes can be detected much faster than before. In one case, we were able to compare different multispectral images collected just 12 minutes apart. In fact, it was so fast that we were able to detect a plane flying over the scene.
As remote sensing from space continues to become more accessible, particularly with the explosive use of cubesats and small satellites in government and commercial sectors – more satellite images will be available. This is good news in theory, because it means more data to fuel a full analysis. In practice, however, this analysis is challenged by the overwhelming volume of data, the diversity of sensor designs and modalities, and the siled nature of image repositories for different satellite vendors. Additionally, as image analysts are inundated with this tidal wave of images, the development of automated detection algorithms that “know where to look” is paramount.
This new approach to detecting change will not solve all of these challenges, but it will help in leveraging the strengths of various satellite modalities – and give us more clarity on our world’s changing landscape in the process.
Follow us on Twitter @Spacedotcom or Facebook.
Follow all of Expert Voices’ issues and debates – and join the discussion – on Facebook and Twitter. The opinions expressed are those of the author and do not necessarily reflect those of the publisher.