By Philipp Arndt \ Scripps Institution of Oceanography, University of California San Diego \ Github: \@fliphilipp \ Contact: parndt@ucsd.edu
**Note from June 2024:** OpenAltimetry has become extremely slow since it transitioned to being hosted at NSIDC, which sometimes makes the request time out. Also, some of the data products have become unavailable through the API (this prints as "request failed"). When this code runs, it is now unfortunately painfully slow. I am told that the OpenAltimetry/NSIDC teams are working on fixing this.
We'll be using the following Python libraries in this notebook:
%matplotlib widget
%load_ext autoreload
%autoreload 2
import os
import ee
import geemap
import requests
import numpy as np
import pandas as pd
import matplotlib.pylab as plt
from datetime import datetime
from datetime import timedelta
from ipywidgets import Layout
import rasterio as rio
from rasterio import plot as rioplot
from rasterio import warp
The import below is a class that I wrote myself. It helps us read and store data from the OpenAltimetry API.
If you are interested in how this works, you can find the code in utils/oa.py
.
from utils.oa import dataCollector
GEE requires you to authenticate your access, so if ee.Initialize()
does not work you first need to run ee.Authenticate()
. This gives you a link at which you can use your google account that is associated with GEE to get an authorization code. Copy the authorization code into the input field and hit enter to complete authentication.
try:
ee.Initialize()
except:
ee.Authenticate()
ee.Initialize()
# ee.Authenticate()
Let's say we have found some data that looks weird to us, and we don't know what's going on.
I went to openaltimetry.earthdatacloud.nasa.gov and selected BROWSE ICESAT-2 DATA. Then I selected ATL 06 (Land Ice Height) on the top right, and switched the projection🌎 to Arctic. Then I selected August 22, 2021 in the calendar📅 on the bottom left, and toggled the cloud☁️ button to show MODIS imagery of that date. I then zoomed in on my region of interest.
To find out what ICESat-2 has to offer here, I clicked on SELECT A REGION on the top left, and drew a rectangle around that mysterious cloud. When right-clicking on that rectangle, I could select View Elevation profile. This opened a new tab, and showed me ATL06 and ATL08 elevations.
It looks like ATL06 can't decide where the surface is, and ATL08 tells me that there's a forest canopy on the Greenland Ice Sheet? To get to the bottom of this, I scrolled all the way down and selected 🛈Get API URL. The website confirms that the "API endpoint was copied to clipboard." Now I can use this to access the data myself.
Note: Instead of trying to find this region yourself, you can access the OpenAltimetry page shown above by going to this annotation🏷️. Just left-click on the red box and select "View Elevation Profile".
All we need to do is to paste the API URL that we copied from the webpage into a string. We also need to specify which beam we would like to look at. The GT1R ground track looks funky, so let's look at that one!
# paste the API URL from OpenAltimetry below, and specify the beam you are interested in
oa_api_url = 'https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atl03?date=2021-08-22&minx=-24.03772653601395&miny=77.53753839308544&maxx=-23.86438315078483&maxy=77.57604313591165&trackId=909&beamNames=gt1r&beamNames=gt1l&outputFormat=json'
gtx = 'gt1r'
We can now initialize a dataCollector
object, using the copy-pasted OpenAltimetry API URL, and the beam we would like to look at. \
(Again, I defined this class in utils/oa.py
to do some work for us in the background.)
is2data = dataCollector(oaurl=oa_api_url,beam=gtx, verbose=True)
OpenAltimetry API URL: https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atlXX?date=2021-08-22&minx=-24.03772653601395&miny=77.53753839308544&maxx=-23.86438315078483&maxy=77.57604313591165&trackId=909&outputFormat=json&beamNames=gt1r&client=jupyter Date: 2021-08-22 Track: 909 Beam: gt1r Latitude limits: [77.53753839308544, 77.57604313591165] Longitude limits: [-24.03772653601395, -23.86438315078483]
Alternatively, we could use a date, track number, beam, and lat/lon bounding box as input to the dataCollector
.
date = '2021-08-22'
rgt = 909
beam = 'gt1r'
latlims = [77.5326, 77.5722]
lonlims = [-23.9891, -23.9503]
is2data = dataCollector(date=date, latlims=latlims, lonlims=lonlims, track=rgt, beam=gtx, verbose=True)
OpenAltimetry API URL: https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atlXX?date=2021-08-22&minx=-23.9891&miny=77.5326&maxx=-23.9503&maxy=77.5722&trackId=909&beamNames=gt1r&outputFormat=json&client=jupyter Date: 2021-08-22 Track: 909 Beam: gt1r Latitude limits: [77.5326, 77.5722] Longitude limits: [-23.9891, -23.9503]
Note that this also constructs the API url for us.
Here we use the requestData()
function of the dataCollector
class, which is defined in utils/oa.py
. It downloads all data products that are available on OpenAltimetry based on the inputs with which we initialized our dataCollector
, and writes them to pandas dataframes.
Note from June 2024: OpenAltimetry has become extremely slow since it transitioned to being hosted at NSIDC, which sometimes makes the request time out. Also, some of the data products have become unavailable through the API (this prints as "request failed"). I am told that the team is working on fixing this.
is2data.requestData(verbose=True)
---> requesting ATL03 data... 6314 data points. ---> requesting ATL06 data... 194 data points. ---> requesting ATL07 data... request failed. ---> requesting ATL08 data... 26 data points. ---> requesting ATL10 data... request failed. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data.
The data are now stored as data frames in our dataCollector
object. To verify this, we can run the cell below.
vars(is2data)
{'url': 'https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atlXX?date=2021-08-22&minx=-23.9891&miny=77.5326&maxx=-23.9503&maxy=77.5722&trackId=909&beamNames=gt1r&outputFormat=json&client=jupyter', 'date': '2021-08-22', 'track': 909, 'beam': 'gt1r', 'latlims': [77.5326, 77.5722], 'lonlims': [-23.9891, -23.9503], 'atl03': lat lon h conf 0 77.572273 -23.953864 748.409241 Noise 1 77.572246 -23.953913 847.594727 Noise 2 77.572247 -23.953895 785.343201 Noise 3 77.572232 -23.953941 906.424866 Noise 4 77.572222 -23.953903 739.359497 Noise ... ... ... ... ... 6309 77.532606 -23.986983 788.247559 High 6310 77.532600 -23.986988 787.683777 High 6311 77.532593 -23.986994 788.859985 High 6312 77.532594 -23.986993 786.800476 High 6313 77.532594 -23.986993 787.658264 High [6314 rows x 4 columns], 'atl06': lat lon h 0 77.572104 -23.954022 810.224304 1 77.571928 -23.954170 809.449646 2 77.571751 -23.954318 808.569153 3 77.571575 -23.954466 808.484558 4 77.571398 -23.954613 808.140320 .. ... ... ... 189 77.533473 -23.986257 785.033386 190 77.533297 -23.986406 785.808105 191 77.533120 -23.986555 786.543274 192 77.532944 -23.986702 787.244507 193 77.532768 -23.986849 787.765320 [194 rows x 3 columns], 'atl08': lat lon h canopy 0 77.571838 -23.954245 809.255493 3.402823e+38 1 77.570961 -23.954979 807.246216 3.402823e+38 2 77.570076 -23.955711 804.799744 3.402823e+38 3 77.569191 -23.956451 801.567993 3.410217e+00 4 77.568314 -23.957197 797.294250 3.402823e+38 5 77.567429 -23.957932 793.571960 3.175049e+00 6 77.566551 -23.958658 789.192017 2.079102e+00 7 77.565666 -23.959396 784.972778 1.745239e+00 8 77.564781 -23.960142 781.505920 3.402823e+38 9 77.563904 -23.960869 776.662964 2.394409e+00 10 77.563019 -23.961609 771.332397 6.974670e+00 11 77.562141 -23.962358 761.032227 1.180969e+01 12 77.544502 -23.977083 763.093262 8.813538e+00 13 77.543617 -23.977800 764.617065 8.257019e+00 14 77.542732 -23.978540 764.995605 7.054932e+00 15 77.541855 -23.979282 769.079590 4.145813e+00 16 77.540970 -23.980009 771.878052 3.402823e+38 17 77.540085 -23.980747 773.208191 3.402823e+38 18 77.539207 -23.981489 774.369629 3.402823e+38 19 77.538322 -23.982227 776.913086 3.402823e+38 20 77.537445 -23.982954 777.269653 3.402823e+38 21 77.536560 -23.983694 779.454041 3.402823e+38 22 77.535675 -23.984425 780.175171 3.590332e+00 23 77.534798 -23.985142 781.494263 3.402823e+38 24 77.533913 -23.985882 783.230103 3.402823e+38 25 77.533035 -23.986628 785.877747 3.402823e+38, 'atl12': Empty DataFrame Columns: [lat, lon, h] Index: [], 'atl13': Empty DataFrame Columns: [lat, lon, h] Index: []}
Now let's plot this data. Here, we are just creating an empty figure fig
with axes ax
.
# create the figure and axis
fig, ax = plt.subplots(figsize=[9,5])
# plot the data products
atl06, = ax.plot(is2data.atl06.lat, is2data.atl06.h, c='C0', linestyle='-', label='ATL06')
atl08, = ax.plot(is2data.atl08.lat, is2data.atl08.h, c='C2', linestyle=':', label='ATL08')
if np.sum(~np.isnan(is2data.atl08.canopy))>0:
atl08canopy = ax.scatter(is2data.atl08.lat, is2data.atl08.h+is2data.atl08.canopy, s=2, c='C2', label='ATL08 canopy')
# add labels, title and legend
ax.set_xlabel('latitude')
ax.set_ylabel('elevation in meters')
ax.set_title('Some ICESat-2 data I found on OpenAltimetry!')
ax.legend(loc='upper left')
# add some text to provide info on what is plotted
info = 'ICESat-2 track {track:d}-{beam:s} on {date:s}\n({lon:.4f}E, {lat:.4f}N)'.format(track=is2data.track,
beam=is2data.beam.upper(),
date=is2data.date,
lon=np.mean(is2data.lonlims),
lat=np.mean(is2data.latlims))
infotext = ax.text(0.01, -0.08, info,
horizontalalignment='left',
verticalalignment='top',
transform=ax.transAxes,
fontsize=7,
bbox=dict(edgecolor=None, facecolor='white', alpha=0.9, linewidth=0))
# set the axis limits
ax.set_xlim((is2data.atl03.lat.min(), is2data.atl03.lat.max()))
ax.set_ylim((741, 818));
Let's add the ATL03 photons to better understand what might be going on here.
atl03 = ax.scatter(is2data.atl03.lat, is2data.atl03.h, s=1, color='black', label='ATL03', zorder=-1)
ax.legend(loc='upper left')
fig.tight_layout()
if not os.path.isdir('plots'):
os.makedirs('plots')
fig.savefig('plots/my-plot.jpg', dpi=300)
To make plots easier to produce, the dataCollector
class also has a method to plot the data that we downloaded.
fig = is2data.plotData();
display(fig)
So far we have only seen the data in elevation vs. latitude space. It's probably good to know what the scale on the x-axis is here in units that we're familiar with.
def dist_latlon2meters(lat1, lon1, lat2, lon2):
# returns the distance between two coordinate points - (lon1, lat1) and (lon2, lat2) along the earth's surface in meters.
R = 6371000
def deg2rad(deg):
return deg * (np.pi/180)
dlat = deg2rad(lat2-lat1)
dlon = deg2rad(lon2-lon1)
a = np.sin(dlat/2) * np.sin(dlat/2) + np.cos(deg2rad(lat1)) * np.cos(deg2rad(lat2)) * np.sin(dlon/2) * np.sin(dlon/2)
c = 2 * np.arctan2(np.sqrt(a), np.sqrt(1-a))
return R * c
lat1, lat2 = is2data.atl08.lat[0], is2data.atl08.lat.iloc[-1]
lon1, lon2 = is2data.atl08.lon[0], is2data.atl08.lon.iloc[-1]
ground_track_length = dist_latlon2meters(lat1, lon1, lat2, lon2)
print('The ground track is about %.1f kilometers long.' % (ground_track_length/1e3))
The ground track is about 4.4 kilometers long.
Google Earth Engine (GEE) has a large catalog of geospatial raster data, which is ready for analysis in the cloud. It also comes with an online JavaScript code editor.
But since we all seem to be using python, it would be nice to have these capabilities available in our Jupyter comfort zone...
Thankfully, there is a python API for GEE, which we have imported using import ee
earlier. It doesn't come with an interactive map, but the python package geemap has us covered!
We can start working on our map by calling geemap.Map()
. This just gives us a world map with a standard basemap.
from ipywidgets import Layout
Map = geemap.Map(layout=Layout(width='70%', max_height='450px'))
Map
Map(center=[0, 0], controls=(WidgetControl(options=['position', 'transparent_bg'], widget=SearchDataGUI(childr…
Now we need to add our ICESat-2 gound track to that map. Let's use the lon/lat coordinates of the ATL08 data product for this.
We also need to specify which Coordinate Reference System (CRS) our data is in. The longitude/latitude system that we are all quite familiar with is referenced by EPSG:4326
. To add the ground track to the map we need to turn it into an Earth Engine "Feature Collection".
ground_track_coordinates = list(zip(is2data.atl08.lon, is2data.atl08.lat))
ground_track_projection = 'EPSG:4326' # <-- this specifies that our data longitude/latitude in degrees [https://epsg.io/4326]
gtx_feature = ee.FeatureCollection(ee.Geometry.LineString(coords=ground_track_coordinates,
proj=ground_track_projection,
geodesic=True))
gtx_feature
Now that we have it in the right format, we can add it as a layer to the map.
Map.addLayer(gtx_feature, {'color': 'red'}, 'ground track')
According to the cell above this should be a red line. But we still can't see it, because we first need to tell the map where to look for it.
Let's calculate the center longitude and latitude, and center the map on it.
center_lon = (lon1 + lon2) / 2
center_lat = (lat1 + lat2) / 2
Map.setCenter(center_lon, center_lat, zoom=3);
Map
Map(center=[77.55243682861328, -23.970436096191406], controls=(WidgetControl(options=['position', 'transparent…
So we actually couldn't see it because it was in Greenland.
Unfortunately the basemap here doesn't give us much more information. Let's add a satellite imagery basemap.
This is a good time to look at the layer control on the top right.
Map.add_basemap('SATELLITE', opacity=0.5) # <-- this adds a layer called 'Google Satellite'
Map.layer_opacity(name='Esri.WorldImagery', opacity=0.5)
Map.setCenter(center_lon, center_lat, zoom=7);
Map.addLayer(gtx_feature,{'color': 'red'},'ground track')
Map
Map(center=[77.55243682861328, -23.970436096191406], controls=(WidgetControl(options=['position', 'transparent…
...okay, so this is near the margin of the ice sheet and there are some lakes. This might be important.
Let's see if we can find any more clues about the nature of this weird ICESat-2 data. Let's dig deeper.
Both of these Sentinel-2 satellites take images of most places on our planet at least every week or so. Maybe these images can tell us what was happening here around the same time that ICESat-2 acquired our data?
The imagery scenes live in image collections on Google Earth Engine.
You can find all collections here: https://developers.google.com/earth-engine/datasets/catalog/
The above link tells us we can find some images under 'COPERNICUS/S2_SR_HARMONIZED'
.
collection_name1 = 'COPERNICUS/S2_SR_HARMONIZED' # Landsat 8 earth engine collection
# https://developers.google.com/earth-engine/datasets/catalog/LANDSAT_LC08_C01_T2
To access the collection, we call ee.ImageCollection
:
collection = ee.ImageCollection(collection_name1)
collection
<ee.imagecollection.ImageCollection object at 0x7f6ba6bbc610>
Can we find out how many images there are in total?
number_of_scenes = collection.size()
print(number_of_scenes)
ee.Number({ "functionInvocationValue": { "functionName": "Collection.size", "arguments": { "collection": { "functionInvocationValue": { "functionName": "ImageCollection.load", "arguments": { "id": { "constantValue": "COPERNICUS/S2_SR_HARMONIZED" } } } } } } })
Actually, asking for the size of the collection does not do anything! 🤔
It just tells Earth Engine on the server-side that this variable refers to the size of the collection, which we may need later to do some analysis on the server. As long as this number is not needed, Earth Engine will not go through the trouble actually computing it.
To force Earth Engine to compute and get any information on the client side (our local machine / Cryocloud), we need to call .getInfo()
. In this case that would be number_of_scenes = collection.size().getInfo()
.
Because this command would ask Earth Engine to count every single Sentinel-2 file that exists, this command would take a really long time to execute. I will avoid this here and just give you the answer from when I wrote this tutorial.
# number_of_scenes = collection.size().getInfo()
number_of_scenes = 19323842
print('There are %i number of scenes in the image collection' % number_of_scenes)
There are 19323842 number of scenes in the image collection
Who wants to look at almost 20 million pictures? I don't. So let's try to narrow it down.
Let's start with only images that overlap with the center of our ground track.
# the point of interest (center of the track) as an Earth Engine Geometry
point_of_interest = ee.Geometry.Point(center_lon, center_lat)
collection = collection.filterBounds(point_of_interest)
print('There are {number:d} images in the spatially filtered collection.'.format(number=collection.size().getInfo()))
There are 3208 images in the spatially filtered collection.
Much better! Now let's only look at images that were taken soon before or after ICESat-2 passed over this spot.
days_buffer_imagery = 6
dateformat = '%Y-%m-%d'
datetime_requested = datetime.strptime(is2data.date, dateformat)
search_start = (datetime_requested - timedelta(days=days_buffer_imagery)).strftime(dateformat)
search_end = (datetime_requested + timedelta(days=days_buffer_imagery)).strftime(dateformat)
print('Search for imagery from {start:s} to {end:s}.'.format(start=search_start, end=search_end))
Search for imagery from 2021-08-16 to 2021-08-28.
collection = collection.filterDate(search_start, search_end)
print('There are {number:d} images in the spatially filtered collection.'.format(number=collection.size().getInfo()))
There are 34 images in the spatially filtered collection.
We can also sort the collection by date ('system:time_start'
), to order the images by acquisition time.
collection = collection.sort('system:time_start')
Again, we need to use .getInfo()
to actually see any information on our end. This is a python dictionary.
info = collection.getInfo()
type(info)
dict
Let's see what's inside!
info.keys()
dict_keys(['type', 'bands', 'version', 'id', 'properties', 'features'])
'features'
sounds like it could hold information about the images we are trying to find...
len(info['features'])
34
A list of 34 things! Those are probably the 34 images in the collection. Let's pick the first one and dig deeper!
feature_number = 0
info['features'][0].keys()
dict_keys(['type', 'bands', 'version', 'id', 'properties'])
info['features'][feature_number]['id']
'COPERNICUS/S2_SR_HARMONIZED/20210816T150759_20210816T150915_T27XVG'
Looks like we found a reference to a Sentinel-2 image! Let's look at the 'bands'
.
for band in info['features'][feature_number]['bands']:
print(band['id'], end=', ')
B1, B2, B3, B4, B5, B6, B7, B8, B8A, B9, B11, B12, AOT, WVP, SCL, TCI_R, TCI_G, TCI_B, MSK_CLDPRB, MSK_SNWPRB, QA10, QA20, QA60,
'properties'
could be useful too!
info['features'][0]['properties'].keys()
dict_keys(['DATATAKE_IDENTIFIER', 'AOT_RETRIEVAL_ACCURACY', 'SPACECRAFT_NAME', 'SATURATED_DEFECTIVE_PIXEL_PERCENTAGE', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B8A', 'CLOUD_SHADOW_PERCENTAGE', 'MEAN_SOLAR_AZIMUTH_ANGLE', 'system:footprint', 'VEGETATION_PERCENTAGE', 'SOLAR_IRRADIANCE_B12', 'SOLAR_IRRADIANCE_B10', 'SENSOR_QUALITY', 'SOLAR_IRRADIANCE_B11', 'GENERATION_TIME', 'SOLAR_IRRADIANCE_B8A', 'FORMAT_CORRECTNESS', 'CLOUD_COVERAGE_ASSESSMENT', 'THIN_CIRRUS_PERCENTAGE', 'system:time_end', 'WATER_VAPOUR_RETRIEVAL_ACCURACY', 'system:time_start', 'DATASTRIP_ID', 'PROCESSING_BASELINE', 'SENSING_ORBIT_NUMBER', 'NODATA_PIXEL_PERCENTAGE', 'SENSING_ORBIT_DIRECTION', 'GENERAL_QUALITY', 'GRANULE_ID', 'REFLECTANCE_CONVERSION_CORRECTION', 'MEDIUM_PROBA_CLOUDS_PERCENTAGE', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B8', 'DATATAKE_TYPE', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B9', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B6', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B7', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B4', 'MEAN_INCIDENCE_ZENITH_ANGLE_B1', 'NOT_VEGETATED_PERCENTAGE', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B5', 'RADIOMETRIC_QUALITY', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B2', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B3', 'MEAN_INCIDENCE_ZENITH_ANGLE_B5', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B1', 'MEAN_INCIDENCE_ZENITH_ANGLE_B4', 'MEAN_INCIDENCE_ZENITH_ANGLE_B3', 'MEAN_INCIDENCE_ZENITH_ANGLE_B2', 'MEAN_INCIDENCE_ZENITH_ANGLE_B9', 'MEAN_INCIDENCE_ZENITH_ANGLE_B8', 'MEAN_INCIDENCE_ZENITH_ANGLE_B7', 'DARK_FEATURES_PERCENTAGE', 'HIGH_PROBA_CLOUDS_PERCENTAGE', 'MEAN_INCIDENCE_ZENITH_ANGLE_B6', 'UNCLASSIFIED_PERCENTAGE', 'MEAN_SOLAR_ZENITH_ANGLE', 'MEAN_INCIDENCE_ZENITH_ANGLE_B8A', 'RADIATIVE_TRANSFER_ACCURACY', 'MGRS_TILE', 'CLOUDY_PIXEL_PERCENTAGE', 'PRODUCT_ID', 'MEAN_INCIDENCE_ZENITH_ANGLE_B10', 'SOLAR_IRRADIANCE_B9', 'SNOW_ICE_PERCENTAGE', 'DEGRADED_MSI_DATA_PERCENTAGE', 'MEAN_INCIDENCE_ZENITH_ANGLE_B11', 'MEAN_INCIDENCE_ZENITH_ANGLE_B12', 'SOLAR_IRRADIANCE_B6', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B10', 'SOLAR_IRRADIANCE_B5', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B11', 'SOLAR_IRRADIANCE_B8', 'MEAN_INCIDENCE_AZIMUTH_ANGLE_B12', 'SOLAR_IRRADIANCE_B7', 'SOLAR_IRRADIANCE_B2', 'SOLAR_IRRADIANCE_B1', 'SOLAR_IRRADIANCE_B4', 'GEOMETRIC_QUALITY', 'SOLAR_IRRADIANCE_B3', 'system:asset_size', 'WATER_PERCENTAGE', 'system:index'])
That's a lot going on right there! But 'GRANULE_ID'
is probably useful. Let's go through all our features and print the product id.
for feature in info['features']:
print(feature['properties']['GRANULE_ID'])
L2A_T27XVG_A023217_20210816T150915 L2A_T26XNM_A023217_20210816T150915 L2A_T27XVG_A023231_20210817T143747 L2A_T26XNM_A023231_20210817T143747 L2A_T27XVG_A032140_20210817T152909 L2A_T26XNM_A032140_20210817T152909 L2A_T27XVG_A032154_20210818T145938 L2A_T26XNM_A032154_20210818T145938 L2A_T27XVG_A032168_20210819T142935 L2A_T26XNM_A032168_20210819T142935 L2A_T27XVG_A023260_20210819T151803 L2A_T26XNM_A023260_20210819T151803 L2A_T27XVG_A023274_20210820T144750 L2A_T26XNM_A023274_20210820T144750 L2A_T27XVG_A032197_20210821T150913 L2A_T26XNM_A032197_20210821T150913 L2A_T27XVG_A032211_20210822T143925 L2A_T26XNM_A032211_20210822T143925 L2A_T27XVG_A023303_20210822T152807 L2A_T26XNM_A023303_20210822T152807 L2A_T27XVG_A023317_20210823T145754 L2A_T26XNM_A023317_20210823T145754 L2A_T27XVG_A023331_20210824T142741 L2A_T26XNM_A023331_20210824T142741 L2A_T27XVG_A032240_20210824T151911 L2A_T26XNM_A032240_20210824T151911 L2A_T27XVG_A032254_20210825T144920 L2A_T26XNM_A032254_20210825T144920 L2A_T27XVG_A023360_20210826T150906 L2A_T26XNM_A023360_20210826T150906 L2A_T27XVG_A023374_20210827T143745 L2A_T26XNM_A023374_20210827T143745 L2A_T27XVG_A032283_20210827T152908 L2A_T26XNM_A032283_20210827T152908
The visible bands in Sentinel-2 are 'B2'
:blue, 'B3'
:green, 'B4'
:red.
So to show a "true color" RGB composite image on the map, we need to select these bands in the R-G-B order:
myImage = collection.first()
myImage_RGB = myImage.select('B4', 'B3', 'B2')
vis_params = {'min': 0.0, 'max': 10000, 'opacity': 1.0, 'gamma': 1.5}
Map.addLayer(myImage_RGB, vis_params, name='my image')
Map.addLayer(gtx_feature,{'color': 'red'},'ground track')
Map
Map(center=[77.55243682861328, -23.970436096191406], controls=(WidgetControl(options=['position', 'transparent…
This seems to have worked. But there's clouds everywhere.
We need a better approach to get anywhere here. To do this, we use not only the Sentinel-2 Surface Reflectance image collection, but also merge it with the Sentinel-2 cloud probability collection, which can be accessed under COPERNICUS/S2_CLOUD_PROBABILITY
.
Let's specify a function that adds the cloud probability band to each Sentinel-2 image and calcultes the mean cloud probability in the neighborhood of the ICESat-2 ground track, then map this function over our location/date filtered collection.
def get_sentinel2_cloud_collection(is2data, days_buffer=6, gt_buffer=100):
# create the area of interest for cloud likelihood assessment
ground_track_coordinates = list(zip(is2data.atl08.lon, is2data.atl08.lat))
ground_track_projection = 'EPSG:4326' # our data is lon/lat in degrees [https://epsg.io/4326]
gtx_feature = ee.Geometry.LineString(coords=ground_track_coordinates,
proj=ground_track_projection,
geodesic=True)
area_of_interest = gtx_feature.buffer(gt_buffer)
datetime_requested = datetime.strptime(is2data.date, '%Y-%m-%d')
start_date = (datetime_requested - timedelta(days=days_buffer)).strftime('%Y-%m-%dT%H:%M:%S')
end_date = (datetime_requested + timedelta(days=days_buffer)).strftime('%Y-%m-%dT%H:%M:%S')
print('Search for imagery from {start:s} to {end:s}.'.format(start=start_date, end=end_date))
# Import and filter S2 SR HARMONIZED
s2_sr_collection = (ee.ImageCollection('COPERNICUS/S2_SR_HARMONIZED')
.filterBounds(area_of_interest)
.filterDate(start_date, end_date))
# Import and filter s2cloudless.
s2_cloudless_collection = (ee.ImageCollection('COPERNICUS/S2_CLOUD_PROBABILITY')
.filterBounds(area_of_interest)
.filterDate(start_date, end_date))
# Join the filtered s2cloudless collection to the SR collection by the 'system:index' property.
cloud_collection = ee.ImageCollection(ee.Join.saveFirst('s2cloudless').apply(**{
'primary': s2_sr_collection, 'secondary': s2_cloudless_collection,
'condition': ee.Filter.equals(**{'leftField': 'system:index','rightField': 'system:index'})}))
cloud_collection = cloud_collection.map(lambda img: img.addBands(ee.Image(img.get('s2cloudless')).select('probability')))
def set_is2_cloudiness(img, aoi=area_of_interest):
cloudprob = img.select(['probability']).reduceRegion(reducer=ee.Reducer.mean(),
geometry=aoi,
bestEffort=True,
maxPixels=1e6)
return img.set('ground_track_cloud_prob', cloudprob.get('probability'))
return cloud_collection.map(set_is2_cloudiness)
Get this collection for our ICESat-2 data, and print all the granule IDs and associated cloudiness over the ground track.
collection = get_sentinel2_cloud_collection(is2data)
info = collection.getInfo()
for feature in info['features']:
print('%s --> along-track cloud probability: %5.1f %%' % (feature['properties']['GRANULE_ID'],
feature['properties']['ground_track_cloud_prob']))
Search for imagery from 2021-08-16T00:00:00 to 2021-08-28T00:00:00. L2A_T26XNM_A023217_20210816T150915 --> along-track cloud probability: 89.7 % L2A_T27XVG_A023217_20210816T150915 --> along-track cloud probability: 89.7 % L2A_T26XNM_A023231_20210817T143747 --> along-track cloud probability: 4.5 % L2A_T27XVG_A023231_20210817T143747 --> along-track cloud probability: 4.6 % L2A_T26XNM_A032140_20210817T152909 --> along-track cloud probability: 12.1 % L2A_T27XVG_A032140_20210817T152909 --> along-track cloud probability: 12.1 % L2A_T26XNM_A032154_20210818T145938 --> along-track cloud probability: 10.3 % L2A_T27XVG_A032154_20210818T145938 --> along-track cloud probability: 10.1 % L2A_T26XNM_A032168_20210819T142935 --> along-track cloud probability: 12.9 % L2A_T27XVG_A032168_20210819T142935 --> along-track cloud probability: 12.8 % L2A_T26XNM_A023260_20210819T151803 --> along-track cloud probability: 9.8 % L2A_T27XVG_A023260_20210819T151803 --> along-track cloud probability: 9.9 % L2A_T26XNM_A023274_20210820T144750 --> along-track cloud probability: 1.3 % L2A_T27XVG_A023274_20210820T144750 --> along-track cloud probability: 1.3 % L2A_T26XNM_A032197_20210821T150913 --> along-track cloud probability: 20.2 % L2A_T27XVG_A032197_20210821T150913 --> along-track cloud probability: 20.3 % L2A_T26XNM_A032211_20210822T143925 --> along-track cloud probability: 14.8 % L2A_T27XVG_A032211_20210822T143925 --> along-track cloud probability: 14.8 % L2A_T26XNM_A023303_20210822T152807 --> along-track cloud probability: 15.7 % L2A_T27XVG_A023303_20210822T152807 --> along-track cloud probability: 15.8 % L2A_T26XNM_A023317_20210823T145754 --> along-track cloud probability: 5.6 % L2A_T27XVG_A023317_20210823T145754 --> along-track cloud probability: 5.5 % L2A_T26XNM_A023331_20210824T142741 --> along-track cloud probability: 8.0 % L2A_T27XVG_A023331_20210824T142741 --> along-track cloud probability: 7.9 % L2A_T26XNM_A032240_20210824T151911 --> along-track cloud probability: 7.5 % L2A_T27XVG_A032240_20210824T151911 --> along-track cloud probability: 7.5 % L2A_T26XNM_A032254_20210825T144920 --> along-track cloud probability: 7.7 % L2A_T27XVG_A032254_20210825T144920 --> along-track cloud probability: 7.8 % L2A_T26XNM_A023360_20210826T150906 --> along-track cloud probability: 15.8 % L2A_T27XVG_A023360_20210826T150906 --> along-track cloud probability: 15.8 % L2A_T26XNM_A023374_20210827T143745 --> along-track cloud probability: 10.3 % L2A_T27XVG_A023374_20210827T143745 --> along-track cloud probability: 10.1 % L2A_T26XNM_A032283_20210827T152908 --> along-track cloud probability: 9.8 % L2A_T27XVG_A032283_20210827T152908 --> along-track cloud probability: 9.8 %
We specify a certain cloud probability threshold, and then only keep the images that fall below it. Here we are choosing a quite aggressive value of maximum 5% cloud probability...
# filter by maximum allowable cloud probability (in percent)
MAX_CLOUD_PROB_ALONG_TRACK = 5
cloudfree_collection = collection.filter(ee.Filter.lt('ground_track_cloud_prob', MAX_CLOUD_PROB_ALONG_TRACK))
print('There are %i cloud-free scenes.' % cloudfree_collection.size().getInfo())
There are 4 cloud-free scenes.
Using the image property 'system:time_start'
we can calculate the time difference from the ICESat-2 overpass and set it as a property. This allows us to sort the collection by it and to make sure that the first image in the collection is the closest-in-time to ICESat-2 image that is also cloud-free.
# get the time difference between ICESat-2 overpass and Sentinel-2 acquisitions, set as image property
is2time = is2data.date + 'T12:00:00'
def set_time_difference(img, is2time=is2time):
timediff = ee.Date(is2time).difference(img.get('system:time_start'), 'second').abs()
return img.set('timediff', timediff)
cloudfree_collection = cloudfree_collection.map(set_time_difference).sort('timediff')
Print some stats for the final collection to make sure everything looks alright.
info = cloudfree_collection.getInfo()
for feature in info['features']:
s2datetime = datetime.fromtimestamp(feature['properties']['system:time_start']/1e3)
is2datetime = datetime.strptime(is2time, '%Y-%m-%dT%H:%M:%S')
timediff = s2datetime - is2datetime
timediff -= timedelta(microseconds=timediff.microseconds)
diffsign = 'before' if timediff < timedelta(0) else 'after'
print('%s --> along-track cloud probability: %5.1f %%, %s %7s ICESat-2' % (feature['properties']['GRANULE_ID'],
feature['properties']['ground_track_cloud_prob'],np.abs(timediff), diffsign))
L2A_T26XNM_A023274_20210820T144750 --> along-track cloud probability: 1.3 %, 1 day, 21:09:34 before ICESat-2 L2A_T27XVG_A023274_20210820T144750 --> along-track cloud probability: 1.3 %, 1 day, 21:09:38 before ICESat-2 L2A_T26XNM_A023231_20210817T143747 --> along-track cloud probability: 4.5 %, 4 days, 21:19:32 before ICESat-2 L2A_T27XVG_A023231_20210817T143747 --> along-track cloud probability: 4.6 %, 4 days, 21:19:36 before ICESat-2
first_image_rgb = cloudfree_collection.first().select('B4', 'B3', 'B2')
Map = geemap.Map(layout=Layout(width='70%', max_height='450px'))
Map.add_basemap('SATELLITE')
Map.addLayer(first_image_rgb, vis_params, name='my image')
Map.addLayer(gtx_feature,{'color': 'red'},'ground track')
Map.centerObject(gtx_feature, zoom=12)
Map
Map(center=[77.55243721126286, -23.970458522024117], controls=(WidgetControl(options=['position', 'transparent…
We can use .getDownloadUrl()
on an Earth Engine image.
It asks for a scale, which is just the pixel size in meters (10 m for Sentinel-2 visible bands). It also asks for the region we would like to export; here we use a .buffer
around the center.
(Note: This function can only be effectively used for small download jobs because there is a request size limit. Here, we only download a small region around the ground track, and convert the image to an 8-bit RGB composite to keep file size low. For larger jobs you should use Export.image.toDrive
)
# create a region around the ground track over which to download data
point_of_interest = ee.Geometry.Point(center_lon, center_lat)
buffer_around_center_meters = ground_track_length*0.52
region_of_interest = point_of_interest.buffer(buffer_around_center_meters)
# make the image 8-bit RGB
s2rgb = first_image_rgb.unitScale(ee.Number(0), ee.Number(10000)).clamp(0.0, 1.0).multiply(255.0).uint8()
# get the download URL
downloadURL = s2rgb.getDownloadUrl({'name': 'mySatelliteImage',
'crs': s2rgb.projection().crs(),
'scale': 10,
'region': region_of_interest,
'filePerBand': False,
'format': 'GEO_TIFF'})
downloadURL
'https://earthengine.googleapis.com/v1/projects/earthengine-legacy/thumbnails/a69f85a839d8212b4311789c51be1432-1fcb1753952838f942607d736fe298a6:getPixels'
We can save the content of the download URL with the requests
library.
response = requests.get(downloadURL)
if not os.path.isdir('imagery'):
os.makedirs('imagery')
filename = 'imagery/my-satellite-image.tif'
with open(filename, 'wb') as f:
f.write(response.content)
print('Downloaded %s' % filename)
Downloaded imagery/my-satellite-image.tif
myImage = rio.open(filename)
myImage
<open DatasetReader name='imagery/my-satellite-image.tif' mode='r'>
Now we can easily plot the image in a matplotlib figure, just using the rasterio.plot()
module.
fig, ax = plt.subplots(figsize=[4,4])
rioplot.show(myImage, ax=ax);
display(fig)
Because our plot is now in the Antarctic Polar Stereographic Coordrinate Reference System, we need to project the coordinates of the ground track from lon/lat values. The rasterio.warp.transform
function has us covered. From then on, it's just plotting in Matplotlib.
gtx_x, gtx_y = warp.transform(src_crs='epsg:4326', dst_crs=myImage.crs, xs=is2data.atl08.lon, ys=is2data.atl08.lat)
ax.plot(gtx_x, gtx_y, color='red', linestyle='-')
ax.axis('off')
display(fig)
The code above is found more concisely in one more method:
dataCollector.visualize_sentinel2()
This one has some default parameters, which you can change when running it:
- max_cloud_prob = 20
- days_buffer = 10
- gamma_value = 1.8
- title = 'ICESat-2 data'
- imagery_filename = 'imagery/my-satellite-image.tif'
- plot_filename = 'plots/my-plot.jpg'
We can now do everything we did in this tutorial in just three lines!
url = 'https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atl03?date=2021-08-22&minx=-24.03772653601395&miny=77.53753839308544&maxx=-23.86438315078483&maxy=77.57604313591165&trackId=909&beamNames=gt1r&beamNames=gt1l&outputFormat=json'
is2data = dataCollector(oaurl=url,beam='gt1r')
fig = is2data.visualize_sentinel2(max_cloud_prob=5, title='ICESat-2 visualization tutorial data', plot_filename='plots/tutorial_data.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 6127 data points. ---> requesting ATL06 data... 188 data points. ---> requesting ATL07 data... request failed. ---> requesting ATL08 data... 24 data points. ---> requesting ATL10 data... request failed. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data. The ground track is 4.3 km long. Looking for Sentinel-2 images from 2021-08-12T12:00:00 to 2021-09-01T12:00:00 --> there are 4 cloud-free images. Looking for Sentinel-2 images from 2021-08-02T12:00:00 to 2021-09-11T12:00:00 --> there are 8 cloud-free images. Looking for Sentinel-2 images from 2021-07-23T12:00:00 to 2021-09-21T12:00:00 --> there are 24 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2B_MSIL2A_20210820T144749_N0301_R082_T26XNM_20210820T173503 - time difference: 2 days before ICESat-2 - mean along-track cloud probability: 1.3 --> Downloaded the 8-bit RGB image as imagery/my-satellite-image.tif. --> Saved plot as plots/tutorial_data.jpg.
And now we can also easily run this on some totally different data!
with an OpenAltimetry API url...
url = 'https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atl03?date=2020-12-15&minx=-77.858681&miny=25.728091&maxx=-77.831461&maxy=25.832559&trackId=1254&beamNames=gt1r&beamNames=gt1l&outputFormat=json'
mydata = dataCollector(oaurl=url, beam='gt1r')
fig = mydata.visualize_sentinel2(max_cloud_prob=20,
title='Nearshore Bathymetry ICESat-2 Data',
imagery_filename='imagery/my-other-satellite-image.tif',
plot_filename='plots/nearshore-bathymetry.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 34237 data points. ---> requesting ATL06 data... 519 data points. ---> requesting ATL07 data... request failed. ---> requesting ATL08 data... 115 data points. ---> requesting ATL10 data... request failed. ---> requesting ATL12 data... 2 data points. ---> requesting ATL13 data... 316 data points. The ground track is 11.7 km long. Looking for Sentinel-2 images from 2020-12-05T12:00:00 to 2020-12-25T12:00:00 --> there are 4 cloud-free images. Looking for Sentinel-2 images from 2020-11-25T12:00:00 to 2021-01-04T12:00:00 --> there are 10 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2A_MSIL2A_20201213T155521_N0214_R011_T17RRJ_20201213T195958 - time difference: 2 days before ICESat-2 - mean along-track cloud probability: 5.7 --> Downloaded the 8-bit RGB image as imagery/my-other-satellite-image.tif. --> Saved plot as plots/nearshore-bathymetry.jpg.
...or with input parameters (track, date, beam, and bounding box)
# weird lake detected in EAIS
kwargs = {'track': 842,
'date': '2020-11-18',
'beam': 'gt2l',
'lonlims': [95.50, 95.55],
'latlims': [-65.635, -65.615]
}
mydata = dataCollector(**kwargs)
fig = mydata.visualize_sentinel2(max_cloud_prob=20, title='An Ice Shelf Bench', plot_filename='plots/weird-lake-detection_bench.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 60763 data points. ---> requesting ATL06 data... 113 data points. ---> requesting ATL07 data... No data. ---> requesting ATL08 data... 20 data points. ---> requesting ATL10 data... No data. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data. The ground track is 2.2 km long. Looking for Sentinel-2 images from 2020-11-08T12:00:00 to 2020-11-28T12:00:00 --> there are 6 cloud-free images. Looking for Sentinel-2 images from 2020-10-29T12:00:00 to 2020-12-08T12:00:00 --> there are 9 cloud-free images. Looking for Sentinel-2 images from 2020-10-19T12:00:00 to 2020-12-18T12:00:00 --> there are 10 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2B_MSIL2A_20201118T025609_N0214_R003_T46DFN_20201118T052915 - time difference: Same day as ICESat-2 - mean along-track cloud probability: 3.6 --> Downloaded the 8-bit RGB image as imagery/my-satellite-image.tif. --> Saved plot as plots/weird-lake-detection_bench.jpg.
url = 'https://openaltimetry.earthdatacloud.nasa.gov/data/api/icesat2/atl06?date=2021-01-10&minx=-119.587886&miny=37.730067&maxx=-119.488871&maxy=37.775582&trackId=265&beamNames=gt3r&beamNames=gt3l&beamNames=gt2r&beamNames=gt2l&beamNames=gt1r&beamNames=gt1l&outputFormat=json'
mydata = dataCollector(oaurl=url, beam='gt2l')
fig = mydata.visualize_sentinel2(title='Half Dome and Yosemite Valley', plot_filename='plots/half_dome_yosemite.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 5798 data points. ---> requesting ATL06 data... 193 data points. ---> requesting ATL07 data... request failed. ---> requesting ATL08 data... 19 data points. ---> requesting ATL10 data... request failed. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data. The ground track is 5.0 km long. Looking for Sentinel-2 images from 2020-12-31T12:00:00 to 2021-01-20T12:00:00 --> there are 8 cloud-free images. Looking for Sentinel-2 images from 2020-12-21T12:00:00 to 2021-01-30T12:00:00 --> there are 14 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2B_MSIL2A_20210111T184739_N0214_R070_T11SKB_20210111T210436 - time difference: 1 day after ICESat-2 - mean along-track cloud probability: 16.1 --> Downloaded the 8-bit RGB image as imagery/my-satellite-image.tif. --> Saved plot as plots/half_dome_yosemite.jpg.
kwargs = {'track': 721,
'date': '2019-05-15',
'beam': 'gt3r',
'latlims': [-71.35, -71.26],
'lonlims': [71.41, 71.59]
}
mydata = dataCollector(**kwargs)
fig = mydata.visualize_sentinel2(max_cloud_prob=5, title='Ice-Covered Lake on Ice Shelf', plot_filename='plots/ice_covered_lake.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 29323 data points. ---> requesting ATL06 data... 509 data points. ---> requesting ATL07 data... No data. ---> requesting ATL08 data... 102 data points. ---> requesting ATL10 data... No data. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data. The ground track is 10.1 km long. Looking for Sentinel-2 images from 2019-05-05T12:00:00 to 2019-05-25T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-04-25T12:00:00 to 2019-06-04T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-04-15T12:00:00 to 2019-06-14T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-04-05T12:00:00 to 2019-06-24T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-03-26T12:00:00 to 2019-07-04T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-03-16T12:00:00 to 2019-07-14T12:00:00 --> there are not enough cloud-free images: widening date range... Looking for Sentinel-2 images from 2019-03-06T12:00:00 to 2019-07-24T12:00:00 --> there are 2 cloud-free images. Looking for Sentinel-2 images from 2019-02-24T12:00:00 to 2019-08-03T12:00:00 --> there are 4 cloud-free images. Looking for Sentinel-2 images from 2019-02-14T12:00:00 to 2019-08-13T12:00:00 --> there are 6 cloud-free images. Looking for Sentinel-2 images from 2019-02-04T12:00:00 to 2019-08-23T12:00:00 --> there are 8 cloud-free images. Looking for Sentinel-2 images from 2019-01-25T12:00:00 to 2019-09-02T12:00:00 --> there are 8 cloud-free images. Looking for Sentinel-2 images from 2019-01-15T12:00:00 to 2019-09-12T12:00:00 --> there are 10 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2B_MSIL2A_20190314T034629_N0211_R075_T42DWF_20190314T084528 - time difference: 62 days before ICESat-2 - mean along-track cloud probability: 0.1 --> Downloaded the 8-bit RGB image as imagery/my-satellite-image.tif. --> Saved plot as plots/ice_covered_lake.jpg.
kwargs['date'] = '2019-11-13'
mydata = dataCollector(**kwargs)
fig = mydata.visualize_sentinel2(max_cloud_prob=20, title='Where did the lake go?', plot_filename='plots/ice_covered_lake_drained.jpg')
display(fig)
--> Getting data from OpenAltimetry. ---> requesting ATL03 data... 47229 data points. ---> requesting ATL06 data... 509 data points. ---> requesting ATL07 data... No data. ---> requesting ATL08 data... 102 data points. ---> requesting ATL10 data... No data. ---> requesting ATL12 data... No data. ---> requesting ATL13 data... No data. The ground track is 10.1 km long. Looking for Sentinel-2 images from 2019-11-03T12:00:00 to 2019-11-23T12:00:00 --> there are 2 cloud-free images. Looking for Sentinel-2 images from 2019-10-24T12:00:00 to 2019-12-03T12:00:00 --> there are 4 cloud-free images. Looking for Sentinel-2 images from 2019-10-14T12:00:00 to 2019-12-13T12:00:00 --> there are 6 cloud-free images. Looking for Sentinel-2 images from 2019-10-04T12:00:00 to 2019-12-23T12:00:00 --> there are 8 cloud-free images. Looking for Sentinel-2 images from 2019-09-24T12:00:00 to 2020-01-02T12:00:00 --> there are 10 cloud-free images. --> Closest cloud-free Sentinel-2 image to ICESat: - product_id: S2B_MSIL2A_20191119T034629_N0213_R075_T42DWG_20191119T070644 - time difference: 6 days after ICESat-2 - mean along-track cloud probability: 0.7 --> Downloaded the 8-bit RGB image as imagery/my-satellite-image.tif. --> Saved plot as plots/ice_covered_lake_drained.jpg.
Find some data from OpenAltimetry, and plot it with a cloud-free satellite image.
If you don't know where to start with OpenAltimetry, you can look here.
##### YOUR CODE GOES HERE - uncomment and edit it!
# url = 'http://???.org/??'
# gtx = 'gt??'
# is2data = dataCollector(oaurl=url, beam=gtx)
# fig = is2data.visualize_sentinel2(title='<your figure title goes here>',
# imagery_filename='imagery/your-satellite-image.tif',
# plot_filename='plots/your-plot.jpg')
🎉 Congratulations! You've completed this tutorial and have seen how we can put ICESat-2 photon-level data into context using Google Earth Engine and the OpenAltimetry API.
You can explore a few more use cases for this code (and possible solutions for Exercise 2) in this notebook.
To further explore the topics of this tutorial see the following detailed documentation:
Based on javascript code in this example.
# a function for buffering ICESat-2 along-track depth measurement locations by a footprint radius
def bufferPoints(radius, bounds=False):
def buffer(pt):
pt = ee.Feature(pt)
return pt.buffer(radius).bounds() if bounds else pt.buffer(radius)
return buffer
# a function for extracting Sentinel-2 band data for ICESat-2 along-track depth measurement locations
def zonalStats(ic, fc, reducer=ee.Reducer.mean(), scale=None, crs=None, bands=None, bandsRename=None,
imgProps=None, imgPropsRename=None, datetimeName='datetime', datetimeFormat='YYYY-MM-dd HH:mm:ss'):
# Set default parameters based on an image representative.
imgRep = ic.first()
nonSystemImgProps = ee.Feature(None).copyProperties(imgRep).propertyNames()
if not bands:
bands = imgRep.bandNames()
if not bandsRename:
bandsRename = bands
if not imgProps:
imgProps = nonSystemImgProps
if not imgPropsRename:
imgPropsRename = imgProps
# Map the reduceRegions function over the image collection.
results = ic.map(lambda img:
img.select(bands, bandsRename)
.set(datetimeName, img.date().format(datetimeFormat))
.set('timestamp', img.get('system:time_start'))
.reduceRegions(collection=fc.filterBounds(img.geometry()),reducer=reducer,scale=scale,crs=crs)
.map(lambda f: f.set(img.toDictionary(imgProps).rename(imgProps,imgPropsRename)))
).flatten().filter(ee.Filter.notNull(bandsRename))
return results
ground_track_buffer = 7.5 # radius in meters, for a 15-m diameter footprint
ground_track_coordinates = list(zip(is2data.gt.lon, is2data.gt.lat))
gtx_feature = ee.Geometry.LineString(coords=ground_track_coordinates, proj='EPSG:4326', geodesic=True)
aoi = gtx_feature.buffer(ground_track_buffer)
# clip to the ground track for along-track querying
img_gt = cloudfree_collection.first().clip(aoi)
# get the footprint-buffered query points from the lake object's depth_data dataframe
pts = ee.FeatureCollection([
ee.Feature(ee.Geometry.Point([r.lon, r.lat]), {'plot_id': i}) for i,r in is2data.gt.iterrows()
])
ptsS2 = pts.map(bufferPoints(ground_track_buffer))
# query the Sentinel-2 bands at the ground track locations where lake depths are posted
thiscoll = ee.ImageCollection([img_gt])
bandNames = ['B1', 'B2', 'B3', 'B4', 'B5', 'B6', 'B7', 'B8', 'B8A', 'B9', 'B11', 'B12', 'probability']
bandIdxs = list(range(len(bandNames)-1)) + [23]
ptsData = zonalStats(thiscoll, ptsS2, bands=bandIdxs, bandsRename=bandNames, imgProps=['PRODUCT_ID'], scale=5)
# get the data to the client side and merge into the lake depth data frame
features_select = ['plot_id'] + bandNames
results = ptsData.select(features_select, retainGeometry=False).getInfo()['features']
data = [x["properties"] for x in results]
dfS2 = pd.DataFrame(data).set_index('plot_id')
df_bands = is2data.gt.join(dfS2, how='left')
# calculate the Normalized Difference Water Index (NDWI)
df_bands['ndwi'] = (df_bands.B2-df_bands.B4) / (df_bands.B2+df_bands.B4)
df_bands
index | lat | lon | xatc | B1 | B11 | B12 | B2 | B3 | B4 | B5 | B6 | B7 | B8 | B8A | B9 | probability | ndwi | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 499 | 77.575984 | -23.950755 | 0.000000 | 7805.000000 | 174.496656 | 120.102007 | 7971.206243 | 7477.810479 | 6570.976589 | 6385.264214 | 5722.692308 | 4950.603679 | 4591.400223 | 4282.834448 | 4309.000000 | 0.0 | 0.096287 |
1 | 498 | 77.575907 | -23.950819 | 8.666588 | 7809.721973 | 175.095852 | 121.388453 | 7907.470852 | 7459.334081 | 6539.421525 | 6362.034753 | 5699.419283 | 4924.241031 | 4508.647982 | 4249.598094 | 4307.600897 | 0.0 | 0.094695 |
2 | 497 | 77.575830 | -23.950883 | 17.333177 | 7828.404029 | 175.104645 | 126.960269 | 7900.264130 | 7383.585898 | 6459.908226 | 6302.698937 | 5653.784555 | 4872.342473 | 4480.991606 | 4187.358142 | 4302.065473 | 0.0 | 0.100302 |
3 | 496 | 77.575754 | -23.950947 | 25.999765 | 7832.000000 | 174.848384 | 127.294872 | 7836.829431 | 7354.361204 | 6391.734671 | 6289.449275 | 5643.166667 | 4859.569119 | 4514.595318 | 4171.269231 | 4301.000000 | 0.0 | 0.101563 |
4 | 495 | 77.575677 | -23.951011 | 34.666353 | 7832.000000 | 180.040578 | 117.342968 | 7886.403558 | 7362.290161 | 6407.406337 | 6297.875486 | 5695.351862 | 4884.257921 | 4552.197888 | 4147.739300 | 4301.000000 | 0.0 | 0.103471 |
... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
495 | 4 | 77.538014 | -23.982483 | 4289.961227 | 7934.040044 | 190.987764 | 139.721913 | 8243.434928 | 7803.492770 | 7010.246941 | 6933.573415 | 6266.519466 | 5602.593993 | 5345.003337 | 4844.038376 | 4688.687430 | 0.0 | 0.080845 |
496 | 3 | 77.537937 | -23.982546 | 4298.627816 | 7928.000000 | 186.282294 | 150.310134 | 8169.710468 | 7785.692650 | 7035.095768 | 7021.868597 | 6326.749443 | 5705.568486 | 5387.273942 | 4982.011693 | 4766.000000 | 0.0 | 0.074622 |
497 | 2 | 77.537860 | -23.982610 | 4307.294404 | 7928.000000 | 188.880872 | 150.717562 | 8295.310962 | 7911.302013 | 7148.988814 | 6999.876957 | 6320.019575 | 5681.105705 | 5445.953020 | 4969.465884 | 4766.000000 | 0.0 | 0.074223 |
498 | 1 | 77.537783 | -23.982673 | 4315.960992 | 7928.000000 | 196.303641 | 151.083473 | 8282.308123 | 7897.145098 | 7167.764706 | 6872.538375 | 6280.672269 | 5563.534454 | 5361.434174 | 4868.732773 | 4766.000000 | 0.0 | 0.072138 |
499 | 0 | 77.537707 | -23.982736 | 4324.627581 | 7928.000000 | 196.805804 | 151.053571 | 8031.430804 | 7693.513393 | 6898.491071 | 6836.042411 | 6268.083705 | 5528.127232 | 5196.671875 | 4845.238839 | 4766.000000 | 0.0 | 0.075884 |
500 rows × 18 columns
fig, axs = plt.subplots(figsize=[9,5], nrows=3, sharex=True)
ax = axs[0]
ax.scatter(is2data.atl03.lat, is2data.atl03.h, s=1, c='k', label='ATL03')
ax.set_ylim((741, 818))
ax.legend()
ax = axs[1]
ax.plot(df_bands.lat,df_bands.ndwi, c='b', label='NDWI')
ax.set_ylim((0, 1))
ax.legend()
ax = axs[2]
scale = 1e-4
ax.plot(df_bands.lat,df_bands.B4*scale, c='r', label='B4')
ax.plot(df_bands.lat,df_bands.B3*scale, c='g', label='B3')
ax.plot(df_bands.lat,df_bands.B2*scale, c='b', label='B2')
ax.set_ylim((0, 1))
ax.legend(fontsize=8)
ax.set_xlim((is2data.gt.lat.min(), is2data.gt.lat.max()))
ax.set_xlabel('latitude')
plt.close(fig)
display(fig)