Skip to article frontmatterSkip to article content

Terrestrial Laser Scanning

This notebook is designed to take terrestrial laser scanner (TLS) data from the SnowEx Alaska Campaigns and derive snow depth. The TLS data is provided in both a raw point cloud format and a processed DEM format. For this example, we will be focusing on the TLS DEMs.

The TLS data is available through the cloud on NSIDC, so we will be using the earthaccess package. The first example will involve a single TLS image for simplicity, then we will have a second example that examines multiple TLS scans from the campaigns.

!pip install --upgrade earthaccess
Collecting earthaccess
  Downloading earthaccess-0.14.0-py3-none-any.whl.metadata (8.2 kB)
Collecting fsspec>=2022.11 (from earthaccess)
  Downloading fsspec-2025.7.0-py3-none-any.whl.metadata (12 kB)
Collecting importlib-resources>=6.3.2 (from earthaccess)
  Downloading importlib_resources-6.5.2-py3-none-any.whl.metadata (3.9 kB)
Collecting multimethod>=1.8 (from earthaccess)
  Downloading multimethod-2.0-py3-none-any.whl.metadata (9.2 kB)
Collecting pqdm>=0.1 (from earthaccess)
  Downloading pqdm-0.2.0-py2.py3-none-any.whl.metadata (3.2 kB)
Collecting python-cmr>=0.10.0 (from earthaccess)
  Downloading python_cmr-0.13.0-py3-none-any.whl.metadata (10 kB)
Requirement already satisfied: requests>=2.26 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from earthaccess) (2.32.5)
Collecting s3fs>=2022.11 (from earthaccess)
  Downloading s3fs-2025.7.0-py3-none-any.whl.metadata (1.4 kB)
Collecting tinynetrc>=1.3.1 (from earthaccess)
  Downloading tinynetrc-1.3.1-py2.py3-none-any.whl.metadata (2.9 kB)
Requirement already satisfied: typing-extensions>=4.10.0 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from earthaccess) (4.14.1)
Collecting bounded-pool-executor (from pqdm>=0.1->earthaccess)
  Downloading bounded_pool_executor-0.0.3-py3-none-any.whl.metadata (2.7 kB)
Collecting tqdm (from pqdm>=0.1->earthaccess)
  Downloading tqdm-4.67.1-py3-none-any.whl.metadata (57 kB)
Requirement already satisfied: python-dateutil<3.0.0,>=2.8.2 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from python-cmr>=0.10.0->earthaccess) (2.9.0.post0)
Requirement already satisfied: six>=1.5 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from python-dateutil<3.0.0,>=2.8.2->python-cmr>=0.10.0->earthaccess) (1.17.0)
Requirement already satisfied: charset_normalizer<4,>=2 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from requests>=2.26->earthaccess) (3.4.3)
Requirement already satisfied: idna<4,>=2.5 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from requests>=2.26->earthaccess) (3.10)
Requirement already satisfied: urllib3<3,>=1.21.1 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from requests>=2.26->earthaccess) (2.5.0)
Requirement already satisfied: certifi>=2017.4.17 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from requests>=2.26->earthaccess) (2025.8.3)
Collecting aiobotocore<3.0.0,>=2.5.4 (from s3fs>=2022.11->earthaccess)
  Downloading aiobotocore-2.24.1-py3-none-any.whl.metadata (25 kB)
Collecting aiohttp!=4.0.0a0,!=4.0.0a1 (from s3fs>=2022.11->earthaccess)
  Downloading aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.7 kB)
Collecting aioitertools<1.0.0,>=0.5.1 (from aiobotocore<3.0.0,>=2.5.4->s3fs>=2022.11->earthaccess)
  Downloading aioitertools-0.12.0-py3-none-any.whl.metadata (3.8 kB)
Collecting botocore<1.39.12,>=1.39.9 (from aiobotocore<3.0.0,>=2.5.4->s3fs>=2022.11->earthaccess)
  Downloading botocore-1.39.11-py3-none-any.whl.metadata (5.7 kB)
Collecting jmespath<2.0.0,>=0.7.1 (from aiobotocore<3.0.0,>=2.5.4->s3fs>=2022.11->earthaccess)
  Downloading jmespath-1.0.1-py3-none-any.whl.metadata (7.6 kB)
Collecting multidict<7.0.0,>=6.0.0 (from aiobotocore<3.0.0,>=2.5.4->s3fs>=2022.11->earthaccess)
  Downloading multidict-6.6.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (5.3 kB)
Collecting wrapt<2.0.0,>=1.10.10 (from aiobotocore<3.0.0,>=2.5.4->s3fs>=2022.11->earthaccess)
  Downloading wrapt-1.17.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl.metadata (6.4 kB)
Collecting aiohappyeyeballs>=2.5.0 (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess)
  Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl.metadata (5.9 kB)
Collecting aiosignal>=1.4.0 (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess)
  Downloading aiosignal-1.4.0-py3-none-any.whl.metadata (3.7 kB)
Requirement already satisfied: attrs>=17.3.0 in /home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess) (25.3.0)
Collecting frozenlist>=1.1.1 (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess)
  Downloading frozenlist-1.7.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (18 kB)
Collecting propcache>=0.2.0 (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess)
  Downloading propcache-0.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (12 kB)
Collecting yarl<2.0,>=1.17.0 (from aiohttp!=4.0.0a0,!=4.0.0a1->s3fs>=2022.11->earthaccess)
  Downloading yarl-1.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (73 kB)
Downloading earthaccess-0.14.0-py3-none-any.whl (64 kB)
Downloading fsspec-2025.7.0-py3-none-any.whl (199 kB)
Downloading importlib_resources-6.5.2-py3-none-any.whl (37 kB)
Downloading multimethod-2.0-py3-none-any.whl (9.8 kB)
Downloading pqdm-0.2.0-py2.py3-none-any.whl (6.8 kB)
Downloading python_cmr-0.13.0-py3-none-any.whl (14 kB)
Downloading s3fs-2025.7.0-py3-none-any.whl (30 kB)
Downloading aiobotocore-2.24.1-py3-none-any.whl (85 kB)
Downloading aiohttp-3.12.15-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.7 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/1.7 MB ? eta -:--:--
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 21.9 MB/s  0:00:00
Downloading aioitertools-0.12.0-py3-none-any.whl (24 kB)
Downloading botocore-1.39.11-py3-none-any.whl (13.9 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 0.0/13.9 MB ? eta -:--:--
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13.9/13.9 MB 68.5 MB/s  0:00:00m0:00:01
Downloading jmespath-1.0.1-py3-none-any.whl (20 kB)
Downloading multidict-6.6.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (254 kB)
Downloading wrapt-1.17.3-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl (88 kB)
Downloading yarl-1.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (352 kB)
Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl (15 kB)
Downloading aiosignal-1.4.0-py3-none-any.whl (7.5 kB)
Downloading frozenlist-1.7.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (232 kB)
Downloading propcache-0.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (206 kB)
Downloading tinynetrc-1.3.1-py2.py3-none-any.whl (3.9 kB)
Downloading bounded_pool_executor-0.0.3-py3-none-any.whl (3.4 kB)
Downloading tqdm-4.67.1-py3-none-any.whl (78 kB)
Installing collected packages: tinynetrc, bounded-pool-executor, wrapt, tqdm, propcache, multimethod, multidict, jmespath, importlib-resources, fsspec, frozenlist, aioitertools, aiohappyeyeballs, yarl, python-cmr, pqdm, botocore, aiosignal, aiohttp, aiobotocore, s3fs, earthaccess
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━  9/22 [fsspec]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 13/22 [yarl]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16/22 [botocore]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16/22 [botocore]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16/22 [botocore]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18/22 [aiohttp]
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 22/22 [earthaccess]
Successfully installed aiobotocore-2.24.1 aiohappyeyeballs-2.6.1 aiohttp-3.12.15 aioitertools-0.12.0 aiosignal-1.4.0 botocore-1.39.11 bounded-pool-executor-0.0.3 earthaccess-0.14.0 frozenlist-1.7.0 fsspec-2025.7.0 importlib-resources-6.5.2 jmespath-1.0.1 multidict-6.6.4 multimethod-2.0 pqdm-0.2.0 propcache-0.3.2 python-cmr-0.13.0 s3fs-2025.7.0 tinynetrc-1.3.1 tqdm-4.67.1 wrapt-1.17.3 yarl-1.20.1
import earthaccess
import matplotlib.pyplot as plt
import numpy as np
import re
import rioxarray as rxr
import shutil
import tempfile
import xarray as xr
/home/runner/micromamba/envs/cookbook-dev/lib/python3.13/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html
  from .autonotebook import tqdm as notebook_tqdm
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[2], line 2
      1 import earthaccess
----> 2 import matplotlib.pyplot as plt
      3 import numpy as np
      4 import re

ModuleNotFoundError: No module named 'matplotlib'

The TLS data was gathered in Bonanza Creek near Fairbanks, AK in two months: October 2022 and March 2023. These months correspond to the snow-off and snow-on seasons, respectively. We will start by getting some sample snow-on TLS data from a single day.

# Authenticate with Earthdata Login servers
auth = earthaccess.login(strategy="interactive")

# Search for snow-on granules
results = earthaccess.search_data(
    #short_name="SNEX23_BCEF_TLS",
    doi = "10.5067/R466GRXNA61S",
    temporal=('2023-03-15', '2023-03-15'),
)

Because the TLS data is available on-demand through the cloud, we do not need to download it. Instead, we can stream it directly with rioxarray!

# Load a single TLS scan
files = earthaccess.open(results)
snow_on = rxr.open_rasterio(files[1])
snow_on.rio.width
# Visualize the snow-on data
fig, ax = plt.subplots()
snow_on.plot(ax=ax, vmin=123, vmax=126,
             cbar_kwargs={'label': "Elevation [m]"})
ax.set_xlabel("Easting [m]")
ax.set_ylabel("Northing [m]")
ax.set_title(" ")

Two things are noticeable from this TLS data:

  1. It has a very high resolution (0.15 m).
  2. The signal attenutates after ~60 m, so we have a small field of view.

This suggests that we will be able to obtain very fine-scale measurements of snow depth, but we will need scans from multiple locations to better characterize snow in Bonanza Creek.

In any case, let’s grab the snow-off data from the same location, and try to derive snow depth.

# Now search for snow-off granules
results = earthaccess.search_data(
    #short_name="SNEX23_BCEF_TLS",
    doi = "10.5067/R466GRXNA61S",
    temporal=('2022-10-25', '2022-10-25'),
)
display(results)
# Again, load a single snow-off TLS scan
files = earthaccess.open(results)
snow_off = rxr.open_rasterio(files[1])
fig, ax = plt.subplots()
snow_off.plot(vmin=123, vmax=126,
              cbar_kwargs={'label': "Elevation [m]"})
ax.set_xlabel("Easting [m]")
ax.set_ylabel("Northing [m]")
ax.set_title(" ")

Although the snow-on/-off data look similar to each other, there are slight differences, meaning that we cannot perform a difference right away. We must first interpolate the data, ensuring that fill values are accounted for, then perform the difference.

# Interpolate snow-on data onto the x/y grid of snow-off data
snow_on_interp = snow_on.interp(
    x=snow_off.x,
    y=snow_off.y,
    kwargs={"fill_value": snow_on.attrs.get('_FillValue', np.nan)}
)

# Calculate the difference (snow depth)
difference = snow_on_interp - snow_off

# Define fill values in data
fill = snow_off.attrs.get('_FillValue', -9999.0)

# Include only data that is not equal to the fill value
difference = difference.where((snow_off != fill) & (snow_on_interp != fill))
# Plot snow depth over the TLS scene
fig, ax = plt.subplots()
difference.plot(vmin=0, vmax=1.5,
                cbar_kwargs={'label': "Snow depth [m]"})
ax.set_xlabel("Easting [m]")
ax.set_ylabel("Northing [m]")
ax.set_title(" ")

Although not perfect, this provides a very reasonable snow depth DEM for the TLS data gathered in this location. If we want, we can perform basic statistics on the derived snow depths.

# Calculate median snow depth over the scene
median_depth = difference.where(difference>=0).median()

# Make histogram plot of snow depth
fig, ax = plt.subplots()
difference.where(difference>=0).plot.hist(ax=ax, bins=50)
ax.axvline(x=median_depth, color='black', linewidth=2, linestyle='--') # Median depth line
ax.set_xlim([0, 2.5])
ax.set_ylabel("Counts")
ax.set_xlabel("Snow depth [m]")
ax.set_title(' ')
ax.text(1, 8000, f'Median depth = {median_depth:.2f} m', fontsize=12)

Multiple Scans Example

Because we can stream the TLS data through the cloud, this example is very similar to the above code. The main exception is that we will generate a list of DataArrays, from which we derive snow depth for three TLS scanning locations.

# Search for snow-on granules
snow_on_results = earthaccess.search_data(
    #short_name="SNEX23_BCEF_TLS",
    doi = "10.5067/R466GRXNA61S",
    temporal=('2023-03-01', '2023-03-31'),
)

snow_off_results = earthaccess.search_data(
    #short_name="SNEX23_BCEF_TLS",
    doi = "10.5067/R466GRXNA61S",
    temporal=('2022-10-01', '2022-10-31'),
)
# Create list of snow-on DataArrays
snow_on_files = earthaccess.open(snow_on_results)
snow_on_rasters = [rxr.open_rasterio(f) for f in snow_on_files]

# Create list of snow-off DataArrays
snow_off_files = earthaccess.open(snow_off_results)
snow_off_rasters = [rxr.open_rasterio(f) for f in snow_off_files]

To make the final plot of this example cleaner, we will assign each TLS scan a label based on the site ID at Bonanza Creek.

snon_site_ids = []
snoff_site_ids = []
# Get site IDs for each snow-on DataArray
for f in snow_on_files:
    # Get path from file name
    path = f.path
    # Use regex to extract the site ID from file path, given pattern _SW_YYYYMMDD_SITEID_V
    m = re.search(r'_(SW|N)_\d{8}_(.*?)_V', path)
    if m:
        snon_site_ids.append(m.group(2))
    else:
        snon_site_ids.append("unknown")

# Get site IDs for each snow-off DataArray
for f in snow_off_files:
    # Step 1: Extract path
    path = f.path
    # Step 2: Use regex to extract the site ID
    # Pattern: _SW_YYYYMMDD_SITEID_V
    m = re.search(r'_(SW|N|NE)_\d{8}_(.*?)_V', path)
    if m:
        snoff_site_ids.append(m.group(2))
    else:
        snoff_site_ids.append("unknown")

print(snon_site_ids)
print(snoff_site_ids)
# Add site ID to attributes of DataArrays
for r, site in zip(snow_on_rasters, snon_site_ids):
    r.attrs['site_id'] = site

for r, site in zip(snow_off_rasters, snoff_site_ids):
    r.attrs['site_id'] = site
# Create dictionaries linking each DataArray to a site ID
snow_on_dict = {r.attrs['site_id']: r for r in snow_on_rasters}
snow_off_dict = {r.attrs['site_id']: r for r in snow_off_rasters}

Now each TLS scan is linked to a site ID. However, we can see that the snow-on data has many more scans than the snow-off data. Because snow depth data is our priority, we will only consider snow-on scans that share a site ID with the snow-off data.

# Determine site IDs with recorded data for both snow-off and snow-on season
common_site_ids = sorted(set(snow_on_dict).intersection(snow_off_dict))
print("Common site IDs:", common_site_ids)
# Create lists of DataArrays for the common sites only
snow_on_paired = [snow_on_dict[sid] for sid in common_site_ids]
snow_off_paired = [snow_off_dict[sid] for sid in common_site_ids]

Now that the site IDs are matched, deriving snow depth is the same as the first example, only with looping to make the calculation (and plotting) easier.

snow_depths = []
# Interpolate DataArrays and derive snow depth, as before
for so, soff, site in zip(snow_on_paired, snow_off_paired, common_site_ids):
    # Interpolate snow-on data onto the x/y grid of snow-off data
    tmp_interp = so.interp(
        x=soff.x,
        y=soff.y,
    )

    tmp_diff = tmp_interp - soff
    tmp_diff.attrs['site_id'] = site

    tmp_diff = tmp_diff.where((tmp_diff[0]>0)&(tmp_diff[0]<=2))
    snow_depths.append(tmp_diff)
# Plot the derived snow depths in a 3x3 figure
fig, axes = plt.subplots(3, 3, figsize=(12, 12))
axes = axes.flatten()

for idx, data_array in enumerate(snow_depths):
    data_array.plot(ax=axes[idx], vmin=0, vmax=2)
    axes[idx].set_title(f"{snow_depths[idx].attrs['site_id']}")

plt.tight_layout()
plt.show()

That’s all there is to it! Some of the coverage is a bit sparse, and the depths over site DEC look rather high, but we otherwise have reasonable snow depths over 9 sites in Bonanza Creek. These could then be compared to other ground based efforts or airborne data to cross-calibrate observation methods.