Coregistration#

xDEM implements a wide range of coregistration algorithms and pipelines for 3-dimensional alignment from the peer-reviewed literature often tailored specifically to elevation data, aiming at correcting systematic elevation errors.

Two categories of alignment are generally differentiated: 3D affine transformations described below, and other alignments that possibly rely on external variables, described in Bias correction.

Affine transformations can include vertical and horizontal translations, rotations and reflections, and scalings.

More reading

Coregistration heavily relies on the use of static surfaces, which you can read more about on the Static surfaces as error proxy guide page.

Quick use#

Coregistration pipelines are defined by combining Coreg objects:

import xdem

# Create a coregistration pipeline
my_coreg_pipeline = xdem.coreg.ICP() + xdem.coreg.NuthKaab()

# Or use a single method
my_coreg_pipeline = xdem.coreg.NuthKaab()

Then, coregistering a pair of elevation data can be done by calling xdem.coreg.workflows.dem_coregistration(), or xdem.DEM.coregister_3d() from the DEM that should be aligned.

Hide the code for opening example files
import geoutils as gu
import numpy as np
import matplotlib.pyplot as plt
from xdem.coreg.workflows import dem_coregistration

# Open a reference and to-be-aligned DEM
ref_dem = xdem.DEM(xdem.examples.get_path("longyearbyen_ref_dem"))
tba_dem = xdem.DEM(xdem.examples.get_path("longyearbyen_tba_dem"))
# Coregister by calling the dem_coregistration function
aligned_dem = dem_coregistration(tba_dem, ref_dem, coreg_method=my_coreg_pipeline)
# Coregister by calling the DEM method
aligned_dem = tba_dem.coregister_3d(ref_dem, my_coreg_pipeline)

Alternatively, the coregistration can be applied by calling fit_and_apply(), or sequentially calling the fit() and apply() steps, which allows a broader variety of arguments at each step, and re-using the same transformation to several objects (e.g., horizontal shift of both a stereo DEM and its ortho-image).

# (Equivalent) Or use fit and apply
aligned_dem = my_coreg_pipeline.fit_and_apply(ref_dem, tba_dem)

Information about the coregistration inputs and outputs is summarized in info().

Tip

Often, an inlier_mask has to be passed to fit() to isolate static surfaces to utilize during coregistration (for instance removing vegetation, snow, glaciers). This mask can be easily derived using create_mask().

Summary of supported methods#

Affine method

Description

Reference

Nuth and Kääb (2011)

Horizontal and vertical translations

Nuth and Kääb (2011)

Minimization of dh

Horizontal and vertical translations

N/A

Iterative closest point

Translation and rotations

Besl and McKay (1992)

Vertical shift

Vertical translation

N/A

Using a coregistration#

The Coreg object#

Each coregistration method implemented in xDEM inherits their interface from the Coreg class1, and has the following methods:

  • fit_and_apply() for estimating the transformation and applying it in one step,

  • info() for plotting the metadata, including inputs and outputs of the coregistration.

For more details on the input and output metadata stored by coregistration methods, see the Coregistration metadata section.

The two above methods cover most uses. More specific methods are also available:

  • fit() for estimating the transformation without applying it,

  • apply() for applying an estimated transformation,

  • to_matrix() to convert the transform to a 4x4 transformation matrix, if possible,

  • from_matrix() to create a coregistration from a 4x4 transformation matrix.

Inheritance diagram of implemented coregistrations:

Inheritance diagram of xdem.coreg.base.Coreg, xdem.coreg.affine, xdem.coreg.biascorr

See Bias correction for more information on non-rigid transformations.

Coregistration methods#

Important

Below we create misaligned elevation data to examplify the different methods in relation to their type of affine transformation.

See coregistration on real data in the Basic and Advanced gallery examples!

Nuth and Kääb (2011)#

xdem.coreg.NuthKaab

  • Performs: Horizontal and vertical shifts.

  • Supports weights: Planned.

  • Pros: Refines sub-pixel horizontal shifts accurately, with fast convergence.

  • Cons: Diverges on flat terrain, as landforms are required to constrain the fit with aspect and slope.

The Nuth and Kääb (2011) coregistration approach estimates a horizontal translation iteratively by solving a cosine equation between the terrain slope, aspect and the elevation differences. The iteration stops if it reaches the maximum number of iteration limit, or if the iterative shift amplitude falls below a specified tolerance.

Hide the code for adding a horizontal and vertical shift
x_shift = 30
y_shift = 30
z_shift = 10
# Affine matrix for 3D transformation
matrix = np.array(
    [
        [1, 0, 0, x_shift],
        [0, 1, 0, y_shift],
        [0, 0, 1, z_shift],
        [0, 0, 0, 1],
    ]
)
# We create misaligned elevation data
tba_dem_shifted = xdem.coreg.apply_matrix(ref_dem, matrix)
# Define a coregistration based on the Nuth and Kääb (2011) method
nuth_kaab = xdem.coreg.NuthKaab()
# Fit to data and apply
aligned_dem = nuth_kaab.fit_and_apply(ref_dem, tba_dem_shifted)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before NK")
(tba_dem_shifted - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After NK")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
plt.tight_layout()
_images/df440ae7d4936dec59f227ba5510da6bd214ed3f567456d387db7ad631afe842.png

Minimization of dh#

xdem.coreg.DhMinimize

  • Performs: Horizontal and vertical shifts.

  • Supports weights: Planned.

  • Pros: Can be used to perform both local and global registration by tuning the minimizer.

  • Cons: Sensitive to noise.

The minimization of elevation differences (or dh) coregistration approach estimates a horizontal translation by minimizing any statistic on the elevation differences, typically their spread (defaults to the NMAD).

# Define a coregistration based on the dh minimization approach
dh_minimize = xdem.coreg.DhMinimize()
# Fit to data and apply
aligned_dem = dh_minimize.fit_and_apply(ref_dem, tba_dem_shifted)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before dh\nminimize")
(tba_dem_shifted - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After dh\nminimize")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
plt.tight_layout()
_images/f5d00e169a37d3a27b2bbb8cf7800c92a635db30707b1864ebc671dd177c339d.png

Vertical shift#

xdem.coreg.VerticalShift

  • Performs: Vertical shifting using any custom function (mean, median, percentile).

  • Supports weights: Planned.

  • Pros: Useful to have as independent step to refine vertical alignment precisely as it is the most sensitive to outliers, by refining inliers and the central estimate function.

  • Cons: Always needs to be combined with another approach.

The vertical shift coregistration is simply a shift based on an estimate of the mean elevation differences with customizable arguments.

Hide the code for adding a vertical shift
# Apply a vertical shift of 10 meters
tba_dem_vshifted = ref_dem + 10
# Define a coregistration object based on a vertical shift correction
vshift = xdem.coreg.VerticalShift(vshift_reduc_func=np.median)
# Fit and apply
aligned_dem = vshift.fit_and_apply(ref_dem, tba_dem_vshifted)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before vertical\nshift")
(tba_dem_vshifted - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After vertical\nshift")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
plt.tight_layout()
_images/ceb1a3784b304e1f1f167c2d273287900423c5373a40a8feff7b9203ceb8e19a.png

Iterative closest point#

xdem.coreg.ICP

  • Performs: Rigid transform transformation (3D translation + 3D rotation).

  • Does not support weights.

  • Pros: Efficient at estimating rotation and shifts simultaneously.

  • Cons: Poor sub-pixel accuracy for horizontal shifts, sensitive to outliers, and runs slowly with large samples.

Iterative Closest Point (ICP) coregistration is an iterative point cloud registration method from Besl and McKay (1992). It aims at iteratively minimizing the distance between closest neighbours by applying sequential rigid transformations. If DEMs are used as inputs, they are converted to point clouds. As for Nuth and Kääb (2011), the iteration stops if it reaches the maximum number of iteration limit or if the iterative transformation amplitude falls below a specified tolerance.

ICP is currently based on OpenCV’s implementation (an optional dependency), which includes outlier removal arguments. This may improve results significantly on outlier-prone data, but care should still be taken, as the risk of landing in local minima increases.

Hide the code for adding a shift and rotation
# Apply a rotation of 0.2 degrees in X, 0.1 in Y and 0 in Z
e = np.deg2rad([0.2, 0.1, 0])
# Add X/Y/Z shifts in meters
shifts = np.array([10, 20, 5])
# Affine matrix for 3D transformation
import pytransform3d
matrix_rot = pytransform3d.rotations.matrix_from_euler(e, i=0, j=1, k=2, extrinsic=True)
matrix = np.diag(np.ones(4, dtype=float))
matrix[:3, :3] = matrix_rot
matrix[:3, 3] = shifts

centroid = [ref_dem.bounds.left + 5000, ref_dem.bounds.top - 2000, np.median(ref_dem)]
# We create misaligned elevation data
tba_dem_shifted_rotated = xdem.coreg.apply_matrix(ref_dem, matrix, centroid=centroid)
# Define a coregistration based on ICP
icp = xdem.coreg.ICP()
# Fit to data and apply
aligned_dem = icp.fit_and_apply(ref_dem, tba_dem_shifted_rotated)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before ICP")
(tba_dem_shifted_rotated - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After ICP")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
plt.tight_layout()
_images/05a3f05addddf72e3949d8cce7df50cbfb929456007ae99bdfd0400639a23861.png

Building coregistration pipelines#

The CoregPipeline object#

Often, more than one coregistration approach is necessary to obtain the best results, and several need to be combined sequentially. A CoregPipeline can be constructed for this:

# We can list sequential coregistration methods to apply
pipeline = xdem.coreg.CoregPipeline([xdem.coreg.ICP(), xdem.coreg.NuthKaab()])

# Or sum them, which works identically as the syntax above
pipeline = xdem.coreg.ICP() + xdem.coreg.NuthKaab()

The CoregPipeline object exposes the same interface as the Coreg object. The results of a pipeline can be used in other programs by exporting the combined transformation matrix using to_matrix().

# Fit to data and apply the pipeline of ICP + Nuth and Kääb
aligned_dem = pipeline.fit_and_apply(ref_dem, tba_dem_shifted_rotated)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before ICP + NK")
(tba_dem_shifted_rotated - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After ICP + NK")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
plt.tight_layout()
_images/d448315caabdb92e539d6c880b697e119c41ed703ff45021bd2732b8f70cde53.png

Coregistration metadata#

The metadata surrounding a coregistration method, which can be displayed by info(), is stored in the meta nested dictionary. This metadata is divided into inputs and outputs. Input metadata corresponds to what arguments are used when initializing a Coreg object, while output metadata are created during the call to fit(). Together, they allow to apply the transformation during the apply() step of the coregistration.

# Example of metadata info after fitting
my_coreg_pipeline.info()
Generic coregistration information 
  Method:       NuthKaab 
  Is affine?    True 
  Fit called?   True 
Inputs
  Randomization
    Subsample size requested:                     500000.0
    Random generator:                             None
  Fitting and binning
    Fit, bin or bin+fit:                          bin_and_fit
    Function to fit:                              _nuth_kaab_fit_func
    Optimizer for fitting:                        curve_fit
    Bin sizes or edges:                           72
    Binning statistic:                            nanmedian
    Number of dimensions of binning and fitting:  1
    Names of bias variables:                      ['aspect']
  Iterative
    Maximum number of iterations:                 10
    Tolerance to reach (pixel size):              0.001
Outputs
  Randomization
    Subsample size drawn from valid values:       500000
  Affine
    Eastward shift estimated (georeferenced unit):8.6
    Northward shift estimated (georeferenced unit):1.3
    Vertical shift estimated (elevation unit):    -2.43

For both inputs and outputs, four consistent categories of metadata are defined.

Note: Some metadata, such as the parameters fit_or_bin and fit_func described below, are pre-defined for affine coregistration methods and cannot be modified. They only take user-specified value for Bias correction.

1. Randomization metadata (common to all):

  • An input subsample to define the subsample size of valid data to use in all methods (recommended for performance),

  • An input random_state to define the random seed for reproducibility of the subsampling (and potentially other random aspects such as optimizers),

  • An output subsample_final that stores the final subsample size used, which can be smaller than requested depending on the amount of valid data intersecting the two elevation datasets.

2. Fitting and binning metadata (common to nearly all methods):

  • An input fit_or_bin to either fit a parametric model by passing “fit”, perform an empirical binning by passing “bin”, or to fit a parametric model to the binning with “bin_and_fit” (only “fit” or “bin_and_fit” possible for affine methods),

  • An input fit_func to pass any parametric function to fit to the bias (pre-defined for affine methods),

  • An input fit_optimizer to pass any optimizer function to perform the fit minimization,

  • An input bin_sizes to pass the size or edges of the bins for each variable,

  • An input bin_statistic to pass the statistic to compute in each bin,

  • An input bin_apply_method to pass the method to apply the binning for correction,

  • An output fit_params that stores the optimized parameters for fit_func,

  • An output fit_perr that stores the error of optimized parameters (only for default fit_optimizer),

  • An output bin_dataframe that stores the dataframe of binned statistics.

3. Iteration metadata (common to all iterative methods):

  • An input max_iterations to define the maximum number of iterations at which to stop the method,

  • An input tolerance to define the tolerance at which to stop iterations (tolerance unit defined in method description),

  • An output last_iteration that stores the last iteration of the method,

  • An output all_tolerances that stores the tolerances computed at each iteration.

4. Affine metadata (common to all affine methods):

  • An output matrix that stores the estimated affine matrix,

  • An output centroid that stores the centroid coordinates with which to apply the affine transformation,

  • Outputs shift_x, shift_y and shift_z that store the easting, northing and vertical offsets, respectively.

Tip

In xDEM, you can extract the translations and rotations of an affine matrix using xdem.coreg.AffineCoreg.to_translations and xdem.coreg.AffineCoreg.to_rotations.

To further manipulate affine matrices, see the documentation of pytransform3d.

5. Specific metadata (only for certain methods):

These metadata are only inputs specific to a given method, outlined in the method description.

For instance, for xdem.coreg.Deramp, an input poly_order to define the polynomial order used for the fit, and for xdem.coreg.DirectionalBias, an input angle to define the angle at which to do the directional correction.

Dividing coregistration in blocks#

The BlockwiseCoreg object#

Caution

The BlockwiseCoreg feature is still experimental: it might not support all coregistration methods, and create edge artefacts.

Sometimes, we want to split a coregistration across different spatial subsets of an elevation dataset, running that method independently in each subset. A BlockwiseCoreg can be constructed for this:

blockwise = xdem.coreg.BlockwiseCoreg(xdem.coreg.NuthKaab(), subdivision=16)

The subdivision corresponds to an equal-length block division across the extent of the elevation dataset. It needs to be a number of the form 2n (such as 4 or 256).

It is run the same way as other coregistrations:

# Run 16 block coregistrations
aligned_dem = blockwise.fit_and_apply(ref_dem, tba_dem_shifted)
Hide plotting code
# Plot before and after
f, ax = plt.subplots(1, 2)
ax[0].set_title("Before block NK")
(tba_dem_shifted - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[0])
ax[1].set_title("After block NK")
(aligned_dem - ref_dem).plot(cmap='RdYlBu', vmin=-30, vmax=30, ax=ax[1], cbar_title="Elevation differences (m)")
_ = ax[1].set_yticklabels([])
_images/74c380cfeb8a36de138f39e2b56499f3c6dc8d9c9b6cf57a67b14d592ace9146.png