Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add equivalent layer for harmonic functions #78

Merged
merged 60 commits into from
Sep 20, 2019
Merged
Show file tree
Hide file tree
Changes from 57 commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
c714a85
Start experimenting with EQL
santisoler Jun 28, 2019
a08efbe
Replace 2D for 3D
santisoler Jul 4, 2019
4a0751a
Change gridder for interpolator
santisoler Jul 4, 2019
1a1b3e5
Continue experimenting with EQL
santisoler Jul 4, 2019
01938f0
Initialize data array with zeros instead of empty
santisoler Jul 5, 2019
d2920ef
Improve some styling
santisoler Jul 5, 2019
a7a1255
Add docstrings for functions and methods
santisoler Jul 5, 2019
1a0fe92
Simplify arguments for jacobian_numba
santisoler Jul 5, 2019
3db62d4
Split distance computation on independent function
santisoler Jul 5, 2019
cd005be
Define points at depth 3 times min distance
santisoler Jul 5, 2019
17df1e7
Change default depth to 3x dist to nearest point
santisoler Jul 5, 2019
e56c8d7
Add depth_factor parameter
santisoler Jul 16, 2019
512056f
Move points and depth_factor arguments to __init__
santisoler Jul 21, 2019
2f798df
Rename point masses to sources and masses to coefs
santisoler Jul 21, 2019
3d0e8a7
Rename gridder to EQLHarmonic
santisoler Jul 21, 2019
2138f4e
Incorporate verde.median_distance
santisoler Jul 21, 2019
4293aa1
Import EQLHarmonic on harmonica/__init__.py
santisoler Jul 21, 2019
3f69ede
Use verde.base.n_1d_arrays on input coordinates
santisoler Jul 21, 2019
7aa8881
Change coordinates from list to tuple in predict
santisoler Jul 21, 2019
a4af096
Start writing an example for EQLHarmonic
santisoler Jul 21, 2019
c46f62f
Add cross-validation
santisoler Jul 21, 2019
3516889
Improve how grid is generated on example
santisoler Jul 22, 2019
ec7834a
Start drafting tests for EQLHarmonic
santisoler Jul 22, 2019
6d1f2f7
Add test function for jacobian_numba
santisoler Jul 22, 2019
c5a009f
Merge branch 'master' into eql
santisoler Jul 22, 2019
8519d78
Add test function for EQLHarmonic with few sources
santisoler Jul 22, 2019
81bac5e
Merge branch 'master' into eql
santisoler Jul 22, 2019
8cf45e4
Change sign when setting default point sources
santisoler Jul 22, 2019
91e55eb
Merge branch 'master' into eql
santisoler Jul 23, 2019
1ecbd7c
Merge branch 'master' into eql
santisoler Aug 1, 2019
ec89987
Merge branch 'master' into eql
santisoler Aug 5, 2019
819bff3
Merge branch 'master' into eql
santisoler Aug 8, 2019
308f659
Rename vertical coordinate to upward
santisoler Aug 8, 2019
bc3076c
Make use of forward.utils.distance_cartesian func
santisoler Aug 8, 2019
16ec998
Run black
santisoler Aug 8, 2019
ffcab6f
Fix wrong call of distance_cartesian function
santisoler Aug 8, 2019
df00fb2
Minor changes for fixing pylint errors
santisoler Aug 8, 2019
63860a1
Update eql_harmonic example
santisoler Aug 8, 2019
f92fda1
Add text on eql_harmonic example
santisoler Aug 8, 2019
22d08ca
Start improving example figure
santisoler Aug 8, 2019
e3479fa
Fix typo
santisoler Aug 14, 2019
444bed7
Add citation of Cooper for default depth_factor
santisoler Aug 14, 2019
62946ec
Improve docstring
santisoler Aug 14, 2019
951dfad
Merge branch 'master' into eql
santisoler Aug 14, 2019
4e7ee68
Change sign of relative depth of point sources
santisoler Aug 14, 2019
a3da7f1
Improve gallery example for EQLHarmonic
santisoler Aug 14, 2019
ea08f08
Add EQLHarmonic to api/index.rst
santisoler Aug 14, 2019
4ddf0f1
Remove unused cartopy from example
santisoler Aug 15, 2019
348c25b
Merge branch 'master' into eql
santisoler Aug 16, 2019
6df8c5e
Fix comment on setting the depth of point sources
santisoler Aug 16, 2019
855c469
Retitle API section where EQLHarmonic lives
santisoler Aug 28, 2019
93b6c92
Use a single depth for now and reword example
leouieda Sep 13, 2019
1680a37
Add information to class docstring
leouieda Sep 14, 2019
63b54f8
Separate EQL examples and add more text
leouieda Sep 14, 2019
9f9b9d2
Rework tests to check the actual interpolation
leouieda Sep 14, 2019
89ed643
Take ownership of azure conda path
leouieda Sep 14, 2019
12f38aa
Test using custom sources as well
leouieda Sep 14, 2019
5bad26e
Merge branch 'master' into eql
santisoler Sep 20, 2019
b1c9344
Merge branch 'master' into eql
santisoler Sep 20, 2019
5129904
Rename depth variable to relative_depth
santisoler Sep 20, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,12 @@ jobs:
- bash: echo "##vso[task.prependpath]$CONDA/bin"
displayName: Add conda to PATH

# On Hosted macOS, the agent user doesn't have ownership of Miniconda's installation
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't we add these lines on a different PR?

# directory We need to take ownership if we want to update conda or install packages
# globally
- bash: sudo chown -R $USER $CONDA
displayName: Take ownership of conda installation

# Get the Fatiando CI scripts
- bash: git clone --branch=1.1.1 --depth=1 https:/fatiando/continuous-integration.git
displayName: Fetch the Fatiando CI scripts
Expand Down
8 changes: 8 additions & 0 deletions doc/api/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,14 @@ Gravity Corrections
normal_gravity
bouguer_correction

Equivalent Layers
--------------------------

.. autosummary::
:toctree: generated/

EQLHarmonic

Forward modelling
-----------------

Expand Down
2 changes: 2 additions & 0 deletions doc/references.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ References
.. [AmanteEakins2009] Amante, C. and B.W. Eakins, 2009. ETOPO1 1 Arc-Minute Global Relief Model: Procedures, Data Sources and Analysis. NOAA Technical Memorandum NESDIS NGDC-24. National Geophysical Data Center, NOAA. doi:`10.7289/V5C8276M <https://doi.org/10.7289/V5C8276M>`__
.. [BarthelmesKohler2016] Barthelmes, F. and Kohler, W. (2016), International Centre for Global Earth Models (ICGEM), in: Drewes, H., Kuglitsch, F., Adam, J. et al., The Geodesists Handbook 2016, Journal of Geodesy (2016), 90(10), pp 907-1205, doi:`10.1007/s00190-016-0948-z <https://doi.org/10.1007/s00190-016-0948-z>`__
.. [Blakely1995] Blakely, R. (1995). Potential Theory in Gravity and Magnetic Applications. Cambridge: Cambridge University Press. doi:`10.1017/CBO9780511549816 <https://doi.org/10.1017/CBO9780511549816>`__
.. [Cooper2000] Cooper, G.R.J. (2000), Gridding gravity data using an equivalent layer, Computers & Geosciences, Computers & Geosciences, doi:`10.1016/S0098-3004(99)00089-8 <https://doi.org/10.1016/S0098-3004(99)00089-8>`__
.. [Dampney1969] Dampney, C. N. G. (1969). The equivalent source technique. Geophysics, 34(1), 39–53. doi:`10.1190/1.1439996 <https://doi.org/10.1190/1.1439996>`__
.. [Forste_etal2014] Förste, Christoph; Bruinsma, Sean.L.; Abrikosov, Oleg; Lemoine, Jean-Michel; Marty, Jean Charles; Flechtner, Frank; Balmino, G.; Barthelmes, F.; Biancale, R. (2014): EIGEN-6C4 The latest combined global gravity field model including GOCE data up to degree and order 2190 of GFZ Potsdam and GRGS Toulouse. GFZ Data Services. doi:`10.5880/icgem.2015.1 <http://doi.org/10.5880/icgem.2015.1>`__
.. [Grombein2013] Grombein, T., Seitz, K., Heck, B. (2013), Optimized formulas for the gravitational field of a tesseroid, Journal of Geodesy. doi:`10.1007/s00190-013-0636-1 <https://doi.org/10.1007/s00190-013-0636-1>`__
.. [Hofmann-WellenhofMoritz2006] Hofmann-Wellenhof, B., & Moritz, H. (2006). Physical Geodesy (2nd, corr. ed. 2006 edition ed.). Wien ; New York: Springer.
Expand Down
2 changes: 2 additions & 0 deletions examples/eql/README.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Equivalent Layers
-----------------
96 changes: 96 additions & 0 deletions examples/eql/harmonic.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
"""
Gridding and upward continuation
================================

Most potential field surveys gather data along scattered and uneven flight lines or
ground measurements. For a great number of applications we may need to interpolate these
data points onto a regular grid at a constant altitude. Upward-continuation is also a
routine task for smoothing, noise attenuation, source separation, etc.

Both tasks can be done simultaneously through an *equivalent layer* [Dampney1969]_. We
will use :class:`harmonica.EQLHarmonic` to estimate the coefficients of a set of point
sources (the equivalent layer) that fit the observed data. The fitted layer can then be
used to predict data values wherever we want, like on a grid at a certain altitude. The
sources for :class:`~harmonica.EQLHarmonic` in particular are placed one beneath each
data point at a constant depth following [Cooper2000]_.

The advantage of using an equivalent layer is that it takes into account the 3D nature
of the observations, not just their horizontal positions. It also allows data
uncertainty to be taken into account and noise to be suppressed though the least-squares
fitting process. The main disadvantage is the increased computational load (both in
terms of time and memory).
"""
import matplotlib.pyplot as plt
import numpy as np
import pyproj
import verde as vd
import harmonica as hm


# Fetch the sample total-field magnetic anomaly data from Rio de Janeiro
data = hm.datasets.fetch_rio_magnetic()

# Slice a smaller portion of the survey data to speed-up calculations for this example
region = [-42.35, -42.10, -22.35, -22.15]
inside = vd.inside((data.longitude, data.latitude), region)
data = data[inside]
print("Number of data points:", data.shape[0])

# Since this is a small area, we'll project our data and use Cartesian coordinates
projection = pyproj.Proj(proj="merc", lat_ts=data.latitude.mean())
easting, northing = projection(data.longitude.values, data.latitude.values)
coordinates = (easting, northing, data.altitude_m)

# Create the equivalent layer. We'll use the default point source configuration at a
# constant depth beneath each observation point. The damping parameter helps smooth the
# predicted data and ensure stability.
eql = hm.EQLHarmonic(depth=1000, damping=10)

# Fit the layer coefficients to the observed magnetic anomaly.
eql.fit(coordinates, data.total_field_anomaly_nt)

# Evaluate the data fit by calculating an R² score against the observed data.
# This is a measure of how well layer the fits the data NOT how good the interpolation
# will be.
print("R² score:", eql.score(coordinates, data.total_field_anomaly_nt))

# Interpolate data on a regular grid with 400 m spacing. The interpolation requires an
# extra coordinate (upward height). By passing in 500 m, we're effectively
# upward-continuing the data (mean flight height is 200 m).
grid = eql.grid(spacing=400, data_names=["magnetic_anomaly"], extra_coords=500)

# The grid is a xarray.Dataset with values, coordinates, and metadata
print("\nGenerated grid:\n", grid)

# Plot original magnetic anomaly and the gridded and upward-continued version
fig, (ax1, ax2) = plt.subplots(nrows=1, ncols=2, figsize=(12, 6), sharey=True)

# Get the maximum absolute value between the original and gridded data so we can use the
# same color scale for both plots and have 0 centered at the white color.
maxabs = vd.maxabs(data.total_field_anomaly_nt, grid.magnetic_anomaly.values)

ax1.set_title("Observed magnetic anomaly data")
tmp = ax1.scatter(
easting,
northing,
c=data.total_field_anomaly_nt,
s=20,
vmin=-maxabs,
vmax=maxabs,
cmap="seismic",
)
plt.colorbar(tmp, ax=ax1, label="nT", pad=0.05, aspect=40, orientation="horizontal")

ax2.set_title("Gridded and upward-continued")
tmp = grid.magnetic_anomaly.plot.pcolormesh(
ax=ax2,
add_colorbar=False,
add_labels=False,
vmin=-maxabs,
vmax=maxabs,
cmap="seismic",
)
plt.colorbar(tmp, ax=ax2, label="nT", pad=0.05, aspect=40, orientation="horizontal")

plt.tight_layout(pad=0)
plt.show()
2 changes: 1 addition & 1 deletion harmonica/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
from .coordinates import geodetic_to_spherical, spherical_to_geodetic
from .forward.point_mass import point_mass_gravity
from .forward.tesseroid import tesseroid_gravity

from .equivalent_layer.harmonic import EQLHarmonic

# Get the version number through versioneer
__version__ = version.full_version
Expand Down
Empty file.
230 changes: 230 additions & 0 deletions harmonica/equivalent_layer/harmonic.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,230 @@
"""
Equivalent layer for generic harmonic functions
"""
import numpy as np
from numba import jit
from sklearn.utils.validation import check_is_fitted
import verde as vd
import verde.base as vdb

from ..forward.utils import distance_cartesian


class EQLHarmonic(vdb.BaseGridder):
r"""
Equivalent-layer for generic harmonic functions (gravity, magnetics, etc).

This equivalent layer can be used for:

* Cartesian coordinates (geographic coordinates must be project before use)
* Gravity and magnetic data (including derivatives)
* Single data types
* Interpolation
* Upward continuation
* Finite-difference based derivative calculations

It cannot be used for:

* Joint inversion of multiple data types (e.g., gravity + gravity gradients)
* Reduction to the pole of magnetic total field anomaly data
* Analytical derivative calculations

Point sources are located beneath the observed potential-field measurement points by
default [Cooper2000]_. Custom source locations can be used by specifying the
*points* argument. Coefficients associated with each point source are estimated
through linear least-squares with damping (Tikhonov 0th order) regularization.

The Green's function for point mass effects used is the inverse Cartesian distance
between the grid coordinates and the point source:

.. math::

\phi(\bar{x}, \bar{x}') = \frac{1}{||\bar{x} - \bar{x}'||}

where :math:`\bar{x}` and :math:`\bar{x}'` are the coordinate vectors of the
observation point and the source, respectively.

Parameters
----------
damping : None or float
The positive damping regularization parameter. Controls how much smoothness is
imposed on the estimated coefficients. If None, no regularization is used.
points : None or list of arrays (optional)
List containing the coordinates of the point sources used as the equivalent
layer. Coordinates are assumed to be in the following order: (``easting``,
``northing``, ``upward``). If None, will place one
point source bellow each observation point at a fixed depth below the
observation point [Cooper2000]_. Defaults to None.
depth : float
Depth at which the point sources are placed beneath the observation points. Use
positive numbers (negative numbers would mean point sources are above the data
points). Ignored if *points* is specified.

Attributes
----------
points_ : 2d-array
Coordinates of the point sources used to build the equivalent layer.
coefs_ : array
Estimated coefficients of every point source.
region_ : tuple
The boundaries (``[W, E, S, N]``) of the data used to fit the interpolator.
Used as the default region for the :meth:`~harmonica.HarmonicEQL.grid` and
:meth:`~harmonica.HarmonicEQL.scatter` methods.
"""

def __init__(self, damping=None, points=None, depth=500):
self.damping = damping
self.points = points
self.depth = depth

def fit(self, coordinates, data, weights=None):
"""
Fit the coefficients of the equivalent layer.

The data region is captured and used as default for the
:meth:`~harmonica.HarmonicEQL.grid` and :meth:`~harmonica.HarmonicEQL.scatter`
methods.

All input arrays must have the same shape.

Parameters
----------
coordinates : tuple of arrays
Arrays with the coordinates of each data point. Should be in the
following order: (easting, northing, upward, ...). Only easting,
northing, and upward will be used, all subsequent coordinates will be
ignored.
data : array
The data values of each data point.
weights : None or array
If not None, then the weights assigned to each data point.
Typically, this should be 1 over the data uncertainty squared.

Returns
-------
self
Returns this estimator instance for chaining operations.
"""
coordinates, data, weights = vdb.check_fit_input(coordinates, data, weights)
# Capture the data region to use as a default when gridding.
self.region_ = vd.get_region(coordinates[:2])
coordinates = vdb.n_1d_arrays(coordinates, 3)
if self.points is None:
self.points_ = (coordinates[0], coordinates[1], coordinates[2] - self.depth)
else:
self.points_ = vdb.n_1d_arrays(self.points, 3)
jacobian = self.jacobian(coordinates, self.points_)
self.coefs_ = vdb.least_squares(jacobian, data, weights, self.damping)
return self

def predict(self, coordinates):
"""
Evaluate the estimated equivalent layer on the given set of points.

Requires a fitted estimator (see :meth:`~harmonica.HarmonicEQL.fit`).

Parameters
----------
coordinates : tuple of arrays
Arrays with the coordinates of each data point. Should be in the
following order: (``easting``, ``northing``, ``upward``, ...). Only
``easting``, ``northing`` and ``upward`` will be used, all subsequent
coordinates will be ignored.

Returns
-------
data : array
The data values evaluated on the given points.
"""
# We know the gridder has been fitted if it has the coefs_
check_is_fitted(self, ["coefs_"])
shape = np.broadcast(*coordinates[:3]).shape
size = np.broadcast(*coordinates[:3]).size
dtype = coordinates[0].dtype
coordinates = tuple(np.atleast_1d(i).ravel() for i in coordinates[:3])
data = np.zeros(size, dtype=dtype)
predict_numba(coordinates, self.points_, self.coefs_, data)
return data.reshape(shape)

def jacobian(
self, coordinates, points, dtype="float64"
): # pylint: disable=no-self-use
"""
Make the Jacobian matrix for the equivalent layer.

Each column of the Jacobian is the Green's function for a single point source
evaluated on all observation points.

Parameters
----------
coordinates : tuple of arrays
Arrays with the coordinates of each data point. Should be in the
following order: (``easting``, ``northing``, ``upward``, ...). Only
``easting``, ``northing`` and ``upward`` will be used, all subsequent
coordinates will be ignored.
points : tuple of arrays
Tuple of arrays containing the coordinates of the point sources used as
equivalent layer in the following order: (``easting``, ``northing``,
``upward``).
dtype : str or numpy dtype
The type of the Jacobian array.

Returns
-------
jacobian : 2D array
The (n_data, n_points) Jacobian matrix.
"""
n_data = coordinates[0].size
n_points = points[0].size
jac = np.zeros((n_data, n_points), dtype=dtype)
jacobian_numba(coordinates, points, jac)
return jac


@jit(nopython=True)
def predict_numba(coordinates, points, coeffs, result):
"""
Calculate the predicted data using numba for speeding things up.
"""
east, north, upward = coordinates[:]
point_east, point_north, point_upward = points[:]
for i in range(east.size):
for j in range(point_east.size):
result[i] += coeffs[j] * greens_func(
east[i],
north[i],
upward[i],
point_east[j],
point_north[j],
point_upward[j],
)


@jit(nopython=True)
def greens_func(east, north, upward, point_east, point_north, point_upward):
"""
Calculate the Green's function for the equivalent layer using Numba.
"""
distance = distance_cartesian(
(east, north, upward), (point_east, point_north, point_upward)
)
return 1 / distance


@jit(nopython=True)
def jacobian_numba(coordinates, points, jac):
"""
Calculate the Jacobian matrix using numba to speed things up.
"""
east, north, upward = coordinates[:]
point_east, point_north, point_upward = points[:]
for i in range(east.size):
for j in range(point_east.size):
jac[i, j] = greens_func(
east[i],
north[i],
upward[i],
point_east[j],
point_north[j],
point_upward[j],
)
Loading