Skip to content

Commit

Permalink
2d interpolation (#176)
Browse files Browse the repository at this point in the history
* fix: adding json exports

* fix: moving and renaming conversion functions

also adding some type hints and a generic numeric type

* tests: adding tests for conversions

need to add some better tests not just 0 and 1s

* fix: remove process_map2loop

not required, can be done using processor

* fix: adding synthetic horizontal layers model

* fix: adding corners property to bounding box

* fix: substituting pli for p1

same interpolator but this implementation uses numpy instead of cython

* tests: test that horizontal layers are working

* fix: making structured tetra compatible with p1 interp

* fix: making p1 interpolator work

* fix: cast input to np array

* fix: adding type enums

* fix: adding sparse matrix neighbour calculator

this means no more cython code

* fix: adding some type hints

* refactor: ➖ removing model builder

there is no real advantage to using a builder design pattern for loopstructural

* fix: ✨ adding ability to calculate normalised value of a feature

this just makes it easier to scale scalar field between 0 and 1

* fix: adding map for interpolator to support type

* fix: enabling support builder with interpolator factory

* fix: updating imports for maths functions

* fix: adding support creator from bbox

* fix: when reset clear the constraints dictionary

* fix: reset when setting up interpolator

* fix: removing old pli interpolator

* fix: fixing tetra indexing and masks

* fix: removing unused imports

* fix: speeding up element getter

* fix: adding onGeometryChange function for supports

* fix: reset change number of constraints

* fix: adding gradient orthogonal to p1interpolator

* fix: changing instrusion example to rescale points

* fix: setting model tolerance using new bounding box

* fix: updating fold to use p1 interpolator

* fix: renaming variables

* fix: adding name and vector args to min edge jumps

* fix: removing old imports

* fix: fixing orthogonal constraints for p1

* fix: moving fault setup into fault builder frm model

* fix: flake8 error

* fix: add bb origin to surfaces

* fix: fold constraints added in setup interpolator

this allows for interpolator to be reset
and doesn't remove the fold constraints when resetting

* fix: temp fix for issue with np types

* Replace reqs with correct theme

Theme being loaded by requirements.txt was wrong (an old one?) So have updated with the pydata_sphinx_theme so now builds correctly

* Adding PyPI custom link to navigation

* Replacing square icon with (nicer) round icon

* Adding navbar options and metadata

Image is a placeholder for now, but this is an (onvs opinionated) view on a nicer format. Adds a limit to the number of nav items, adds a logo, changes the title to look cleaner (and not say "documentation" at the end), adds metadata

* Fixing svg img

* Updating the contributors docs

* removing accidental tab

* docs: fixing missing automodule

* fix: adding length to bbox

* fix: removing dof and name from interpolator json

* docs: adding explicit imports for submodules

Without explicit import of submodules and using __ALL__ the majority of the code isn't imported so sphinx doesn't document it.

* docs: adding citations to documentation

* fix: removing reference to dtm creator and using get interpolator correctly

* fix: face table resize when geometry reset

* fix: elements property for base structured support

* fix: adding imports to api

* fix: changing from interpolator type string to enum

* fix: removing cython

* fix: only axial foliation is constrained by fold frame

* Update release-please.yml

* Update release-please.yml

* Update release-please.yml

* ci: updating to use pip wheel .

* Update release-please.yml

* fix: disable progress bar for model update

* ci: remove numpy version for conda

* fix: changing to interpolator factory

required changing how fold interpolator is setup. Rather than
initialising with a fold, the fold is added to the interpolator
added map between stringinterpolator names and the interpolatortypes enum

* fix: identify points that fall on face of shared tetras. Automatically chooses the second one.

* fix: rename surface vtk property to vtk from pyvista

* fix: add docstring and cast isinside to an array

* docs: adding docs to surface datatype

* fix: removing cython code

* fix: removing model api for time being

* build: adding conda builds to git ignore

* typo

* ci: using conda convert to create osx

* ci: removing cython

* ci: typo in yml

* ci: again

* ci: again

* fix: removing analysis module, will create separate ptyhon library

* style: style fixes by ruff and autoformatting by black

* fix: changing import of bounding box

* fix: migrating np.random.shuffle to use generator object

* fix: removing axis/angle from apply faults

* fix: calculate gradient of faulted feature returns true gradient

* fix: bb import update and some formatting

* style: style fixes by ruff and autoformatting by black

* fix: adding a method to rotate vectors that are faulted

* fix: store constraints as matrix rather than dict

the matrices can then be stacked together. no real advantage to the previous approach. It is useful
if you want to use the matrix for a constraint outside of
the typical LS workflow.

* fix: removing unused variable

* fix: adding --user flag and removing cython from test suite

* fix: changing discrete support to return element index

previously there was a mix of returning the index or the
cells in the index. making supports consistent

* fix: global indexes for elements

* fix: linting

* style: style fixes by ruff and autoformatting by black

* fix: changing bounding box import

* ci: updating action version numbers

* fix: moving face table and aabb to separate file

* fix: adding abc for base support

* fix: updating 2d grid to follow same pattern as 3d

* fix: updating 2d tri mesh to same pattern as 3d tetra

* fix: typo vaue to value

* fix: adding abstract methods for interpolation functions

* fix: changing add methods to abstract method names

* fix: use abstract base class for supports

* fix: don't swap axes for dn, not consistent with 3d

* docs: adding type hints

* fix: adding get element gradient for location

* removing prints

* fix: updating p2 2d support

* fix: linting

* docs: type hints

* fix: adding implementation of required abstract methods

* tests: updating 2d local coords to use array

* fix: adding gradient orthogonal abstract method

* fix: adding methods for setting inequality constraints

* fix: adding element size abstract method

* fix: removing unused operator code

* fix: adding faults setter/getter

* fix: making intrusion feature compatible with base feature

* fix: linting

* fix: count number of constraints before adding

previously not counting constraints before tying to add

* style: style fixes by ruff and autoformatting by black

* tests: don't test base feature anymore

* style: style fixes by ruff and autoformatting by black

---------

Co-authored-by: Sam <samuel.joseph.roberts@hotmail.co.uk>
Co-authored-by: lachlangrose <lachlangrose@users.noreply.github.com>
  • Loading branch information
3 people authored Mar 4, 2024
1 parent a76e718 commit 5701100
Show file tree
Hide file tree
Showing 23 changed files with 896 additions and 372 deletions.
21 changes: 11 additions & 10 deletions .github/workflows/release-please.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
- name: Lint with ruff
run: |
ruff check LoopStructural --fix
- uses: stefanzweifel/git-auto-commit-action@v4
- uses: stefanzweifel/git-auto-commit-action@v5
with:
commit_message: "style: style fixes by ruff and autoformatting by black"

Expand All @@ -31,7 +31,8 @@ jobs:
python-version: ${{ fromJSON(vars.PYTHON_VERSIONS)}}
steps:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3

with:
python-version: ${{ matrix.python }}
- name: Installing dependencies
Expand Down Expand Up @@ -100,7 +101,7 @@ jobs:
os: ["ubuntu-latest"]
python-version: ${{ fromJSON(vars.PYTHON_VERSIONS)}}
steps:
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3
with:
auto-update-conda: true
python-version: ${{ matrix.python-version }}
Expand All @@ -120,7 +121,7 @@ jobs:
conda convert -p all conda/linux-64/*.tar.bz2 -f -o conda
conda install anaconda-client -y
- name: upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: conda-${{ matrix.os }}-${{ matrix.python-version }}
path: conda
Expand All @@ -138,11 +139,11 @@ jobs:
python setup.py sdist
pip wheel -w wheelhouse .
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: dist
path: dist/*.tar.gz
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: dist
path: wheelhouse/*.whl
Expand All @@ -152,20 +153,20 @@ jobs:
runs-on: ubuntu-latest
if: ${{ needs.release-please.outputs.release_created }}
steps:
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
name: dist
path: dist
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
path: conda
- uses: pypa/gh-action-pypi-publish@v1.6.4
- uses: pypa/gh-action-pypi-publish@v1
with:
skip_existing: true
verbose: true
user: ${{ secrets.PYPI_USERNAME }}
password: ${{ secrets.PYPI_PASSWORD }}
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3
- name: upload all files to conda-forge
shell: bash -l {0}
env:
Expand Down
10 changes: 4 additions & 6 deletions LoopStructural/interpolators/_builders.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
StructuredGrid,
TetMesh,
)
from LoopStructural.utils import BoundingBox
from LoopStructural.datatypes import BoundingBox
from LoopStructural.utils.logging import getLogger

logger = getLogger(__name__)
Expand Down Expand Up @@ -61,6 +61,7 @@ def get_interpolator(
)

return P1Interpolator(support)
return P1Interpolator(support)
if interpolatortype == "P2":
if support is not None:
logger.info(
Expand All @@ -69,9 +70,7 @@ def get_interpolator(
)
return P2Interpolator(support)
else:
raise ValueError(
"Cannot create P2 interpolator without support, try using PLI"
)
raise ValueError("Cannot create P2 interpolator without support, try using PLI")

if interpolatortype == "FDI":
# find the volume of one element
Expand All @@ -96,8 +95,7 @@ def get_interpolator(

grid = StructuredGrid(origin=origin, nsteps=nsteps, step_vector=step_vector)
logger.info(
f"Creating regular grid with {grid.n_elements} elements \n"
"for modelling using FDI"
f"Creating regular grid with {grid.n_elements} elements \n" "for modelling using FDI"
)
return FiniteDifferenceInterpolator(grid)
if interpolatortype == "DFI":
Expand Down
82 changes: 9 additions & 73 deletions LoopStructural/interpolators/_discrete_interpolator.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
Discrete interpolator base for least squares
"""

from abc import abstractmethod
import logging

from time import time
Expand Down Expand Up @@ -39,16 +40,12 @@ def __init__(self, support, data={}, c=None, up_to_date=False):
else np.zeros(self.support.n_nodes)
)
self.region_function = lambda xyz: np.ones(xyz.shape[0], dtype=bool)
# self.region_map[self.region] = np.array(range(0,
# len(self.region_map[self.region])))

self.shape = "rectangular"
if self.shape == "square":
self.B = np.zeros(self.nx)
self.c_ = 0
# self.A = [] # sparse matrix storage coo format
# self.col = []
# self.row = [] # sparse matrix storage
# self.w = []

self.solver = "cg"

self.eq_const_C = []
Expand Down Expand Up @@ -221,6 +218,12 @@ def add_constraints_to_least_squares(self, A, B, idc, w=1.0, name="undefined"):
'w': w,
}

@abstractmethod
def add_gradient_orthogonal_constraints(
self, points: np.ndarray, vectors: np.ndarray, w: float = 1.0
):
pass

def calculate_residual_for_constraints(self):
"""Calculates Ax-B for all constraints added to the interpolator
This could be a proxy to identify which constraints are controlling the model
Expand Down Expand Up @@ -252,7 +255,6 @@ def remove_constraints_from_least_squares(self, name="undefined", constraint_ids
-------
"""

pass

def add_equality_constraints(self, node_idx, values, name="undefined"):
Expand Down Expand Up @@ -288,72 +290,6 @@ def add_equality_constraints(self, node_idx, values, name="undefined"):
}
self.eq_const_c += idc[outside].shape[0]

def add_inequality_constraints_to_matrix(self, A, l, u, idc, name="undefined"):
"""Adds constraints for a matrix where the linear function
l < Ax > u constrains the objective function
Parameters
----------
A : numpy array
matrix of coefficients
l : numpy array
lower bounds
u : numpy array
upper bounds
idc : numpy array
index of constraints
Returns
-------
"""
# map from mesh node index to region node index
gi = np.zeros(self.support.n_nodes, dtype=int)
gi[:] = -1
gi[self.region] = np.arange(0, self.nx, dtype=int)
idc = gi[idc]
rows = np.arange(self.ineq_const_c, self.ineq_const_c + idc.shape[0])
rows = np.tile(rows, (A.shape[-1], 1)).T
self.ineq_constraints[name] = {"A": A, "l": l, "col": idc, "u": u, "row": rows}
self.ineq_const_c += idc.shape[0]

def add_inequality_feature(self, feature, lower=True, mask=None):
"""Add an inequality constraint to the interpolator using an existing feature.
This will make the interpolator greater than or less than the exising feature.
Evaluate the feature at the interpolation nodes.
Can provide a boolean mask to restrict to only some parts
Parameters
----------
feature : BaseFeature
the feature that will be used to constraint the interpolator
lower : bool, optional
lower or upper constraint, by default True
mask : np.ndarray, optional
restrict the nodes to evaluate on, by default None
"""
# add inequality value for the nodes of the mesh
# flag lower determines whether the feature is a lower bound or upper bound
# mask is just a boolean array determining which nodes to apply it to

value = feature(self.support.nodes)
if mask is None:
mask = np.ones(value.shape[0], dtype=bool)
l = np.zeros(value.shape[0]) - np.inf
u = np.zeros(value.shape[0]) + np.inf
mask = np.logical_and(mask, ~np.isnan(value))
if lower:
l[mask] = value[mask]
if not lower:
u[mask] = value[mask]

self.add_inequality_constraints_to_matrix(
np.ones((value.shape[0], 1)),
l,
u,
np.arange(0, self.nx, dtype=int),
)

def add_tangent_constraints(self, w=1.0):
"""Adds the constraints :math:`f(X)\cdotT=0`
Expand Down
30 changes: 17 additions & 13 deletions LoopStructural/interpolators/_finite_difference_interpolator.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ def setup_interpolator(self, **kwargs):
self.assemble_inner(operator, weight)
self.add_norm_constraints(self.interpolation_weights["npw"])
self.add_gradient_constraints(self.interpolation_weights["gpw"])
self.add_vaue_constraints(self.interpolation_weights["cpw"])
self.add_value_constraints(self.interpolation_weights["cpw"])
self.add_tangent_constraints(self.interpolation_weights["tpw"])
self.add_interface_constraints(self.interpolation_weights["ipw"])
self.add_inequality_constraints()
Expand All @@ -136,7 +136,7 @@ def copy(self):
"""
return FiniteDifferenceInterpolator(self.support)

def add_vaue_constraints(self, w=1.0):
def add_value_constraints(self, w=1.0):
"""
Parameters
Expand All @@ -151,7 +151,9 @@ def add_vaue_constraints(self, w=1.0):
points = self.get_value_constraints()
# check that we have added some points
if points.shape[0] > 0:
node_idx, inside = self.support.position_to_cell_corners(points[:, :3])
node_idx, inside = self.support.position_to_cell_corners(
points[:, : self.support.dimension]
)
# print(points[inside,:].shape)
gi = np.zeros(self.support.n_nodes, dtype=int)
gi[:] = -1
Expand All @@ -160,14 +162,14 @@ def add_vaue_constraints(self, w=1.0):
idc[:] = -1
idc[inside, :] = gi[node_idx[inside, :]]
inside = np.logical_and(~np.any(idc == -1, axis=1), inside)
a = self.support.position_to_dof_coefs(points[inside, :3])
# a*=w
# a/=np.product(self.support.step_vector)
a = self.support.position_to_dof_coefs(points[inside, : self.support.dimension])
# a *= w
# a/=self.support.enp.product(self.support.step_vector)
self.add_constraints_to_least_squares(
a,
points[inside, 3],
idc[inside, :],
w=w * points[inside, 4],
w=w * points[inside, 4] * self.support.element_size,
name="value",
)
if np.sum(inside) <= 0:
Expand Down Expand Up @@ -233,7 +235,9 @@ def add_interface_constraints(self, w=1.0):
# get elements for points
points = self.get_interface_constraints()
if points.shape[0] > 1:
vertices, c, tetras, inside = self.support.get_element_for_location(points[:, :3])
vertices, c, tetras, inside = self.support.get_element_for_location(
points[:, : self.support.dimension]
)
# calculate volume of tetras
# vecs = vertices[inside, 1:, :] - vertices[inside, 0, None, :]
# vol = np.abs(np.linalg.det(vecs)) / 6
Expand Down Expand Up @@ -391,7 +395,7 @@ def add_norm_constraints(self, w=1.0):
self.up_to_date = False

def add_gradient_orthogonal_constraints(
self, points: np.ndarray, vector: np.ndarray, w: float = 1.0, B: float = 0
self, points: np.ndarray, vectors: np.ndarray, w: float = 1.0, B: float = 0
):
"""
constraints scalar field to be orthogonal to a given vector
Expand Down Expand Up @@ -424,8 +428,8 @@ def add_gradient_orthogonal_constraints(
idc[inside, :] = gi[node_idx[inside, :]]
inside = np.logical_and(~np.any(idc == -1, axis=1), inside)
# normalise vector and scale element gradient matrix by norm as well
norm = np.linalg.norm(vector, axis=1)
vector[norm > 0, :] /= norm[norm > 0, None]
norm = np.linalg.norm(vectors, axis=1)
vectors[norm > 0, :] /= norm[norm > 0, None]

# normalise element vector to unit vector for dot product
(
Expand All @@ -437,7 +441,7 @@ def add_gradient_orthogonal_constraints(
T[norm > 0, :, :] /= norm[norm > 0, None, None]

# dot product of vector and element gradient = 0
A = np.einsum("ij,ijk->ik", vector[inside, :3], T)
A = np.einsum("ij,ijk->ik", vectors[inside, :3], T)
B = np.zeros(points[inside, :].shape[0]) + B
self.add_constraints_to_least_squares(
A, B, idc[inside, :], w=w * self.vol, name="gradient orthogonal"
Expand All @@ -461,7 +465,7 @@ def add_regularisation(self, operator, w=0.1):
-------
"""
self.assemble_inner(operator)
self.assemble_inner(operator, w)
# self.assemble_borders()

# def assemble_borders(self, operator, w):
Expand Down
36 changes: 33 additions & 3 deletions LoopStructural/interpolators/_geological_interpolator.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,8 +106,8 @@ def set_value_constraints(self, points: np.ndarray):
"""
points = self.check_array(points)
if points.shape[1] < 4:
raise ValueError("Value points must at least have X,Y,Z,val")
if points.shape[1] < 5:
raise ValueError("Value points must at least have X,Y,Z,val,w")
self.data["value"] = points
self.n_i = points.shape[0]
self.up_to_date = False
Expand Down Expand Up @@ -172,10 +172,20 @@ def set_interface_constraints(self, points: np.ndarray):
self.data["interface"] = points
self.up_to_date = False

def set_inequality_constraints(self, points: np.ndarray):
def set_value_inequality_constraints(self, points: np.ndarray):
if points.shape[1] < 5:
raise ValueError("Inequality constraints must at least have X,Y,Z,lower,upper")
self.data["inequality"] = points
self.up_to_date = False

def set_inequality_pairs_constraints(self, pointsa: np.ndarray, pointsb: np.ndarray):
if pointsa.shape[1] < 3:
raise ValueError("Inequality pairs constraints must at least have X,Y,Z")
if pointsb.shape[1] < 3:
raise ValueError("Inequality pairs constraints must at least have X,Y,Z")
self.data["inequality_pairs"] = {"a": pointsa, "b": pointsb}
self.up_to_date = False

def get_value_constraints(self):
"""
Expand Down Expand Up @@ -274,6 +284,26 @@ def evaluate_gradient(self, locations: np.ndarray):
def reset(self):
pass

@abstractmethod
def add_value_constraints(self, w: float = 1.0):
pass

@abstractmethod
def add_gradient_constraints(self, w: float = 1.0):
pass

@abstractmethod
def add_norm_constraints(self, w: float = 1.0):
pass

@abstractmethod
def add_tangent_constraints(self, w: float = 1.0):
pass

@abstractmethod
def add_interface_constraints(self, w: float = 1.0):
pass

def to_dict(self):
return {
"type": self.type,
Expand Down
Loading

0 comments on commit 5701100

Please sign in to comment.