Skip to content

Commit

Permalink
deploy: 1f77167
Browse files Browse the repository at this point in the history
  • Loading branch information
punkduckable committed Nov 9, 2024
1 parent 8db11ce commit 30bfc89
Show file tree
Hide file tree
Showing 37 changed files with 2,009 additions and 616 deletions.
Binary file modified .doctrees/autoapi/lasdi/fd/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/gp/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/gplasdi/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/inputs/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/latent_dynamics/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/latent_dynamics/sindy/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/latent_space/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/param/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/physics/burgers1d/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/physics/index.doctree
Binary file not shown.
Binary file modified .doctrees/autoapi/lasdi/timing/index.doctree
Binary file not shown.
Binary file modified .doctrees/environment.pickle
Binary file not shown.
131 changes: 129 additions & 2 deletions _sources/autoapi/lasdi/fd/index.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -30,37 +30,104 @@ Module Contents
.. py:class:: Stencil
.. py:attribute:: leftBdrDepth
:type: int
:value: 0



.. py:attribute:: leftBdrWidth
:type: list[int]
:value: []



.. py:attribute:: leftBdrStencils
:type: list[list[float]]
:value: [[]]



.. py:attribute:: leftBdrNorm
:type: list[float]
:value: []


Suppose that InteriorStencils is an array of length Ns and interiorIndexes is a list of
length Ns. Further suppose the underlying time series contains Nx points, x(t_0), ... ,
x(t_{Nx -1}). Assuming index i is an "interior index" (not too close to 0 or Nx - 1), then we
approximate the time derivative of z at time t_i as follows:
z'(t_i) pprox c_0 z(t_{i + i(0)}) + ... + c_{Ns - 1} z(t_{i + i(Ns - 1)})
where c_k = interiorStencils[k] and i(k) = interiorIndexes[k]. Note that the indices may be
negative or positive. Thus, interiorStencils and interiorIndexes tell us how to construct the
finite difference scheme away from the boundary.
x
For instance, the central difference scheme corresponds to interiorStencils = [-1/2, 1/2],
interiorIndexes = [-1, 1] and
z'(t_i) pprox (1/2)(-z(t_{i - 1}) + z_{t_{i + 1}})

Note: We assume that interiorIndexes is in ASCENDING order.


.. py:attribute:: interiorStencils
:type: numpy.ndarray


.. py:attribute:: interiorIndexes
:type: list[int]
:value: []



.. py:method:: getOperators(Nx, periodic=False)
.. py:method:: getOperators(Nx: int, periodic: bool = False) -> tuple[scipy.sparse.spmatrix, torch.Tensor, torch.Tensor]
The stencil class acts as an abstract base class for finite difference schemes. We assume
that the user has a time series, x(t_0), ... , x(t_{Nx - 1}) \in \mathbb{R}^d. We will
further assume that the time stamps are evenly spaced, t_0, ... , t_{Nx - 1}. That is,
there is some \Delta t such that t_k = t_0 + \Delta_t*k for each k. We also assume the user
wants to approximate the time derivative of x at t_0, ... , t_{Nx - 1} using a finite
difference scheme.

The getOperators method builds an a sparse Tensor housing the "operator matrix". This is
an Nx x Nx matrix that we use to apply the finite difference scheme to one component of
the time series. How does this work? Suppose the time series is x(t_0), ... ,
x(t_{Nx - 1}) \in \mathbb{R}^d. Then, for each i \in \{ 1, 2, ... , d}, we construct Dxi
such that Dxi * xi is the vector whose i'th entry holds the approximation (using the
selected finite difference scheme) of x_i'(t_i). Here, xi = [x_i(t_0), ... ,
x_i(t_{Nx - 1})]. To build Dxi, we first set up a matrix to give the correct approximation
to x_i'(t_j) whenever j is an interior index. The rest of this function adjusts the
first/last few rows of Dxi so that it also gives the correct approximation at the
boundaries (j close to to 0 or Nx - 1).

The Stencil class is not a standalone class; you shouldn't use it directly. Rather, you
should use one of the sub-classes defined below. Each one implements a specific finite
difference scheme to approximate the time derivatives at t_0, ... , t_{Nx - 1} (they may
use different rules on the left and right boundary). Each sub-class should set the
interiorStencils and interiorIndexes attributes.


-------------------------------------------------------------------------------------------
:Parameters: * **Nx** (*An integer specifying the number of points in the time series we want to apply a*)
* **finite difference scheme to.**
* **periodic** (*A boolean specifying if we should treat the time series as periodic or not.*)

-------------------------------------------------------------------------------------------
:returns: * *Three elements. The first Dxi, the "Operator Matrix" described above. The second holds the*
* *"norm" tensor and the third holds the "PeriodicOffset" tensor.*
* **TODO** (*what are the last two of these used for? And what are they?*)



.. py:method:: convert(scipy_coo: scipy.sparse.spmatrix) -> torch.Tensor
Converts scipy_coo, a sparse numpy array, to a sparse torch Tensor.


-------------------------------------------------------------------------------------------
:Parameters: **scipy_coo** (*A sparse numpy array.*)

-------------------------------------------------------------------------------------------
:rtype: A tensor version of scipy_coo.

.. py:method:: convert(scipy_coo)


.. py:class:: SBP12
Expand All @@ -85,6 +152,21 @@ Module Contents
:value: [0.5]


Suppose that InteriorStencils is an array of length Ns and interiorIndexes is a list of
length Ns. Further suppose the underlying time series contains Nx points, x(t_0), ... ,
x(t_{Nx -1}). Assuming index i is an "interior index" (not too close to 0 or Nx - 1), then we
approximate the time derivative of z at time t_i as follows:
z'(t_i) pprox c_0 z(t_{i + i(0)}) + ... + c_{Ns - 1} z(t_{i + i(Ns - 1)})
where c_k = interiorStencils[k] and i(k) = interiorIndexes[k]. Note that the indices may be
negative or positive. Thus, interiorStencils and interiorIndexes tell us how to construct the
finite difference scheme away from the boundary.
x
For instance, the central difference scheme corresponds to interiorStencils = [-1/2, 1/2],
interiorIndexes = [-1, 1] and
z'(t_i) pprox (1/2)(-z(t_{i - 1}) + z_{t_{i + 1}})

Note: We assume that interiorIndexes is in ASCENDING order.


.. py:attribute:: interiorStencils
Expand Down Expand Up @@ -112,6 +194,21 @@ Module Contents
.. py:attribute:: leftBdrNorm
Suppose that InteriorStencils is an array of length Ns and interiorIndexes is a list of
length Ns. Further suppose the underlying time series contains Nx points, x(t_0), ... ,
x(t_{Nx -1}). Assuming index i is an "interior index" (not too close to 0 or Nx - 1), then we
approximate the time derivative of z at time t_i as follows:
z'(t_i) pprox c_0 z(t_{i + i(0)}) + ... + c_{Ns - 1} z(t_{i + i(Ns - 1)})
where c_k = interiorStencils[k] and i(k) = interiorIndexes[k]. Note that the indices may be
negative or positive. Thus, interiorStencils and interiorIndexes tell us how to construct the
finite difference scheme away from the boundary.
x
For instance, the central difference scheme corresponds to interiorStencils = [-1/2, 1/2],
interiorIndexes = [-1, 1] and
z'(t_i) pprox (1/2)(-z(t_{i - 1}) + z_{t_{i + 1}})

Note: We assume that interiorIndexes is in ASCENDING order.


.. py:attribute:: interiorStencils
Expand Down Expand Up @@ -139,6 +236,21 @@ Module Contents
.. py:attribute:: leftBdrNorm
Suppose that InteriorStencils is an array of length Ns and interiorIndexes is a list of
length Ns. Further suppose the underlying time series contains Nx points, x(t_0), ... ,
x(t_{Nx -1}). Assuming index i is an "interior index" (not too close to 0 or Nx - 1), then we
approximate the time derivative of z at time t_i as follows:
z'(t_i) pprox c_0 z(t_{i + i(0)}) + ... + c_{Ns - 1} z(t_{i + i(Ns - 1)})
where c_k = interiorStencils[k] and i(k) = interiorIndexes[k]. Note that the indices may be
negative or positive. Thus, interiorStencils and interiorIndexes tell us how to construct the
finite difference scheme away from the boundary.
x
For instance, the central difference scheme corresponds to interiorStencils = [-1/2, 1/2],
interiorIndexes = [-1, 1] and
z'(t_i) pprox (1/2)(-z(t_{i - 1}) + z_{t_{i + 1}})

Note: We assume that interiorIndexes is in ASCENDING order.


.. py:attribute:: interiorStencils
Expand All @@ -163,6 +275,21 @@ Module Contents

.. py:attribute:: leftBdrNorm
Suppose that InteriorStencils is an array of length Ns and interiorIndexes is a list of
length Ns. Further suppose the underlying time series contains Nx points, x(t_0), ... ,
x(t_{Nx -1}). Assuming index i is an "interior index" (not too close to 0 or Nx - 1), then we
approximate the time derivative of z at time t_i as follows:
z'(t_i) pprox c_0 z(t_{i + i(0)}) + ... + c_{Ns - 1} z(t_{i + i(Ns - 1)})
where c_k = interiorStencils[k] and i(k) = interiorIndexes[k]. Note that the indices may be
negative or positive. Thus, interiorStencils and interiorIndexes tell us how to construct the
finite difference scheme away from the boundary.
x
For instance, the central difference scheme corresponds to interiorStencils = [-1/2, 1/2],
interiorIndexes = [-1, 1] and
z'(t_i) pprox (1/2)(-z(t_{i - 1}) + z_{t_{i + 1}})

Note: We assume that interiorIndexes is in ASCENDING order.


.. py:attribute:: leftBdrStencils
Expand Down
70 changes: 58 additions & 12 deletions _sources/autoapi/lasdi/gp/index.rst.txt
Original file line number Diff line number Diff line change
Expand Up @@ -17,27 +17,73 @@ Functions
Module Contents
---------------

.. py:function:: fit_gps(X, Y)
.. py:function:: fit_gps(X: numpy.ndarray, Y: numpy.ndarray) -> list[sklearn.gaussian_process.GaussianProcessRegressor]
Trains each GP given the interpolation dataset.
X: (n_train, n_param) numpy 2d array
Y: (n_train, n_coef) numpy 2d array
We assume each target coefficient is independent with each other.
gp_dictionnary is a dataset containing the trained GPs (as sklearn objects)
Trains a GP for each column of Y. If Y has shape N x k, then we train k GP regressors. In this
case, we assume that X has shape N x M. Thus, the Input to the GP is in \mathbb{R}^M. For each
k, we train a GP where the i'th row of X is the input and the i,k component of Y is the
corresponding target. Thus, we return a list of k GP Regressor objects, the k'th one of which
makes predictions for the k'th coefficient in the latent dynamics.

We assume each target coefficient is independent with each other.


.. py:function:: eval_gp(gp_dictionnary, param_grid)
-----------------------------------------------------------------------------------------------
:Parameters: * **X** (*A 2d numpy array of shape (n_train, input_dim), where n_train is the number of training*)
* **examples and input_dim is the number of components in each input (e.g., the number of**
* **parameters)**
* **Y** (*A 2d numpy array of shape (n_train, n_coef), where n_train is the number of training*)
* **examples and n_coef is the number of coefficients in the latent dynamics.**

Computes the GPs predictive mean and standard deviation for points of the parameter space grid
-----------------------------------------------------------------------------------------------
:returns: * *A list of trained GP regressor objects. If Y has k columns, then the returned list has k*
* *elements. It's i'th element holds a trained GP regressor object whose training inputs are the*
* *columns of X and whose corresponding target values are the elements of the i'th column of Y.*


.. py:function:: eval_gp(gp_list: list[sklearn.gaussian_process.GaussianProcessRegressor], param_grid: numpy.ndarray) -> tuple[numpy.ndarray, numpy.ndarray]
.. py:function:: sample_coefs(gp_dictionnary, param, n_samples)
Computes the GPs predictive mean and standard deviation for points of the parameter space grid

Generates sample sets of ODEs for one given parameter.
coef_samples is a list of length n_samples, where each terms is a matrix of SINDy coefficients sampled from the GP predictive
distributions

-----------------------------------------------------------------------------------------------
:Parameters: * **gp_list** (*a list of trained GP regressor objects. The number of elements in this list should*)
* **match the number of columns in param_grid. The i'th element of this list is a GP regressor**
* **object that predicts the i'th coefficient.**
* **param_grid** (*A 2d numpy.ndarray object of shape (number of parameter combination, number of*)
* **parameters). The i,j element of this array specifies the value of the j'th parameter in the**
* **i'th combination of parameters. We use this as the testing set for the GP evaluation.**

-----------------------------------------------------------------------------------------------
:returns: * *A two element tuple. Both are 2d numpy arrays of shape (number of parameter combinations,*
* *number of coefficients). The two arrays hold the predicted means and std's for each parameter*
* *at each training example, respectively.*
* *Thus, the i,j element of the first return variable holds the predicted mean of the j'th*
* *coefficient in the latent dynamics at the i'th training example. Likewise, the i,j element of*
* *the second return variable holds the standard deviation in the predicted distribution for the*
* *j'th coefficient in the latent dynamics at the i'th combination of parameter values.*


.. py:function:: sample_coefs(gp_list: list[sklearn.gaussian_process.GaussianProcessRegressor], param: numpy.ndarray, n_samples: int)
Generates sets of ODE (SINDy) coefficients sampled from the predictive distribution for those
coefficients at the specified parameter value (parma). Specifically, for the k'th SINDy
coefficient, we draw n_samples samples of the predictive distribution for the k'th coefficient
when param is the parameter.


-----------------------------------------------------------------------------------------------
:Parameters: * **gp_list** (*a list of trained GP regressor objects. The number of elements in this list should*)
* **match the number of columns in param_grid. The i'th element of this list is a GP regressor**
* **object that predicts the i'th coefficient.**
* **param** (*A combination of parameter values. i.e., a single test example. We evaluate each GP in*)
* **the gp_list at this parameter value (getting a prediction for each coefficient).**
* **n_samples** (*Number of samples of the predicted latent dynamics used to build ensemble of fom*)
* **predictions. N_s in the paper.**

-----------------------------------------------------------------------------------------------
:returns: * *A 2d numpy ndarray object called coef_samples. It has shape (n_samples, n_coef), where n_coef*
* *is the number of coefficients (length of gp_list). The i,j element of this list is the i'th*
* *sample of the j'th SINDy coefficient.*


Loading

0 comments on commit 30bfc89

Please sign in to comment.