Skip to content

Commit

Permalink
Updates to documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
msainsburydale committed Dec 2, 2023
1 parent 97e6e32 commit c1c6b90
Show file tree
Hide file tree
Showing 6 changed files with 103 additions and 103 deletions.
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ A convenient interface for `R` users is available [here](https://github.com/msai

### Supporting and citing

This software was developed as part of academic research. If you would like to support it, please star the repository. If you use it in your research or other activities, please use the following citation.
This software was developed as part of academic research. If you would like to support it, please star the repository. If you use it in your research or other activities, please also use the following citation.

```
@article{SZH_2023_neural_Bayes_estimators,
Expand All @@ -34,12 +34,14 @@ This software was developed as part of academic research. If you would like to s

### Papers using NeuralEstimators

- **Likelihood-Free Parameter Estimation with Neural Bayes Estimators** [[paper]](https://www.tandfonline.com/doi/full/10.1080/00031305.2023.2249522)\
- **Likelihood-free parameter estimation with neural Bayes estimators** [[paper]](https://www.tandfonline.com/doi/full/10.1080/00031305.2023.2249522)\
Matthew Sainsbury-Dale, Andrew Zammit-Mangion, Raphaël Huser (2023)


- **Neural Bayes Estimators for Censored Inference with Peaks-Over-Threshold Models** [[paper]](https://arxiv.org/abs/2306.15642)\
- **Neural Bayes estimators for censored inference with peaks-over-threshold models** [[paper]](https://arxiv.org/abs/2306.15642)\
Jordan Richards, Matthew Sainsbury-Dale, Andrew Zammit-Mangion, Raphaël Huser (2023+)

- **Neural Bayes Estimators for Irregular Spatial Data using Graph Neural Networks** [[paper]](https://arxiv.org/abs/2310.02600)\
- **Neural Bayes estimators for irregular spatial data using graph neural networks** [[paper]](https://arxiv.org/abs/2310.02600)\
Matthew Sainsbury-Dale, Jordan Richards, Andrew Zammit-Mangion, Raphaël Huser (2023+)

- **Modern extreme value statistics for Utopian extremes** [[paper]](https://arxiv.org/abs/2311.11054)\
Jordan Richards, Noura Alotaibi, Daniela Cisneros, Yan Gong, Matheus B. Guerrero, Paolo Redondo, Xuanjie Shao
2 changes: 1 addition & 1 deletion docs/src/API/core.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ PiecewiseEstimator

## Training

The function `train` is used to train a single neural estimator, while the wrapper function `trainx` is useful for training multiple neural estimators over a range of sample sizes, making using of the technique known as pre-training.
The function [`train`](@ref) is used to train a single neural estimator, while the wrapper function [`trainx`](@ref) is useful for training multiple neural estimators over a range of sample sizes, making using of the technique known as pre-training.

```@docs
train
Expand Down
10 changes: 6 additions & 4 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Once familiar with the details of the [Framework](@ref), see the [Examples](@ref

### Supporting and citing

This software was developed as part of academic research. If you would like to support it, please star the [repository](https://github.com/msainsburydale/NeuralEstimators.jl). If you use it in your research or other activities, please use the following citation.
This software was developed as part of academic research. If you would like to support it, please star the [repository](https://github.com/msainsburydale/NeuralEstimators.jl). If you use it in your research or other activities, please also use the following citation.

```
@article{SZH_2023_neural_Bayes_estimators,
Expand All @@ -33,8 +33,10 @@ This software was developed as part of academic research. If you would like to s

### Papers using NeuralEstimators

- **Likelihood-Free Parameter Estimation with Neural Bayes Estimators** [[paper]](https://www.tandfonline.com/doi/full/10.1080/00031305.2023.2249522)
- **Likelihood-free parameter estimation with neural Bayes estimators** [[paper]](https://www.tandfonline.com/doi/full/10.1080/00031305.2023.2249522)

- **Neural Bayes Estimators for Censored Inference with Peaks-Over-Threshold Models** [[paper]](https://arxiv.org/abs/2306.15642)
- **Neural Bayes estimators for censored inference with peaks-over-threshold models** [[paper]](https://arxiv.org/abs/2306.15642)

- **Neural Bayes Estimators for Irregular Spatial Data using Graph Neural Networks** [[paper]](https://arxiv.org/abs/2310.02600)
- **Neural Bayes estimators for irregular spatial data using graph neural networks** [[paper]](https://arxiv.org/abs/2310.02600)

- **Modern extreme value statistics for Utopian extremes** [[paper]](https://arxiv.org/abs/2311.11054)
3 changes: 2 additions & 1 deletion docs/src/workflow/overview.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@

# Overview

To develop a neural estimator with `NeuralEstimators.jl`,
Expand All @@ -7,6 +8,6 @@ To develop a neural estimator with `NeuralEstimators.jl`,
- Initialise a neural network, `θ̂`, that will be trained into a neural Bayes estimator.
- Train `θ̂` under the chosen loss function using [`train`](@ref).
- Assess `θ̂` using [`assess`](@ref). The resulting object of class [`Assessment`](@ref) can be used to assess the estimator with respect to the entire parameter space by estimating the risk function with [`risk`](@ref), or used to plot the empirical sampling distribution of the estimator.
- Apply `θ̂` to observed data (once its performance has been checked in the above step). Bootstrap-based uncertainty quantification is facilitated with [`bootstrap`](@ref) and [`interval`](@ref).
- Apply `θ̂` to observed data (once its performance has been checked in the above step). Bootstrap-based uncertainty quantification is facilitated with [`bootstrap`](@ref) and [`interval`](@ref).

See the [Examples](@ref) and, once familiar with the basic workflow, see [Advanced usage](@ref) for practical considerations on how to most effectively construct neural estimators.
25 changes: 18 additions & 7 deletions src/NeuralEstimators.jl
Original file line number Diff line number Diff line change
Expand Up @@ -75,18 +75,29 @@ include("missingdata.jl")

end

#TODO Add helper functions for censoring and missing data (take these from EM paper, and ask Jordan if he has any code he would like to share)
#TODO
# - Add helper functions for censored data and write an example in the documentation.
# - Plotting from Julia (which can act directly on the object of type assessment).
# - Examples:
# o Add some figures to the examples in the documentation (e.g., show the sampling distribution in univariate example).
# o Give the formula for how to compute the input channels dimension in the gridded example.


# ---- long term:
# - README.md
# - turn some document examples into "doctests"
# - plotrisk and plotdistribution (wait until the R interface is finished)
# - Add "AR(k) time series" and "Irregular spatial data" examples. (The former will be an example using partially exchangeable neural networks and the latter will be an example using GNNs.)
# - Add "AR(k) time series" example. (An example using partially exchangeable neural networks.)
# - Precompile NeuralEstimators.jl to reduce latency: See https://julialang.org/blog/2021/01/precompile_tutorial/. It seems very easy, just need to add precompile(f, (arg_types…)) to whatever methods I want to precompile.
# - Get DeepSetExpert working optimally on the GPU (leaving this for now as we don't need it for the paper).
# - See if DeepSet.jl is something that the Flux people would like to include. (They may also improve the code.)
# - With the fixed parameters method of train, there seems to be substantial overhead with my current implementation of simulation on the fly. When epochs_per_Z_refresh = 1, the run-time increases by a factor of 4 for the Gaussian process with nu varied and with m = 1. For now, I’ve added an argument simulate_on_the_fly::Bool, which allows us not to switch off on-the-fly simulation even when epochs_per_Z_refresh = 1. However, it would be good to reduce this overhead.
# - Callback function for plotting during training. See https://www.youtube.com/watch?v=ObYDHi_jJXk&ab_channel=TheJuliaProgrammingLanguage. Also, I know there is a specific module for call backs while training Flux models, so may this is already possible in Julia too. In either case, I think train() should have an additional argument, callback. See also the example at: https://github.com/stefan-m-lenz/JuliaConnectoR.
# - With the fixed parameters method of train, there seems to be overhead with my current implementation of just-in-time simulation. When epochs_per_Z_refresh = 1, the run-time increases by a factor of 4 for the Gaussian process with m = 1. For now, I’ve added an argument simulate_on_the_fly::Bool, which allows us to switch off just-in-time simulation.

# ---- once I've made the project public:
# ---- once the software is properly polished:
# - Add NeuralEstimators.jl to the list of packages that use Documenter: see https://documenter.juliadocs.org/stable/man/examples/
# - Add NeuralEstimators.jl to https://github.com/smsharma/awesome-neural-sbi#code-packages-and-benchmarks.
# - Once NeuralEstimators is on the Julia package manager, add the following to index.md:
#
# Install `NeuralEstimators` from [Julia](https://julialang.org/)'s package manager using the following command inside Julia:
#
# ```
# using Pkg; Pkg.add("NeuralEstimators")
# ```
Loading

0 comments on commit c1c6b90

Please sign in to comment.