Skip to content

Commit

Permalink
update installaiton guide
Browse files Browse the repository at this point in the history
  • Loading branch information
Tinggong committed Oct 1, 2024
1 parent 2220b3c commit 69d52f9
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 13 deletions.
1 change: 1 addition & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"

[compat]
Distributions = "0.25"
Fibers = "1.0"
Flux = "0.14"
JLD2 = "0.4"
ProgressMeter = "1.10"
Expand Down
12 changes: 3 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,24 +11,18 @@ Microstructure.jl is under active development, testing and optimization and upda
Gong, T., & Yendiki, A. (2024). Microstructure. jl: a Julia Package for Probabilistic Microstructure Model Fitting with Diffusion MRI. arXiv preprint arXiv:2407.06379.

### Installation
To install Microstructure.jl, open Julia and enter the package mode by typing `]`, then add the package:
To install Microstructure.jl, open Julia and enter the package mode by typing `]`, then add the package, which will install the latest released version:

```julia
julia> ]
(@v1.8) pkg> add Microstructure
```

You can check if your installation is the latest version by typing `status` in the package mode and upgrade to the latest version using `up` in the package mode:

```julia
(@v1.8) pkg> up Microstructure
```

If a newer version isn't being installed using `up`, you can remove current installation and add the latest version by (replace `0.1.4` with latest version number):
If you want to keep up to date with the developing version I am working on, remove the current installation and add the repository directly:

```julia
(@v1.8) pkg> rm Microstructure
(@v1.8) pkg> add Microstructure@0.1.4
(@v1.8) pkg> add https://github.com/Tinggong/Microstructure.jl.git
```

### Relationship to Other Packages
Expand Down
4 changes: 2 additions & 2 deletions docs/src/manual/estimators.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Estimators

This page introduces two types of estimators implemented in Microstructure.jl for estimating parameters and uncertainties: the Markov Chain Monte Carlo (MCMC) sampling method and Monte Carlo dropout using neural networks. While MCMC estimator will take longer computation time, it is recommended for more accurate parameter estimation comparing with the neural network estimator currently implemented. The performance of neural network estimator will be closely linked to the parameter distributions in the training samples. Currently, function to generate uniform parameter distributions is provided, which may not be the optimized solutions for every model. However, if you are interested in studying how training samples affect estimation accuracy, you are welcome to try it out and you can also generate samples use other distributions.
This page introduces two types of estimators in Microstructure.jl for estimating parameters and quantifying uncertainties: the Markov Chain Monte Carlo (MCMC) sampling method and Monte Carlo dropout using neural networks. These two types of estimators are flexibly parametrized, allowing you to specify sampling options for MCMC and training options for neural networks.

## MCMC

Expand Down Expand Up @@ -28,7 +28,7 @@ Function mcmc! runs on single thread and suitable for testing sampler parameters

## Neural Networks

This module currently includes simple multi-layer perceptrons and training data generation function, which allows supervised training of the MLPs on synthesised data with uniform parameter distributions.
This module currently includes simple multi-layer perceptrons and training data generation function, which allows supervised training of the MLPs on synthesised data with given training parameter distributions.

### Specify a network model for your task

Expand Down
6 changes: 4 additions & 2 deletions src/estimators_nn.jl
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,10 @@ export NetworkArg,
)
Return a `NetworkArg` object with necessary parameters to construct a neural network model
and generate training samples for specifc biophysical model. A test Network architecture and training
samples can be automaticlly determined from the modelling task by using function
and generate training samples for specifc biophysical model.
A test network architecture and training samples can be automaticlly determined from the modelling task by using function
NetworkArg(model, protocol, params, prior_range, prior_dist, paralinks, noisetype, sigma_range, sigma_dist)
"""
Expand Down

0 comments on commit 69d52f9

Please sign in to comment.