Skip to content

Commit

Permalink
fix 2 dead links
Browse files Browse the repository at this point in the history
  • Loading branch information
janosh committed Sep 21, 2024
1 parent 8333b3f commit 63e4c0b
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 6 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ default_install_hook_types: [pre-commit, commit-msg]

repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.4.10
rev: v0.6.6
hooks:
- id: ruff
args:
Expand Down
3 changes: 1 addition & 2 deletions data/packages.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,7 @@
authors: Facebook / Meta
authors_url: https://opensource.fb.com
lang: PyTorch
description: |
[FlowTorch Docs](https://flowtorch.ai) is a PyTorch library for learning and sampling from complex probability distributions using a class of methods called Normalizing Flows.
description: FlowTorch is a PyTorch library for learning and sampling from complex probability distributions using Normalizing Flows.

- title: TensorFlow Probability
date: 2018-06-22
Expand Down
2 changes: 1 addition & 1 deletion data/publications.yml
Original file line number Diff line number Diff line change
Expand Up @@ -399,7 +399,7 @@
description: Normalizing flows have potential in Bayesian statistics as a complementary or alternative method to MCMC for sampling posteriors. However, their training via reverse KL divergence may be inadequate for complex posteriors. This research proposes a new training approach utilizing direct KL divergence, which involves augmenting a local MCMC algorithm with a normalizing flow to enhance mixing rate and utilizing the resulting samples to train the flow. This method requires minimal prior knowledge of the posterior and can be applied for model validation and evidence estimation, offering a promising strategy for efficient posterior sampling.

- title: Adaptive Monte Carlo augmented with normalizing flows
url: https://pnas.org/doi/10.1073/pnas.2109420119
url: https://doi.org/10.1073/pnas.2109420119
date: 2022-03-02
authors: Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden
description: Markov Chain Monte Carlo (MCMC) algorithms struggle with sampling from high-dimensional, multimodal distributions, requiring extensive computational effort or specialized importance sampling strategies. To address this, an adaptive MCMC approach is proposed, combining local updates with nonlocal transitions via normalizing flows. This method blends standard transition kernels with generative model moves, adapting the generative model using generated data to improve sampling efficiency. Theoretical analysis and numerical experiments demonstrate the algorithm's ability to equilibrate quickly between metastable modes, sampling effectively across large free energy barriers and achieving significant accelerations over traditional MCMC methods.
Expand Down
4 changes: 2 additions & 2 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ A list of awesome resources for understanding and applying normalizing flows (NF
1. 2022-05-16 - [Multi-scale Attention Flow for Probabilistic Time Series Forecasting](https://arxiv.org/abs/2205.07493) by Feng, Xu et al.<br>
Proposes a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF), where one integrates multi-scale attention and relative position information and the multivariate data distribution is represented by the conditioned normalizing flow.

1. 2022-03-02 - [Adaptive Monte Carlo augmented with normalizing flows](https://pnas.org/doi/10.1073/pnas.2109420119) by Gabrié, Rotskoff et al.<br>
1. 2022-03-02 - [Adaptive Monte Carlo augmented with normalizing flows](https://doi.org/10.1073/pnas.2109420119) by Gabrié, Rotskoff et al.<br>
Markov Chain Monte Carlo (MCMC) algorithms struggle with sampling from high-dimensional, multimodal distributions, requiring extensive computational effort or specialized importance sampling strategies. To address this, an adaptive MCMC approach is proposed, combining local updates with nonlocal transitions via normalizing flows. This method blends standard transition kernels with generative model moves, adapting the generative model using generated data to improve sampling efficiency. Theoretical analysis and numerical experiments demonstrate the algorithm's ability to equilibrate quickly between metastable modes, sampling effectively across large free energy barriers and achieving significant accelerations over traditional MCMC methods. [[Code](https://zenodo.org/records/4783701#.Yfv53urMJD8)]

1. 2022-01-14 - [E(n) Equivariant Normalizing Flows](https://arxiv.org/abs/2105.09016) by Satorras, Hoogeboom et al.<br>
Expand Down Expand Up @@ -329,7 +329,7 @@ Zuko is used in [LAMPE](https://github.com/francois-rozet/lampe) to enable Likel
1. 2020-12-07 - [flowtorch](https://github.com/facebookincubator/flowtorch) by [Facebook / Meta](https://opensource.fb.com)
&ensp;
<img src="https://img.shields.io/github/stars/facebookincubator/flowtorch" alt="GitHub repo stars" valign="middle" /><br>
[FlowTorch Docs](https://flowtorch.ai) is a PyTorch library for learning and sampling from complex probability distributions using a class of methods called Normalizing Flows.
FlowTorch is a PyTorch library for learning and sampling from complex probability distributions using Normalizing Flows.

1. 2020-02-09 - [nflows](https://github.com/bayesiains/nflows) by [Bayesiains](https://homepages.inf.ed.ac.uk/imurray2/group)
&ensp;
Expand Down

0 comments on commit 63e4c0b

Please sign in to comment.