-
Notifications
You must be signed in to change notification settings - Fork 13
Home
Vinay Sisodia edited this page Jun 20, 2020
·
19 revisions
Articles to understand theory better:
Other articles not discussed in mainstream ML:
- Bayesian Committee Machine
- Predictive State Representations: A New Theory for Modeling Dynamical Systems
- Learning by transduction
Misc
- NeRF
- Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains Some background work:
- An introduction to probabilistic programming
- (a more approachable resource) Automating Inference, Learning, and Designusing Probabilistic Programming
- A rational analysis of rule-based concept learning
- Learning overhypotheses with hierarchical Bayesian models
- Noise contrastive estimation
- Notes on noise contrastive estimation and negative sampling
- Density ratio trick
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- Putting an end to end-to-end: Gradient-Isolated Learning of Representations
- Brain-Like Object Recognition with High-Performing Shallow Recurrent ANNs
- Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
- Probabilistic Matrix Factorization for Automated Machine Learning.pdf
- Estimators for Multivariate Information Measures in General Probability Spaces
- Generalized Sliced Wasserstein Distances
- Deep ReLU Networks Have Surprisingly Few Activation Patterns
- Learning Conditional Deformable Templates with Convolutional Networks
- SGD on Neural Networks Learns Functions of Increasing Complexity
- Full-Gradient Representation for Neural Network Visualization
- Understanding and Improving Layer Normalization
- Neural Similarity Learning
- A Primal-Dual Constraint for Deep Learning with Constraints
- Integer Discrete Flows and Lossless Compression