Skip to content

Latest commit

 

History

History
8 lines (6 loc) · 974 Bytes

README.md

File metadata and controls

8 lines (6 loc) · 974 Bytes

Understanding Noise-Augmented Training for Randomized Smoothing

This repository provides code and visualizations for our TMLR Paper "Understanding Noise-Augmented Training for Randomized Smoothing", Ambar Pal, Jeremias Sulam, Transactions in Machine Learning Research 2023. We demonstrate that noise-augmented training is not always beneficial for randomized smoothing, and identify a key theoretical property that is able to determine this.

Synthetic Experiments

The synthetic experiments in Section 5 can be reproduced with the code in Synthetic_Experiments.ipynb.

Real Data Experiments

The plots for the experiments on MNIST and CIFAR-10 can be reproduced with MNIST_CIFAR10_ExperimentsPlot.ipynb. Further, code for training the MNIST noise-augmented models is given in C1_train_noise_augmented_MNIST.py. These models can then be used with C2_evaluate_risk_after_noise_augmentation_MNIST.py to evaluating the risk of the randomized smoothed classifers.