Skip to content

六代兴亡如梦,苒苒惊时月。纵使岁寒途远,此志应难夺。

Notifications You must be signed in to change notification settings

zeyuliu1037/Multiple-Adversarial_Examples_attack

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

93 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multiple-Adversarial_Examples_attack

六代兴亡如梦,苒苒惊时月。纵使岁寒途远,此志应难夺。

These are adversarial examples attack for typical deep learning techniques: classifisers, Faster RCNN, YOLO

Aiming at some selected typical image classification and object detection networks, we designed specific adversarial examples attacking method to generate adversarial samples. On the basis of not affecting the visual effect, after processing the test samples, we can achieve the purpose of deceiving the recognition network model and outputting the complete error recognition results.

Dependencies

  • Tensorflow, keras
  • pytorch
  • Numpy
  • Matplotlib
  • jupyter notebook, colab

Reference

[1] Su, Jiawei, Danilo Vasconcellos Vargas, and Kouichi Sakurai. "One pixel attack for fooling deep neural networks." IEEE Transactions on Evolutionary Computation 23.5 (2019): 828-841.
[2] Wang, Derui, et al. "Daedalus: Breaking non- maximum suppression in object detection via adversarial examples." arXiv (2019): arXiv-1902.
[3] Wei, Xingxing, et al. "Transferable adversarial attacks for image and video object detection." ar Xiv preprint arXiv:1811.12641 (2018).

About

六代兴亡如梦,苒苒惊时月。纵使岁寒途远,此志应难夺。

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 52.8%
  • Jupyter Notebook 47.0%
  • Cython 0.2%