当前位置:   article > 正文

对抗样本的定义_adversarial sample

adversarial sample

 对抗样本主要是特指能攻击深度神经网络模型并且人眼不可察觉的恶意样本,以下为各个论文中对其的定义。

  • Adversarial examples are a type of attack on machine learning (ML) systems which cause misclassification of inputs. 《Adversarial Examples and Metrics》
  • An adversarial sample is an input crafted to cause deep learning algorithms to misclassify. 《The Limitations of Deep Learning in Adversarial Settings》
  • Deep neural networks (DNNs) are challenged by their vulnerability to adversarial examples, which are crafted by adding small, human-imperceptible noises to legitimate examples, but make a model output attackerdesired inaccurate predictions. Adversarial attacks serve as an important surrogate to evaluate the robustness of deep learning models before they are deploye. 《Boosting Adversarial Attacks with Momentum》
  • Recent work has demonstrated that deep neural networks are vulnerable to adversarial examples—inputs that are almost indistinguishable from natural data and yet classified incorrectly by the network. In fact, some of the latest findings suggest that the existence of adversarial attacks may be an inherent weakness of deep learning models. 《Towards Deep Learning Models Resistant to Adversarial》
  • Convolutional neural networks can be easily attacked by adversarial examples, which are computed by adding small perturbations to clean inputs. 《Smooth Adversarial Training》
  • An adversarial sample can be defined as one which appears to be drawn from a particular class by humans (or advanced cognitive systems) but fall into a different class in the feature space. Adversarial samples are strategically modified samples, which are crafted with the purpose of fooling a classifier at hand. 《Towards Crafting Text Adversarial Samples》
  • The adversarial examples are always constructed by adding vicious perturbations to the original images, and the added perturbations are not easily perceived by human. 《Generative Networks for Adversarial Examples with Weighted Perturbations》
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/小蓝xlanll/article/detail/296105
推荐阅读
相关标签
  

闽ICP备14008679号