Cao, H., Si, C., Sun, Q. et al. (3 more authors) (2022) ABCAttack : a gradient-free optimization black-box attack for fooling deep image classifiers. Entropy, 24 (3). 412.
Abstract
The vulnerability of deep neural network (DNN)-based systems makes them susceptible to adversarial perturbation and may cause classification task failure. In this work, we propose an adversarial attack model using the Artificial Bee Colony (ABC) algorithm to generate adversarial samples without the need for a further gradient evaluation and training of the substitute model, which can further improve the chance of task failure caused by adversarial perturbation. In untargeted attacks, the proposed method obtained 100%, 98.6%, and 90.00% success rates on the MNIST, CIFAR-10 and ImageNet datasets, respectively. The experimental results show that the proposed ABCAttack can not only obtain a high attack success rate with fewer queries in the black-box setting, but also break some existing defenses to a large extent, and is not limited by model structure or size, which provides further research directions for deep learning evasion attacks and defenses.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2022 The Authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
Keywords: | deep neural networks; adversarial examples; image classification; information security; black-box attack |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 23 Mar 2022 10:33 |
Last Modified: | 23 Mar 2022 10:33 |
Status: | Published |
Publisher: | MDPI AG |
Refereed: | Yes |
Identification Number: | 10.3390/e24030412 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:185002 |