Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks

dc.contributor.authorMirzaeian, Ali
dc.contributor.authorKosecka, Jana
dc.contributor.authorHomayoun, Houman
dc.contributor.authorMohsenin, Tinoosh
dc.contributor.authorSasan, Avesta
dc.date.accessioned2021-02-03T19:21:55Z
dc.date.available2021-02-03T19:21:55Z
dc.date.issued2021
dc.description.abstractThis paper proposes an ensemble learning model that is resistant to adversarial attacks. To build resilience, we introduced a training process where each member learns a radically distinct latent space. Member models are added one at a time to the ensemble. Simultaneously, the loss function is regulated by a reverse knowledge distillation, forcing the new member to learn different features and map to a latent space safely distanced from those of existing members. We assessed the security and performance of the proposed solution on image classification tasks using CIFAR10 and MNIST datasets and showed security and performance improvement compared to the state of the art defense methods.
dc.identifier.citationAli Mirzaeian, Jana Kosecka, Houman Homayoun, Tinoosh Mohsenin, Avesta Sasan. Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks. California, ISQED 2021
dc.identifier.urihttps://hdl.handle.net/1920/11947
dc.language.isoen_US
dc.rightsAttribution-ShareAlike 3.0 United States
dc.rights.urihttps://creativecommons.org/licenses/by-sa/3.0/us/
dc.subjectEnsemble learning
dc.subjectNeural Networks
dc.titleDiverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks
dc.typeWorking Paper

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
isqed_2021_diverse_knowledge_distillation-7.pdf
Size:
2.31 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.52 KB
Format:
Item-specific license agreed upon to submission
Description: