• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
成果搜索

author:

Du, Boyuan (Du, Boyuan.) [1] | Yu, Yuanlong (Yu, Yuanlong.) [2] (Scholars:于元隆) | Liu, Huaping (Liu, Huaping.) [3]

Indexed by:

CPCI-S EI Scopus

Abstract:

Artificial neural networks suffer from catastrophic forgetting when knowledge needs to be learned from multi-batch or streaming data. In response to this problem, researchers have proposed a variety of lifelong learning methods to avoid catastrophic forgetting. However, current methods usually do not consider the possibility of malicious attacks. Meanwhile, in real lifelong learning scenarios, batch data or streaming data usually come from an incompletely trusted environment. Attackers can easily manipulate data or inject malicious samples into the training data set. As a result, the reliability of neural networks decreases. Recently, researches of lifelong learning attacks need to obtain real samples of the attacked classes, whether using backdoor attacks or data poisoning attacks. In this paper, we focus on an attack setting that is more suitable for lifelong learning scenario. This setting has two main features. The first is the setting does not require real samples of the attacked classes, and the second is it allows attacks to be performed on tasks that exclude the attacked classes. For this scenario, we propose a lifelong learning attack model based on deep inversion. In the scenario where EWC is used as the benchmark lifelong learning model, our experiments show that 1) in the data poisoning attack, the target accuracy can be significantly decreased by adding 0.5% of poisoned samples; 2) The backdoor attack with high accuracy can be achieved by adding 1% of backdoor samples.

Keyword:

backdoor attack data poisoning attack deep inversion lifelong learning

Community:

  • [ 1 ] [Du, Boyuan]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou, Peoples R China
  • [ 2 ] [Yu, Yuanlong]Fuzhou Univ, Coll Comp & Data Sci, Fuzhou, Peoples R China
  • [ 3 ] [Liu, Huaping]Tsinghua Univ, Dept Comp Sci & Technol, Beijing, Peoples R China

Reprint 's Address:

Show more details

Related Keywords:

Source :

2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN

ISSN: 2161-4393

Year: 2023

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Online/Total:233/10036168
Address:FZU Library(No.2 Xuyuan Road, Fuzhou, Fujian, PRC Post Code:350116) Contact Us:0591-22865326
Copyright:FZU Library Technical Support:Beijing Aegean Software Co., Ltd. 闽ICP备05005463号-1