http://10.10.120.238:8080/xmlui/handle/123456789/150
Title: | Data Poisoning Attack by Label Flipping on SplitFed Learning |
Authors: | Gajbhiye S. Singh P. Gupta S. |
Keywords: | Adversarial machine learning Data poisoning Federated Learning Label flipping SplitFed Learning |
Issue Date: | 2023 |
Publisher: | Springer Science and Business Media Deutschland GmbH |
Abstract: | In the distributed machine learning scenario, we have Split Learning (SL) and Federated Learning (FL) as the popular techniques. In SL, the model is split between the clients and the server for sequential training of clients, whereas in FL, clients train parallelly. The model splitting in SL provides better overall privacy than FL. SplitFed learning (SFL) combines these two popular techniques to incorporate the model splitting approach from SL to improve privacy and utilize the generic FL approach for faster training. Despite the advantages, the distributed nature of SFL makes it vulnerable to data poisoning attacks by malicious participants. This vulnerability prompted us to study the robustness of SFL under such attacks. The outcomes of this study would provide valuable insights to organizations and researchers who wish to deploy or study SFL. In this paper, we conduct three experiments. Our first experiment demonstrates that data poisoning attacks seriously threaten SFL systems. Even the presence of 10% malicious participants can cause a drastic drop in the accuracy of the global model. We further perform a second experiment to study the robustness of two variants of SFL under the category of targeted data poisoning attacks. The results of experiment two demonstrate that SFLV1 is more robust than SFLV2 the majority of times. In our third experiment, we studied untargeted data poisoning attacks on SFL. We found that untargeted attacks cause a more significant loss in the global model’s accuracy than targeted attacks. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG. |
URI: | https://dx.doi.org/10.1007/978-3-031-23599-3_30 http://localhost:8080/xmlui/handle/123456789/150 |
ISBN: | 978-3031235986 |
ISSN: | 1865-0929 |
Appears in Collections: | Conference Paper |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.