Detail View

FedNN: Federated learning on concept drift data using weight and adaptive group normalizations
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

DC Field Value Language
dc.contributor.author Kang, Myeongkyun -
dc.contributor.author Kim, Soopil -
dc.contributor.author Jin, Kyong Hwan -
dc.contributor.author Adeli, Ehsan -
dc.contributor.author Pohl, Kilian M. -
dc.contributor.author Park, Sang Hyun -
dc.date.accessioned 2025-07-15T15:10:12Z -
dc.date.available 2025-07-15T15:10:12Z -
dc.date.created 2024-02-20 -
dc.date.issued 2024-05 -
dc.identifier.issn 0031-3203 -
dc.identifier.uri https://scholar.dgist.ac.kr/handle/20.500.11750/58643 -
dc.description.abstract Federated Learning (FL) allows a global model to be trained without sharing private raw data. The major challenge in FL is client-wise data heterogeneity leading to different model convergence speed and accuracy. Despite the recent progress of FL, most methods verify their accuracy on prior probability shift (label distribution skew) dataset, while the concept drift problem (i.e., where each client has distinct styles of input while sharing the same labels) has not been explored. In real scenarios, concept drift is of paramount concern in FL since the client's data is collected under extremely different conditions making FL optimization more challenging. Significant differences in inputs among clients exacerbate the heterogeneity of clients’ parameters compared to prior probability shift, ultimately resulting in failures for previous FL approaches. To address the challenge of concept drift, we use Weight Normalization (WN) and Adaptive Group Normalization (AGN) to alleviate conflicts during global model updates. WN re-parameterizes weights to have zero mean and unit variance while AGN adaptively selects the optimal mean and standard deviation for feature normalization based on the dataset. These two components significantly contribute to having consistent activations after global model updates reducing heterogeneity in concept drift data. Comprehensive experiments on seven datasets (with concept drift) demonstrate that our method outperforms five state-of-the-art FL methods and shows faster convergence speed compared to the previous FL methods. © 2024 Elsevier Ltd -
dc.language English -
dc.publisher Elsevier -
dc.title FedNN: Federated learning on concept drift data using weight and adaptive group normalizations -
dc.type Article -
dc.identifier.doi 10.1016/j.patcog.2023.110230 -
dc.identifier.wosid 001154871600001 -
dc.identifier.scopusid 2-s2.0-85183339359 -
dc.identifier.bibliographicCitation Kang, Myeongkyun. (2024-05). FedNN: Federated learning on concept drift data using weight and adaptive group normalizations. Pattern Recognition, 149. doi: 10.1016/j.patcog.2023.110230 -
dc.description.isOpenAccess FALSE -
dc.subject.keywordAuthor Federated learning -
dc.subject.keywordAuthor Concept drift -
dc.subject.keywordAuthor Weight normalization -
dc.subject.keywordAuthor Adaptive group normalization -
dc.citation.title Pattern Recognition -
dc.citation.volume 149 -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.relation.journalResearchArea Computer Science; Engineering -
dc.relation.journalWebOfScienceCategory Computer Science, Artificial Intelligence; Engineering, Electrical & Electronic -
dc.type.docType Article -
Show Simple Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Related Researcher

김수필
Kim, Soopil김수필

Division of Intelligent Robotics

read more

Total Views & Downloads