Cited time in webofscience Cited time in scopus

Full metadata record

DC Field Value Language
dc.contributor.advisor 서대원 -
dc.contributor.author Seonhyeong Kim -
dc.date.accessioned 2023-03-22T19:57:20Z -
dc.date.available 2023-03-23T06:00:21Z -
dc.date.issued 2023 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/45748 -
dc.identifier.uri http://dgist.dcollection.net/common/orgView/200000653511 -
dc.description Federated learning, Machine learning, Optimization -
dc.description.abstract Federated learning (FL) is a distributed method to train a global model over a set of local clients
while keeping data localized. It reduces the risks of privacy and security but faces important challenges
including expensive communication costs and client drift issues. To address these issues, we propose
FedElasticNet, a communication-efficient and drift-robust FL framework leveraging the elastic net.
It repurposes two types of the elastic net regularizers (i.e., l1 and l2 penalties on the local model
updates): (1) the l1-norm regularizer sparsifies the local updates to reduce the communication costs
and (2) the l2-norm regularizer resolves the client drift problem by limiting the impact of drifting
local updates due to data heterogeneity. FedElasticNet is a general framework for FL; hence, without
additional costs, it can be integrated into prior FL techniques, e.g., FedAvg, FedProx, SCAFFOLD,
and FedDyn. We show that our framework effectively resolves the communication cost and client
drift problems simultaneously.
-
dc.description.abstract 연합 학습은 로컬 클라이언트가 가지고 있는 데이터를 공유하지 않으면서 로컬 클라이언트들의
데이터에 대한 글로벌 모델을 학습하는 분산 학습 방법이다. 연합 학습은 개인 정보 유출과 해
킹의 위험을 줄이지만 값비싼 통신 비용과 Client drift 문제를 포함한 중요한 해결 과제가 남아있
다. 이러한 문제를 해결하기 위해 Elastic Net 을 활용하는 통신 효율적이고 Client drift 에 강한 연
합학습 프레임워크인 FedElasticNet 을 제안한다. 이는 두 가지 유형의 Elastic Net Regularizer term
(즉, 로컬 모델 업데이트에 대한 ℓ1 및 ℓ2 페널티)의 목적을 변경한다. 본 논문에서 데이터 이질
성으로 인한 로컬 업데이트 드리프트의 영향을 제한하여 Client drift 문제를 해결한다. FedElasticNet 은 연합 학습에서 사용가능한 일반 프레임워크이다. 따라서 추가적인 수정없이 FedAvg,
FedProx, SCAFFOLD 및 FedDyn 과 같은 이전 연합 학습 기술에 통합할 수 있다. 본 논문에서 제
안하는 프레임워크가 통신 비용과 Client drift 문제를 동시에 효과적으로 해결한다는 것을 보여준
다.
-
dc.description.tableofcontents Ⅰ. Introduction 1
Contributions 1
Ⅱ. Related Work 1
Related Work 2
Ⅲ. Proposed Method 2
FedElasticNet 3
3.1 3
3.2 3
3.3 4
Ⅳ. Experiments 5
Experimental Setup 6
Evaluation of Methods 6
Ⅴ. Conclusion 9
Ⅵ. Appendix 10
6.1 10
6.2 10
6.3 12
Ⅶ. Proof 14
-
dc.format.extent 27 -
dc.language eng -
dc.publisher DGIST -
dc.title Communication-Efficient and Drift-Robust Federated Learning via Elastic Net -
dc.type Thesis -
dc.identifier.doi 10.22677/THESIS.200000653511 -
dc.description.degree Master -
dc.contributor.department Department of Electrical Engineering and Computer Science -
dc.contributor.coadvisor Yongjune Kim -
dc.date.awarded 2023-02-01 -
dc.publisher.location Daegu -
dc.description.database dCollection -
dc.citation XT.IM 김54 202302 -
dc.date.accepted 2023-03-21 -
dc.contributor.alternativeDepartment 전기전자컴퓨터공학과 -
dc.subject.keyword Federated learning -
dc.subject.keyword Machine learning -
dc.subject.keyword Optimization -
dc.contributor.affiliatedAuthor Seonhyeong Kim -
dc.contributor.affiliatedAuthor Daewon Seo -
dc.contributor.affiliatedAuthor Yongjune Kim -
dc.contributor.alternativeName 김선형 -
dc.contributor.alternativeName Daewon Seo -
dc.contributor.alternativeName 김용준 -
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Department of Electrical Engineering and Computer Science Theses Master

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE