Cited time in webofscience Cited time in scopus

Boosting for Straggling and Flipping Classifiers

Title
Boosting for Straggling and Flipping Classifiers
Author(s)
Cassuto, YuvalKim, Yongjune
Issued Date
2021-07-12
Citation
2021 IEEE International Symposium on Information Theory, ISIT 2021, pp.2441 - 2446
Type
Conference Paper
ISBN
9781538682098
ISSN
2157-8095
Abstract
Boosting is a well-known method in machine learning for combining multiple weak classifiers into one strong classifier. When used in distributed setting, accuracy is hurt by classifiers that flip or straggle due to communication and/or computation unreliability. While unreliability in the form of noisy data is well-treated by the boosting literature, the unreliability of the classifier outputs has not been explicitly addressed. Protecting the classifier outputs with an error/erasure-correcting code requires reliable encoding of multiple classifier outputs, which is not feasible in common distributed settings. In this paper we address the problem of training boosted classifiers subject to straggling or flips at classification time. We propose two approaches: one based on minimizing the usual exponential loss but in expectation over the classifier errors, and one by defining and minimizing a new worst-case loss for a specified bound on the number of unreliable classifiers. © 2021 IEEE.
URI
http://hdl.handle.net/20.500.11750/46915
DOI
10.1109/isit45174.2021.9517745
Publisher
Institute of Electrical and Electronics Engineers Inc.
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Department of Electrical Engineering and Computer Science Information, Computing, and Intelligence Laboratory 2. Conference Papers

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE