Detail View

Metadata Downloads

Title
Boosting for Straggling and Flipping Classifiers
Issued Date
2021-07-17
Citation
Cassuto, Yuval. (2021-07-17). Boosting for Straggling and Flipping Classifiers. 2021 IEEE International Symposium on Information Theory, ISIT 2021, 2441–2446. doi: 10.1109/isit45174.2021.9517745
Type
Conference Paper
ISBN
9781538682098
ISSN
2157-8117
Abstract
Boosting is a well-known method in machine learning for combining multiple weak classifiers into one strong classifier. When used in distributed setting, accuracy is hurt by classifiers that flip or straggle due to communication and/or computation unreliability. While unreliability in the form of noisy data is well-treated by the boosting literature, the unreliability of the classifier outputs has not been explicitly addressed. Protecting the classifier outputs with an error/erasure-correcting code requires reliable encoding of multiple classifier outputs, which is not feasible in common distributed settings. In this paper we address the problem of training boosted classifiers subject to straggling or flips at classification time. We propose two approaches: one based on minimizing the usual exponential loss but in expectation over the classifier errors, and one by defining and minimizing a new worst-case loss for a specified bound on the number of unreliable classifiers. © 2021 IEEE.
URI
http://hdl.handle.net/20.500.11750/46915
DOI
10.1109/isit45174.2021.9517745
Publisher
IEEE Information Theory Society
Show Full Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Total Views & Downloads