Efficient Byzantine-robust distributed inference with regularization: a trade-off between compression and adversary
From MaRDI portal
Publication:6571196
DOI10.1016/j.ins.2024.121010MaRDI QIDQ6571196
Le Chang, Guang Yang, Xing-Cai Zhou, Shao-Gao Lv
Publication date: 11 July 2024
Published in: Information Sciences (Search for Journal in Brave)
compressiondistributed learningadversarycommunication-efficientByzantine-robuststatistical error rate
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- Distributed testing and estimation under sparse high dimensional models
- The landscape of empirical risk for nonconvex losses
- Distributed secure state estimation for cyber-physical systems under sensor attacks
- Byzantine-resilient distributed state estimation: a min-switching approach
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Communication-Efficient Distributed Statistical Inference
- Fault-Tolerant Multi-Agent Optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Efficient Byzantine-robust distributed inference with regularization: a trade-off between compression and adversary