Securing Federated Learning Against Attacks

Securing Federated Learning Against Attacks

A Communication-Efficient Approach for Byzantine-Resilient Optimization

CyBeR-0 introduces a novel zero-order optimization method that protects federated learning systems from Byzantine attacks while significantly reducing communication costs.

  • Provides robust aggregation for heterogeneous client data
  • Maintains stable performance while transmitting only a few scalars
  • Offers convergence guarantees for non-convex objectives
  • Successfully tested on standard learning tasks and LLM fine-tuning

This research is critical for security in distributed learning environments where malicious actors might compromise client nodes, enabling safer deployment of federated learning in sensitive applications.

Byzantine-Resilient Zero-Order Optimization for Communication-Efficient Heterogeneous Federated Learning

50 | 104