The main premise of federated learning (FL) is that machine learning model
updates are computed locally to preserve user data privacy. This approach
avoids by design user data to ever leave the perimeter of their
FLAME is a defense framework for Federated Learning that minimizes the required amount of noise by using model clustering and weight clipping to ensure the benign performance of the aggregated model while effectively eliminating adversarial backdoors.