AI Glossary/Federated Learning

What Is Federated Learning?

Definition

Federated learning is a distributed machine learning approach where a model is trained across multiple decentralized devices or servers, each holding local data that never leaves its source, preserving data privacy while still producing a capable global model.

How Federated Learning Works

Traditional AI training requires centralizing all data in one place, which raises privacy, legal, and security concerns. Federated learning solves this by training local models on each device or institution's data, then only sharing the model updates (gradients) — not the raw data — with a central server. The server aggregates these updates to improve the global model, which is then sent back to all participants. This approach is crucial in healthcare (training on hospital data without sharing patient records), finance (learning from bank data without exposing transactions), and mobile devices (improving keyboard prediction without uploading what you type).

Real-World Examples

1

Apple training its predictive keyboard model across millions of iPhones without collecting users' typed text

2

Multiple hospitals collaboratively training an AI diagnostic model without sharing patient records across institutions

3

Banks training a fraud detection model across their networks without exposing individual transaction data to competitors

Recommended Tools

Related Terms