
Server Aggregation via Key-Homomorphic Masking for Secure Single-Server Aggregation
Explore the concept of Secure Single-Server Aggregation through key-homomorphic masking, addressing the issues of leakage in federated learning by focusing on client-server interactions, system assumptions, robustness, and privacy guarantees. Prior works and baseline comparisons are also discussed.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
LERNA: Secure Single-Server Aggregation via Key-Homomorphic Masking Hanjun Li 1 Huijia (Rachel) Lin 1 Antigoni Polychroniadou2 Stefano Tessaro1 1 University of Washington 2 J.P. Morgan AI Research
Motivation: Federated Learning [Bonawitz et al. 2019], [Kairouz et al. 2021] Local models still ?1 ?? leaked to server! Clients train local models ?1, ?? ?2 Server aggregates global model ?3 Aggregate(?1, ??) Can take many iterations
Abstraction: Single Server Aggregation [Bonawitz et al. 2017] ?? ?1 Many clients with inputs ?1, ?? ?? (high-dim integer vectors) ?2 ?3 ??= sum(?1, ??) Server learns ??, and nothing else. Clients only interact w/ server
This Work: Single Server Repeated Aggregation Clients with new inputs ?1 at iteration ? ? ? ?? ?1 ? ?? ?, ?? ? ?2 ? ? ?? 1, ?? ? ?3 ?? Same set of clients throughout Allows for a reusable setup round
System Model and Assumptions Assumption: < ? fraction dropout across all iteration < ? fraction corruption ? + ? < 1 Clients dropout Can happen at any point Can rejoin later Statically chosen by adversary Corrupted server w/ colluding clients Same corruption set throughout Statically chosen Assume PKI setup
Robustness and Privacy Guarantees Robustness: if a client drops out.. During 1st round: input not included in aggregation After 1st round: input included Privacy: Adv. learns.. a single sum of size 1 ? ? ?, and nothing else.
Prior Works Online Comm. ? ?? Server Comp. ?(?? ) ?(?? + ? ) ?(?2+ ? ) Client Comp. ?(? ) ? ?2 ?( + ???) Round 4 Threshold ? + 2? < 1 [BBGLR20] MicroFedML2? ?? LERNA ? + 2? < 1 3 ? ?? ? + ? < 1 3 Baseline: Repeat state-of-art (single-session) [BBGLR20] 4 rounds per aggregation (malicious) Leak multiple ?(log? + ?) size non-overlapping sums ?: # clients : Input dimension ?: modulus log ?,log? < ?
Prior Works Online Comm. ? ?? Server Comp. ?(?? ) ?(?? + ? ) ?(?2+ ? ) Client Comp. ?(? ) ? ?2 ?( + ???) Round 4 Threshold ? + 2? < 1 [BBGLR20] MicroFedML2? ?? LERNA ? + 2? < 1 3 ? ?? ? + ? < 1 3 MicroFedML [GPSBB22] Reusable setup + 3 rounds per aggregation Server computes DL Small modulus ? only ?: # clients : Input dimension ?: modulus log ?,log? < ?
This Work Online Comm. ? ?? Server Comp. ?(?? ) ?(?? + ? ) ?(?2+ ? ) Client Comp. ?(? ) ? ?2 ?( + ???) Round 4 Threshold ? + 2? < 1 [BBGLR20] MicroFedML2? ?? LERNA ? + 2? < 1 3 ? ?? ? + ? < 1 3 This work: LERNA Reusable setup + 3 rounds per aggregation Fast server computation ?: # clients : Input dimension ?: modulus log ?,log? < ?
Other Related Works SASH+[LCYFLL22] ACORN [BGLLMRY23] Improve computation over [BBGLR20] Semi-honest only More rounds Add input validation to [BBGLR20] Reduce computation w/ seed-homomorphic PRG Still 4 rounds
Strawman: w/ linear secret sharing Warm up to our actual protocol Inefficient as is Doesn t utilize setup round Based on linear secret sharing Different from pair-wise masking paradigm [Bonawitz et al. 2017, BBGLR20] Focus on semi-honest setting
Strawman: w/ linear secret sharing ?(?2 ) total communication! ? 1. Every client ?? secret shares one-time pad ?? w/ all other clients ?} { ??,? ?} { ?1,? ?} { ?2,? ??computes: { ??,? Encrypt ??,? ?? s public key Server forward encrypted shares ?} Share( ?? ? using ?) ?} { ??,?
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? ? ? ? ?? ?1 ? ?2 Server computes ??= ? ?? = ? ?? = ?? ? ?? ??computes ?? ?+ ??? ?+ ?? ? ?= ?? ?+ ?? ? ?
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? 3. Server sends online clients ? ? ? ? ? ?
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? 3. Server sends online clients ? Client ?? replies aggregated shares ??,? = ? ??,? ? ? ??,? ? ? ? ??,1 ? ??,2 ? ??,?
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? 3. Server sends online clients ? Client ?? replies aggregated shares ??,? 4. (Locally) server recovers aggregation result ?? ? ? ? = ? ??,? ? Server computes: ?? ?? ? ?= Recon ?= ?? ??,? ? ? ? ??
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? 3. Server sends online clients ? Client ?? replies aggregated shares ??,? 4. (Locally) server recovers aggregation result ?? ? ? ? = ? ??,? ? By linearity Server computes: ?? ?? ? ?= Recon ?= ?? ? ? ??,? = URecon = U ?? ??,? ? ? ? ?? ? ?
Strawman: w/ linear secret sharing ?(?2 ) total communication! ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? 3. Server sends online clients ? Client ?? replies aggregated shares ??,? 4. (Locally) server recovers aggregation result ?? ? ? ? = ? ??,? ? Main ingredient: key-homomorphic masking scheme to avoid secret sharing for every iteration ?
Key-homomorphic Masking (Simplified) Compute masked input: ? = Mask ?,? + ? Secret key ?, distinct tag ? Unmask result: ? = ? Mask( ?,?) Additive homomorphism: Mask ?1,? + Mask ?2,? = Mask ?1+ ?2,?
Strawman: w/ linear secret sharing ? 1. Every client ?? secret shares one-time pad ?? 2. Every client ?? sends masked input ?? ? ? ? ?? ?1 ? ?2 ? Server computes ??= ? ?? ?? ??computes ?? ?= ?? ?+ ?? ?
LERNA w/ Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? Setup: ? ? ? ?? ?1 ? ?2 ? Server computes ??= ? ?? = ?Mask ??,? + ?? ?? ??computes ?? ?= Mask( ??,?) + ?? ? ?
LERNA w/ Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? 2. Server sends online clients ? Client ?? replies ??,? Setup: ?= Mask ??,? + ?? ? ? ? ??,? ? ??,1 ? ??,2 ??computes ??,?= ? ??,? ??,? = Mask( ??,?,?) ? ??,? ?
LERNA w/ Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? 2. Server sends online clients ? Client replies ??,? 3. (Locally) server recovers aggregation result ?? Setup: ? ? ? By key- homomorphism Server computes: ?? ?? ? ?= Recon ?= ?? = UMask Recon = UMask ??,? ??,??,? ??,? ? ? ? ??
Key-homomorphic Masking from LWR [BLMR13] Compute masked input: ? = Mask ?,? + ? = ? ? ??+? Unmask result: ? = ? Mask( ?,?) LWR Assumption [BPR12]: ? ? ?, ? ?? ??, ?? ? ?, ? ? ?, and ? ? ?:
Key-homomorphic Masking from LWR [BLMR13] Compute masked input: ? = Mask ?,? + ? = ? ? ??+? Approx. Additive homomorphism: Mask ?1,? + Mask ?2,? = ? ? ?1 ?+ ? ? ?2 ? = ? ? ( ?1+ ?2)?+? = Mask ?1+ ?2,?+? Rounding error |?| 1 Unmask result: ? = ? Mask( ?,?) LWR Assumption [BPR12]: ? ? ?, ? ?? ??, ?? ? ?, ? ? ?, and ? ? ?:
LERNA w/ Approx. Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? 2. Server sends online clients ? Client replies ??,? 3. (Locally) server recovers aggregation result ?? Setup: ? ? ? |? |~coeff. of Recon! Server computes: ?? ?? ? ?= Recon ?+ ? = ?? ??,??,?+? ??,? ? ?? = UMask Recon ? ?
LERNA w/ Approx. Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? 2. Server sends online clients ? Client replies ??,? 3. (Locally) server recovers aggregation result ?? Setup: ? ? ? |? |~coeff. of Recon! Server computes: ?? ?? ? ?= Recon ?+ ? = ?? ??,??,?+? ??,? ? ?? = UMask Recon ? ?
LERNA w/ Approx. Key-homomorphic Masking Every client ?? secret shares masking key ?? 1. Every client ?? sends masked input ?? 2. Server sends online clients ? Client replies ??,? 3. (Locally) server recovers aggregation result ?? Setup: ? ? Flat Secret Sharing w/ 0-1 Coeff. ? |? |~coeff. of Recon! Server computes: ?? ?? ? ?= Recon ?+ ? = ?? ??,??,?+? ??,? ? ?? = UMask Recon ? ?
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) Setup: ?1,5 ?1,4 ?1,6 ?1,3 ?1,2 Comm. cost for setup is prohibitive (for large ?) ?1,? { ?1,?}
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) Setup: ?2,5 ?2,4 ?2,6 ?2,3 Comm. cost for setup is prohibitive (for large ?) { ?2,?} ?2,? ?2,1
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) Setup: ??,5 ??,4 ??,6 ??,3 ??,2 Comm. cost for setup is prohibitive (for large ?) ??,1 { ??,?}
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) [DT06] with rand. ?(?) committee: total share size ? ?5.3 Setup: ?1,5 ?1,4 ?1,6 Comm. cost for setup reduced! { ?1,?}
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) [DT06] with rand. ?(?) committee: total share size ? ?5.3 Setup: ?2,5 ?2,4 ?2,6 Comm. cost for setup reduced! { ?2,?}
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) [DT06] with rand. ?(?) committee: total share size ? ?5.3 Setup: ??,5 ??,4 ??,6 Comm. cost for setup reduced! { ??,?}
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) [DT06] with rand. ?(?) committee: total share size ? ?5.3 Introduces negl. failure probability Static corruption and dropout model
Flat Secret Sharing Existing scheme [DT06] has total share size ?(?5.3) [DT06] with rand. ?(?) committee: total share size ? ?5.3 Introduces negl. failure probability Static corruption and dropout model Our paper: ?(?2) committee w/ total share size ?(?2) Gap (const. fraction) between recon. and privacy thresholds Essentially the same scheme, with more careful analysis
LERNA Under LWR Approximate Key-homomorphic Masking LWR LERNA Protocol + Flat Linear Secret Sharing
LERNA Under DCR (See Paper) Key-homomorphic Masking* DCR LERNA Protocol + Integer Linear Secret Sharing * Require trusted setup to generate public parameters
LERNA: Recap Every client ?? secret shares masking key ?? w/ committee 1. Every client ?? sends masked input ?? 2. Server sends online clients ?, committee client replies ??,? 3. (Locally) server recovers aggregation result ?? Features: Low round complexity per aggregation Fast server computation (0-1 reconstruction of single mask) Lightweight non-committee clients Setup: ? ? ?
LERNA: Recap Every client ?? secret shares masking key ?? w/ committee 1. Every client ?? sends masked input ?? 2. Server sends online clients ?, committee client replies ??,? 3. (Locally) server recovers aggregation result ?? Setup: ? ? ? Bottleneck: committee clients Large storage cost (? shares of secret keys) Additional computation cost (aggregating ?(?) key shares)
Follow-up Work: Flamingo [MWAPR23] LERNA Flamingo 3 rounds Static corruption and dropout ? + ? < 1 Faster computation 3 rounds Static corruption and dropout ? + 2? < 1/3 Much smaller storage for committee Open question: Can we achieve the best of both worlds? Can we avoid static assumptions?