Private Information Retrieval with Client Preprocessing: Construction and Bounds

pir with client side preprocessing information n.w
1 / 13
Embed
Share

Explore the concept of Private Information Retrieval (PIR) with client-side preprocessing, information-theoretic constructions, lower bounds, and various constructions under different assumptions like DDH, LWE, QR, and more. Discover the trade-offs between client efficiency, server efficiency, hint size, communication, and computation. Dive into known results, negative implications, positive outcomes, and open questions in the realm of PIR with client preprocessing.

  • Private Information Retrieval
  • Client Preprocessing
  • Information-Theoretic Constructions
  • Bounds
  • Communication

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. PIR with Client-Side Preprocessing: Information-Theoretic Constructions and Lower Bounds Yuval Ishai, Elaine Shi, Daniel Wichs

  2. Private Information Retrieval (PIR) [CGKS95,KO00] DB 0,1? ? [?] Retrieve DB[?] without revealing ?. Trivial: server sends entire DB. Goal: communication << ?.

  3. Private Information Retrieval (PIR) [CGKS95,KO00] Constructions under many assumptions (DDH, LWE, QR, ...). Much recent interest from theory and practice. Two downsides to PIR: 1. Server needs to read entire DB during each protocol execution 2. Requires public-key crypto. Relax the model by allowing preprocessing preprocessing!

  4. PIR with Client Preprocessing [CK20,CHK22,,ZPSZ24,] Preprocess DB 0,1? hint ? [?] PIR Protocols Retrieve DB[?] Client gets a small hint about DB in offline preprocessing. Streaming algorithm. Can run arbitrarily many PIR protocols (stateful). Client efficiency: sublinear hint size, communication Server efficiency: sublinear server computation

  5. PIR with Client Preprocessing [CK20,CHK22,,ZPSZ24,] Preprocess DB 0,1? hint ? [?] PIR Protocols Retrieve DB[?] Known results: Negative: Max(hint size, server comp)= Positive: ?( ?) hint size, communication, and computation from OWFs. Can reduce communication to polylog(N) under stronger assumptions (LWE). ? // client vs. server tradeoff

  6. Our Work Can we get PIR with client preprocessing information-theoretically? Negative Results: Max(hint size, communication) = Overcoming this implies OWFs ? // client efficiency (or even hard-on-average problems in SZK)* Positive Results: Scheme 1: ?(? Scheme 2: ?(? 1 2+? 1) hint, ?(? 2 3) hint, ?(? 1 2) comm, 1 2) comm/client comp, ?(? ?(?1+? 1) server/client comp. 2 3) server comp. Open Questions: Optimal i.t. scheme with ?(? Better computational schemes under weaker assumptions than PIR/PKE? 1 2) hint, ?(? 1 2) comm, ?(? 1 2) server/client comp?

  7. Scheme 1: optimal hint vs. communication Starting point: 2-server i.t. PIR scheme. [Dvir-Gopi 16]: communication ??(1) DB DB ????2(?2,??) ????1(?1,??) Each ?? individually hides ?. ?1 ?2 ?1 ?2 Can efficiently sample: ?1 $,?2 (????? ? |?1) query(?) ??[?] ????1(?1, ) is linear in DB

  8. Scheme 1: optimal hint vs. communication DB DB 0,1? DB Preprocess ????2(?2,??) ????1(?1,??) hint ?1 ?2 ? [?] ?1 ?2 DB[?] query(?) ??[?] 1,?1 1), ,(?1 1 2+? 1 can be computed via a streaming algorithm ?,?1?) } with ?1 $,?1 = ????1?1 ,?? for ? = hint:= { (?1 |hint| = ? PIR protocol for query : Send ?2 Receive ?2 ??(1) communication ?. During any epoch of L queries, server streams DB Client computes updated hint for next L queries. (????? ? |?1 . Recover DB[i]. ) 1 2 communication ?

  9. Scheme 2: Sublinear server computation 2 3) hint, ?(? 2 3) hint, ?(? 2 3) server comp. 2 3) server comp. 1 2) comm/client comp, ?(? 2 3) comm/client comp, ?(? Paper: ?(? Talk: ?(? Starting point: PIANO scheme from OWFs [ZPSZ24] 1 2) hint, ?(? 1 2) comm/client comp, ?(? 1 2) server comp. ?(? Component: IT-PIANO. Has hint = (R,P) with large random R of size ?(?) and small data-dependent P of size ?(? 1 2). Client queries only depend on R.

  10. Scheme 2: Sublinear server computation Rebalancing trick: Rebalancing trick: Convert IT-PIANO to scheme with small overall hint. 1. Parse DB as ? = ?1/3 sub-databases ??1,..,???of size ?2/3each. 2. Run IT-PIANO on all ?? in parallel with same R. 2 3). Hint size and comm/comp = ?(? DB1DB2DB3 Preprocess DB hint=(?,?1, ,??) j q query(j,R) For [?]: ? = ????(?,?? )

  11. Negative Result Max(hint size, communication) = Doing better gives mutual information amplification (MIA): Alice and Bob start with ? bits of mutual information, talk over public channel, agree on a shared key with Shannon pseudoentropy > ?. Implies OWFs. ? for i.t. constructions. ?? 0,1?, hint Preprocess ?? ? ??, ??? ??? < ?( ?) DB hint i Alice chooses a random ? chunk i uses PIR to get all the bits DB[chunk i] = shared key PIR for chunk i Transcript(chunk i) Transcript($). |Transcript| = ?(?) H(DB[chunk i] | Transcript($)) = ( ?) Alice Bob

  12. Negative Result Max(hint size, communication) = Doing better gives mutual information amplification (MIA): Alice and Bob start with ? bits of mutual information, talk over public channel, agree on a shared key with Shannon pseudoentropy > ?. Implies OWFs. ? for i.t. constructions. Above result is more general, allows server preprocessing / state. Rules out truly information-theoretic read-only ORAM with o( ?) efficiency. For 1-round / data-oblivious schemes, doing better gives hard-on-avg problems in SZK.

  13. Conclusion Can we get PIR with client preprocessing information-theoretically? Negative Results: Max(hint size, communication) = Overcoming the above implies OWFs. ? (*Or even hard-on-average problems in SZK) Positive Results: Scheme 1: ?(? Scheme 2: ?(? 1 2+? 1) hint, ?(? 2 3) hint, ?(? 1 2) comm, 1 2) comm/client comp, ?(? ?(?1+? 1) server/client comp. 2 3) server comp. Open Questions: Optimal i.t. scheme with ?(? Better computational scheme under weaker assumptions than PIR/PKE? 1 2) hint, ?(? 1 2) comm, ?(? 1 2) server/client comp?

More Related Content