
Time-Bounded Kolmogorov Complexity and One-Way Functions
Explore the concept of time-bounded Kolmogorov complexity, one-way functions, and their relevance in complexity theory and cryptography. Understand the definitions, properties, and significance of one-way functions in various cryptographic applications like private-key encryption and digital signatures. Delve into the relationship between Kolmogorov complexity and cryptographic constructs such as pseudorandom generators and commitment schemes, and the implications for the existence of one-way functions.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Slides by Zhenjian A new notion of time-bounded Kolmogorov complexity One-Way Functions and pKt Complexity Shuichi Hirahara Igor C. Oliveira Zhenjian Lu National Institute of Informatics University of Warwick University of Warwick
One-Way Functions Easy ? ?(?) Hard
One-Way Functions Uniform adversary Definition (One-Way Functions [Diffie-Hellman 76]): A family of efficient computable function ? = ??: 0,1? 0,1? one-way if for every probabilistic polynomial-time algorithm ? and every ? , ? is 1 1??(?) ?? ? ??? ?? ??(1) ? 0,1? ? ? is infinitely-often (i.o.) one-way if the above holds for infinitely many ?
One-Way Functions One-way functions are both sufficient and necessary for Pseudorandom functions [GGM84] Private-key encryption [GM84] Digital signitures [Rompel90] Zero-knowledge proofs [GMW89] Authentication schemes [FS90] Commitment schemes [Naor90] Pseudorandom generators [HILL99]... Whether one-way functions exist is the most important question in cryptography.
Meta-Complexity Complexity of Complexity Complexity Theory & Cryptogrphy versus Kolmogorov Complexity Theory NP vs BPP Theory of Kolmogorov Complexity DistNP vs AvgBPP Existence of OWFs Why? Import methods and perspectives from Kolmogorov complexity in order to investigate the main open problems from complexity theory and cryptography, etc.
Kolmogorov Complexity string ? shortest encoding of ? 01001101 0111010111 010001110 Definition (Kolmogorov Complexity): K ? = minimum length of a program ? {0,1} such that ? outputs ?
Kolmogorov Complexity string ? shortest encoding of ? 01001101 0111010111 010001110 Definition (Kolmogorov Complexity): K ? = minimum length of a program ? {0,1} such that ? outputs ? Definition (Conditional Kolmogorov Complexity): K ? | ? = minimum length of a program ? {0,1} such that ? outputs ?given access to ?
Time-Bounded Kolmogorov Complexity Two notions of time-bounded K. complexity: Fixed-Time-Bound and Levin s notion. Definition (?-time-bounded Kolmogorov complexity): K?? =min ? {0,1} |?| ? outputs ? within ? steps Definition (Levin s time-bounded Kolmogorov complexity): Kt ? =min ? {0,1} |?| + logTIME(?):? outputs ?
One-Way Functions and Average-Case Hardness of Time-Bounded Kolmogorov Complexity
One-Way Functions and Average-Case Hardness of Kt Theorem [Liu-Pass 21]: Kt is hard to compute on average over the uniform distribution. (i.o.) one-way functions
Average-Case Complexity Easy on average over a family of distributions ? = ?? ? means: For every polynomial ?( ), there is a probabilistic polynomial-time algorithm such that for every ? , the algorithm succeeds with probability at least 1 1/?(?) over an input ? ~ ??. Two-Sided Error Zero-Error The algorithm outputs the correct answer for almost all ? ~ ?. The algorithm outputs the correct answer for almost all ? ~ ?. For the other?, the algorithm can outputa wrong answer. For the other?, the algorithm outputs .
Average-Case Complexity Easy on average over a family of distributions ? = ?? ? means: For every polynomial ?( ), there is a probabilistic polynomial-time algorithm such that for every ? , the algorithm succeeds with probability at least 1 1/?(?) over an input ? ~ ??. Two-Sided Error (error-prone) Zero-Error (errorless) The algorithm outputs the correct answer for almost all ? ~ ?. The algorithm outputs the correct answer for almost all ? ~ ?. For the other?, the algorithm can outputa wrong answer. For the other?, the algorithm outputs .
One-Way Functions and Average-Case Hardness of Kt Theorem [Liu-Pass 21]: Kt is hard to compute on average with two- sided error over the uniform distribution. (i.o.) one-way functions
One-Way Functions and Average-Case Hardness of Kt Theorem [Liu-Pass 21]: Kt is hard to compute on average with two- sided error over the uniform distribution. (i.o.) one-way functions State-of-the-art: Kt cannot be computed in determinsitic linear time in the worst case [Brandt 24].
One-Way Functions and Average-Case Hardness of Kt Theorem [Liu-Pass 21]: Kt is hard to compute on average with two- sided error over the uniform distribution. (i.o.) one-way functions Theorem [Liu-Pass 21, ABKvanMR 02]: Kt is hard to compute on average with zero error over the uniform distribution. ??? ???
One-Way Functions and Average-Case Hardness of Kt Corollary [Liu-Pass 21]: (i.o.) one-way functions Kt is hard to compute on average with two- sided error over the uniform distribution. Kt is hard to compute on average with zero error over the uniform distribution. ??? ???
One-Way Functions and Average-Case Hardness of Kt Corollary [Liu-Pass 21]: (i.o.) one-way functions Kt is hard to compute on average with two- sided error over the uniform distribution. Kt is hard to compute on average with zero error over the uniform distribution. ??? ??? If there is a reduction from computing Kt on average with two-sided error to that of zero error, then one can base OWFs on ??? ???.
Probabilistic Kolmogorov Complexity Definition (pKt): ?~ ?,???Kt(? | ?) ? 2 pKt ? = min ? ?? 3 Intuition: pKt can be viewed as Kt in the presence of a random string .
One-Way Functions and Average-Case Hardness of pKt Theorem [This work]: pKt is hard to approximate on average with two- sided error over some samplable distribution. (i.o.) one-way functions
One-Way Functions and Average-Case Hardness of pKt Theorem [This work]: pKt is hard to approximate on average with two- sided error over some samplable distribution. (i.o.) one-way functions pKt is hard to approximate on average with zero error over the uniform distribution. We can show this unconditionally
One-Way Functions and Average-Case Hardness of pKt Corollary [This Work]: pKt is hard to approximate on average with two- sided error over some samplable distribution. (i.o.) one-way functions pKt is hard to approximate on average with zero error over the uniform distribution.
One-Way Functions and Average-Case Hardness of pKt Corollary [This Work]: pKt is hard to approximate on average with two- sided error over some samplable distribution. (i.o.) one-way functions pKt is hard to approximate on average with zero error over the uniform distribution. If there is a reduction from approximating pKt on average with two-sided error to that with zero error, then OWFs exist.
One-Way Functions and Time-Bounded Symmetry of Information
Symmetry of Information K ?,? K ? + K ? | ?
Symmetry of Information K ?,? K ? + K ? | ? Symmetry of information for time-unbounded Kolmogorov complexity: K ?,? K ? + K ? | ?
Symmetry of Information K ?,? K ? + K ? | ? Symmetry of information for time-unbounded Kolmogorov complexity: K ?,? K ? + K ? | ? K ? + K ? | ? K ?,? K ? + K ? | ? in Shannon s information theory H ? + H ? | ? = H ? + H ? | ?
Symmetry of Information Symmetry of information for time-unbounded Kolmogorov complexity: K ?,? K ? + K ? | ? Does symmetry of information hold in the time-bounded setting, w.r.t. K?? K??,? Kpoly ?? + Kpoly ?? | ? ?(log?)
One-Way Functions and Symmetry of Information 1 ?,? ??pKt ?,? pKt y + pKt ? ? for some poly-time samplable distribution ? Pr poly ? Theorem [This work]: Symmetry of informaition fails w.r.t. pKt in the average case one-way functions Cf. A similar characterization for pKt by[H.-Ilango-Lu-Nanashima-Oliveira 23].
One-Way Functions and Symmetry of Information 1 ?,? ??pKt ?,? pKt y + pKt ? ? for some poly-time samplable distribution ? Pr poly ? Theorem [This work]: Symmetry of informaition fails w.r.t. pKt in the average case one-way functions Cf. A similar characterization for pKt by[H.-Ilango-Lu-Nanashima-Oliveira 23]. Symmetry of informaition fails w.r.t. pKt in the worst case ????? (?.?.)?/???? Cf. Symmetry of information fails w.r.t. Kt in the worst case [Ronneburger 04].
One-Way Functions and Symmetry of Information Corollary [This work]: one-way functions Symmetry of informaition fails w.r.t. pKt in the average case Symmetry of informaition fails w.r.t. pKt in the worst case ????? (?.?.)?/???? If failure of SoI w.r.t. pKt in the worst case implies that in the average case (i.e., the existence of one asymmetric pair implies many asymmetric pairs), then one can base OWFs on ????? (?.?.)?/????.
Open Problems Can we close the gap between two-sized and zero error average-case hardness for approximating pKt? Can we show that failure of SoI w.r.t. pKt in the worst case implies that in the average case? Can we refute SoI w.r.t. to pKt unconditoinally, without the assumption that ????? ?/?????