0% found this document useful (0 votes)
150 views13 pages

Pseudo-Random Generation From One-Way Functions (Extended Abstract)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
150 views13 pages

Pseudo-Random Generation From One-Way Functions (Extended Abstract)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Pseudo-random generation from

one-way functions
(Extended Abstract)

Russell Impagliazzo’ Leonid A. Levint


Department of Mathematics Computer Science Department
U. C. Berkeley Boston University
Russe1liQerni.e. berkeley.edu LndQbu-cs.bu.edu
Michael Luby:
International Computer Science Institute
Berkeley, California
Luby@icsi.berkeley.edu

Abstract secure private key encryption protocols ([Goldwasser,


Micali 821, [Gold. leich, Goldwasser, Micali S4], [Luby,
We show that the existence of one-way functions is Rackoff SG]). [Gold reich, Micali, Wigderson SG] shows
necessary and sufficient for the existence of pseudo- that any problem in NP has a zero-knowledge proof
random generators in the following sense. Let f be system if bit commitment is possible, and [Naor SS]
an easily computable function such that when x is shows how to construct a bit commitment protocol
chosen randomly: (1) from f(x) it is hard to recover based on a pseudo-random generator. [Yao S2] shows
an X’ with f(~‘) = f(x) by a small circuit, or; (2) f that the existence of pseudo-random generators im-
has small degeneracy a.nd from f(z) it is hard to re- plies that BPP c DTinze(2ne) for every 6 > 0.
cover 2 by a fast algorithm. From one-way functions On the other hand, there are many natural prob-
of type (1) or (2) we show how to construct pseudo- lems that a.re conjectured to be one-way functions,
random generators secure against small circuits or whereas it is hard to think of a natural example
fast algorithms, respectively, and vice-versa. Pre- of a conjectured (perfect) pseudo-random generator.
vious results show how to construct pseudo-random Thus, it is desirable to convert what seems to arise
generators from one-way functions that have special natura.lly (one-wa.y functions) into a valuable com-
properties ([El urn, Micali 821, [Yao 821, [Levin 851, modity (a pseudo-random generator).
[Goldreich, Krawczyk, Luby SS]). The first construction of a pseudo-random genera-
We use the results of [Goldreich, Levin 89] in an tor [Blum, Micah S2] is based on the intracta,bility of
essential way. the discrete log problem. pa.0 821 generalizes this by
showing tha.t a pseudo-random generator can be con-
structed from any one-way permutation. [Levin S5]
1 Introduction shows that the existence of functions that are one-way
on a quadratic number of iterates is necessary and
One of the basic primitives in cryptography and other sufficient for the existence of pseudo-random genera.-
areas of computer science is a pseudo-random genera- tors. [Goldreich, Krawczyk, Luby SS] show that any
tor. A pseudo-random generator can be used to build one-way function for which the preimage sizes of all
elements in the range are roughly equal is sufficient.
*Research partially supported by NSF grant CCR 88-13632 (The actual condition is slightly weaker.)
tSupported by NSF grant DCR-8607492
:On leave of absence from the University of Iloronto, re- We show that the existence of one-way functions is
search partially supported by NSERC operating grant A8092 necessa.ry and suficient for the existence of pseudo-
Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage,
the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is be permission of the Association for
Computing Machinery. To copy otherwise, or to republish, requires a fee and/or specific permission.
@ 1989 ACM 0-89791-307-8/89/0005/0012 $1.50
12
random generators in the following sense. Let f be an Definition (uniform and non-uniform) : A uni-
easily computable function such that when z is chosen form algorithm is a probabilistic Turing machine.
randomly: (1) from f(z) it is hard to recover an z’ The time bound T(n) for a uniform algorithm is the
with f(z’) = f(z) by a small circuit, or; (2) f has maximum running time on inputs of length n. A non-
small degeneracy and from f(z) it is hard to recover uniform algorithm is a pair of Turing machines A and
z by a fast algorithm. From one-way functions of M. A is a preprocessing algorithm that on input of
type (1) or (2) we show how to construct pseudo- length n produces a string A(n). The running time of
random generators secure against small circuits or A(n) is not limited. Atgorithm M accepts as input x
fast algorithms, respectively, and vice-versa. and A(lxl). The time bound T(n) for a non-uniform
algorithm is maximum running time of 14 over a.11
Notation : Let x and y be bit strings. Then, 121 is inputs x and A(jxj). r+rom the results of [Karp, Lip-
the length of x, x o y is the concatenation of x and y, ton 801, a non-uniform algorithm with time bound in
xi is the jth bit of 2 and c t i is the first i bits of x. resource class R is equivalent to a (recursive) family
If (3~is a number, then (Q( is the absolute value of Q. of circuits with the size of the circuit for inputs of
Let CCand y be two equal length bit strings. x @ y is length n bounded by a function r(n) where 1’ is in
the inner product mod 2 of CCand y and x $ y is the resource class R.
vector sum mod 2 (i.e. bitwise parity) of x and y.
Definition (feasible) : A uniform (non-uniform)
algorithm is feasible with respect to resource class R
1.1 Security if the time bound function is in R.
In this paper we consider both uniform and non- Definition (negligible) : A function p : N - N is
uniform models of security. The difference between negligible with respect to resource class R if for all
the two models of security is that, in the uniform T E R, for almost all n, p(n) 5 l/r(n).
model, the adversary is a fast algorithm, whereas,
in the non-uniform model, the adversary is a small For the remainder of the paper, unless stated oth-
circuit. At first glance, the non-uniform notion of erwise, a feasible a.dversary a.lgorithm is always with
security seems too strong of a requirement to place respect to an arbitrary but fixed resource class R.
on a cryptographic protocol. However, a protocol For each cryptographic task that we define, there are
that is only secure in the uniform sense is suscep- implicitly two definitions being made simulta.neously,
tible to the following type of attack. The time al- one with respect to the uniform and the other with
lowed for computation by an adversary before the respect to the non-uniform model of security. Unless
protocol begins may be much greater than the al- otherwise stated, each definition, lemma, proposition
lowable time during the protocol. The result of the and theorem has two versions, one in each model.
preprocessing can then be used by the adversary to
break the protocol within the allowable time. Secu-
rity in the non-uniform model is equivalent to im- 1.2 One-way functions
munity from this type of attack (see [Karp, Lipton
Intuitively, a. function f is one-zuay if it is easy to
SO]). Also, the existence of a pseudo-random gener- compute but hard to invert, i.e. given z the value of
ator with non-uniform security is used to prove that
f(z) can be computed in polynomial-time but every
BPP c DTime(2n’) for every 6 > 0 [Yao 821. feasible algorithm that receives as input f(z) (when
Definition (resources) : A resource class R is class z is a randomly chosen string of length n) can out-
of functions from N to N that includes the identity put a y such that f(y) = f(z) with only negligible
function I = 7~. If r’(n) 5 r(n) E R then r’(n’) + probability. It ha.s not yet been proven that one-way
1 E R. Finally, log,(r(n)) < n is monotone. functions exist (if P = NP then they certainly do
not exist, but even if P # NP it is not clear if they
We use resource classes to parameterize the re- exist), but there a.re many examples of functions that
sources allowed for adversaries in breaking one-way seem to be one-way in practice and that are conjec-
functions, pseudo-random generators and other cryp- tured to be provably one-wa.y. Some examples of con-
tographic tasks. For example, R might be all polyno- jectured one-way functions a.re factoring a composite
mial bounded functions or all functions bounded by number N that is the product of large randomly cho-
2°(‘05Cn) for some constant c > 1. sen primes, square roots modulo such an N, discrete

13
log modulo a large randomly chosen prime, problems Proposition : If there is a polynomial-samplable D
from coding theory aad the subset sum problem. such that f is one-way on D then there is a one-way
function g.
Notation (functions and probability ensem-
bles) : A leng2h (funclion) r(n) is a monotone in- The function g is simply the composition of f and
creasing function from JV to N such that r(n) is com- the polynomial-time computable sampling function
putable in time polynomial in n. A function f with for D. This proposition allows us to state most of our
input length m(n) and output length r(n) specifies for results in terms of one-way functions on the uniform
each 72 E N a function fn : (0, l}m(n) -+ (0, l}‘(“). ensemble as opposed to other probability ensembles.
For simplicity, we write f(z) in place of fn(z). We
Notation (copies of functions and ensembles) :
say that f is polynomial-tim.e compu2able if there
Let q(n) be a length function. Let Dq be the ensemble
is a polynomial-time Turing machine that on input
with length 11. q(n) such that 0: is the distribution
2 E (0, l}‘(“) computes f(z). A pTobability ensemble
obtained by independently sampling q(n) times from
D with length m( n ) assigns to each positive integer n
D, and concatenating the results. Similarly, let fq
a probability distribution D, on bit strings of length
be the function with input and output lengths n+q(n)
m(n). The anifo~rn ensemble U assigns to each posi-
and 1(7z) . q(n), respectively, given by:
tive integer 71 the uniform probability distribution U,
on strings of length n. For X c (0, l}“(“), D[X] is fY(q 0.. -0 Xq(n)) = f(Q) 0 *m-Om&r
the sum over all z E X of the probability of x with
respect to D,,. We use the notation 2 ED (0, l}n(n) where 21,. . . , “q(n) E (0, l)‘x.
to mean that x is ra.ndomly chosen from (0, l}nz(n)
according to D,. We say that D is polynomial- The following is implicitly used in [Yao S2].
sanzplable if there is a polynomial-time Turing ma- Proposition 1.1 (somewhat one-way + one-
chine M wit,11 input length It(n) and output length way) : If f is somewhat one-way on D with asso-
m(n) such that, for each n E iv, M(z) ED (0, l}m(n) ciated constant c > 0, then fq is one-way on Dq,
when 2 EU (0, l}k(n). Define f(D) to be the proba-
where q(n) = ncfl.
bility ensemble with length I(n), where f(&) is the
probability distribution defined by the random vari-
able f(z) when x ED (0, l}m(n). For random variable 1.3 Pseudo-random generators
X defined with respect to Q,, Exp[X] is the expected Informally, a polynomial-time computable function f
value of X. 13 is used for probability. is pseudo-random if f( 2 ) is strictly longer than z and
if every feasible algorithm can distinguish f(x) from
Note : Hereafter, unless stated otherwise, f is always
a truly random string of the same length (when 2
a polynomial-time computable function with input
is chosen randomly) with only negligible probability.
length 11 aud output length i(n) and D and E are
Intuitively, f(z) “looks” just like a random string to
a.lways proba.bility ensembles with length n.
any feasible algorithm, even though it is generated
Definition (one-way function) : We say that f is from a string 2 that is strictly shorter. This intuition
is captured in the following definition of [Blum, Micali
weakly one-luay on U if, for every feasible algorithm
S2], [Yao S2].
M, the inverting probability .Pr[z = M(f(z))] when
x EJJ (0, l}” is negligible. We say that f is some- Definition (prg) : f is a. genernlorif I(n) > n for a.11
what oibe-wtly 01~ n if, for some constant c > 0, for n E N. The distinguishing p?sobability p(n) of algo-
every feasible algorithm M, the inverting probabil- rithm M for f is ] Pr[M(f(x)) = 13 - Pr[M(y) = l]]
ity l’r[f(zz) = f(~(f(x)))] when 2 ED {O,l>” is at when 2 EU (0, l}” and y Err (0, l}‘tn). We sa.y that
most 1 - 1/7~~. We say that f is one-way on D if, for f is pseudo-random if every fea.sible algorithm ha.s
every feasible algorithm M, the inverting probability negligible distinguishing probability for f.
Pr[j(z) = f’(ng(f(z)))] when z ED (0, l}n is negli-
gible. We sa.y that f is o?Le-way if f is one-way on The normal definition of a pseudo-random gener-
u. ator insists that the generator can stretch the input
by any polynomial amount. The following shows the
If a function is one-lvay then it is both weakly one- definition above is equivalent, and follows from [Gol-
way and somewhat one-way. dreich, Goldwasser, Micali Sri].

14
Proposition 1.2 (polynomial stretching) : If an argument similar to that used in [Goldreich, Gold-
there is a pseudo-random generator f then, for every wasser, Micali 841. We claim that f is one-way on U.
constant c > 1, there is a pseudo-random generator Let J1 be any feasible algorithm that has inverting
with input length 72and output length 72’. probability p(n) for f on U that is non-negligible.
Then, we can use M to distinguish between f(Un)
We can now state one of our main theorems. and Uz,, with probability at least p(n) - 1/2n. The
THEOREM A (one-way -+ prg) : If there is a distinguisher simply outputs 1 on input y E {0, l}2n if
one-way function in the non-uniform model of secu- f(M(y)) = y. The probability that for y EU (0, 1}2n
rity then there is a pseudo-random generator in the there exists an x E {O,l}” with f(x) = y is at most
non-uniform model of security. l/an, from which the claim follows. Any feasible al-
gorithm to invert f on D must either distinguish D
The proof of Theorem A can be found in Section from U or invert f on U. Since both of these are
5. The following definitions are necessary to state our impossible from the above, f is one-way on D.
next main theorem. The following definition is from
[Shannon]. (+-) Use Proposition 1.4, Proposition 1.5 and Theo-
rem C below. 0
Definition (Shannon entropy) : The (Shannon)
entropy of D, is given by Let v(n) be any function in the resource class.
Theorem B rema.ins true if: (1) in the --+ direc-
Ent(Dn) = - c D[{x)l . ~og(DC{~l]). tion of the theorem, O(1) degeneracy is replaced
ZE{O,l)” with I/r(n) and (a.s in the proof) “weakly one-way”
is replaced with “one-way”; (2) in the +- direction
The elblropy function E&(D) assigns to each n E of the theorem, O(1) degeneracy is replaced with
N the value Ent(D,). For a function f, we call lod+N. h(+)> cannot be replaced with any-
Ent(f(D)) the (Sh annon) enkopy of f on D and thing asymptotically bigger for the following reason.
Em!(D) - Ent(f(D)) the degeneracy off on D. No algorithm on input f(x) can output y such that
y = x with proba.bility greater than l/If-‘(x)] when
THEOREM B : The following are equivalent:
2 Eu {O,l}“. Let s(n) be such that for a.111’ E R,
l There is a pseudorandom generator linifl-+03 {r(n)/s(n)} = 0. Then, f(z o y) = 2, where
(y] = log(s((z()), has degeneracy log(s(lz)) and is
l There is a polynomial-samplable D and f so that weakly one-way. On the other hand, f is useless for
f is weakly one-way and has degeneracy 0( 1) on constructing a pseudo-ra.ndom generator.
D. We now define a type of one-way function whose ex-
istence is equivalent to that of pseudo-random gener-
: (-)
ators. This characterization is useful in proving The-
Proof Without loss of generality, let f be a
orem B.
pseudo-random generator such that on input length
n the output length is 2n (see Proposition 1.2). Let Definition (hidden bit) : A bit (function) 6 for f
g(z) = f(x) 1‘ IL] and let gi(z) be the ith iterate of g is a poiynoniial-time con~puta.ble function with input
on 2, i.e. go(x) = 2 and gi+i(z) = g(gi(x)). Let D, length n that outputs a single bit. The dis1inguishin.g
be the distribution given by gi(z) when 2 EU (0, l}”
probability p(lz) of algorithm A4 for f and b on D
and i EU {O,...,n}. Note that D is polynomial- is [Pr[fV(f(c)) = b(z)] - Pr[n/f(f(2) # b(~)]] when
samplable. The degeneracy of g on D is the av- 6 is hidden for f on D if every feasible
x ED {O,l}“.
erage over i of the degeneracy of g on gi(z) when a.lgorithm has negligible distinguishing probabilit,y for
x Eu (0, I}“, which is at most (n - 0)/n = 1. The f on D. G is hidden for f if it is hidden for f on U.
degeneracy of f on D is thus at most 1, because it is
at most the degeneracy of g on D. Notation (inner product bit) : Let x, 1~E (0, 1) n.
We now show that f is one-way on D. Since f is Let f’(~ o y) = f(x) o y and let DA = D, o U,,. It is
a pseudo-random generator, every feasible algorithm easy to see that if f is one-way on D then f’ is one-
has negligible probability of distinguishing D from way on D’. The irllrer product bit is 6(x o y) = x @ y.
Li (as in the definition of pseudo-random generator,
where D takes the role of f(x)). This follows using The following is from [Goldreich, Levin 801.

15
Proposition 1.3 (weakly one-way -+ inner is meaningful for f’ on D’. 0
product bitis hidden) : Assume that f is weakly
Proposition 1.5 : Let D be polynomial-samplable
one-way on n. Let f’, D’ and bit b for f’ be de-
fined in terms off and .D as in. the inner product bit and let bit b be both hidden and meaningful for f on
D. Then there is polynomial-time computable func-
definition. Then, b is hidden for f’ on D’.
tion f’ and a bit b’ for f’ such that b’ is both hidden
The idea of a function that hides a bit was in- and meaningful for f’.
troduced in the original construction of a pseudo-
Proof Sketch : Let M be the sampling algorithm
random generator [Blum, Micali 821 and has been
for D. The construction is to let f’(z) = f(M(x))
central to all such constructions since that time.
and let b’(z) = b(M(z)). 0
Proposition 1.3 presents an elegant, simple and gen-
eral method of obtaining a hidden bit from a one-way
function. However, naive use of Proposition 1.3 with
an arbitrary one-way function has difficulties because 2 Background
of the following. Let f be any one-way function. De-
fine one-way function f’ as f’(z o y) = f(x), where Definition (statistically indistinguishable and
1~1= 1x1. From Proposition 1.3: f”(xoyor) = f(z)or, quasi-random) : D, and E, are statistically in-
where Irl = Iz o yl, hides the inner product bit. How- distinguishable within 6 if for every X c (0, l>“,
ever, for cryptographic purposes this is useless, as the lD[X] - E[X] 1 < 6. D, is quasi-random within b
hidden bit is informationally impossible to recover. if D, is statistically indistinguishable within 5 from
The hidden bit is only cryptographically useful if it is u 72.
also meaningfulas defined below. (We later show how
to use Proposition 1.3 to extract hidden and mean- We need to use a variant definition of entropy used
ingful bits from one-way functions.) in [Chor, Goldreich 851.

Definition (meaningful bit) : Let b be a bit for Definition (min-entropy) : The min-entropy of
f. An unbounded adversary M for b and f is an & is mkE{o,l}n(- log(D[{~)l)).
algorithm with unbounded time and space resources.
Intuitively, if a distribution has min-entropy k, it
The distinguishing probability p(n) of algorithm M
is “at least a.s random” as the uniform distribution
on D is I WM(f (x)) = b(x)1 - Pr[M(f (x)) # b(~>ll on k bit strings. There are distributions that have
when z ED (0, l}n. b is meaningful for f on D if there
arbitrarily large entropy but have only one bit of min-
is a constant c > 0 and an unbounded adversary A4
entropy.
such that p(n) 1 l/nc. The bit b is meaningful for f
Some of the definitions given in this subsection are
if it is meaningful for f on U.
computational analogues of the statistical definitions
Theorem C (hidden and meaningful -+ prg) : given in the previous subsection. The following def-
If there is a bit b that is both .hidden and meaningful inition appears in [Goldwasser, Micali 821, [Yao 821
for f then there is a pseudo-random generator. and [Goldwasser, Micali, Rackoff 851.

The proof of Theorem C can be found in Section Definition (camp. indistinguishable) : The dis-
4. tinguishing probability function p(n) of algorithm A4
for D and E is ) Pr[dB(z) = l] - Pr[A4(z’) = 111when
Proposition 1.4 : Suppose that D is polynomial- 2 EI> (0, l}” and 2’ EE (0, l}n. D is computation-
samplable and that f is weakly one-way and has de- ally indistinguishable from E if fea.sible algorithm has
generacy O( 1) on D. Then there is a polynomial-time negligible distinguishing probability.
computable f’ and a polynomial-samplable D’ and a
bit b for f’ such that b is both meaningful and hidden The following two propositions are the crucial point
for f’ on D’. in this pa.per where constructions involving the uni-
form and non-uniform models of security diverge. Al-
Proof Sketch : Construct f’, D’ and b from f and though analogs of each other, these two propositions
D as in the definition of the inner product bit. Then, have a subtle difference in a.ddition to the notions
by Proposition 1.3, b is hidden for f’ on D’. Because of security involved. Both say, intuitively, that, if
f has degeneracy 0( 1) on D, it ca.n be shown that b two ensembles a.re computationally indistinguishable,

16
then many samples from one ensemble are indistin- tor of length m. Then, h(z) = (M . X) @ b. We can
guishable from many samples of the other. If the en- choose h EU H,,, by choosing h EU {O,l}(“+‘)“.
sembles in question are both polynomial-samplable, Hereafter, whenever we refer to a family or system of
then Proposition 2.3 says this is true with respect to hash functions, we mean the family defined here.
both models of security. However, Proposition 2.4
says that in the non-uniform model it is also true for
arbitrary ensembles. The reason for this difference 3 Combinatorial Lemmas
is as follows. In the uniform model of security, re-
ceiving several samples from an ensemble that is not Due to its importance in such basic algorithms as
polynomial-samplable might give an adversary infor- primality testing, randomness has become an inter-
mation that it could not compute itself. This ex- esting computational resource in its own right. Re-
tra information might allow the adversary to be able cently, various studies for extracting good random
to distinguish between the two ensembles. However, bits from biased “slightly-random” sources that nev-
this kind of information can never be helpful to a ertheless possess a certa.in amount of entropy have
non-uniform adversary, since it is succinctly describ- been made; these sources model the imperfect phys-
able (being extracted from a polynomial number of ical sources of randomness, such as Geiger counter
polynomial length samples) and hence could be in- noise and Zener diodes, that would have to actually
cluded in the non-uniform “advice” string. This dis- be utilized in real life. (See [Blum 841, [Santha, Vazi-
tinction has repercussions throughout the paper and rani 841, [Vazirani 851, [Vazirani, Vazirani 851, [Chor,
is ultimately the reason why our proof of Theorem A Goldreich 851 [McInnes 871.)
does not hold in the uniform model of security. The The following lemma is very useful in many of our
following propositions appear in [Goldwasser, Micali, constructions of various kinds of one-way functions
Rackoff 851. Let q(n) be a length function. and pseudo-random generators. However, it is prob-
ably best thought of as a result in the theory of
Proposition 2.3 : If D and E are polynomial-
slightly-randomness, rather than cryptography. In-
samplable probability ensembles that are computa
tuitively, it can be thought of as a method for ex-
tionally indistinguishable (in either model of secu-
tracting “good” random bits from a slightly-random
rity). then 04 and Eq are computationally indistin-
source using real random bits as a “catalyst”. In
guishable (in the same model of security as before).
more detail, the va.rious components of the lemma
Proposition 2.4 : If D and E are arbitrary prob- should be interpreted as follows. Suppose we have a
ability ensembles that are computationally indistin- slightly-random source that yields a distribution on
guishable in the non-uniform model of security then strings of length n with min-entropy greater than m.
Dq and EQ are computationally indistinguishable in A fair coin is used to generate a random hash func-
the non-uniform model of security. tion mapping n bits to m - 4e bits, where e is a small
The concept of a universal hash function, intro- integer. We then sample from the slightly-random
duced in [Carter, Wegman], has proved to have far source and apply our hash function to the result. The
reaching and a broad spectrum of applications in the lemma states that the resulting bits are essentially
theory of computation. randomly distributed a.nd almost uncorrelated with
the bits used to genera.te the hash function. Thus,
Definitiou (hash functions) : Let H,,, be a fam- we have managed to convert almost all the entropy
ily of functions from n bit strings to m bit strings. of the slightly-random source into uniform random
We say H,,, is a family of pairwise independent uni- bits while maintaining our original supply of uniform
versal hash functions if, for all 2, y E (0, l}“, 2 # y, random bits. Previously, [McInnes 871 proved a re-
h(z) o h(y) EU {0, l}‘, when h EU H,,,. A system lated lemma.
of hash functions consists of one such family for all In this extended abstract, we prove a version of this
pairs n and m. The following system of pairwise in- lemma that suffices for the purposes of subsequent
dependent universal hash functions has several nice constructions. Many generalizations are possible, in-
properties. Let H,,,, be the set of all m by n + 1 cluding much weaker restrictions on the hash func-
matrices over the field with two elements. We think tions used, and the substitution of Renyi entropy for
of a hash function from this system as h = (Af, b), min-entropy (see [Renyi 701). These generalizations
where M is an m by n bit matrix and b is a bit vec- will appear in the full pa.per; some of them can be

17
used to make our constructions more efficient. l The min-entropy of E, is at least k(n) .
Erd(D,) - 72- k(n)2/3 - k(n). 2-“.
Lemma 3.1 (smoothing min-entropy) : Let D,
have min-entropy at least m and let I = m - de. l En is st,atistically indistinguishable from Dk
Then the distribution It o h(z)., where h EU H,,, and within 2-L(n)’ + It(n) . 2+ for a fixed c > 0.
2 ED (0, l}“, is quasi-random within 6 = 3/2”.
Proof Sketch : Let Y & (0, l}n be the set of ele-
Proof : We need to show that 710statistical test can
ments with probability at least, 2-2n with respect to
distinguish between h. o h(z) and a random string of
D, and let Y’ = (0, l}n - Y. Let 0; be the dis-
length IhI f 1 with probability greater than 6. Each
tribution on (0, l}n described as: (1) for all 2 E Y,
h E II,,{ partitions (0, l}” into 2’ equivalence classes,
D’[{x:)] = D[(z}]/D[Y]; (2) for all z E Y ‘, D’[{z}] =
where class i is Xi(h) = (x CE{O,l}“ : h(z) = i}.
0. It is easy to show that D[Y’] < 2-n, and from this
Let yi(h) = D[X;(h)]. In the following, probabilities
it follows that Ent(Dh) > EnA - 2-n.
and expected values are with respect to h EU H,,l.
Consider the random variable X with values in
By the properties of Ijn,,, Exp[Yi(h)] = 1/2l. Let
the interval [0,2n] defined by X(d) = - log(D’[{d)]),
d&(h) = l/2’-Y;(h). Let small(h) = {i E {O,l}’ :
where d E (0, l}“. When di ED! (0,l)” indepen-
disci > 0}, i.e. the set of i such that Y,(~L) is
dently for i = 1,. . . , k(n), Y = Gill ,-,.,k(nj X(c&) is
smaller than average. Let p(h) be the distinguish-
the sum of independent randorn variables on the in-
ing probability for a statistical test. Each input h o i
terval [0,2n] with expected value Eni( Hence,
to the test such that the outsput is 1 adds exactly
by an elementary extension of Chernoff bounds, with
&xi(h) to p(h). F rom this, it is clear that the best
exponentially high probability (in k(n)) Y has value
test is when the output is 1 for all i E small(h),
within an additive factor of 7%. L(n)2/3 of its ex-
in which case p(h) = ~i~~mall~h~disc~(h). Let
pectation. Thus, with exponentially high probabil-
toosmall = {i E {O,l)’ : disci > 1/2l+“}. ity, Y > rl-(n) . Ent(Db) - n. ,4(n)2/3. This means
Using Chebychev’s Inequality, the pairwise indepen- that only with exponentially small probability the
dence properties of lin,l and the fact that min- sequence cll, . . . , dk(“) has probability greater than
entropy of D, is at least m, it is straightforward to 2-k(n).Ent(D’,)+n.~(~)2’3. Restricting 0’: to the com-
show that Pr[i E toosmall( 5 1/22e for each fixed
i E {0, l}‘. We say h is bad if jloosmal/(h)I > 2’/2e. plement of this exponentially small in probability set
of sequences, and renormalizing the distribution as
From the bound on Pr[i E toosmall( it, is easy
to see that Pr[h is bad] _< 1/2e using Markov’s In- before, we obtain E,. 0
equality. For every h, p(h) _( 1. For each good h, Corollary 3.3: Let k(72) = 11’ for a.ny constant
p(h) < jtoosmall(h)l/2’ -t- 2’/2’+” < 2/2e. Thus, the c > 0. Let h be uniformly and randomly chosen from
overall distinguishing probability for the best test is
Hn.k(n.),k(n).Ent(D)-2n.k(n) 2131 and let di EfJ i”, l)”
at most 3/2e. 0 independently for i = 1,. . . , k(?l). Then the distribu-
tion h o h(cZl o . . . o tlr;cn)) is quasi-random within an
Can we replace the condition that D has high min-
exponentially small in n amount.
entropy in Lemma 3.1 by the weaker and more nat-
ural condition that D have high entropy in the usual Proof : Combine Lemma 3.1 and Lemma 3.2. 0
(Shannon) sense ? Not directly. For example, a distri-
bution can have high Shannon entropy yet still have
one element output with probability l/2; thus, any 4 Computational Entropy
function computed based on ofle sample from this dis-
tribution generates some output with probability at Intuitively, pseudo-random generators transform a
least l/2, and therefore is highly non-random. This small amount of randomness into a much larger string
problem hints at a partial solution: take multiple in- that is random for all practical purposes. Of course,
dependent. samples from the distribution. in information-theoretic terms, no such increase is
possible. Applying a fixed function to a string can
Lemma 3.2 (entropy -4 min-entropy) : Let b(n) only decrease its informational content, not increase
be a length function. For every probability ensemble it. (FormaJly, if D is a. distribution and f a function
D there is a probability ensemble E with length func- on the same finite set, the Shannon entropy of D is
tion n 1 k(n) satisfying: at least as large a.s that of the induced distribution

18
f(d), where d is chosen according to D.) Thus, in semble D such that D is computationally indistin-
some sense, any pseudo-random generator determines guishable (in the non-uniform sense) from f(U) and
an ensemble that (asymptotically) has a greater com- Ent(D,) 2 s(n).
putaiional entropy than it has Shannon entropy. In
The difference between these two definitions is nec-
this section, we formalize this intuitive notion of the
essary so that the following proposition holds in both
computational entropy of an ensemble. This defini-
models of security.
tion provides one of the main conceptual tools in our
paper. Proposition 4.1 (additivity of computational
Just as Shannon entropy quantifies the amount of entropy) : Let q(n) be a length function. If f has
randomness in a distribution, computational entropy computational entropy at least s(n) then f* has com-
quantifies the amount of “apparent” randomness (to putational entropy at least s(n) - q(n).
feasible algorithms) of a distribution. For example,
we can relax the notion of a generator f being pseudo- Proof : The uniform case follows from Proposition
random by allowing its output to be computation- 2.3 and the non-uniform case’follows from Proposi-
ally indistinguishable from an ensemble D that is not tion 2.4. 0
necessarily the uniform ensemble, where D has more Definition (pseudo-entropy generator) : We say
Shannon entropy than the input to f. We call such f is a pseudo-entropy generator if there is a constant
an f a pseudo-entropy generator. In this case, we say c > 0 such that f has computational entropy at least
that the computational entropy of f is at least the n + l/nc.
Shannon entropy of D.
The notion of computational entropy is also useful Definition (false entropy) : We say f has false
in the case when the Shannon entropy of D is not entropy if there is a constant c > 0 such that f has
necessarily greater than that of the input to $. We computational entropy at least Ent( f(U)) + l/nc.
say that f has false entropy if the computational en-
Lemma 4.2 (pseudo-entropy -+ pseudo-
tropy of f exceeds the Shannon entropy of f. (but
random) : If there is a pseudo-entropy generator
not necessarily the Shannon entropy of the input to
then there is a pseudo-random generator.
-0
We use computational entropy in constructions of Proof : Let f be the (uniform/non-uniform)
pseudo-random generators as follows. The first step pseudo-entropy generator, and let D be the
is to show how to use a pseudo-entropy generator to (polynomial-sa.mplable/arbitrary) probability ensem-
construct a pseudo-random generator. The next step ble with Eni 2 n + 11~~ that is computation-
is to show how to convert any function with false ally indistinguishable from f(U) (with length l(n)).
entropy into a pseudo-entropy generator. Let t(n) = n3e+2. By Proposition 2.3/2.4, f”(U)
We obtain Theorem C as a direct consequence of is computationally indistinguishable from D k. Note
these constructions. ln Section 5, we prove that any that Ent(Dk) = k(7z) . Ent(D,,) 2 k(n) . (n + nmC).
one-way function in the non-uniform sense can be Let j(n) = k(n) f (n + nmC) - 2n . k(n)2/3. Let
used to construct a function with false entropy in h EU ffk(n)./(n),j(n). Let D:, be the probability dis-
the non-uniform sense, thus completing the proof of tribution defined by h o IL(~/~o . . . o oh:, where, for
Theorem A. aIli= l,..., k(n), yi ED (0, 1)‘(n) independently. By
In the following, s(n) is a function from N to posi- Corollary 3.3, DA is quasi-random within an exponen-
tive reak and the probability ensemble for the inputs tially small in n a.mount. Let the generator be defined
i
off is U. byg(hozlo...ozk(,,)) = hoh(f(zl)o...of(x+))).
Definition (uniform computational entropy) :
Let Ek be the probability distribution defined by the
We say f has uniform computational entropy at least output ofg when the input is h EU E1h(,).,(n),j(,) and,
s(n) if there is a polynomial-samplable ensemble D for all i = 1,. .., k(n), “ti EU (0, l}n independently.

such that D is computationally indistinguishable (in Then, since fk (U) is computationally indistinguish-
the uniform sense) from f(U) and Ent(D,) 2 s(n). able from D’, it follows that E’ is computationally in-
distinguishable from D’. &cause D’ is quasi-random
Definition {non-uniform computational en- within an exponential small in n amount, and because
tropy) : We say f has non-uniform computational by choice of k(n) tl le output of g is longer tha.n the
entropy at least s(n) if there is an (arbitrary) en- input, g is a pseudo-random generator. D

19
Our present goal is t,o transform a function f with the other hand, z is usually uniquely determined by
false entropy into a pseudo-entropy generator g. The I(Z) o h o h(c) and thus g has little degeneracy.
major obstacle is t,hat f could be many-to-one. In Lemma 4.3 and Lemma 4.4 show how to con-
this case, even though the output off seemingly has struct a pseudo-entropy generator from a function f
more entropy than it really has, the Shannon entropy with false entropy that satisfies a technical condition:
of the output off may be much less than the length of Enl(f(Un)) can be approximated fairly well in time
the input; intuitively the application off to the input polynomial in n. This condition is not essential; we
may cause more of a loss in Shannon entropy than the later sketch a slightly more complicated construction
corresponding gain in false entropy. For example, if of a pseudo-random generator without assuming this
f is 16-to-l, the probability distribution f(Un) has condition for f.
n - 4 bits of Shannon entropy, four bits less than
the input length. Even if f has three bits of false Lemma 4.3 : Consider the probability ensembles D
entropy, f( Un)) still has one less bit of computational and E defined as follows. Fix and c > 0, R(n) = nc
entropy than the input length. We need a method andj(n) = (n-Ent(f(U,)))~k(r~)-2n.k(n)~/~. D,
of recovering this loss in Shannon entropy without is given by
affecting the false entropy.
fh> O. . . o f(x+)) o h o h(q o . . .o x+)),
We do this in two steps. First, we let f’ = fq for
a suitable length function q(n). This has two effects; where xi EU (0, l}” independently for i = 1,. . . , B(n)
the false entropy in f’ is q times that of f and, for a and where 11E~J li,.k(,),j(,,). E is given by
randomly chosen input z to f’, the size of the preim-
age of f’(x) is, with high probability, relatively close f(n) 0 *. * 0 f(Xk(n)) 0 r,

to the expected preimage size.


where the zi are randomly chosen as above and
We then apply the technique of outputting, in ad- r EU (0, l}lhl+j(n). Tl len, D, is statistically indis-
dition to f’(x), the out,put of a randomly chosen hash tinguishable from E, within a.n exponentially small
function applied to 2 that produces a string of length in 12amount.
roughly 1x1 minus the Shannon entropy of the proba-
bility distribution determined by f’. This technique, Proof : We claim that, with high probability if, for
that we refer to as “hashing the preimages of a func- i = l,..., n, we indepcndent,ly choose xi EU (0, I}”
tion”, is also used in proving Theorem A (see Section and fix y; = f(~i) then the following distribution
5) and in ma.ny of the applications in [Impagliazzo, is quasi-random within an exponentially small in n
Luby 891. We now give a highly intuitive presenta- amount. Let SYl,...rYkcnj be the set of all sequences
tion of this technique. Let f be a one-way function. x’10 . ..o 2’k(nI where Z: E f-‘(yi). Randomly and
Let ranff(z) = ]{y < z : f(y) = f(x)}], i.e. the uniformly choose a scquencc Z: 0 . . . 0 Z&n) E S
rank of 2 among all preimages of f(x). For now, and a random h. The distribution is defined as
we make the highly unreasonable assumption that ho h(z’, o . . .o ~~~n~ ). From Lemma 3.1, this distribu-
ra7zk(z) is easily computable. Consider the function tion is quasi-random within an exponentially small in
g(z) = f(2) 0 ranrl-(z). g(Z) is one-to-one and so n amount if the min-entropy of the uniform distribu-
Ent(g(U,)) = n, i.e. g has degeneracy zero. Further- tion on SY1,...rYk~n~is substantially greater than j(n).
more, the task of inverting x is at least as hard as that Define X(y) = log (If-‘(yj]) . Then, the min-entropy
of inverting f. On input for, an adversary doesn’t of the uniform distribution on SYl,...,y,(,J is simply
just ha.ve to find some preimage of f(z), it has to be
able to find the rib srnallest preimage. (Similarly, if
f has fa.lse entropy, g has at least as much.) rank(z)
log(ISY,,...,Yk(,)
I>= c x(yi). i=l,...,k(n)
is not in general computable. However, the value of
a random ha.sh function on E frequently serves the This is the sum of k(n) independent random samples
same purpose. Let g(~ o h) = f(z) o h o h(z). For where the range of each possible value is [0, n]. Thus,
a fixed value of f(x), and a random hash function using Chernoff bounds, the sum is within an addi-
h that outputs log(lJ--l(f(x)jl) bits, the distribution tive factor of 72. k(n) ‘I3 of its expected value with
hoh(x) is almost uniform (See Lemma 3.1), and thus probability exponentially close to 1. The expected
could have been generated by the adversary. Con- value of X(yi) when yi is chosen as described above
sequently, g(a: o h) is as hard to invert as f(x). On is n - Enl(f(Un)). Tl pus, the expected value of the

20
sum is (n - Ent(f(U,))) - k(n) from which the claim of all of our input is n’s’. At least one of the candi-
follows. 0 dates uses the correct value for Ent(f(Un)) and hence
its output is pseudo-random. We “exclusive-or” all
Lemma 4.4 : Assume there is a polynomial-time nc+l outputs to produce the final output of length
computable function f with at least n-C bits of nc+3 . Because at least one of the outputs from the
false entropy. Assume that we can approximate candidates is pseudo-random, the final output is also
Enl(f(U,)) to within an additive factor of n-(‘+‘) in pseudo-random. Thus, we have stretched an input
time polynomial in n. Then there is a pseudo-entropy of length n =f2 to a pseudo-random output of length
generator g. nc+3. 0
Proof : Let Fe(n) = n3c+2, let u(n) be the ap- Lemma 4.6 (hidden and meaningful bit + false
proximation of Ent(f(U,)) and let j(n) = (n - entropy) : If there is a function f that hides a mean-
u(n)) * k(n) - 2 n . C(n)2/3 Let g(h 0 21 0 . . . 0 zk~~))
ingful bit, then there is a function g and a constant
=f(x1)0.. . o f(zqn)) o h o h(ccl o . . . o xkcn)), where
d > 0 such that g has false entropy at least l/nd.
h E H,A(~),~(~) and 21,. . -, xqn) E i&l>“- We
claim that g(U) h as computational entropy at least Proof Sketch : Let b be the hidden and meaningful
R. k(n) + Ihj + 1. Let D be the probability ensemble bit for f and let c > 0 be such that an information
that is computationally indistinguishable from f(U) adversary has distinguishing probability p(n) > l/nc.
in the definition of false entropy. (In the uniform Let g(x) = f(x) o b(z). Let D be the probability
model of security, D is polynomial-samplable.) Then, ensemble such that D, is given by f(x) op when x EU
(0, l}n and /? EU (0, l}. S ince f hides the bit b, g(U)
Eni > Ent(f(U,))+n-= 2 a(n)+n-c-n-(c+l). is computationally indistinguishable from D. (Also,
D is polynomia,l-samplable, which is needed in the
(The last term is the round-off error involved in ap- uniform model of security). Since 6(z) has correlation
proximating Ent(f(U,)) by &(?I).) From Lemma 4.3, at least l/nc to f(z) and since p is independent of
g(hox10.. .Oxqn)) = f(Xl)O.. .of(zk(,))ohoh(xp.. .o f(x), it can be shown that Ent(D,) > Eng(y( Un)) +
ok), is statistically indistinguishable from the dis- 1/16n35 17
tribution g(hozlo.. .oz~(~J) = f(q)o.. .of(~k(~))or,
when h EU H,.k(,~,j(~), T EU {O,l)lhlfj(“) and, for Theorem C : If there is a bit b that is both hidden
i=l , . . ., k(n), xi EU (0, l}“, independently. Since and meaningful for f then there is a pseudo-random
D is computationally indistinguishable from f(U), generator.
this last ensemble is computationally indistinguish-
Proof : Combine Lemmas 4.6 and 4.5. 0
able from Dk o r by Proposition 2.3/2.4 (depending
on the model of security). 0: or has Shannon entropy

k(n) - Ent(D,) + Irl = k(n) . Enl(D,) + Ihl+ j(n). 5 Pseudo-Random Generators


from One-Way Functions
Since Ent( Dn) > a(n) + n-’ - n-(c+l), this quantity
is at least n. - k(n) + Ihl+ 1 for our choice of k(n). D In the previous section we reduced the problem of
Lemma 4.5 (false entropy -+ prg) : If there is an constructing a pseudo-random generator to the prob-
lem of finding a function with false entropy. In this
f that has false entropy at least nwc for some constant
c 2 0 then there is a pseudo-random generator. section we show, in the non-uniform model, how to
construct a function g with false entropy from any
Proof Sketch : There are only nCfl possible values one-way function f. Let D be a probability ensemble
for Ent(f(U,)) t o within an additive factor of nwc. that is computationally indistinguishable from g(U).
For each of the nC+l possible values of Ent(f(U,)), The construction has the property that if there is a
combining the constructions given in Lemmas 4.4, (uniform/non-uniform) feasible algorithm for distin-
4.2, and Proposition 1.2 yields a candidate for a guishing D from g(U) then there is a (uniform/non-
pseudo-random generator stretching a bit string of uniform) feasible algorithm for inverting f. The only
length n into one of length ncf3. For each of the problem is that D is not polynomial-samplable. In
possible values, we uniformly and randomly choose a particular, even if g(U) is computationally indistin-
string of length n as the input; thus the total length guishable from D in the uniform model, we see no way

21
to prove that gQ(U) is computationally indistinguish- t E {o,l}j(,)+‘(,), M’ f orms u = s o t and simu-
able from Dq in the uniform model (see Proposition lates M on input i o h o f(z) o U. We claim that
2.3). with probability at least p(n)/2 there is a round,
i.e. an i and a t, where this simulation yields a y
Lemmas 5.1 (one-way --f non-uniform false en-
with f(y) = f(r). The round in question is when
tropy) : If there is a one-way function f (in the non- i=i(z). Wek now that the probability that M, on
uniform sense) then there is a polynomial-time com-
input i(z)ohof(z)o(h(z) 7 (i(z)+l(n))), for z and h
putable function g with false entropy at least 1/3n
randomly selected, yields such a y, is at least p(n). If
(in the non-uniform sense).
instead of trying the one value h(z) t (i(z) + l(n)) we
Proof : Let I(n) = [log(n + l)] and let h(n) = try (h(x) r (i(z) -j(n>))ot for all possible extensions
2’(n) - 1. Define g by t E {o,lp)+‘(q th e chance that M finds such a y
can only increase. Fix f(z) and consider the distribu-
g(roiohoa) = roiohof(z)o(h(z) 1 (i+Z(n)))orOz, tion ho(h(z’) t (i(z)-j(n))) for 2’ randomly and uni-
formly chosen from f-l (f(z)). The distribution on z’
where r, x E (0, l}‘“, i E (0, l}‘(n) (i is to be has min-entropy log (If-‘(f(~))l) > i(z). Hence, by
thought of as a number between 0 and h(n)) and Lemma3.1, the distribution ho(h(z’) 1 (i(x)--j(n)))
h E H,,k(,)+l(,). Let m(n) := jr o i o h o 21. Let is statistically indistinguishable within 3p( 72)2 from
WA be the uniform probability distribution strings the distribution h o s when h EU H,,k(n)+,(nI and
of length m(n). We describe a probability ensem- s &, (0, l}+++). Tl rus the probability that, for at
ble D that is computationally indistinguishable from least one t, M on input i(z) o h o f(z) o s o i finds a
g(U’) such that Ent(.D,,) > Ent(g(UA)) + 1/37a. Al- preimage of f(z) f or s chosen at random differs from
though D is not going to be polynomial-samplable, the probability when s is given by h(z) j’ (i(z)-j(n))
it is easiest to think of D as being generated from by at most 3p(n) 2. Thus, this probability for a ra.n-
roiohox ~~~ {O,l}“+) and from independently cho- dom s is at least p(n) - 3~47~)~ 2 p(n)/2. From this
sen bit ,0 EU (0, I}. D elne f i(x) = llog(lf-l(f(~))l)J. contradiction of the one-wayness of f, we conclude
D(roiohoxo/?) is exactly the same as g(roiohoz) that g’ is one-way 011 E. End of Claim Proof
except for possibly the last bit, which in g is always
We now conclude the proof of the lemma. To dis-
r@z. The last bit of D(roiohozop) is r@a: unless
tinguish D from ~((7’) is- e-q uivalent to being able to
i = i(z), in which case it is j3.
predict T@X, for i(z)ohoz randomly chosen according
Let ensemble E be such that En is the distcibu-
to E, and and 1’ EU (0, l}“, given TO g’(i(z) oh o CE).
tion on i(z) o h o cc when z tfu (0, l}” and h EU
From the above claim and from Proposition 1.3 it fol-
Hn,qn)+qn). We define
lows that this task is computationally infeasible, and
g’(i o h o XT) = i o h o f(x) o (h(z) r (i + Z(n))). thus g( U’) and D a.re computationally indistinguish-
able.
All that remains to be shown is that Ent(D,) >
Claim : g’ is one-way on E. 1/3n + Ent(g(UA)). This follows from the fact that
with probability at least 1 - l/n, when zr E~,J (0, l}n
Proof of Claim : A.ssume there is an algorithm
alld h EU lf,,q,)+~(~), x is uniquely determined
M with time bound Y’(n) that inverts g’(i o h o z)
by f(z) and h(x) t (i(x) -I- 1(n)). When x is
with probability at least p(n) = l/T(n) for some
fixed and i GJ (0, I}‘cn), i = i(z) with probabil-
funct,ion T(n) in resource class R when i o h o z
ity at least l/272. Thus, with probability at least
is randomly chosen according to E,, i.e. on input
(1 - 1/72)/2n 1 1/3n the last bit of g(r o i o h o x) is
i(x) o h, of(z) o (h(z) t (i(z) + I(n))), M finds y such
completely determined by the preceding bits, whereas
that f(y) = f(x) and h(y) t (i(z) + l(n)) = h(z) t in D(r o i o h OX o ,f3) the last bit ,0 is always indepen-
(i(x) + l(n)). w e use only the fact that f(y) = f(z) dent of the preceding bits, and thus there is one extra
in the proof of the claim. We construct an algorithm
bit of entropy in D,. Furthermore, in all cases, r 0 2
M’ with time bound polynomial in T(n) and n that
adds at most one bit of entropy to the preceding bits
inverts f(z) with probability at least p(n)/2. Let
ofg(roiohoX). 0
j(n) = 8log(T(n)). On input .f(z), M’ runs through
all possible i = 0, . . . , k(n). For each i, n/r’ chooses Theorem A : If there is a one-way function in the
h EU H,,k(,)+l(,) and s E~J (0, l}i-j(n). For each non-uniform model of security then there is a pseudo-

22
random generator in the non-uniform model of secu- Amos Fiat, Moni Naor, Ronitt Rubinfeld, Manuel
rity. Blum, Steven Rudich, Noam Nisan, Lance Fortnow,
Umesh Vazirani, Charlie Rackoff, Oded Goldreich
Proof : Combine Lemma 5.1 and Lemma 4.5. 0 and Hugo Krawczyk for their insights and contribu-
tions to this work. We in particular thank Char-
lie, Umesh and Manuel for their advice and enthu-
6 Open Problems siasm, Lance for suggesting a version of Lemma 3.2
and Oded and Hugo for introducing the third author
The results presented in this paper unify different
to this question.
concepts in theoretical cryptography. When com-
bined with other ,work ( [Goldreich, Goldwasser,
Micah 841, [Luby, Rackoff 861, [Goldreich, Micali, References
Wigderson 861, [Naor SS]), they show that applica-
tions ranging from private key encryption to zero-
knowledge proofs can be baaed on any one-way func-
PI Blum, M., “Independent Unbiased Coin Flips
From a Correlated Biased Source: A Finite State
tion. [Impagliazzo, Luby 891 shows that most cryp- Markov Chain”, 251h FOCS, 1984, pp. 425-433.
tographic applications that are impossible in a world
where anything that is informationally possible is PI Blum, M., and h/Iicali, S., “How to Generate
computationally possible must be implicitly based on Cryptographically Strong Sequences of Pseudo-
a one-way function. Random Bits”, SIAM J. on Computing, Vol. 13,
Several very interesting open questions remain. We 1984, pp. 850-864, FOCS 1982.
have shown that in the non-uniform model of security
any one-way function yields a pseudo-random genera- PI Carter, J., and M. Wegman, “Universal Classes
of Hash Functions”, JCSS, 1979, Vol. 18, pp.
tor. Is this also true in the uniform model of security?
143-154.
A more general problem is to characterize the con-
ditions under which cryptographic applications are
possible. Although there are characterizations for
PI Chor, B., and 0. Goldreich, “Unbiased Bits from
Sources of Wea.k Randomness and Probabilistic
some applications in this paper combined with [Im- Communication Complexity”, SIAM J. on Conz-
pa.gliazzo, Luby 891, many others remain open. [Naor, puting, Vol. 17, 1988, FOCS 1985.
Yung 891 give a signature scheme that can be based
on any one-way permutation. Can this assumption PI Goldreich, O., S. Goldwasser, and S. Micah,
be weakened to give a signature scheme based on any “How to Construct Random Functions”, J. of
one-way function? Some applications seem unlikely ACM, Vol. 33, No. 4, 1986, pp. 792-807, FOCS
to be shown possible based on any one-way function, 1984.
e.g. [Impagliazzo, Rudich 891 give strong evidence
that secret exchange is an application of this kind. PI Goldreich, O., Krawczyk, H. and Luby, M.,
Another important issue is that of efficiency; the “On the Existence of Pseudorandom Genera-
construction we give here for a pseudo-random gener- tors”, 29th FOCS, 19SS, pp. 12-24.
ator ba.sed on any one-way function increases the size
of the seed by a large polynomial amount. For practi- PI Goldreich, O., and L.A. Levin, “A Hard-Core
Predicate for any One-Way Function”, these pro-
cal applications, it would be nice to have a much more ceedings.
parsimonious construction. We would like to develop
constructions that are more efficient than the existing LB1
Goldreich, O., RIicali, hil. and Wigderson, A.,
ones, with the goal being a private key cryptosystem “Proofs tha.t Yield Nothing but Their Validity
based on the intractability of some natural problem and a h!Iethodology of Cryptographic Protocol
that is as fast as any cryptosystem used in practice. Design”, 27th FOCS, 1986, pp. 174-187, Tech.
Report TR498, Comp. Sci. Dept., Technion, sub-
mitted to JACM.
7 Acknowledgements
t91Goldwasser, S. and Micali, S., “Probabilistic En-
This research evolved over a long period of time and cryption,” JCSS, Vol. 28, No. 2, April 1984, pp.
was grea,tly influenced by many people. We thank 270-299. STOC 1982.

23
[lo] Goldwasser, S., Micah, S. and Raekoff, C., “The [23] Vazirani, U. and Vazirani, V., “Random Poly-
Knowledge Complexity of Interactive :Proof Sys- nomial Time is Equal to Slightly-random Poly-
terns,” SIAM J. on Computin.g, Vol. 18, No. 1, nomial Time”, 26th FOCS, 1985, pp. 417-428,
1989, pp. 186-208, STOC 1985. submitted to JACM.

[ll] Impagliazzo, R. and Luby, M., “One-way func- [24] Yao A.C., “Theory and Applications of Trap-
tions are essential for information based cryptog- door Functions”, 23rd FOCS, 1982, pp. 80-91.
raphy,” submitted to CRYPT0 1989.

[la] Impagliazzo, R. and Rudich, S., “Limits on the


Provable Consequences of One-way Functions”,
these proceedings

[13] Karp, R., Lipton, R., “Turing Machines that


Take Advice,” .L ‘Enseignement Mathematique,
Vol. 28, pp. 191-209 (1982), STOC 1980

[14] Levin, L.A., “One-Way Function and Pseudo-


random Generators”, Combinatorics, Vol. 7, No.
4, 1987, pp. 357-363, STOC 1.985.

[15] Luby M., and Rackoff, C., “How to Construct


Pseudorandom Permutat,ions From Pseudoran-
dom Functions”, SIAM J. on Compvting, Vol.
17, 1988, pp. 3X3-386, STOC 1986

[16] McInnes, J., ‘Cryptography Using Weak


Sources of Randomness,” Tech. Report 194/87,
U. of Toronto, 1987.

[17] Naor, M., personal communication, 1988.

[18] Naor, M. aud Yung, M., “Universal One-way


Hash Functions and Their Applications”, these
proceedings.

[19] Renyi, A., Probability Theory, North-


Holland, Amsterdam, 1970.

[20] Shannon, C., “A Mathematical Theory of Com-


munication”, Bell Systems Technical Journal,
27, 1948, pp. 379-423 and pp. 623-656.

[al] Sa.ntha, M. and Vazirani, U., “Generating


Quasi-random Sequences from Slightly-random
Sources”, 25th FOCS, 1984, pp. 434-440, JCSS,
Vol. 33, No. 1, 1986.

[22] Vazirani, U., “Towards a Strong Communica-


tion Complexity Theory or Generating Quasi-
random Sequences from Two Communicating
Slightly-random Sources”‘, 17th STOC, 1985, pp.
366-378, Combinatorics, Vol. 7, No.4, 1987.

24

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy