0% found this document useful (0 votes)
13 views81 pages

BISS01

The document is a lecture on modern cryptography by Stefan Dziembowski, covering the evolution from historical ciphers to contemporary secure communication methods. It emphasizes the importance of provable security, the role of formal definitions, and the significance of Kerckhoffs' principle in designing secure encryption systems. The lecture also discusses various encryption schemes, including the one-time pad, and the challenges associated with achieving practical security.

Uploaded by

Dorjan Zela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views81 pages

BISS01

The document is a lecture on modern cryptography by Stefan Dziembowski, covering the evolution from historical ciphers to contemporary secure communication methods. It emphasizes the importance of provable security, the role of formal definitions, and the significance of Kerckhoffs' principle in designing secure encryption systems. The lecture also discusses various encryption schemes, including the one-time pad, and the challenges associated with achieving practical security.

Uploaded by

Dorjan Zela
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 81

Modern Cryptography www.dziembowski.

net/Studenti/BISS09

Lecture 1
Introduction to Cryptography

Stefan Dziembowski
University of Rome
p
La Sapienza

BiSS 2009
Bertinoro International
Spring School
2‐6 March 2009
Plan
1. Introduction
2
2. Historical ciphers
3. Information‐theoretic security
4. Computational security
Cryptography

In the past:

the art of encrypting messages (mostly for the


military applications).
applications)

Now:

the science of securing digital communication


and transactions (encryption,
(encryption authentication,
authentication
digital signatures, e‐cash, auctions, etc..)
Terminology
constructing secure
systems breaking the systems

Cryptology = cryptography + cryptoanalysis

This convention is slightly


g y artificial and often
ignored.

Common usage:
yp y of X” = “breakingg X”
“cryptoanalysis
Cryptography
yp g p y – ggeneral picture
p

plan of the course:

encryption authentication

private key private key private key


1 2
encryption authentication
public keyy
p p
public keyy
3 4 signatures
i t
encryption

5 advanced cryptographic protocols


Encryption schemes
(a very general picture)
Encryption scheme (cipher) = encryption & decryption

plaintext m encryption ciphertext c decryption m


Alice Bob

In the past: should not learn m


a text in natural language.
Now:
a string of bits. Eve
Art vs
vs. science

In the past:
lack of precise definitions, ad‐hoc design,
usually insecure.

Nowadays:
formal definitions, systematic design, very
secure constructions.
Provable security

We want to construct schemes that are


provably secure.

B t
But...
• why do we want to do it?
• how to define it?
• and is it possible to achieve it?
Provable security – the motivation
In many areas of computer science formal proofs are not essential.
For example, instead of proving that an algorithm is efficient,
we can just simulate it on a “typical input”.

In cryptography it’s not true, because

there cannot exist an experimental proof that a scheme is secure.

Why?
Because a notion of a
“t i l adversary”
“typical d ”
does not make sense.

SSecurity
i definitions
d fi i i are useful
f l also
l because
b they
h allow
ll us to construct
schemes in a modular way...
Kerckhoffs' principle
Kerckhoffs
Auguste Kerckhoffs (1883):
The enemy knows the system

The cipher should remain secure even


if the adversary knows the
specification of the cipher.

The only thing that is secret is a

short key k

that is usually chosen uniformly at random


10
A more refined picture

plaintext m encryption ciphertext c decryption m


Alice Bob

key k Eve key k

doesn t know k
doesn’t
should not learn m
(Of course Bob can use the same method to send messages to Alice.)
(That’s why it’s called the symmetric setting)

Let us assume that k is unifromly random


11
Kerckhoffs' principle – the
motivation
1. In commercial products it is unrealistic to assume that
the design details remain secret (reverse‐
engineering!)
2. Short keys are easier to protect, generate and
replaced.
p
3. The design details can be discussed and analyzed in
public.

Not respecting this principle


=
``security by obscurity”.
A mathematical view
K – key space
M – plaintext
l i t t space
C ‐ ciphertext space

An encryption scheme is a pair (Enc,Dec), where


y Enc : K × M → C is an encryption algorithm,
y Dec : K × C → M is an decryption algorithm.

We will sometimes write Enck(m) and Deck(c) instead of Enc(k,m)


and Dec(k,c).

Correctness
for every k we should have Deck(Enck(m)) = m.
Plan
1. Introduction
2
2. Historical ciphers
3. Information‐theoretic security
4. Computational security
Shift cipher
M = words over alphabet {A,...,Z} ≈ {0,...,25}
K = {0,...,25}

Enck(m0,...,mn) = (k+m0 mod 25,..., k+mn mod 25)


Deck(c0,...,ccn) = (c0 ‐ k mod 25,...,
25 cn ‐ k mod 25)

Cesar: k = 3
Securityy of the shift cipher
p
How to break the shift cipher?
Check all possible keys!

Let c be a ciphertext.

For every k Є {0,...,25} check if Deck(c) “makes sense”.

M t probably
Most b bl only
l one such
h k exists.
it

Thus Deck(c) is the message.

This is called a brute force attack.


Moral: the key space needs to be large!
Substitution
b i i cipher
i h
M = wordsd over alphabet {A Z} ≈ {0,...,25}
l h b t {A,...,Z} {0 25}
K = a set of permutations of {0,...,25}

A B C D E F G H I J K L M N O P R S T U WV X Y Z

π
A B C D E F G H I J K L M N O P R S T U WV X Y Z

Encπ(m0,...,mn) = (π(m0),..., π(mn))

Decπ(c0,...,cn) = (π‐1(c0),..., π‐1(cn))


17
How to break the substitution cipher?

Use statistical p
patterns of the
language.

For example: the frequency


tables.
tables

Texts of 50 characters can


usuallyy be broken this way.
y
18
Other famous historical ciphers
Vigenère cipher:

Blaise de Vigenère Leon Battista Alberti


(1523 ‐ 1596) (1404 – 1472)

Enigma

Marian Rejewski Alan Turing


(1905 ‐ 1980) (1912‐1954) 19
In the past ciphers were designed in an
ad‐hoc manner
In contemporary cryptography the ciphers are
designed in a systematic way
way.

Main goals:
1. define security
2. construct schemes that are “provably secure”
Plan
1. Introduction
2
2. Historical ciphers
3. Information‐theoretic security
4. Computational security
Definingg “securityy of an encryption
yp scheme” is not
trivial.

consider the following experiment


(m – a message)

1. the key K is chosen uniformly at random

2. C := EncK(m) is given to the adversary

how to define
security

?
(m – a message)

Idea 1 1. the key K is chosen uniformly at random

2. C := EncK(m) is given to the adversary

An idea
“The adversary should not be able to compute K.”

A problem
bl

the encryption scheme that “doesn’t encrypt”:


EncK(m)
( )=m
satisfies this definition!
(m – a message)

Idea 2 1. the key K is chosen uniformly at random

2. C := EncK(m) is given to the adversary

An idea
“The adversary should not be able to compute m.”

A problem

What if the adversaryy can compute,


p , e.g.,
g , the first half of m?
m1 ... m|m|/2 ? ... ?
(m – a message)

Idea 3 1. the key K is chosen uniformly at random

2. c := Enck(m) is given to the adversary

An idea
“The
The adversary should not learn any information about m.
m”

Ap
problem

But he may already have some a priori information about m!

For example he may know that m is a sentence in English...


(m – a message)

Idea 4 1. the key K is chosen randomly

2. c := EncK(m) is given to the adversary

An idea
“The adversary should not learn any additional information
about m.”

This makes much more sense.


But how to formalize it?
We will use the language of the probability theory.
Notation
A : Ω → A ‐ a random variable
then
PA : A → [0,1] denotes the distribution of A:
PA(a) = P(A = a)

F two
For t distributions
di t ib ti PA and
d PB we write
it
PA = PB So, it is the same as saying:
if they are equal (as functions)
functions). “for
for every a
P(A = a) = P(B = a)”
if X is an event then
PA|X denotes the distribution of A conditioned on X:

PA|X (a)
( ) = P (A = a | X).
X)
Notation

Two (discrete) random variables


A : Ω → A and B : Ω → B

are independent if

for every a and b:


P(A = a and B = b) = P(A = a) ∙ P(B = b).
Independence: equivalent formulations
for every a and b:
P(A = a and B = b) = P(A = a) ∙ P(B = b).
b)
equals to
P(A
( = a and B = b)) / P(B
( = b))

for every a and b:


technical P(A = a) = P(A = a | B = b).
assumptions:
P(B = b) ≠ 0
P(B = b0) ≠ 0
P(B = b1) ≠ 0 for every a and b0,b1:
P(A = a | B = b0) = P(A = a | B = b1).

for
f every b0,b
b1 :
P A | B = b0 = P A | B = b1
More notation
If
• A : Ω → A is a random variable, and
• f : A → B is a function,
function
then f(A) denotes a random variable Ω → B,
defined as
f(A)(ω) = f(A(ω)).

For example if A has a uniform distribution over


{1,2,3,4,5} h A2 has
{1 2 3 4 5} then h a uniform
if di
distribution
ib i
over {1,4,9,16,25}.
Notation

If A is a set then
Y←A
means that Y is chosen uniformly at random
from the set A.
How to formalize the “Idea 4”?
“The adversary should not learn any additional information
b
about m.””

also called: information‐theoretically


information theoretically secret

An encryption scheme is perfectly secret if


such that
for every random variable M P(C = c) > 0
and every m Є M and c Є C
P(M = m) = P(M = m | (Enc(K,M))= c)

equivalently: M and Enc(K,M)


Enc(K M) are independent
Equivalently:
q y
for every M we have that: M and Enc(K,M) are
independent

for everyy m0 and m1 we have:


P Enc(K,M) | M = m0 = P Enc(K,M) | M = m1

for every m0 and m1 we have P Enc(K,m0) = P Enc(K,m1)

intuitive...
A perfectly secret scheme: one‐time pad

t – a parameter component‐wise xor


K = M = {0,1}t
Gilbert
Vernam’s cipher: Vernam
((1890 –1960))
Enck(m)
( ) = k xor m
Deck(c) = k xor c

Correctness is trivial:

Deck(Enck(m)) = k xor (k xor m)


m
34
Perfect secrecy of the one time pad
one‐time

Perfect secrecy of the one time pad is also


trivial.
trivial

This
h is b
because ffor every m
the distribution PEnc(K,m) is uniform
( d hence
(and h does
d not depend
d d on m).
)

for every c:
( = m xor c)) = 2‐t
( ( , ) = c)) = P(K
P(Enc(K,m)
Observation
One time pad can be generalized as follows
follows.

Let (G,+) be a group. Let K = M = C = G.

The following is a perfectly secret encryption


scheme:

• Enc(k,m) = m + k
• Dec(k,m)
ec(k,m) = m – k
Why the one‐time
one time pad is not practical?

1. The key has to be as long as the message.

2. The key cannot be reused

This is because:
Enck((m0) xor Enck((m1) = ((k xor m0) xor ((k xor m1)
= m0 xor m1

37
One time‐pad is optimal in the class of
perfectly secret schemes
Theorem (Shannon 1949)
In every perfectly secret encryption scheme
Enc : K × M → C , Dec : K × C → M
we have |K| ≥ |M|.

Proof
P f secrecy implies
Perfect i li that
h the
h distribution
di ib i off Enc(K,m)
E (K ) does
d not depend
d d on m

Without loss of ggeneralityy we can assume that ((for each m))


C = {Enc(k,m)}kЄK
Hence: |K| ≥ |C|.

|K| ≥ |M|
Fact: we always have that |C| ≥ |M|.
This is because for every k we have that
Enck : M → C is an injection
(otherwise we wouldn’t be able to decrypt).
38
Practicality?
y
Generally, the one‐time pad is not very practical, since:
• the key has to be as long as the total length of the encrypted
messages,
• it is hard to generate truly random strings.
However, it is sometimes used (e.g.
in the military applications),
because of the following
advantages:
• perfect
f t secrecy,
a KGB one‐time pad hidden • short messages can be encrypted
in a walnut shell using pencil and paper .
In the 1960s the Americans and the Soviets established a hotline
yp
that was encrypted usingg the one‐time p
pad.(additional
(
advantage: they didn’t need to share their secret encryption
methods) 39
V
Venona project
j t (1946 – 1980)

American National Security Agency


decrypted
yp Soviet messages
g that were
transmitted in the 1940s.

That was possible because the Soviets


reused the keys in the one‐time pad
Ethel and Julius Rosenberg scheme.
scheme

40
Outlook
We constructed
W d a perfectly
f l secret
encryption scheme

Our scheme has certain drawbacks


(|K| ≥ |M|).

But by Shannon’s theorem this is


unavoidable.
unavoidable

Can we go home and relax?

41
What to do?
Idea
use a model where the power of
the
h adversary
d is limited.
l d

How?

Classical (computationally‐secure) cryptography:


bound his computational power.

Alternative options:
quantum cryptography,
q yp g p y, bounded‐storage
g model,...
,
(not too practical)
Quantum cryptography
Stephen Wiesner (1970s),
(1970s) Charles H
H. Bennett and Gilles Brassard (1984)

quantum link

Alice Bob

Quantum indeterminacy: quantum states cannot be measured


without disturbing the original state.
Eve
Hence Eve cannot read the bits in an unnoticeable way.
way
Quantum cryptography
Advantage:
Advantage

security is based on the laws of quantum physics


Disadvantage:
needs a dedicated equipment.

Practicality?

Currently: successful transmissions for distances of length around 150 km.


Commercial products are available.

Warning:
Quantum cryptography should not be confused with quantum computing.
A satellite scenario
A third party (a satellite) is
broadcasting random bits.
000110100111010010011010111001110111
111010011101010101010010010100111100
001001111111100010101001000101010010
001010010100101011010101001010010101

Alice Bob

Does it help?
No...
Eve (Shannon’s theorem of course
co rse also
holds in this case.)
Ueli Maurer (1993): noisy channel.

1 0 1 0 1 0 0 1 1 0 1 0 0 1 0

1 0 1 0 1
0 0 0 0 1 1 0 1
0 0 0 1 0
1 1 0 1 0
1 1 0 0 1 1 0 1 0 0 1
0 1
0

1 0 1 0
1 1 0 0 1 1 0 1 0 0 1
0 0 some bits get flipped
(because of the noise)

Assumption: the data that the adversary receives is noisy.


(The data that Alice and Bob receive may be even more noisy.)
Bounded‐Storage
Bounded Storage Model
Another idea: bound the size of adversary’s memory

000110100111010010011010111001110111
111010011101010101010010010100111100
001001111111100010101001000101010010
001010010100101011010101001010010101

too large to fit in Eve’s


Eve s memory
Plan
1. Introduction
2
2. Historical ciphers
3. Information‐theoretic security
4. Computational security
Computing power of the adversary
In practice, the adversary has always a limited
p
computing g power.
p

Therefore,
Th f for
f the
h real‐life
l lif applications,
li i iit iis
enough if the schemes are secure against the
computationally‐limited adversaries.

How to model this?


How to reason about the bounded
computing power?
We required that
M and EncK(M)
are independent
independent,

It is enough to require that


M andd EncK(M)
( )
are independent
from the point of view of a computationally
“from computationally‐limited
limited
adversary’’.

How can this be formalized?


We will use the complexity theory!

50
Practical cryptography starts here:

Eve is computationally‐bounded

We will construct schemes that in principle can be broken if


the adversary has a huge computing power.

For example, the adversary will be able to break the scheme


by enumerating all possible secret keys.
(this is called a “brute force attack”)
Computationally bounded adversary
Computationally‐bounded

Eve is computationally‐bounded

But what does it mean?

Ideas:
1 “She
1. She has can use at most 1000
Intel Core 2 Extreme X6800 Dual Core Processors
for at most 100 years...”

2. “She can buy equipment worth 1 million euro and use it for 30 years..”.

it’s hard to reason


formally about it
A better idea
”The adversary has access to a Turing Machine that can make at most 1030
steps.”

More generally, we could have definitions of a


type:
“a system X is (t,ε)‐secure if every Turing Machine
that operates in time t
can break it with probability at most ε.”

This would be quite precise, but...


We would need to specify exactly what we mean by a “Turing Machine”:
• how many tapes does it have?
• how does it access these tapes (maybe a “random access memory” is a
more realistic model..)
• ...

Moreover, this approach often leads to ugly formulas...


What to do?
Idea:

t steps of a Turing Machine → “efficient computation”

ε → a value
l ““very close
l to zero”.

How to formalize it?

Use the asymptotics!


Efficiently computable?
“efficiently computable” = “polynomial‐time computable
on a Probabilistic Turing Machine”

that is: running in time


O(nc) (for some c)

Here we assume that the poly‐time Turing Machines are


the right model for the real‐life computation.

Not true if a quantum computer is built...


Probabilistic Turing Machines
A standard
d d Turing
T i Machine
M hi has
h some number
b off tapes:

A probabilistic
Turing Machine
has an additional
tape with
random bits. 0 1 1 0 1 0 1 1 0 1
Some notation
If M is a Turing Machine then

M(X)

is a random variable denoting the output of M


assuming that
the
h contents off the
h random
d tape was chosen
h
uniformly at random.
More notation

Y ← M(X)
means that the variable Y takes the value that M
outputs on input X (assuming the random
input is chosen uniformly).
Interactive Turing Machines
A B

B has read‐only
read only access to the “A’s
As A has read
read‐only
only access to the “B’s
Bs
output tape”. output tape”.
Interactive Turing Machines
Of course, we can generalize
li it tto a group off n machines
hi
interacting with each other.

(we can also model: private channels, broadcast channels,


partial broadcast channels, etc.))
p

Usually, we consider only poly‐time, probabilistic


machines.

A group off interactive


i i Turing
T i Machines
M hi is
i sometimes
i
called a protocol.
Very small?

“very small”
=
“negligible”
=
approaches
h 0 faster
f t than
th the
th inverse
i off any polynomial
l i l

Formally

A function µ : N → R is negligible if

1
∀ ∃ ∀ | μ ( n) | ≤ c
c n0 n > n0 n
Nice properties of these notions
• A sum of two polynomials is a polynomial:
poly + poly = poly

• A product of two polynomials is a polynomial:


poly * poly = poly

• A sum of two negligible functions is a negligible function:


negl + negl = negl

Moreover:

• A negligible function multiplied by a polynomial is negligible


negl * poly = negl
Security parameter
Typically we will say that a scheme X is secure if
Typically,

A
P (M breaks the scheme X) is negligible

polynomial‐time
Turing Machine M

The terms “negligible” and “polynomial” make sense only if X (and the adversary) take an
additional input 1n called
a security parameter.

In other words: we consider an infinite sequence


X(1),X(2),...
of schemes.
Example
p
security parameter n = the length of the secret key k

in other words: k is always a random element of {0,1}n

The adversary can always guess k with probability 2‐n.

This probability is negligible.

He can also enumerate all possible keys k in time 2n.


(the “brute force” attack)

This time is eexponential.


ponential
Is this the right
g approach?
pp
Advantages

1. All types of Turing Machines are “equivalent” up to a “polynomial


reduction”.
Therefore we do need to specify the details of the model
model.
2. The formulas get much simpler.

Disadvantage

Asymptotic results don’t tell us anything about security of the concrete


systems.

However

Usually one can prove formally an asymptotic result and then argue
informally that “the
the constants are reasonable
reasonable”

(and can be calculated if one really wants).


How to change the security definition?

we will require that m0,m1 are chosen by a poly‐time adversary

An encryption scheme is perfectly secret if for every m0,m1 Є M


PEnc(K, m0) = PEnc(K, m1)

we will require that no poly‐time adversary can distinguish


Enc(K, m0) from Enc(K, m1)
(Enc,Dec) – an encryption scheme
A ggame
security parameter
1n

adversary
(polynomial‐time probabilistic Turing machine) oracle

chooses m0,m1 such that m0,m1 1 selects k randomly


1. randoml from
|m0|=|m1|
{0,1}n
2. chooses a random b = 0,1
has to guess b c 3 calculates
3.
c := Enc(k,mb)
Alternative name: has indistinguishable encryptions
(sometimes we will say: “is computationally‐secure”, if the context is clear)

Security definition:
We say that (Enc,Dec)
(Enc Dec) is semantically‐secure
semantically secure if any polynomial time adversary guesses b
correctly with probability at most 0.5 + ε(n), where ε is negligible.
Testing the definition

Suppose the adversary can compute k from Enc(k,m).


C he
Can h win i the
h game??
YES!

Suppose the adversary can compute some bit of m from


Enc(k,m). Can he win the game?
YES!
Multiple
p messages
g
In real‐life applications we need to encrypt
multiple messages with one key.
key

The adversary may learn something about the key


by looking at
ciphertexts c1,...,ct of
some messages m1,...,m mt.

How are these messages chosen?


let’s say: the adversary can choose them!

(good tradition: be as pessimistic as possible)


A chosen‐plaintext attack (CPA)
security parameter 1. selects random k Є {0,1}n
1n 2. chooses a random b = 0,1

chooses m’1 m’1

c1 = Enc(k,m’1) oracle

...
chooses m’t m’t

challenge phase: ct = Enc(m’t)

chooses m0,m1 m0,m1

c = Enc(k,m
(k b)

the interaction continues . . .


has to guess b
CPA‐securityy
Alternative name: CPA‐secure

Security definition

We sayy that ((Enc,Dec)


, ) has indistinguishable
g encryptions
yp under a
chosen‐plaintext attack (CPA) if

every randomized polynomial time adversary


guesses b correctly
with probability at most 0.5
0 5 + ε(n),
ε(n) where ε is negligible.
negligible

Observation
Every CPA‐secure encryption has to be
• randomized, or
• “have a state”.
CPA in real‐life
Q: Aren’t we too pessimistic?
A: No! CPA can be implemented in practice
practice.

Example: routing

Enck(m)
k k

m
Is it possible to prove security?
Bad
d news:

Theorem
If semantically‐secure
encryption
i exists
i
(with |k| < |m| )

then

P ≠ NP

Intuition: if P = NP then the adversary can guess the key...


Proof [1/3]
(Enc,Dec) ‐‐ an encryption scheme.
For simplicity suppose that Enc is deterministic

Consider the followingg language:


g g
L = ∪{((Enc(k , m), m) : k ∈ {0,1}n , m ∈ {0,1}n +1}
n

Clearly L is in NP. (k is the NP‐witness)

Suppose P = NP and hence L is poly


poly‐time
time decidable.
decidable

m0,,m1
1. selects
1 l t k randomly
d l
chooses random m0,m1
c 2. chooses a random b = 0,1
such that |mi|=n+1
3. calculates c := Enc(k,mb)
If (c,m0) Є L then output 0
else output 1
Proof [2/3]
The adversary
Th d guesses b incorrectly
i tl only l if b = 1
and (c,m0) Є L. In other words:

“there exists k’ such that


Enck(m1) = Enck’(m0))”

What is the probability that this happens?


Proof [3/3]
k

From the correctness of


m0 c
encryption:
c can appear in each column
c
at most once.
messages
Hence the probability that it
m1 c appears in a randomly chosen
row is at most:
c
|K| /|M| = 1/2.

k
keys SSo, the
th adversary
d wins
i withith
c = Enck(m1) probability at least 3/4
Moral:
“If
If P
P=NP,
NP, then the semantically‐secure
semantically secure encryption is
broken”

Is it 100% true?

Not really...

This is
Thi i because
b even if P=NP
P NP we do
d nott know
k what
h t
are the constants.

Maybe P=NP in a very “inefficient way”...


To prove security of a cryptographic scheme we need to show
a lower bound on the computational complexity of some
problem.
In the “asymptotic
asymptotic setting”
setting that would mean that
at least
we show that P ≠ NP.

Does the implication in the other direction hold?


(that is: does P ≠ NP imply anything for cryptography?)

No! (at least as far as we know)

Therefore
proving
i that
h an encryption
i scheme
h iis secure is
i probably
b bl much
h
harder than proving that P ≠ NP.
What can we prove?
We can prove conditional results.
results

That is, we can show theorems of a type:


Suppose that some
“computational Suppose that some
assumption AA” scheme Y is secure
holds

then scheme X is
e sc
then e e X iss
scheme secure
secure.
secure.
Research program in cryptography
Base the security of cryptographic schemes on a small number of
well‐specified
p “computational
p assumptions”.
p
Examples of A:
“decisional Diffie‐Hellman assumption”
p
“strong RSA assumption”

Some “computational in this we


assumption A” have to
holds
“believe”
interesting only
if this is “far
far
from trivial”

then scheme X is the rest is


secure. provable
©2009 by Stefan Dziembowski. Permission to make digital or hard copies of part or all of
this material is currentlyy ggranted without fee p
provided that copies
p are made onlyy ffor
personal or classroom use, are not distributed for profit or commercial advantage, and
that new copies bear this notice and the full citation.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy