0% found this document useful (0 votes)
12 views33 pages

Ip 09

Some important topics on image compression. Get the note. Thank you hlo gr it

Uploaded by

Sovan Mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views33 pages

Ip 09

Some important topics on image compression. Get the note. Thank you hlo gr it

Uploaded by

Sovan Mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Image Compression

• Data: means by which information is conveyed.


• Information: more unknown, more information. Associated with the
unpredictability.
• Various amounts of data may be used to represent same information.
• Often data may be non – essential or redundant.
• Data compression uses removal of redundancy to compress the amount of data.

Redundancy and Compression Ratio:


Let n1 and n 2 denote information carrying units in two data sets representing same
information.
1 n
The relative data redundancy of 1st set n1 is defined as R D = 1 − , where C R = 1 is
CR n2
called the compression ratio.
When n1 = n 2 , C R = 1, R D = 0 and there is no redundancy.
When n 2 << n1, C R → ∞, RD → 1 showing highest redundancy and greatest
compression.
When n2 >> n1, C R → 0, RD → −∞ , showing no compression.

Coding redundancy:
We assume grey levels are random quantities.
The values of these levels are represented by suitable encoding them.
Let rk in an interval [0,1] represent the grey levels.
n
Let each rk occur with probability p r (rk ) = k , with k = 0,1,2,  L − 1 where L is
n
number of grey levels.
Let l (rk ) bits be used to represent the value rk
Then average number of bits required to represent each pixel
L −1
Lavg =  l ( rk ) p r ( rk )
k =0

And total bits in the image of dimension M × N is MNLavg .


For natural m -bit binary code, Lavg = m
rk p r (rk ) Code 1 l1 (rk ) Code 2 l 2 (rk )
r0 0 0.19 000 3 11 2
r1 1/7 0.25 001 3 01 2
2/7 0.21 010 3 10 2
3/7 0.16 011 3 001 3
0.08 100 3 0001 4
0.06 101 3 00001 5
0.03 110 3 000001 6
r7 1 0.02 111 3 000000 6

Lavg = 2(0.19) + 2(0.25) + 2(0.21) + 3(0.16) + 4(0.08) + 5(0.06) + 6(0.03) + 6(0.02) = 2.7bits

3 1
CR = = 1.1, RD = 1 − = 0.099
2 .7 1.11
• Here coding has been used to minimize the number of symbols using variable
length coding, and code 1 has redundant code symbols. This is called removal of
coding redundancy.
• It uses the fact that generally images have regular and predictable shape and area
much large than the elements
• Hence some values are more probable than others, and they are coded with shorter
length codes.

Interpixel redundancy
A picture of 5 randomly arranged pencils, and 5 parallel pencils have the same histogram,
and same probabilities for each grey value.

However in one case, the picture shows a strong spatial correlation. Hence value of next
pixel can be predicted reasonably well. Then all values are not necessary, and many are
redundant.
These inter-pixel redundancies can be removed by using not pictorial mapping like
difference between adjacent pixels, value and length of grey level runs, etc. Usually this
leads to a non-pictorial representation.
e.g 1024 bits of binary data can be coded as
(1,63)(0,87)(1,37)(0,5)(1,4)(0,556)(1,62)(0.210) requiring 88 bits.
A complete image requires 12166 runs. 11 bits are required for each pair.

1024 × 343 × 1
Then Compression Ratio C R = = 2.63 , and R D = 0.62
12166 × 11
Psycho visual Redundancy:
The eye and the brain do not respond to all visual information with same sensitivity.
Some information is neglected during the processing by the brain. Elimination of this
information does not affect the interpretation of the image by the brain.
Edges and textural regions are interpreted as important features and the brain groups and
correlates such grouping to produce its perception of an object.
Psycho visual redundancy is distinctly vision related, and its elimination does result in
loss of information. Quantization is an example.
When 256 levels are reduced by grouping to 16 levels, objects are still recognizable. The
compression is 2:1, but an objectionable graininess and contouring effect results.

IGS Quantization:
The graininess from quantization can be reduced by Improved Grey Scale quantization
which breaks up the edges and contours and makes them less apparent. IGS adds a
pseudo random number generated from low order bits of neighboring pixels to each pixel
before quantizing them. If most significant bits are all 1, the 0s are added. The msbs
become the coded value.

Pixel Grey level Sum IGS code


i -1 NA 0000 0000 NA
i 0110 1100 0110 1100 0110
i +1 1000 1011 1001 0111 1001
i +2 1000 0111 1000 1110 1000
i +3 1111 0100 1111 0100 1111

Fidelity criteria:
Objective:
Let the pixel error be given as e( x, y ) = fˆ ( x, y ) − f ( x, y ) .
Then total error between two images
M −1 N −1

 [ fˆ ( x, y) − f ( x, y)]
x =0 y =0

The RMS error is


1
 1 M −1 N −1 
2 2

erms =   ˆ [
f ( x, y ) − f ( x, y ) ]
 MN x =0 y =0 
The mean square SNR is
M −1 N −1

 [ fˆ ( x, y)]
2

x =0 y =0
SNRms = M −1 N −1

 [ fˆ ( x, y) − f ( x, y)]
2

x =0 y =0

The RMS SNR is the square root of above ratio.


Subjective:
As commented upon by a cross-section of viewers 1 for excellent to 6 for unusable
i.e excellent, fine, passable, marginal, inferior, unusable.

Image compression models:

The three redundancy removal methods are typically combined to get maximum
compression.
Input image goes into an encoder which creates a set of symbols. It consists of a source
encoder and a channel encoder.
After transmission through the channel it goes into a decoder, consisting of channel
decoder and source decoder from which the reconstructed image is given out.
The source encoder removes most of the redundancies, and the channel encoder increases
noise immunity.
The two distinct coding processes are reversed at the respective decoders at the receiver.
If the channel has no noise, channel encoder-decoder may be omitted.

Source Encoder – Decoder:

The source encoder starts with a mapper transforming data into a non visual format
reducing interpixel redundancies. (i.e. run length encoding).
This is followed by a quantizer (psycho visual redundancy removal) and a symbol
encoder generates variable of fixed length code to minimize coding redundancy.
At the receiver input, symbol decoder is followed by an inverse mapper. (de-quantization
cannot be done, hence there is no de-quantizer block )

Channel Encoder and Decoder:


These are used to reduce the effect of noise during the transmission. The channel
encoding process is such as to re-introduce some controlled redundancy such as
redundant error correcting bits to overcome the noise.
Example is a 7,4 Hamming code with a distance of 3, which can correct all single bit
errors resulting from noise. To a 4 bit code, 3 redundant bits are added to get a resulting 7
bit Hamming encoded code.

Binary number is b3b2 b1b0


Hamming code word is h1 h2 h3 h4 h5 h6 h7 , where
h1 = b3 ⊕ b2 ⊕ b0 , h2 = b3 ⊕ b1 ⊕ b0 , h4 = b2 ⊕ b1 ⊕ b0 which are even parity bits for the
respective groups of bits, and h3 = b3 , h5 = b2 , h6 = b1 h7 = b0 ,

The decoder forms c1 = h1 ⊕ h3 ⊕ h5 ⊕ h7 , c 2 = h2 ⊕ h3 ⊕ h6 ⊕ h7 and


c 4 = h4 ⊕ h5 ⊕ h6 ⊕ h7

If the parity word c 4 c 2 c1 is non zero, its value indicates location of the erroneous bit,
which is inverted to correct the error. The correct binary word is then given by
h3 h5 h6 h7 , and thus a single error is corrected.

More errors can be corrected by defining more redundant bits, making channel encoded
word longer (This moves away from out intended compression and instead increases the
data length. For this reason longer hamming codes are not preferred.)
Information Theory
Entropy:
Information is modeled as a probabilistic process.
If an event E occurs with probability P(E) it contains
1
I ( E ) = log = − log P ( E ) units of information called self information of E.
P( E )
When the base of the log is 2, unit of information is a bit.
Assume the source produces random sequence of symbols from a finite set of source
alphabet A given by {a1 , a 2 ,  , a j }.
[ ]T
Let probability of a j be P (a j ) and let a vector z = P(a1 ),P(a 2 ), ,P(a j ) give the set of
probabilities of alphabet symbols.

The finite ensemble ( A, z ) then describes the source completely.
The self information in a j is
I (a j ) = − log P(a j )
If a large number k symbols are generated, a j will occur kP(a j ) times, contributing
information − kP(a j ) log P (a j )
Thus all the k symbols give information:
j
− k  P (a j ) log P (a j ) ,
j =1

The average information per symbol is defined as the Entropy of the source, and is given
by
j
H ( z ) = − P (a j ) log P (a j )
j =1

Source entropy is maximum when all symbols are equally likely.

The Channel Matrix:


Let the output of the channel belong to an alphabet B comprising of {b1 , b2 ,  bk } with a
probability vector

v = [P (b1 ), P (b2 ),  , P (bk )]
T


The finite ensemble (B, v ) then describes the received information completely.

P (bk ) is related to P (a j ) by

J
P (bk ) =  P (bk a j )P (a j )
j =1

Then we can write the matrix equation


 
v = Qz ,

 P (b1 a1 ) P (b1 a 2 )  P (b1 a j ) 


 
  P (b2 a j ) 
 
P (b2 a1 )
Where Q =       is called the channel matrix.
 
     
 P (b a )   P (bk a j )
 k 1

Conditional Entropy and Equivocation:


Now when the observed output is bk , we can estimate the entropy of the source using the
probability relation
K
P (a j ) =  P(a j bk ) P(bk )
k =1
If we define a conditional entropy function
J

H (z bk ) = − P (a j bk ) log P (a j bk )
j =1

Then, for all bk ’s we have


K
 
H (z v ) =  H (z bk )P(bk ) , or
k =1
J K
 
H (z v ) = − P (a j , bk )log P (a j bk ) , which is called Equivocation of z with respect to
j =1 k =1

v , and gives the information estimate in one source symbol based on observation of
output symbol.

Mutual Information and Channel capacity:


 
The difference between H (z ) and H (z v )is the information transmitted by the channel
(received by observer) estimated by observing one output symbol and is called mutual
information.
   
I ( z , v ) = H ( z ) − H (z v )
K
Upon substitution of the equivocation equation, and noting that P (a j ) =  p (a j , bk ) this
k =1
becomes
  J K P (a j , bk ) J K q kj
I (z , v ) =  P (a j , bk )log = P (a j )q kj log
j =1 k =1 P(a j ) ⋅ P(bk ) j =1 k =1  P(a j )⋅ q kj
 
The maximum value of I ( z , v ) over all possible choices of source probabilities (all

possible z ) is the Channel Capacity C : .
 
C = max[I ( z , v )]
z
Shannon’s Theorem for noiseless channels:

n-th extension of a source:


The source considered earlier generates individual symbols.
If the symbols are statistically independent (do not depend on other symbols) it is a zero-
memory source.
Assume that the source generates words of n characters rather than single symbols.
Its output is said to be an n-tuple of the symbol alphabet.
Such a source is said to be the nth extension of the basic source.

Properties of n-tuples:
Source output now has J n = M possible n element sequences α m called block variables.
A new dictionary now consists of a set A′ of n element sequences with
A′ = {α 1 , α 2 ,  , α M } and P (α i ) = P(a j1 ) P(a j 2 )  P (a jn )
Also,

z ′ = {P (α 1 ), P (α 2 ),  P (α M )}
Hence the corresponding entropy is
M

H ( z ) = − P(α i ) log P(α i )

i =1
 
And also it follows that H ( z ′) = nH ( z )

Shannon’s First Theorem (Theorem for noiseless coding):


1
The self information of output is log so it is reasonable to code it with codeword
P (α i )
of smallest integer length l (α i ) longer than the self information in bits. Thus
1 1
log ≤ l (α i ) < log +1
P (α i ) P (α i )
Or multiplying each term by its probability and adding over the ensemble,
M M M
1 1
i =1
P (α i ) log ≤ 
P (α i ) i =1
P (α i )l (α i ) < 
i =1
P (α i ) log
P (α i )
+1

Or
M
 
H ( z ′) ≤ Lavg ′ < H ( z ′) + 1 where Lavg ′ =  P(α i )l (α i )
i =1
Dividing by n,
 ′
Lavg  1
H (z) ≤ < H (z ) +
n n
In the limit as n becomes large, this becomes
′ 
 Lavg 
lim   = H (z )
n→∞
 n 
The Shannon’s first theorem states that:

Lavg
it is possible to make the ratio approach arbitrarily close to H(z) by coding
n
infinitely long extensions of the source.

H (z )
The coding efficiency can be now defined as η = n ,

Lavg
In the ideal case efficiency becomes 1 as n becomes infinite.

Noisy coding Theorem:


Suppose a Binary Symmetric Channel has a error probability Pe = 0.01
To increase its reliability, we decide to transmit each symbol (1 or 0) three times (000 or
111), and take a majority decision on receiving.
Our majority decision will be wrong if there are two or three errors in the received triplet,
3 2
with a probability Pe + 3Pe Pe which is 0.0003.
In general we can encode nth extension of source using a K-ary code sequence of length
r , such that K r ≤ J n , and select only φ of the possible K r sequences as valid, and also
we formulate rules for optimizing correct deciding when errors arise (i.e. 2 out of 8 valid,
and majority decision for rest in above example).

The zero memory source generates information at a rate H (z ) Units/symbol.

H ( z ′)
The nth extension generates at a rate .
n
φ
Maximum rate after coding is R = log since all words are not valid words.
r
Thus a code of size φ and block length r has a maximum rate R given above.
Shannon’s second theorem (noisy coding theorem) states that:

For any R < C where C is the capacity of a zero memory channel with channel matrix

Q , there exists an integer r and a code of block length r and rate R such that the
probability of a block decoding error is less than or equal to an arbitrary value ε for
ε > 0. .

This implies that probability of error in the noisy channel can be made arbitrarily small,
as long as coded message rate is less than the channel capacity.
Huffman Coding:
Removes coding redundancy by using a variable length coding procedure.

Symbols are arranged in order of decreasing probabilities, and probabililty values are
stated alongside.
The symbols with two smallest probabilities are (considered as) combined
the probability table is reordered using the combined value.
The smallest two probabilities in the new table are combined, and table again reordered.
This continues till only two probability values remain.
These are assigned code values 0 and 1.
Each coded probability value is examined to determine its original constituents, and if it
is a combination 0 and 1 code digits are added to make the code assignment unique.
This is repeated backwards till all original probability values have associated code words.

This procedure creates a near optimal code, and generates a look up table for subsequent
coding requirements. The code is an instantaneous (no references to previous procedures
are required for decoding) uniquely decodable, variable length block code. (see example)

P(a1) = 0.1, P(a2) = 0.4, P(a3) = 0.06, P(a4) = 0.1, P(a5) = 0.04, P(a6) = 0.3

The average length of this code in the example is 2.2 bits/symbol,


Entropy of the source is 2.14 bits/symbol, and code efficiency is 0.973.
The coded output for sequence a3a1a2a2a6 is 010100111100.
The output 0100100010001011 decodes to a4a2a6a4a5 unambiguously.

The coding method is difficult to apply when there are large number of symbols to be
coded. In such a case, truncated Huffman, B2, binary shift, Huffman shift and other
computationally more efficient codes are used with some sacrifice in average length.
For details of these codes, refer to the text book.
    # ) ".# 1!!.
""0(/ 23

  

•         


•                   
•                        
•                !  "
•            #         
          #         
   
• $          %            
    
•          
•                    &
'((
• )          *#       
 #                    
 #                +!#"  
 #                        
   

•                , (-.  !#/ 
      
•              #  !.'    
•                 0
"                     
   
1       
•                    
•    %       +!# "          
•          
•                  
               
456   6  # ) "/# 1!!.
"10"@ 3

456  
•      
•     7      
•                     
•                       
• 456     89)# 9))#  2:)

    
•     ;  ;                

• )  '     #   1./             
 !#"# 1## 1..

•           7 #       
            9         
#     #   ;1..1..;         1./#

•             #     1./#     
       1..1..#      

• 9  <# ."1            #     ' = ' 
              <     

$>324$0          "/ 


(< (< "1/ "1/
(< (< "1/ "1/
(< (< "1/ "1/
(< (< "1/ "1/

•  ."1              0

• 4   1./    .""      

•               #     

• $            ?   "  
 ?  ;   %  ;        
 ?  ;   %  ;        

•             

•    #
                 
 % #           "    1
 A       #       

• 9       #
         %       
   #
        %         #
     %     %      
  1   

•                    
      BB  

• A             

•       #       1/.        456  


           ?  
      "1'    <!  # "! <  

•       "B10"

•                       

•                   

• 456                 


        
C 2    6  # ) "/# 1!!.
"0!! 3

•         7      


• 2    7       
• 9                       
             

C    

•                      1
  

•               "   # 5     
      !    #          
   "     

• $        !  "           
                       

•                      


      9       "1@ !"""""""   D     
   "1' "!!!!!!!#             !  "   "  !
  

•                      
                     

•                      


  #                 

•     "1@  "1'   D #   @        !  " 
 #   8       "1@  "1'  ""!!!!!!  !"!!!!!!#
 

     
•            
• E                   "7  ,7
• 9           %  >  #
• C     #  #     
•              "     !#  
        1   "!  ""
•                   "  1
  
•                        
     
      6  # ) "/# 1!!.
"0"1 3

6 C  & 6C&


•              #
•         !                 "
          
•         #             # 
  
"                 
•             % "             
,7        "       6C&    
•                       
           
• )  1:  #            !#
•                    "
• 9      #        "#     
 #      !#       
• 9        #              
  %         !        "     
    

,         
•           
•          )>   
• F                     
        
•                 ,7  "7        
                        
  
•                0
"             
1            #    !  
•         %         
 
• C  6               
      
• )  #    D         D#    
                      
          D             
 
•               #   *!    
      $
•                   #   
•              

   4!  4D              #


 
• $                     
               
F      6  # ) "/# 1!!.
"0B< 3

,                 1:     

F      F


• C                    
  )      

                         

                     
       

 9  G   F                 
   

 9       

• F                  

• 9             #           #
    #                #

• ) # F     #            
    

•  )    #       C    % 

•               #

•                  F  #  
                  +  
>>>  >  )  '"@H            

• 9     ='  =B#   )      F      ""!!!""

• ) #  I !#      7   I "#          


      #   "!!          
               
    6  # ) "/# 1!!.
""0B" 23

      

• F                             


       !  "

• # $ %% & ! ' 2:J              

•           D          
       7# ;

• 7                   D   #

• ;              

• &             ( )      
        

• 9 ;                   D   #


  KKK      !& &   ::

•            7# ;  7# 7;     
                      

• 6    #                


               

•                         


 #                   D 

•        2:J  ::       7# ;  7;#      


             
A       # ) "@# 1!!.
"1011 3

• 9     #                 

• *             L &    C&     
   * !!"

•                    
)     !   "#      !!!  """   

• 2                  
"(

•                 .!M#

• 3         D       

•                      


           6         # 
D         

• )   * !!"#               !!!!(  


       #             
 

• 8  #              +     
  #  N O P

•               N7        
          %         

• F                    
        1(#  '#         

• *         !!!  """

• 9             #  D        


   

•  %                      
        '"

•                   '",     


  

• 9        #    #          


  Q                 

•    % Q                   

• - ) )  #               . /
       %    & (        
           .    2         
     O R   R S !
4  2      # 3 !'# 1!!.
(0!/ 23

4  2    

• 0 ))&))  $                  
         

•  (  %                 
   

•                   

• $               #     


          *  

•           #         #    
                

•                 #      
     

• $                    




•                    


  

• 9 ":        #       

•              %% &    $ !) 1&


 
•           

• T                      
     

•                       
        (</       /'" -

•                      

• )    #  2              

•                      

•                    
   9                  
    (!0" 9              "!0"  1!0"
4     # 3 !'# 1!!.
B0!. 23

4  2    

•    %        

•                     
               7       
                      
 %    #

•                  7 

• 3&  !& :3                  % 
  

•      %#              %  "  
      :3     " -

• 6 ζ         7    #       )& 
$&   6 ζ           #  !& )


•         D        

•               %  #  
 %      %    
, 2    # 3 !'# 1!!.
B0(. 23

, 2  

•                  %    7  
    

 J %         


 2                  

•            %% & !&)   !& :23

• ,                   % 


• :  $       #       % #    
                 % 
     

 F"             


 α      

•                     

•                      

•              F         & &


                   

• 8    % & &               


           $ 

•   1: 3         


, J %  # 3 !'# 1!!.
B0./ 23

,  %

•    %    * )"      )   


     0,   )  

•                 ) 
 )! &$&) %   %

• )          9          ) )2&4

•   %         )  #    
%           )"

4  3  %

• 9  %     %      %
          #             
  0

•                 )"  
    #        
   

• )    4#  )              


 U

•   %    4  0&  51  %

• :             $ #    


  

• 4  3  %     

•     %              
 

• &            % #      
 %    

•              


      # 3 !'# 1!!.
/0"1 23

    

•   #                      #
   %    

• )                       


    %         

•       0      #    #  % #   


 :   0  A  6                %  
         A- 1       
    0            
                 
 J % 0            %    
      
 $   0            %   

•                   #   $  )% 


  

• 9       #   $  )%   

•         N  4   N4#  )  :)#


    :# 6*  6*#       

•                      


          

•                      
                

•        :        :)  6*    
  

• N4#               9  %  
                  

• N4         N4        #   


• *   N4      


• :#             #    

• 9                     6*  *      
 

•            :)  :             
N4

• *              :#         
          

• 9                  

• 9                  #           
         #   %  & 7 %

• 6  :)            % # 8            
#          

•       D                 
                

•  :    #    1               

9  % 

•                     %

• E                  D         
              1        

•               

•                   %  

•        %  ''  "/  "/

C  

• F                          
      

•  #                  

• 9        #         ,      #  ' &
          #  ) &   

•        #  % #                  
 &&  
5     # 3 !'# 1!!.
/0B1 23

5   

• 5             

•                


              

•              6, "   
  #         #  3
     

•  %           8! $"  


       ' & )7        "  
         !      

•                    


  #     %      ) 

•         %        % 
  #  %             
       )  

• 9   #            # 9 
#        %        
    % 

• 9             


    %#     4  3  %#     
   

•       %#  %   :     


    F      #        
    4  8  

• 5                   

•                


                
     # 3 !'# 1!!.
/0BB 23

    0
•                      
       

•     #                     


           

•                       


   

• C                    
   8! $"! $"                ":#  
      )       ) & )7      

1
• 6                  
       % %        #

•     ":            ,7       
      %      #          
   "#           

•              #        


       #             
 

•             U 9   #  6& )
                    # 
             

•                      
         #            
 %      

•     8! $"! $" 

• 89! $"        %    89! #  5     
% 

• C   %      %      #    


         %! $"      :! V
•    % #      8! 0

•                


  

• )    $        :! $" 


     

• 89!       7    

9 5# $" ; 8!   89! $" I !#         


       

• 6 89!             


         7  #      
  8!          

•      5         


   

•   %          P2$8 #
                
     2     ) 
C  &    # 3 "@# 1!!.
/0"< 3

,      

• 9  &3 9 8  (    #      


                
                

•         

 9        /(        


  *     

 9        /(#     


               
   D          
           
 

•                 


    #    !!""!"!"#         
  % 

•        $,4     !!!!!!!!!!!"   


    #             
 

•                 


$,4

      
•  1:        9 8  (  B     
             
              
 %  &           & 

•            %  & 

•                     


  

• )       & )<  #     
  

•               #  


                
                 
       

                 


           

               


   

                 


               

              # 


   

• 9            #    


                
1:    0

•             


  ))   $&     ' &  
•    #       1    "

•   #       


    "  "        (

•             

•          #   


  

• 2             


#
• 9             !!!" 
  
•                
             

• 9  %     #         


"             % 
      !!"

• 9     #      


            

•  1 )              


            
P2$8  # 3 "@# 1!!.
@0!< 3

&          $0

• 9  9&,         P2$8   

•  P2$8          0


    )&    ))     :  
     U
  1    ))      #    #
       U 
          ))       
   1./-1
• 9    #    )! & )&  ))   
       ' #    % :
    "" 

•         0 :   #  % #


4      

• T               % ''

•    /B           1 "# 
 :   A I '
1        

•  1:                #  %


    #   % %  #     ":     %
  

•                  


    #       % 

•  %              


     7       % 
E &  %   
•  :             :     
    

•       P2$8 *  

     P2$8  


•           :      
        

• 9  :           %    


    "@#   :23    +1/  "@H  <#
   :     B   /"B

• 9        *       /".# 


          B    "!"  (  #  
             B     @ 

•    B              4&C
     ,   ":     % %

• )     :     N#    + 4&C     
    + 4&C          "

• )      <#    4&C  !"""  "  !""!#  
:23   :      "!"!""!

• A %                


:                % 
       %        #   
       %   

•   %           (     !"!!#


  1                 
  1     %     
    
Image Restoration:

(refer to chap 5, art 5.1, 5.2 and 5.3 upto 5.3.2)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy