6ab4 PDF
6ab4 PDF
Instead, this paper is primarily focused on a comparative the maximum entropy based on the bee algorithm. The
study and analysis among Kittler and Illingworth’s MET, rest of the paper is organized as follows. Section (2) the
the three co-occurrence matrix-based entropy thresholding theory of maximum entropy is presented. Section (3) the
techniques and three relative entropy thresholding bee algorithm is explained. Section (4) the simulation
methods plus Otsu’s [22] method. While the methods of results are presented. Section (5) Conclusions are
Wong and Sahoo [23] and Pal and Pal [24], [25] presented.
incorporate some spatial image information in their
methods, others are mostly histogram-based techniques. 2. The Theory of Maximum Entropy
Although researchers have used spatial image information
in several non thresholding image segmentation methods
2.1. The Shannon Entropy
[26-28], thresholding is a fundamentally different and
simple operation. Further, final segmentations using these
The most generally accepted form of entropy was
methods depend on initial segmentations which are not
derived by Shannon (Shannon and Weaver, 1949) [36] in
needed by thresholding. One common trait of all
connection with information theory. Given a discrete
histogram based approaches is that they do not utilize the
considerable amount of information that is captured in the probability distribution Pi , i = 1,2,..., N , the entropy is
spatial distribution of intensities and in image morphology. given by
It is obvious that, in real-life imaging applications, it is n n
very difficult to select a threshold from the histogram only H = −∑ pi log( pi ) , ∑ pi = 1 , (1)
without seeing the image, while the latter has a clear i =0 i =0
object morphology. Yin [29] developed a recursive 0 ≤ pi ≤ 1
programming techniques to reduce the order of magnitude
of computing the multilevel thresholds and further used
Where i = 1,2,..., N , are a set of possible outcomes or
the PSO algorithm to minimize the cross entropy.
Since the theory of entropy was brought into the states of a discrete information source modelled as a
thresholding selection technology, many methods have Markov process. Shannon's measure is used as a measure
been proposed to deal with the problem of image of information gain, choice and uncertainty.
segmentation. The maximum entropy method is proved to Shannon points out a number of interesting properties of
do good results for the infrared image segmentation. But (1), including:
it needs to compute the entropy of every gray value. Over i) H=0 if and only if pi = 0 ∀i ≠ j , Pj = 1, where j
the last decade, modeling the behavior of social insects, can indicate any position in the distribution. Otherwise H
such as ants and bees, for the purpose of search and is positive. This makes intuitive sense, since
problems solving has been the context of the emerging
area of swarm intelligence. Ant colony algorithm [30] and Pj = 1 indicates certainty of the outcome, and the
particle swarm optimization [31] are two most popular information gained by the occurrence of event j is thus
approaches in swarm intelligence. The honey bee mating zero.
optimization [32] may also be considered as a typical ii) For a given number of discrete states N, H is a
swarm-based approach for optimization, in which the maximum when all the Pi are equal, i.e. Pi = 1 N ,
search algorithm is inspired by the process of mating in
real honey bees. In the literature, the honey bee mating i = 1,2,..., N . Intuitively this is the most uncertain
optimization algorithm had been adopted to search for the situation: all outcomes are equally likely, making accurate
optimal solution in many applications such as clustering prediction impossible.
[33], market segmentation [34] and benchmark
mathematical problems [35]. Taking the consideration the 2.2. Image Thresholding Based on the Maximum
complexity of its computation, we proposed a new Entropy
heuristic optimization algorithm, called the bee algorithm
to search the result for infrared image segmentation. In general, an image can be described by a discrete
The bee algorithm may also be considered as a typical function. For discrete values we deal with probabilities
swarm-based approach for optimization, in which the and summations. The probability of occurrence of gray
search algorithm is inspired by the real bees. The behavior level i in an image is approximated by
of bees is the interaction of their (1) genetic potentiality,
ni
(2) ecological and physiological environments, and (3) the Pi = , i = 0,1,2,..., L − 1. (2)
social conditions of the colony, as well as various prior n
and ongoing interactions between these three parameters.
This paper introduces a new approach for optimization of
28 Azarbad et al.
where, as noted at the beginning of this section, n is the i = 1,2,..., N . Effectively, by maximizing the entropy, we
total number of pixels in the image, ni is the number of are attempting to maximize our information gain or,
pixels that have gray level i , and L is the total number of equivalently, arrive at a solution which fits our prior
possible gray levels in the image (L=256). If we consider a knowledge but makes no assumptions beyond what is
threshold that is named T (0<T<L-1) we have two known. In the absence of any prior information, the
maximum entropy distribution is simply the uniform
regions CO , C B . Where, CO is the object region and distribution, as indicated by Shannon's point (ii) above. It
C B is the background region. Shannon defines the entropy can thus be aptly viewed as an application of Laplace's
of a system, which has n status, as principle of insufficient reason, which states that the
uniform distribution is the most unbiased when one has no
n prior knowledge regarding a probabilistic event. We solve
H = −∑ pi log( pi ) (3) equation (6) to get the optimized threshold (T = t )
*
i =0
which can make H (t ) maximum. The threshold T is
Where pi is the probability of the occurrence of the selected as the one which maximizes H (t ) .
event i , and the entropy H is the measuring of the system T = t * = Arg (max H (t )) , (7)
information. The value of the information obtained from an
Where 0 ≤ T = t ≤ 255 , directly to solve the
*
event is the inverse probability of the occurrence. Now
formula (2) is analyzed from the point of an image. The equation (7) will cost lots of time. In order to enhance the
event i can be regarded as a gray value of an image while speed and accuracy of the proposed algorithm, the bee
pi is the probability of the pixel being i . The theory of algorithm is used to extract the optimized threshold.
maximum entropy is to select i which makes H be 3. The Bee Algorithm
maximum value [37]. pO , p B are denoted as the
probabilities of grey levels of the object and background A colony of honey bees can extend itself over long
regions respectively. distances (more than 10 km) and in multiple directions
simultaneously to exploit a large number of food sources
pO = ∑p
i∈CO
i
, Where i = 0,1,2,..., T − 1; (4-1) [38-39]. A colony prospers by deploying its foragers to
good fields. In principle, flower patches with plentiful
amounts of nectar or pollen that can be collected with less
pB = ∑p
i∈C B
i
, (4-2) effort should be visited by more bees, whereas patches
with less nectar or pollen should receive fewer bees [40-
41]. The foraging process begins in a colony by scout bees
Where i = [T , T + 1, T + 2,..., 255] and the entropy being sent to search for promising flower patches. Scout
of the object region and the background region can be bees move randomly from one patch to another. During the
denoted as follows: harvesting season, a colony continues its exploration,
T −1 keeping a percentage of the population as scout bees [39].
H O (T ) = −∑ [( pi pO ) log( pi pO )], (5-1) When they return to the hive, those scout bees that found a
i =0 patch which is rated above a certain quality threshold
(measured as a combination of some constituents, such as
255 sugar content) deposit their nectar or pollen and go to the
H B (T ) = −∑ [( pi p B ) log( pi p B )], (5−2 ) “dance floor” to perform a dance known as the “waggle
i =T dance” [38].
The bee algorithm is an optimization algorithm inspired
by the natural foraging behavior of honey bees to find the
Thus, the function of the entropy is: optimal solution [42]. Table 1 shows the pseudocode for
the algorithm in its simplest form. The algorithm requires a
number of parameters to be set, namely: number of scout
Fitness = H (t ) = H O (t ) + H B (t ) , (6) bees (n), number of sites selected out of n visited sites (m),
number of best sites out of m selected sites (e), number of
The maximum entropy formalism attempts to bees recruited for the other (m-e) selected sites (n1),
maximize the equation (6), subject to known constraints number of bees recruited for best e sites (n2), initial size of
about the parameters we are trying to estimate Pi , patches (ngh) which includes site and its neighborhood and
Segmentation of Infrared Images and Objectives Detection Using Maximum Entropy Method Based on the Bee Algorithm 029
stopping criterion. The algorithm starts with the n scout Table 1. Parameters of the Bee Algorithm
bees being placed randomly in the search space.
The Bee Algorithm Parameters Symbol Value
According to the fitnesses of the sites visited by the
scout bees are evaluated. In step “select m sites for
neighborhood search”, bees that have the highest fitnesses Number of scout bees n 30
are chosen as “selected bees” and sites visited by them are
chosen for neighborhood search. Then, in steps “recruit Number of best selected sites m 25
bees for selected sites” and “select the fittest bee from each
patch”, the algorithm conducts searches in the
neighborhood of the selected sites, assigning more bees to Site radius for neighbourhood search ngh 1
search near to the best e sites. The bees can be chosen
directly according to the fitnesses associated with the sites Number of elite sites out of m selected e 20
they are visiting. Alternatively, the fitness values are used sites
to determine the probability of the bees being selected.
Searches in the neighborhood of the best e sites which Number recruited bees around best n1 15
represent more promising solutions are made more detailed selected sites
by recruiting more bees to follow them than the other
Number of recruited bees around elite n2 20
selected bees. Together with scouting, this differential selected sites
recruitment is a key operation of the bee algorithm.
However, in step “select the fittest bee from each patch”,
for each patch only the bee with the highest fitness will be
selected to form the next bee population. In nature, there is I n pu t I m a g e
T h r e s h o ld in g
&
S eg m e nta ti o n
S e g m en te d I m a g e
Figure 2. (a) Original Image of FLOWERS (b) Segmented Image of FLOWERS (c) Histogram of FLOWERS Image (d) Original Image of STEALER
(e) Segmented Image of STEALER (f) Histogram of STEALER Image (g) Original Image of DOGS (h) Segmented Image of DOGS (l) Histogram of
DOGS Image
Segmentation of Infrared Images and Objectives Detection Using Maximum Entropy Method Based on the Bee Algorithm 031
(c)
(a) (b)
(f)
(d) (e)
Figure 3. (a) Original Image of AIRPLANE (b) Segmented Image of AIRPLANE (c) Histogram of AIRPLANE Image (d) Original Image of SHIP
(e) Segmented Image of SHIP (f) Histogram of SHIP Image (g) Original Image of TANK (h) Segmented Image of TANK (l) Histogram of TANK Image
Fig. 2 shows the segmentation results obtained on the proposed approach. In the second row, the image
FLOWERS, STEALER and DOGS images by the “STEALER” mainly consists of one ideal object, the
proposed algorithm. For original images in the first stealer man, and we have decided to detect that from the
columns (Fig. 2), the segmentation results are shown in background objects. It is seen from the segmented image
the second columns and the histograms of the original that the proposed algorithm succeeded to detect the stealer
images are shown in the third column in Fig. 2. In the first man as black colour and segmented the background as
row, the flowers of the image were clearly segmented as white colour, as shown as in Fig. 2(e). In the third row, the
black and white colours, as shown in Figs. 2(b). And also image “DOGS” includes two noticable objects, two dogs,
the background information was segmented as black that should be segmented and extracted from the
colour. As seen from the first row of Fig. 2, the main background. According to the segmented image in Fig.
features of the image were clearly detected by the
32 Azarbad et al.
[23] A.K.C. Wong and P.K. Sahoo, “A Gray-Level Threshold Selection [25] N.R. Pal and S.K. Pal, “Entropy: A New Definition and Its
Method Based on Maximum Entropy Principle,” IEEE Trans. Applications,” IEEE Trans. Systems, Man, and Cybernetics, vol. 21,
Systems, Man, and Cybernetics, vol. 19, pp. 866-871, 1989. pp. 1260-1270, 1991.
[24] N.R. Pal and S.K. Pal, “Entropy Thresholding,” Signal Processing, [26] F.C. Cheng and J.W. Woods, “Compound Gauss-Markov Random
vol. 16, pp. 97-108, 1989. Fields for Image Segmentation,” IEEE Trans. Signal Processing,
vol. 39, pp. 683-697, 1991.
[27] A. Sarkar, M.K. Biswas, and K.M.S. Sharma, “A Simple [40] Bonabeau E, Dorigo M, and Theraulaz G. Swarm Intelligence:
Unsupervised MRF Model Based Image Segmentation Approach,” from Natural to Artificial Systems. Oxford University Press, New
IEEE Trans. Image Processing, vol. 9, pp. 801-812, 2000. York, 1999.
[28] M. Kass, A. Witkin, and D. Terzopoulos, “Snakes: Active Contour [41] Camazine S, Deneubourg J, Franks NR, Sneyd J, Theraula G and
Models,” Int’l J. Computer Vision, pp. 321-339, 1988. Bonabeau E. Self-Organization in Biological Systems. Princeton:
[29] P.Y. Yin, Multilevel minimum cross entropy threshold selection Princeton University Press, 2003.
based on particle swarm optimization, Applied Mathematics and [42] Eberhart, R., Y. Shi, and J. Kennedy, Swarm Intelligence. Morgan
Computation 184 (2007) 503–513. Kaufmann, San Francisco, 2001.
[30] Y.F. Han, P.F. Shi, An improved ant colony algorithm for fuzzy
clustering in image segmentation, Neurocomputing 70 (2007) 665–
671.
[31] J. Kennedy, R.C. Eberhart, Particle swarm optimization, Authors’ information
in:Proceeding of IEEE International Conference on Neural
1
Network, vol. IV, 1995, pp. 1942–1948. Faculty of Electrical and Computer Engineering, BUT.
2
[32] H.A. Abbasss, Marriage in honey bee optimization (HBO): a Faculty of Electrical and Computer Engineering, BUT.
3
haplometrosis polygynous swarming approach, in: The Congress Faculty of Electrical and Computer Engineering, BUT.
on Evolutionary Computation (CEC001), 2001, pp. 207–214.
[33] M. Fathian, B. Amiri, A. Maroosi, Application of honey bee mating
optimization algorithm on clustering, Applied Mathematics and Milad Azarbad was born in Amol. He received his B.S.
Computations (2007) 1502–1513. degree in electrical engineering from Mazandaran in 2010.
[34] B. Amiri, M. Fathian, Integration of self organizing feature maps His areas of researches are: General Area of Signal
and honey bee mating optimization algorithm for market Processing, Biomedical Engineering, Biomedical Image
segmentation, Journal of Theoretical and Applied Information Processing, Artificial Intelligence, Statistical Pattern
Technology (2007) 70–86. Recognition, Digital Communications and soft computing.
[35] D. Karaboga, B. Basturk, On the performance of artificial bee
colony algorithm, Applied Soft Computing (2008) 687–697.
[36] Shannon, C.E. and W. Weaver (1949). The Mathematical Theory Ataollah Ebrahimzadeh was born in Babol. He
of Communication. Univ. of Illinois Press, Urbana, IL. received his PhD degree in electrical engineering. Now
he is a professor in the Faculty of Electrical and
[37] Gonzalez, R. C., & Woods, R. E. (1992). Digital image processing
Computer Engineering at University of Mazandaran.
(3nd Edition 2006)
His research interests are: General Area of Signal
[38] Von Frisch K. Bees: Their Vision, Chemical Senses and Language. Processing, Wireless Communications, Biomedical
(Revised edn) Cornell University Press, N.Y., Ithaca, 1976. Engineering, Statistical Pattern Recognition, Artificial
[39] Seeley TD. The Wisdom of the Hive: The Social Physiology of Intelligence, Digital Communications.
Honey Bee Colonies. Massachusetts: Harvard University Press,
Cambridge, 1996.