Tariq Rahim Report
Tariq Rahim Report
net/publication/281135849
CITATIONS READS
0 31,879
1 author:
Tariq Rahim
Northwestern Polytechnical University
43 PUBLICATIONS 205 CITATIONS
SEE PROFILE
All content following this page was uploaded by Tariq Rahim on 22 August 2015.
1
was Ars Conjectandi published in Basel in 1713, eight years after his death. The work was
incomplete at the time of his death but it was still a work of the greatest significance in the
development of the theory of probability.
French mathematician De Moivre in 1718, in his book The Doctrine of Chance: A
method of calculating the probabilities of events in play presented the definition of statistical
independence for the first time. In 1756 edition of this book he also presented The First Central
Limit Theorem, which is considered as his most significant work.
The 19th century saw the development and generalization of the early theory. Pierre-
Simon marquis de Laplace publishes in 1812 Th´eorie Analytique des Probabilit´es. This is the
first fundamental book in probability ever published.
In 1933 a monograph Grundbegriffe der Wahrscheinlichkeitsrechnung by the Russian
giant mathematician Andrey Nikolaevich Kolmogorov outlined an axiomatic approach that
forms the basis for the modern probability theory.
1. It is nonnegative, .
2. It is unity for a certain event. That is, .
3. It is additive over the union of an infinite number of pair-wise disjoint events, that is, if
form a sequence of pair-wise mutually exclusive events (that is,
) in S, then .
2
2.5 Conditional Probability and Bayes’ Theorem :
P(B|A) is the probability of the occurrence of event B given that A occurred and is given
as:
(1)
Knowing that A occurred reduces the sample space to A, and the part of it where B also
occurred is . Note that equation (1) is well-defined only if . Because is
commutative, we have
(2)
Above equation is sometimes called the rule of total probability or the rule of
elimination. Therefore the Bayes’ theorem can be re-written as
for r = 1,2,…,n.
3
pattern recognition in case of Radar are mostly model by probability, stochastic process and
statistics. Monte Carlo computer simulations are usually performed in practice to estimate the
probability of error of a digital communication system, especially in cases where the analysis of
the detector performance is difficult to perform.
Let us consider the situation shown in the figure below. To determine the presence of aircraft, a
known signal v(t) is sent out by the radar transmitter. If there is no object in the range of the
radar, the radar receives only a noise whose waveform, represented by Xt . If there is no object in
the range the reflected radar signal plus noise is produced. The overall goal is to decide whether
the received signal is noise only or noise plus the reflected signal. The graph represented signal
is given at the input and output of a matched filter.
4
Input of the matched filter
Suppose we are designing a computer memory to hold n-bit words. To increase the reliability,
we can employ an error correcting code system. With this system, instead of storing just n bits,
we store and an additional of m bits (which are functions of the data bits). When reading back
(n+m) bit word, if at least x bits are read out correctly, then all n data bits can be recovered (the
value of x depends on the code). To characterize the quality of the computer memory, we
compute the probability that at least x bits are correctly read back. We can do this with the help
of binomial random variable.
5
3.4 Wireless communication systems:
In order to enhance weak signals and maximize the range of communication systems, it is
necessary to use amplifiers. Unfortunately, amplifiers always generate thermal noise, which is
added to the desired signal. As a consequence of the underlying physics, the noise is Gaussian.
Hence, the Gaussian density function plays a prominent role in the analysis and design of
communication systems. When noncoherent receivers are used, e.g., noncoherent frequency shift
keying, this naturally leads to the Rayleigh, chi-squared, noncentral chi-squared, and Rice
density functions.
3.5 Variability in electronic circuits:
Although circuit manufacturing processes attempt to ensure that all items have nominal
parameter values, there is always some variation among items. How can we estimate the average
values in a batch of items without testing all of them? How good is our estimate?
We will learn find the answer of these questions when we study parameter estimation and
confidence intervals. Incidentally, the same concepts apply to the prediction of presidential
elections by surveying only a few voters.
3.6 Computer network traffic:
Prior to the 1990s, network analysis and design was carried out using long-established
Markovian models. We can study Markov chains, with the help of which we can observe the
traffic of local-area networks, wide-area networks, and in World Wide Web traffic, a great
research effort began to examine the impact of self similarity on network analysis and design.
This research has yielded some surprising insights into questions about buffer size vs.
bandwidth, multiple time-scale congestion control, connection duration prediction, and other
issues.
3.7 Bayesian Filters:
If we are given stream of observations z, and action model u as , a sensor
model P(z|x), action model P(x|u,x’) and the prior probability of system state P(x), we can
estimate the state of a dynamic systems by using Bayesian Filtering techniques. The popular
Bayesian filters are:
1) Kalman Filters
2) Particle Filters
3) Hidden Markov Models
4) Dynamic Bayesian Networks
5) Partially Observable Markov Decision Processes (POMDPs)
6
These filters and other complete models are given in the reference books.
In spite of the foregoing applications, probability and statistics are used in many other fields,
such as economics, finance, medical treatment and drug studies, manufacturing quality control,
public opinion surveys, games, etc.
7
4. Practical Examples related to Electrical Engineering
Example 1 : Three switches connected in parallel operate
independently. Each switch remains closed with probability
p. (a) Find the probability of receiving an input signal at the
output. (b) Find the probability that switch S1 is open given
that an input signal is received at the output.
S
1
2 S
Input Output
3 S
P( R) P( A1 A2 A3 ) 1 (1 p)3 3 p 3 p2 p3.
We can also derive this equation in a different manner.
Since any event and its compliment form a trivial partition,
we can always write
P( R) P( R | A1 ) P( A1 ) P( R | A1 ) P( A1 ).
But P( R | A1 ) 1, and P( R | A1 ) P( A2 A3 ) 2 p p2
and using these in the above equation we obtain
P( R) p (2 p p2 )(1 p) 3 p 3 p2 p3 ,
8
Note that the events A1, A2, A3 do not form a partition, since
they are not mutually exclusive. Obviously any two or all
three switches can be closed (or open) simultaneously.
Moreover, P( A1 ) P( A2 ) P( A3 ) 1.
b. We need P( A1 | R). From Bayes’ theorem
P( R | A1 ) P( A1 ) (2 p p 2 )(1 p) 2 2 p p2
P( A1 | R) .
P( R ) 3 p 3 p2 p3 3 p 3 p2 p3
Example 2
Three bits are transmitted across a noisy channel and the number of correct receptions is noted.
Find the probability that the number of correctly received bits is two, assuming bit errors are
mutually independent and that on each bit transmission the probability of correct reception is λ
for some fixed 0 ≤λ ≤ 1.
Solution. When the problem talks about the event that two bits are correctly received, we
interpret this as meaning exactly two bits are received correctly; i.e., the other bit is received in
error. Hence, there are three ways this can happen: the single error can be in the first bit, the
second bit, or the third bit. To put this into the language of events, let Ci denote the event that the
ith bit is received correctly (so P(Ci) =λ ), and let S2 denote the event that two of the three bits
sent are correctly received. Then
Next, since C1, C2, and C3 are mutually independent, so are C1 and (C2 ∩C3). Hence, Cc1
and (C1 ∩C2) are also independent. Thus,
9
Treating the last two terms similarly, we have P(S2) = 3(1−λ )λ 2. If bits are as likely to be
received correctly as incorrectly, i.e., λ = 1/2, then P(S2) = 3/8.
5. References:
10