0% found this document useful (0 votes)
47 views11 pages

Tariq Rahim Report

This document provides a summary of: 1. The history of probability theory, dating back to ancient artifacts resembling dice and its earliest documented use in games by Egyptians and mathematicians like Cardano in the 1500s. 2. Key concepts in probability theory including classical, frequency-based, and axiomatic definitions as well as conditional probability and Bayes' Theorem. 3. Applications of probability theory and statistics in electrical engineering, including signal processing, computer memory design, and optical communication systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views11 pages

Tariq Rahim Report

This document provides a summary of: 1. The history of probability theory, dating back to ancient artifacts resembling dice and its earliest documented use in games by Egyptians and mathematicians like Cardano in the 1500s. 2. Key concepts in probability theory including classical, frequency-based, and axiomatic definitions as well as conditional probability and Bayes' Theorem. 3. Applications of probability theory and statistics in electrical engineering, including signal processing, computer memory design, and optical communication systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/281135849

A Report on Probability Theory and its Applications to Electrical Engineering

Research · August 2015


DOI: 10.13140/RG.2.1.4043.9520

CITATIONS READS
0 31,879

1 author:

Tariq Rahim
Northwestern Polytechnical University
43 PUBLICATIONS 205 CITATIONS

SEE PROFILE

All content following this page was uploaded by Tariq Rahim on 22 August 2015.

The user has requested enhancement of the downloaded file.


A Report on Probability Theory and its Applications
to Electrical Engineering
Tariq Rahim
PhD student at School of Electronics & Information
Student ID: 2013410001

1. A Brief History of Probability Theory:


How old is the search for random patterns? Archaeologists have found prehistoric artifacts that
appear as if they could have been used in the same way that we use dice today. Bits of bone and
carefully marked stones that have been unearthed at prehistoric sites were clearly created or at
least put aside for a purpose. These objects evidently had meaning to the user, and they resemble
objects that were later used in board games by, for example, the ancient Egyptians. This evidence
is, however, difficult to interpret. Without a written record it is difficult to know what the
artifacts meant to the user.
One of the earliest devices for producing random patterns for which there is direct evidence is
the astragalus, a bone found in the heels of deer, sheep, dogs, and other mammals. When thrown,
the astragalus can land on any of four easy-to-distinguish sides. Many astragali have been found
at prehistoric sites, and it is certain that they were used in ancient Egypt 5,000 years ago in
games of chance. There are pictures of Egyptians throwing astragali while playing board games.
Unfortunately, the only record of these early games is a pictorial one. We do not know how the
game was played or how the patterns produced by throwing the astragali were interpreted.
The Italian mathematician Girolamo Cardano (1501–76), also known as Jerome Cardan, was the
first to write in a somewhat modern way about the odds of throwing dice. His interest in rolling
dice is understandable. He loved to gamble. He loved to play chess and bet on the outcome. He
was also a prominent physician as well as a mathematician, astrologer, and scientist. He lived in
Italy during the Renaissance and contributed to knowledge in a variety of fields. Cardano was a
Renaissance man—smart, self-confident, and self-absorbed. He wrote at length about himself,
and he enjoyed describing and praising his own accomplishments.
The origin of formal probability theory can be traced to modeling of games of chances such as
dealing from a deck of cards, or spinning a roulette wheel. The 17th century records the first
documented evidence of the use of Probability Theory. More precisely in 1654 Antoine
Gombaud, Chevalier de M´er´e, a French nobleman with an interest in gaming and gambling
questions called the attention of the famous mathematicians Blaise Pascal and Pierre de
Fermat. The exchange of letters between these two mathematicians became the first documented
evidence of the fundamental principles of the theory of probability. In 1657, the Dutch scientist
Christian Huygens wrote a small work De Ratiociniis in Ludo Aleae the first printed work on
the calculus of probabilities. Jacob Bernoulli a Swiss mathematician, whose most original work

1
was Ars Conjectandi published in Basel in 1713, eight years after his death. The work was
incomplete at the time of his death but it was still a work of the greatest significance in the
development of the theory of probability.
French mathematician De Moivre in 1718, in his book The Doctrine of Chance: A
method of calculating the probabilities of events in play presented the definition of statistical
independence for the first time. In 1756 edition of this book he also presented The First Central
Limit Theorem, which is considered as his most significant work.
The 19th century saw the development and generalization of the early theory. Pierre-
Simon marquis de Laplace publishes in 1812 Th´eorie Analytique des Probabilit´es. This is the
first fundamental book in probability ever published.
In 1933 a monograph Grundbegriffe der Wahrscheinlichkeitsrechnung by the Russian
giant mathematician Andrey Nikolaevich Kolmogorov outlined an axiomatic approach that
forms the basis for the modern probability theory.

2. Probability, Conditional Probability and Bay’s Theorem:


In this section we shall present different types of probability definitions with little discussion. In
the end we shall present a very important theorem of probability i.e. Bayes’ theorem.

2.1 Probability – Informal Definition:


Probability is a way of describing uncertainty in numerical terms. The probability of an
event is a number from 0 to 1, inclusive, that indicates the likelihood that the event occurs when
the experiment is performed. The greater the number, the more likely the event.

2.2 Probability – The Classical Definition:


If there are n equally likely possibilities, of which one must occur, and m of these are
regarded as favorable to an event, or as “success,” then the probability of the event or a
“success” is given by m/n.

2.3 Probability – The Frequency Definition:


The probability of a future event is the proportion of times the same kind of event has
occurred under same condition in a long run of repeated experiments in past.

2.4 Probability – Axiomatic Definition:


Let S be a sample space of an experiment. Probability P(.) is a real-valued function that
assigns to each event A in the sample space S a number P(A), called the probability of A, with
the following conditions satisfied:

1. It is nonnegative, .
2. It is unity for a certain event. That is, .
3. It is additive over the union of an infinite number of pair-wise disjoint events, that is, if
form a sequence of pair-wise mutually exclusive events (that is,
) in S, then .

2
2.5 Conditional Probability and Bayes’ Theorem :

P(B|A) is the probability of the occurrence of event B given that A occurred and is given
as:
(1)
Knowing that A occurred reduces the sample space to A, and the part of it where B also
occurred is . Note that equation (1) is well-defined only if . Because is
commutative, we have

(2)

which gives us Bayes’ theorem:

When are mutually exclusive and exhaustive, namely, , therefore,

Above equation is sometimes called the rule of total probability or the rule of
elimination. Therefore the Bayes’ theorem can be re-written as

for r = 1,2,…,n.

3: Applications of probability and statistics to Electrical


Engineering
Probability theory provides powerful tools to explain, model, analyze, and design technology
developed by electrical and computer engineers. From the field of communication engineering,
digital communication, source coding, channel coding, Filter design for noise cancellation,

3
pattern recognition in case of Radar are mostly model by probability, stochastic process and
statistics. Monte Carlo computer simulations are usually performed in practice to estimate the
probability of error of a digital communication system, especially in cases where the analysis of
the detector performance is difficult to perform.

3.1 Signal processing:

Let us consider the situation shown in the figure below. To determine the presence of aircraft, a
known signal v(t) is sent out by the radar transmitter. If there is no object in the range of the
radar, the radar receives only a noise whose waveform, represented by Xt . If there is no object in
the range the reflected radar signal plus noise is produced. The overall goal is to decide whether
the received signal is noise only or noise plus the reflected signal. The graph represented signal
is given at the input and output of a matched filter.

4
Input of the matched filter

Output of the matched filter

3.2 Computer memories:

Suppose we are designing a computer memory to hold n-bit words. To increase the reliability,
we can employ an error correcting code system. With this system, instead of storing just n bits,
we store and an additional of m bits (which are functions of the data bits). When reading back
(n+m) bit word, if at least x bits are read out correctly, then all n data bits can be recovered (the
value of x depends on the code). To characterize the quality of the computer memory, we
compute the probability that at least x bits are correctly read back. We can do this with the help
of binomial random variable.

3.3 Optical Communication:


Optical communication system use photo detectors to interface between optical and electronic
subsystems. When these systems are at the limits of their operating capabilities, the number of
photoelectrons produced by the photo detector is well model by Poisson method. In deciding
whether a transmitted bit is zero or a one, the receiver counts the number of photoelectrons and
compares it to a threshold. System performance is determined by computing the probability the
threshold is exceeded.

5
3.4 Wireless communication systems:
In order to enhance weak signals and maximize the range of communication systems, it is
necessary to use amplifiers. Unfortunately, amplifiers always generate thermal noise, which is
added to the desired signal. As a consequence of the underlying physics, the noise is Gaussian.
Hence, the Gaussian density function plays a prominent role in the analysis and design of
communication systems. When noncoherent receivers are used, e.g., noncoherent frequency shift
keying, this naturally leads to the Rayleigh, chi-squared, noncentral chi-squared, and Rice
density functions.
3.5 Variability in electronic circuits:
Although circuit manufacturing processes attempt to ensure that all items have nominal
parameter values, there is always some variation among items. How can we estimate the average
values in a batch of items without testing all of them? How good is our estimate?
We will learn find the answer of these questions when we study parameter estimation and
confidence intervals. Incidentally, the same concepts apply to the prediction of presidential
elections by surveying only a few voters.
3.6 Computer network traffic:
Prior to the 1990s, network analysis and design was carried out using long-established
Markovian models. We can study Markov chains, with the help of which we can observe the
traffic of local-area networks, wide-area networks, and in World Wide Web traffic, a great
research effort began to examine the impact of self similarity on network analysis and design.
This research has yielded some surprising insights into questions about buffer size vs.
bandwidth, multiple time-scale congestion control, connection duration prediction, and other
issues.
3.7 Bayesian Filters:
If we are given stream of observations z, and action model u as , a sensor
model P(z|x), action model P(x|u,x’) and the prior probability of system state P(x), we can
estimate the state of a dynamic systems by using Bayesian Filtering techniques. The popular
Bayesian filters are:

1) Kalman Filters
2) Particle Filters
3) Hidden Markov Models
4) Dynamic Bayesian Networks
5) Partially Observable Markov Decision Processes (POMDPs)

6
These filters and other complete models are given in the reference books.

In spite of the foregoing applications, probability and statistics are used in many other fields,
such as economics, finance, medical treatment and drug studies, manufacturing quality control,
public opinion surveys, games, etc.

7
4. Practical Examples related to Electrical Engineering
Example 1 : Three switches connected in parallel operate
independently. Each switch remains closed with probability
p. (a) Find the probability of receiving an input signal at the
output. (b) Find the probability that switch S1 is open given
that an input signal is received at the output.
S
1

2 S
Input Output
3 S

Solution: a. Let Ai = “Switch Si is closed”. Then P( Ai )  p,


i  1  3. Since switches operate independently, we have
P( Ai Aj )  P( Ai ) P( Aj ); P( A1 A2 A3 )  P( A1 ) P( A2 ) P( A3 ).

Let R = “input signal is received at the output”. For the


event R to occur either switch 1 or switch 2 or switch 3
must remain closed, i.e.,
R  A1  A2  A3.

P( R)  P( A1  A2  A3 )  1  (1  p)3  3 p  3 p2  p3.
We can also derive this equation in a different manner.
Since any event and its compliment form a trivial partition,
we can always write
P( R)  P( R | A1 ) P( A1 )  P( R | A1 ) P( A1 ).
But P( R | A1 )  1, and P( R | A1 )  P( A2  A3 )  2 p  p2
and using these in the above equation we obtain
P( R)  p  (2 p  p2 )(1  p)  3 p  3 p2  p3 ,

8
Note that the events A1, A2, A3 do not form a partition, since
they are not mutually exclusive. Obviously any two or all
three switches can be closed (or open) simultaneously.
Moreover, P( A1 )  P( A2 )  P( A3 )  1.
b. We need P( A1 | R). From Bayes’ theorem

P( R | A1 ) P( A1 ) (2 p  p 2 )(1  p) 2  2 p  p2
P( A1 | R)    .
P( R ) 3 p  3 p2  p3 3 p  3 p2  p3

Because of the symmetry of the switches, we also have


P( A1 | R)  P( A2 | R)  P( A3 | R).

Example 2

Three bits are transmitted across a noisy channel and the number of correct receptions is noted.
Find the probability that the number of correctly received bits is two, assuming bit errors are
mutually independent and that on each bit transmission the probability of correct reception is λ
for some fixed 0 ≤λ ≤ 1.

Solution. When the problem talks about the event that two bits are correctly received, we
interpret this as meaning exactly two bits are received correctly; i.e., the other bit is received in
error. Hence, there are three ways this can happen: the single error can be in the first bit, the
second bit, or the third bit. To put this into the language of events, let Ci denote the event that the
ith bit is received correctly (so P(Ci) =λ ), and let S2 denote the event that two of the three bits
sent are correctly received. Then

This is a disjoint union, and so P(S2) is equal to

Next, since C1, C2, and C3 are mutually independent, so are C1 and (C2 ∩C3). Hence, Cc1
and (C1 ∩C2) are also independent. Thus,

9
Treating the last two terms similarly, we have P(S2) = 3(1−λ )λ 2. If bits are as likely to be
received correctly as incorrectly, i.e., λ = 1/2, then P(S2) = 3/8.

5. References:

 Han-Fu Chen “Stochastic Approximation and Its Applications” Kluwer


Academic Publishers 2003.
 Florescu; Probability; “A (very) brief history”; Lecture Notes on Probability; Stevens
Institute of Technology, New York 2008.
 John Tabak “Probability and statistics” the science of uncertainty and the history of math
published in America 2004.
 Serguei Primak “Stochastic Methods and Their Applications to Communications”
Stochastic differential equation approach John Wiley & Sons Ltd 2004.
 K. Stordah; “The History Behind the Probability Theory and the Queuing Theory”;
Telektronikk 2, Oslo, 2007.
 JOHN A. GUBNER “Probability And Random Processes For Electrical And Computer
Engineers” Cambridge University Press 2006.
 Y. SUHOV;M. KELBERT “Probability and Statistics by Example” Cambridge
University Press 2005
 Miller and Marylees Miller; “J. E. Freund’s Mathematical Statistics with
Applications”; 7th Ed.
 K. M. Ramachandran; “Mathematical Statistics with Applications”, Elsevier, 2009.
 E. Alpaydin; “Introduction to Machine learning”; Prentice Hall of India, New Dehli,
2005.
 W. Burgard; “A Probability Primer for Robotics Students”; University of Freiburg,
Germany, 2002.

10

View publication stats

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy