Rohini 82317868804
Rohini 82317868804
(Source: https://www.researchgate.net/figure/Block-diagram-of-a-typical-communication-system-
doi101371-journalpone0082935g001_fig15_259457178)
1. The encoding process is a process that takes a k information bits at a time and
maps each k-bit sequence into a unique n-bit sequence. Such an n-bit sequence
is called a code word.
2. The code rate is defined as k/n.
3. If the transmitted symbols are M-ary (for example, M levels), and at the
receiver the output of the detector, which follows the demodulator, has an
estimate of the transmitted data symbol with
(a). M levels, the same as that of the transmitted symbols, then we say the
detector has made a hard decision;
(b). Q levels, Q being greater than M, then we say the detector has made a soft
decision.
Channels models:
1. Binary symmetric channel (BSC):
If (a) the channel is an additive noise channel, and (b) the modulator and
demodulator/detector are included as parts of the channel. Furthermore, if the
modulator employs binary waveforms, and the detector makes hard decision, then
the channel has a discrete-time binary input sequence and a discrete-time binary
output sequence.
Note that if the channel noise and other interferences cause statistically
independent errors in the transmitted binary sequence with average probability
p, the channel is called a BSC. Besides, since each output bit from the channel
depends only upon the corresponding input bit, the channel is also memoryless.
2. Discrete memory less channels (DMC):
A channel is the same as above, but with q-ary symbols at the output of the
channel encoder, and Q-ary symbols at the output of the detector, where Q q .
If the channel and the modulator are memory less, then it can be described by a
set of qQ conditional probabilities
P (Y = y i | X = x j ) P ( y i | x j ), i = 0,1,...,Q − 1; j = 0,1,..., q −1
Such a channel is called discrete memory channel (DSC).
If the input to a DMC is a sequence of n symbols u1 , u2 ,..., un selected from the
alphabet X and the corresponding output is the sequence v1 , v 2 ,..., vn of
symbols from the alphabet Y, the joint conditional probability is
Channel Capacity:
Channel model: DMC
Input alphabet: X = {x0 , x1 , x 2 ,..., xq−1}
Output alphabet: Y = {y 0 , y1 , y 2 ,..., yq−1}
Suppose x j is transmitted, yi is received, then
The mutual information (MI) provided about the event {X = x j } by the
occurrence of the event
Hence, the average mutual information (AMI) provided by the output Y about
the input X is