... is a measure of uncertainty
Encoding
Entropy
Information
Redundancy
{1,2,3,4,5,6} is the sample space of ...
one coin toss
one dice roll
sum of two dice
removing a card from the standard deck
A redundancy of a code S = ...
1 - Iavr/Imax
Iavr/Imax
1 + Iavr/Imax
Imax/Iavr
An average length of codewords qavr = ...
∑ (pi * qi)
∑ (pi / qi)
∑ pi
∑ qi
An efficiency of a code E = ...
Iavr/100
Imax - Iavr
ASCII code is a
Variable length code
Fixed length code
Error-correction code
None of the given
By the Bayes' rule for conditional entropy H(Y|X) \= ...
H(X|Y) - H(X) + H(Y)
[P(B|A)][P(A)] /P(B)
H(X|Y) - H(X)
H(X|Y)+ H(Y)
By the Bayes' theorem ...
P(B|A) = P(A and B)/P(A)
P(A|B) = [P(B|A)][P(A)] /P(B)
P(B|A) = P(A and B)*P(A)
P(A|B) = [P(B|A)][P(A)] * P(B)
21. By the Chain rule H(X,Y) \= H(Y|X) + ...
H(X)
H(Y)
H(Y|X)
H(X|Y)
By the Hartley's formula the amount of information I = ...
I = n*log m
I = m*n
I = log (m/n)
I = log (m*n)
By the Hartley's formula the entropy H = ...
H \= - ∑(pi * log pi)
H \= - ∑ (log pi)
H \= log m
H \= - ∑ (pi / log pi)
By the property of joint entropy H(X,Y) <= ...
H(X) + H(Y)
By the property of joint entropy H(X,Y) ...
H(X,Y) >= H(X) and H(X,Y) <= H(Y)
H(X,Y) <= H(X) and H(X,Y) >= H(Y)
H(X,Y) >= H(X) and H(X,Y) >= H(Y)
H(X,Y) >= H(X) + H(Y)
By the Shannon's formula the amount of information I = ...
H \= - n * ∑( pi * log pi)
H \= - n * ∑ (log pi)
H \= - n * ∑ pi
H \= - n * ∑ (pi / log pi)
By the Shannon's formula the entropy H = ...
H \= - ∑( pi * log pi)
H \= - ∑ pi
Calculate the code rate for Hamming (15,11) code
1
0,733
0,571
0,839
Calculate the code rate for Hamming (31,26) code
Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message
N = mn
N = nm
N = m*n
N = log m
Code is optimal when ...
qavr = H
qavr ≠H
qavr <H
qavr >H
Code rate R (k information bits and n total bits) is defined as
k = n/R
R = k * n
n = R * k
R = k/n
Conditional entropy H(Y|X) lies between
- H(Y) and 0
0 and H(Y)
- H(Y) and H(Y)
0 and 1
Conditional probability P(B|A) = ...
P(A and B)/P(A)
P(A and B)*P(A)
[P(B|A)][P(A)] * P(B)
Find the information amount of a symbol from the language with total number of symbols n = 18.
I = log218
I = log182
I = 18 * log218
I = 18 * log182
For a Hamming (15, 11) code, 15 is the total number of bits and 11 is the number of ...
redundant bits
data bits
parity bits
none of the given
Which of the following symbols will get the shortest codeword after Shannon-Fano coding if probabilities are p(a) = 0.05, p(b) = 0.6, p(c) = 0.2 and p(d) = 0.15?
c
a
d
b
Which of the following is not a correct statement about a probability.
It must have a value between 0 and 1
It is the collection of several experiments
A value near 0 means that the event is not likely to occur/happens
It can be reported as a decimal or a fraction
Which of the following is a part the channel coding?
Huffman code
Hamming code
Shannon-Fano code
RLE code
For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
For Hamming distance d<sub>min</sub> and s errors in the received word, the condition to be able to correct the errors is
dmin>= s+1
dmin>= 2s+1
dmin>= 2s+2
dmin>= s+2
Hamming distance can easily be found with ...
XNOR operation
XOR operation
OR operation
AND operation
In a throw of coin what is the probability of getting head.
1/2
2
0
Specify the formula to calculate numbers of k and n bits to create the Hamming code
(n, k) = (2r - 1, 2r - 1 - r)
(n, k) = (2r, 2r - 1 - r)
(n, k) = (2r - 1, 2r - r)
(n, k) = (2r - 1, 2r - 1 + r)
In a throw of coin what is the probability of getting tails.
Specify the formula to find the amount of information if events have different probabilities.
Hartley's formula
Shannon's formula
Fano's formula
Bayes' formula
Specify the formula to find the amount of information if events have the same probabilities.
Specify the most effective type of code when an alphabet consists of 2 symbols with probabilities p(x1) = 0,05 and p(x2) = 0,95.
ASCII code
Shannon-Fano's code
Shannon-Fano's code by blocks
Hartley's code
In digital communication system, smaller the code rate, ... are the redundant bits.
a. less
b. equal
c. more
d. unpredictable
Specify the right formula if dmin is Hamming distance, s - number of correctable errors and r - number of detectable errors.
dmin>= s+r+1
dmin>= 2s+r+1
dmin>= s+2r+1
dmin>= s+r+2
Specify two types of error control algorithms
block and linear
linear and nonlinear
block and convolution
Noise affects ...
information source
receiver
channel
transmitter
The basic idea behind Shannon-Fano coding is to
compress data by using more bits to encode more frequently occuring characters
compress data by using fewer bits to encode more frequently occuring characters
compress data by using fewer bits to encode fewer frequently occuring characters
expand data by using fewer bits to encode more frequently occuring characters
Probability of occurrence of an event lies between
-1 and 0
-1 and 1
exactly 1
The Hamming distance between "client" and "server" is
6
impossible to detect
The Hamming distance between "make" and "made" is
4
3
The Hamming distance between "push" and "pull" is
Probability of second event in situation if first event has been occurred is classified as
conditional probability
joint entropy
conditional entropy
The Hamming distance between "starting" and "finishing" is
5
The Hamming distance between 001111 and 010011 is
Shannon-Fano and Huffman codes are an encoding algorithms used for
lossy data compression
lossless data compression
error correction
error detection
The Hamming distance between 010111 and 010011 is
The Hamming distance between 011111 and 010011 is
Specify parts of the receiver side
Source encoder, channel encoder, digital modulator
Source decoder, channel decoder, digital demodulator
Source decoder, channel encoder, digital modulator
Source encoder, channel decoder, digital modulator
The Hamming distance between 101001 and 010011 is
The Hamming distance between two strings with equal length is ...
the number of positions at which the corresponding symbols are different
the number of positions at which the corresponding symbols are equal
the number of identical symbols in the first string
the number of identical symbols in the second string
Specify parts of the transmitter side
The number of digits by which any two binary sequences differ is called the ...
Hamming weight
Hamming distance
Hamming length
Specify the case when entropy is maximum
p1=0,5 and p2=0,5
p1=1 and p2=0
p1=0 and p2=1
p1=0,9 and p2=0,1
The prefix code is also known as ...
block code
uniquely decodable code
convolutional code
parity bit
In a throw of dice what is the probability of getting number greater than 5.
a. 1/3
b. 1/6
c. 1/5
d. 1
The string was encoded with Hamming (15,11) code using the transformation matrix. Specify numbers of positions of the parity bits.
12,13,14,15
1,2,3,4
1,2,4,8
2,3,4,5
78. For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
a. redundant bits
b. data bits
c. parity bits
d. none of the given
The string was encoded with Hamming (31,26) code using the transformation matrix. Specify numbers of positions of the parity bits.
27,28,29,30,31
1,2,3,4,5
1,2,4,8,16
2,3,4,5,6
79. For a Hamming (7, 4) code, 7 is the total number of bits and 4 is the number of ...
When data is compressed, the goal is to reduce
noise
redundancy
channel capacity
When the base of the logarithm is 10, then the unit of measure of information is
bytes
dits
nits
bits
43. Code has dmin = 3. How many errors can be detected by this code?
When the base of the logarithm is 2, then the unit of measure of information is
When the base of the logarithm is e, then the unit of measure of information is
Which block or device does the data compression?
Channel encoder
Source encoder
Modulator
Which letter will get the shortest codeword after Huffman coding of the word "abracadabra"?
r
Which of the following codes has the highest code rate?
code rate is constant for all of the Hamming codes
Hamming (31,26)
Hamming (15,11)
Hamming (7,4)
Which of the following codes has the highest redundancy?
redundancy is constant for all of the Hamming codes
Which of the following codes is non-uniform?
Shannon-Fano
ASCII
Hamming
Which of the following codes is prefix?
0, 111, 11
0, 111, 10
0, 101, 10
00, 10, 101
0, 01, 11
0, 10, 11
0, 10, 1
0, 01, 001
Which of the following codes is uniform?
Huffman
10,011,11,001,010
0,10,110,1110,1111
10,01,0001,100,1010
100,110,001,000,010
Which of the following indicate(s) an error in a received combination?
Parity bits
Error syndrome
Data bits
The string was encoded with Hamming (7,4) code using the transformation matrix. Specify numbers of positions of the parity bits.
a. 5,6,7
a. 1,2,3
a. 1,2,4
a. 2,3,4