IT - Theory Part

Description

создано с любовью для вас лучшей группой
Csse 1502
Quiz by Csse 1502, updated more than 1 year ago
Csse 1502
Created by Csse 1502 over 6 years ago
991
3

Resource summary

Question 1

Question
... is a measure of uncertainty
Answer
  • Encoding
  • Entropy
  • Information
  • Redundancy

Question 2

Question
{1,2,3,4,5,6} is the sample space of ...
Answer
  • one coin toss
  • one dice roll
  • sum of two dice
  • removing a card from the standard deck

Question 3

Question
A redundancy of a code S = ...
Answer
  • 1 - Iavr/Imax
  • Iavr/Imax
  • 1 + Iavr/Imax
  • Imax/Iavr

Question 4

Question
An average length of codewords qavr = ...
Answer
  • ∑ (pi * qi)
  • ∑ (pi / qi)
  • ∑ pi
  • ∑ qi

Question 5

Question
An efficiency of a code E = ...
Answer
  • Iavr/Imax
  • Imax/Iavr
  • Iavr/100
  • Imax - Iavr

Question 6

Question
ASCII code is a
Answer
  • Variable length code
  • Fixed length code
  • Error-correction code
  • None of the given

Question 7

Question
By the Bayes' rule for conditional entropy H(Y|X) \= ...
Answer
  • H(X|Y) - H(X) + H(Y)
  • [P(B|A)][P(A)] /P(B)
  • H(X|Y) - H(X)
  • H(X|Y)+ H(Y)

Question 8

Question
By the Bayes' theorem ...
Answer
  • P(B|A) = P(A and B)/P(A)
  • P(A|B) = [P(B|A)][P(A)] /P(B)
  • P(B|A) = P(A and B)*P(A)
  • P(A|B) = [P(B|A)][P(A)] * P(B)

Question 9

Question
21. By the Chain rule H(X,Y) \= H(Y|X) + ...
Answer
  • H(X)
  • H(Y)
  • H(Y|X)
  • H(X|Y)

Question 10

Question
By the Hartley's formula the amount of information I = ...
Answer
  • I = n*log m
  • I = m*n
  • I = log (m/n)
  • I = log (m*n)

Question 11

Question
By the Hartley's formula the entropy H = ...
Answer
  • H \= - ∑(pi * log pi)
  • H \= - ∑ (log pi)
  • H \= log m
  • H \= - ∑ (pi / log pi)

Question 12

Question
By the property of joint entropy H(X,Y) <= ...
Answer
  • H(X)
  • H(Y)
  • H(X) + H(Y)
  • None of the given

Question 13

Question
By the property of joint entropy H(X,Y) ...
Answer
  • H(X,Y) >= H(X) and H(X,Y) <= H(Y)
  • H(X,Y) <= H(X) and H(X,Y) >= H(Y)
  • H(X,Y) >= H(X) and H(X,Y) >= H(Y)
  • H(X,Y) >= H(X) + H(Y)

Question 14

Question
By the Shannon's formula the amount of information I = ...
Answer
  • H \= - n * ∑( pi * log pi)
  • H \= - n * ∑ (log pi)
  • H \= - n * ∑ pi
  • H \= - n * ∑ (pi / log pi)

Question 15

Question
By the Shannon's formula the entropy H = ...
Answer
  • H \= - ∑( pi * log pi)
  • H \= - ∑ (log pi)
  • H \= - ∑ pi
  • H \= - ∑ (pi / log pi)

Question 16

Question
Calculate the code rate for Hamming (15,11) code
Answer
  • 1
  • 0,733
  • 0,571
  • 0,839

Question 17

Question
Calculate the code rate for Hamming (31,26) code
Answer
  • 1
  • 0,839
  • 0,733
  • 0,571

Question 18

Question
Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message
Answer
  • N = mn
  • N = nm
  • N = m*n
  • N = log m

Question 19

Question
Code is optimal when ...
Answer
  • qavr = H
  • qavr &ne;H
  • qavr <H
  • qavr >H

Question 20

Question
Code rate R (k information bits and n total bits) is defined as
Answer
  • k = n/R
  • R = k * n
  • n = R * k
  • R = k/n

Question 21

Question
Conditional entropy H(Y|X) lies between
Answer
  • - H(Y) and 0
  • 0 and H(Y)
  • - H(Y) and H(Y)
  • 0 and 1

Question 22

Question
Conditional probability P(B|A) = ...
Answer
  • P(A and B)/P(A)
  • [P(B|A)][P(A)] /P(B)
  • P(A and B)*P(A)
  • [P(B|A)][P(A)] * P(B)

Question 23

Question
Find the information amount of a symbol from the language with total number of symbols n = 18.
Answer
  • I = log218
  • I = log182
  • I = 18 * log218
  • I = 18 * log182

Question 24

Question
For a Hamming (15, 11) code, 15 is the total number of bits and 11 is the number of ...
Answer
  • redundant bits
  • data bits
  • parity bits
  • none of the given

Question 25

Question
Which of the following symbols will get the shortest codeword after Shannon-Fano coding if probabilities are p(a) = 0.05, p(b) = 0.6, p(c) = 0.2 and p(d) = 0.15?
Answer
  • c
  • a
  • d
  • b

Question 26

Question
Which of the following is not a correct statement about a probability.
Answer
  • It must have a value between 0 and 1
  • It is the collection of several experiments
  • A value near 0 means that the event is not likely to occur/happens
  • It can be reported as a decimal or a fraction

Question 27

Question
Which of the following is a part the channel coding?
Answer
  • Huffman code
  • Hamming code
  • Shannon-Fano code
  • RLE code

Question 28

Question
For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
Answer
  • redundant bits
  • data bits
  • parity bits
  • none of the given

Question 29

Question
For Hamming distance d<sub>min</sub> and s errors in the received word, the condition to be able to correct the errors is
Answer
  • dmin>= s+1
  • dmin>= 2s+1
  • dmin>= 2s+2
  • dmin>= s+2

Question 30

Question
Hamming distance can easily be found with ...
Answer
  • XNOR operation
  • XOR operation
  • OR operation
  • AND operation

Question 31

Question
In a throw of coin what is the probability of getting head.
Answer
  • 1
  • 1/2
  • 2
  • 0

Question 32

Question
Specify the formula to calculate numbers of k and n bits to create the Hamming code
Answer
  • (n, k) = (2r - 1, 2r - 1 - r)
  • (n, k) = (2r, 2r - 1 - r)
  • (n, k) = (2r - 1, 2r - r)
  • (n, k) = (2r - 1, 2r - 1 + r)

Question 33

Question
In a throw of coin what is the probability of getting tails.
Answer
  • 1
  • 1/2
  • 2
  • 0

Question 34

Question
Specify the formula to find the amount of information if events have different probabilities.
Answer
  • Hartley's formula
  • Shannon's formula
  • Fano's formula
  • Bayes' formula

Question 35

Question
Specify the formula to find the amount of information if events have the same probabilities.
Answer
  • Shannon's formula
  • Hartley's formula
  • Fano's formula
  • Bayes' formula

Question 36

Question
Specify the most effective type of code when an alphabet consists of 2 symbols with probabilities p(x1) = 0,05 and p(x2) = 0,95.
Answer
  • ASCII code
  • Shannon-Fano's code
  • Shannon-Fano's code by blocks
  • Hartley's code

Question 37

Question
In digital communication system, smaller the code rate, ... are the redundant bits.
Answer
  • a. less
  • b. equal
  • c. more
  • d. unpredictable

Question 38

Question
Specify the right formula if dmin is Hamming distance, s - number of correctable errors and r - number of detectable errors.
Answer
  • dmin>= s+r+1
  • dmin>= 2s+r+1
  • dmin>= s+2r+1
  • dmin>= s+r+2

Question 39

Question
Specify two types of error control algorithms
Answer
  • block and linear
  • linear and nonlinear
  • block and convolution
  • none of the given

Question 40

Question
Noise affects ...
Answer
  • information source
  • receiver
  • channel
  • transmitter

Question 41

Question
The basic idea behind Shannon-Fano coding is to
Answer
  • compress data by using more bits to encode more frequently occuring characters
  • compress data by using fewer bits to encode more frequently occuring characters
  • compress data by using fewer bits to encode fewer frequently occuring characters
  • expand data by using fewer bits to encode more frequently occuring characters

Question 42

Question
Probability of occurrence of an event lies between
Answer
  • -1 and 0
  • 0 and 1
  • -1 and 1
  • exactly 1

Question 43

Question
The Hamming distance between "client" and "server" is
Answer
  • 0
  • 1
  • 6
  • impossible to detect

Question 44

Question
The Hamming distance between "make" and "made" is
Answer
  • 4
  • 3
  • 1
  • impossible to detect

Question 45

Question
The Hamming distance between "push" and "pull" is
Answer
  • 0
  • 4
  • 2
  • impossible to detect

Question 46

Question
Probability of second event in situation if first event has been occurred is classified as
Answer
  • conditional probability
  • joint entropy
  • conditional entropy
  • none of the given

Question 47

Question
The Hamming distance between "starting" and "finishing" is
Answer
  • 4
  • 3
  • impossible to detect
  • 5

Question 48

Question
The Hamming distance between 001111 and 010011 is
Answer
  • 1
  • 2
  • 3
  • 4

Question 49

Question
Shannon-Fano and Huffman codes are an encoding algorithms used for
Answer
  • lossy data compression
  • lossless data compression
  • error correction
  • error detection

Question 50

Question
The Hamming distance between 010111 and 010011 is
Answer
  • 2
  • 3
  • 1
  • 4

Question 51

Question
The Hamming distance between 011111 and 010011 is
Answer
  • 1
  • 3
  • 2
  • 4

Question 52

Question
Specify parts of the receiver side
Answer
  • Source encoder, channel encoder, digital modulator
  • Source decoder, channel decoder, digital demodulator
  • Source decoder, channel encoder, digital modulator
  • Source encoder, channel decoder, digital modulator

Question 53

Question
The Hamming distance between 101001 and 010011 is
Answer
  • 1
  • 2
  • 4
  • 3

Question 54

Question
The Hamming distance between two strings with equal length is ...
Answer
  • the number of positions at which the corresponding symbols are different
  • the number of positions at which the corresponding symbols are equal
  • the number of identical symbols in the first string
  • the number of identical symbols in the second string

Question 55

Question
Specify parts of the transmitter side
Answer
  • Source decoder, channel decoder, digital demodulator
  • Source encoder, channel encoder, digital modulator
  • Source decoder, channel encoder, digital modulator
  • Source encoder, channel decoder, digital modulator

Question 56

Question
The number of digits by which any two binary sequences differ is called the ...
Answer
  • Hamming weight
  • Hamming distance
  • Hamming code
  • Hamming length

Question 57

Question
Specify the case when entropy is maximum
Answer
  • p1=0,5 and p2=0,5
  • p1=1 and p2=0
  • p1=0 and p2=1
  • p1=0,9 and p2=0,1

Question 58

Question
The prefix code is also known as ...
Answer
  • block code
  • uniquely decodable code
  • convolutional code
  • parity bit

Question 59

Question
In a throw of dice what is the probability of getting number greater than 5.
Answer
  • a. 1/3
  • b. 1/6
  • c. 1/5
  • d. 1

Question 60

Question
The string was encoded with Hamming (15,11) code using the transformation matrix. Specify numbers of positions of the parity bits.
Answer
  • 12,13,14,15
  • 1,2,3,4
  • 1,2,4,8
  • 2,3,4,5

Question 61

Question
78. For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
Answer
  • a. redundant bits
  • b. data bits
  • c. parity bits
  • d. none of the given

Question 62

Question
The string was encoded with Hamming (31,26) code using the transformation matrix. Specify numbers of positions of the parity bits.
Answer
  • 27,28,29,30,31
  • 1,2,3,4,5
  • 1,2,4,8,16
  • 2,3,4,5,6

Question 63

Question
79. For a Hamming (7, 4) code, 7 is the total number of bits and 4 is the number of ...
Answer
  • a. redundant bits
  • b. data bits
  • c. parity bits
  • d. none of the given

Question 64

Question
When data is compressed, the goal is to reduce
Answer
  • noise
  • redundancy
  • channel capacity
  • none of the given

Question 65

Question
When the base of the logarithm is 10, then the unit of measure of information is
Answer
  • bytes
  • dits
  • nits
  • bits

Question 66

Question
43. Code has dmin = 3. How many errors can be detected by this code?
Answer
  • 1
  • 3
  • 2
  • 4

Question 67

Question
When the base of the logarithm is 2, then the unit of measure of information is
Answer
  • bytes
  • bits
  • nits
  • dits

Question 68

Question
When the base of the logarithm is e, then the unit of measure of information is
Answer
  • bytes
  • nits
  • dits
  • bits

Question 69

Question
Which block or device does the data compression?
Answer
  • Channel encoder
  • Source encoder
  • Modulator
  • None of the given

Question 70

Question
Which letter will get the shortest codeword after Huffman coding of the word "abracadabra"?
Answer
  • c
  • r
  • d
  • a

Question 71

Question
Which of the following codes has the highest code rate?
Answer
  • code rate is constant for all of the Hamming codes
  • Hamming (31,26)
  • Hamming (15,11)
  • Hamming (7,4)

Question 72

Question
Which of the following codes has the highest redundancy?
Answer
  • redundancy is constant for all of the Hamming codes
  • Hamming (7,4)
  • Hamming (15,11)
  • Hamming (31,26)

Question 73

Question
Which of the following codes is non-uniform?
Answer
  • Shannon-Fano
  • ASCII
  • Hamming
  • None of the given

Question 74

Question
Which of the following codes is prefix?
Answer
  • 0, 111, 11
  • 0, 111, 10
  • 0, 101, 10
  • 00, 10, 101

Question 75

Question
Which of the following codes is prefix?
Answer
  • 0, 01, 11
  • 0, 10, 11
  • 0, 10, 1
  • 0, 01, 001

Question 76

Question
Which of the following codes is uniform?
Answer
  • ASCII
  • Shannon-Fano
  • Huffman
  • None of the given

Question 77

Question
Which of the following codes is uniform?
Answer
  • 10,011,11,001,010
  • 0,10,110,1110,1111
  • 10,01,0001,100,1010
  • 100,110,001,000,010

Question 78

Question
Which of the following indicate(s) an error in a received combination?
Answer
  • Parity bits
  • Error syndrome
  • Data bits
  • None of the given

Question 79

Question
The string was encoded with Hamming (7,4) code using the transformation matrix. Specify numbers of positions of the parity bits.
Answer
  • a. 5,6,7
  • a. 1,2,3
  • a. 1,2,4
  • a. 2,3,4
Show full summary Hide full summary

Similar

The Digestive System
cian.buckley
The Rock Cycle
eimearkelly3
IMPERFECT TENSE - French
T W
To Kill A Mockingbird GCSE English
naomisargent
ICT Key Terms Quiz - Part 1
Mr Mckinlay
HRCI Glossary of Terms O-Z
Sandra Reed
Functionalist Theory of Crime
A M
CCNA Security 210-260 IINS - Exam 3
Mike M
2PR101 1. test - 2. část
Nikola Truong
1PR101 2.test - Část 13.
Nikola Truong
Information security and data protection
хомяк убийца