IT - Theory Part

Descripción

создано с любовью для вас лучшей группой
Csse 1502
Test por Csse 1502, actualizado hace más de 1 año
Csse 1502
Creado por Csse 1502 hace más de 6 años
991
3

Resumen del Recurso

Pregunta 1

Pregunta
... is a measure of uncertainty
Respuesta
  • Encoding
  • Entropy
  • Information
  • Redundancy

Pregunta 2

Pregunta
{1,2,3,4,5,6} is the sample space of ...
Respuesta
  • one coin toss
  • one dice roll
  • sum of two dice
  • removing a card from the standard deck

Pregunta 3

Pregunta
A redundancy of a code S = ...
Respuesta
  • 1 - Iavr/Imax
  • Iavr/Imax
  • 1 + Iavr/Imax
  • Imax/Iavr

Pregunta 4

Pregunta
An average length of codewords qavr = ...
Respuesta
  • ∑ (pi * qi)
  • ∑ (pi / qi)
  • ∑ pi
  • ∑ qi

Pregunta 5

Pregunta
An efficiency of a code E = ...
Respuesta
  • Iavr/Imax
  • Imax/Iavr
  • Iavr/100
  • Imax - Iavr

Pregunta 6

Pregunta
ASCII code is a
Respuesta
  • Variable length code
  • Fixed length code
  • Error-correction code
  • None of the given

Pregunta 7

Pregunta
By the Bayes' rule for conditional entropy H(Y|X) \= ...
Respuesta
  • H(X|Y) - H(X) + H(Y)
  • [P(B|A)][P(A)] /P(B)
  • H(X|Y) - H(X)
  • H(X|Y)+ H(Y)

Pregunta 8

Pregunta
By the Bayes' theorem ...
Respuesta
  • P(B|A) = P(A and B)/P(A)
  • P(A|B) = [P(B|A)][P(A)] /P(B)
  • P(B|A) = P(A and B)*P(A)
  • P(A|B) = [P(B|A)][P(A)] * P(B)

Pregunta 9

Pregunta
21. By the Chain rule H(X,Y) \= H(Y|X) + ...
Respuesta
  • H(X)
  • H(Y)
  • H(Y|X)
  • H(X|Y)

Pregunta 10

Pregunta
By the Hartley's formula the amount of information I = ...
Respuesta
  • I = n*log m
  • I = m*n
  • I = log (m/n)
  • I = log (m*n)

Pregunta 11

Pregunta
By the Hartley's formula the entropy H = ...
Respuesta
  • H \= - ∑(pi * log pi)
  • H \= - ∑ (log pi)
  • H \= log m
  • H \= - ∑ (pi / log pi)

Pregunta 12

Pregunta
By the property of joint entropy H(X,Y) <= ...
Respuesta
  • H(X)
  • H(Y)
  • H(X) + H(Y)
  • None of the given

Pregunta 13

Pregunta
By the property of joint entropy H(X,Y) ...
Respuesta
  • H(X,Y) >= H(X) and H(X,Y) <= H(Y)
  • H(X,Y) <= H(X) and H(X,Y) >= H(Y)
  • H(X,Y) >= H(X) and H(X,Y) >= H(Y)
  • H(X,Y) >= H(X) + H(Y)

Pregunta 14

Pregunta
By the Shannon's formula the amount of information I = ...
Respuesta
  • H \= - n * ∑( pi * log pi)
  • H \= - n * ∑ (log pi)
  • H \= - n * ∑ pi
  • H \= - n * ∑ (pi / log pi)

Pregunta 15

Pregunta
By the Shannon's formula the entropy H = ...
Respuesta
  • H \= - ∑( pi * log pi)
  • H \= - ∑ (log pi)
  • H \= - ∑ pi
  • H \= - ∑ (pi / log pi)

Pregunta 16

Pregunta
Calculate the code rate for Hamming (15,11) code
Respuesta
  • 1
  • 0,733
  • 0,571
  • 0,839

Pregunta 17

Pregunta
Calculate the code rate for Hamming (31,26) code
Respuesta
  • 1
  • 0,839
  • 0,733
  • 0,571

Pregunta 18

Pregunta
Choose the formula to determine the number N of possible messages with length n if the message source alphabet consists of m characters, each of which can be an element of the message
Respuesta
  • N = mn
  • N = nm
  • N = m*n
  • N = log m

Pregunta 19

Pregunta
Code is optimal when ...
Respuesta
  • qavr = H
  • qavr &ne;H
  • qavr <H
  • qavr >H

Pregunta 20

Pregunta
Code rate R (k information bits and n total bits) is defined as
Respuesta
  • k = n/R
  • R = k * n
  • n = R * k
  • R = k/n

Pregunta 21

Pregunta
Conditional entropy H(Y|X) lies between
Respuesta
  • - H(Y) and 0
  • 0 and H(Y)
  • - H(Y) and H(Y)
  • 0 and 1

Pregunta 22

Pregunta
Conditional probability P(B|A) = ...
Respuesta
  • P(A and B)/P(A)
  • [P(B|A)][P(A)] /P(B)
  • P(A and B)*P(A)
  • [P(B|A)][P(A)] * P(B)

Pregunta 23

Pregunta
Find the information amount of a symbol from the language with total number of symbols n = 18.
Respuesta
  • I = log218
  • I = log182
  • I = 18 * log218
  • I = 18 * log182

Pregunta 24

Pregunta
For a Hamming (15, 11) code, 15 is the total number of bits and 11 is the number of ...
Respuesta
  • redundant bits
  • data bits
  • parity bits
  • none of the given

Pregunta 25

Pregunta
Which of the following symbols will get the shortest codeword after Shannon-Fano coding if probabilities are p(a) = 0.05, p(b) = 0.6, p(c) = 0.2 and p(d) = 0.15?
Respuesta
  • c
  • a
  • d
  • b

Pregunta 26

Pregunta
Which of the following is not a correct statement about a probability.
Respuesta
  • It must have a value between 0 and 1
  • It is the collection of several experiments
  • A value near 0 means that the event is not likely to occur/happens
  • It can be reported as a decimal or a fraction

Pregunta 27

Pregunta
Which of the following is a part the channel coding?
Respuesta
  • Huffman code
  • Hamming code
  • Shannon-Fano code
  • RLE code

Pregunta 28

Pregunta
For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
Respuesta
  • redundant bits
  • data bits
  • parity bits
  • none of the given

Pregunta 29

Pregunta
For Hamming distance d<sub>min</sub> and s errors in the received word, the condition to be able to correct the errors is
Respuesta
  • dmin>= s+1
  • dmin>= 2s+1
  • dmin>= 2s+2
  • dmin>= s+2

Pregunta 30

Pregunta
Hamming distance can easily be found with ...
Respuesta
  • XNOR operation
  • XOR operation
  • OR operation
  • AND operation

Pregunta 31

Pregunta
In a throw of coin what is the probability of getting head.
Respuesta
  • 1
  • 1/2
  • 2
  • 0

Pregunta 32

Pregunta
Specify the formula to calculate numbers of k and n bits to create the Hamming code
Respuesta
  • (n, k) = (2r - 1, 2r - 1 - r)
  • (n, k) = (2r, 2r - 1 - r)
  • (n, k) = (2r - 1, 2r - r)
  • (n, k) = (2r - 1, 2r - 1 + r)

Pregunta 33

Pregunta
In a throw of coin what is the probability of getting tails.
Respuesta
  • 1
  • 1/2
  • 2
  • 0

Pregunta 34

Pregunta
Specify the formula to find the amount of information if events have different probabilities.
Respuesta
  • Hartley's formula
  • Shannon's formula
  • Fano's formula
  • Bayes' formula

Pregunta 35

Pregunta
Specify the formula to find the amount of information if events have the same probabilities.
Respuesta
  • Shannon's formula
  • Hartley's formula
  • Fano's formula
  • Bayes' formula

Pregunta 36

Pregunta
Specify the most effective type of code when an alphabet consists of 2 symbols with probabilities p(x1) = 0,05 and p(x2) = 0,95.
Respuesta
  • ASCII code
  • Shannon-Fano's code
  • Shannon-Fano's code by blocks
  • Hartley's code

Pregunta 37

Pregunta
In digital communication system, smaller the code rate, ... are the redundant bits.
Respuesta
  • a. less
  • b. equal
  • c. more
  • d. unpredictable

Pregunta 38

Pregunta
Specify the right formula if dmin is Hamming distance, s - number of correctable errors and r - number of detectable errors.
Respuesta
  • dmin>= s+r+1
  • dmin>= 2s+r+1
  • dmin>= s+2r+1
  • dmin>= s+r+2

Pregunta 39

Pregunta
Specify two types of error control algorithms
Respuesta
  • block and linear
  • linear and nonlinear
  • block and convolution
  • none of the given

Pregunta 40

Pregunta
Noise affects ...
Respuesta
  • information source
  • receiver
  • channel
  • transmitter

Pregunta 41

Pregunta
The basic idea behind Shannon-Fano coding is to
Respuesta
  • compress data by using more bits to encode more frequently occuring characters
  • compress data by using fewer bits to encode more frequently occuring characters
  • compress data by using fewer bits to encode fewer frequently occuring characters
  • expand data by using fewer bits to encode more frequently occuring characters

Pregunta 42

Pregunta
Probability of occurrence of an event lies between
Respuesta
  • -1 and 0
  • 0 and 1
  • -1 and 1
  • exactly 1

Pregunta 43

Pregunta
The Hamming distance between "client" and "server" is
Respuesta
  • 0
  • 1
  • 6
  • impossible to detect

Pregunta 44

Pregunta
The Hamming distance between "make" and "made" is
Respuesta
  • 4
  • 3
  • 1
  • impossible to detect

Pregunta 45

Pregunta
The Hamming distance between "push" and "pull" is
Respuesta
  • 0
  • 4
  • 2
  • impossible to detect

Pregunta 46

Pregunta
Probability of second event in situation if first event has been occurred is classified as
Respuesta
  • conditional probability
  • joint entropy
  • conditional entropy
  • none of the given

Pregunta 47

Pregunta
The Hamming distance between "starting" and "finishing" is
Respuesta
  • 4
  • 3
  • impossible to detect
  • 5

Pregunta 48

Pregunta
The Hamming distance between 001111 and 010011 is
Respuesta
  • 1
  • 2
  • 3
  • 4

Pregunta 49

Pregunta
Shannon-Fano and Huffman codes are an encoding algorithms used for
Respuesta
  • lossy data compression
  • lossless data compression
  • error correction
  • error detection

Pregunta 50

Pregunta
The Hamming distance between 010111 and 010011 is
Respuesta
  • 2
  • 3
  • 1
  • 4

Pregunta 51

Pregunta
The Hamming distance between 011111 and 010011 is
Respuesta
  • 1
  • 3
  • 2
  • 4

Pregunta 52

Pregunta
Specify parts of the receiver side
Respuesta
  • Source encoder, channel encoder, digital modulator
  • Source decoder, channel decoder, digital demodulator
  • Source decoder, channel encoder, digital modulator
  • Source encoder, channel decoder, digital modulator

Pregunta 53

Pregunta
The Hamming distance between 101001 and 010011 is
Respuesta
  • 1
  • 2
  • 4
  • 3

Pregunta 54

Pregunta
The Hamming distance between two strings with equal length is ...
Respuesta
  • the number of positions at which the corresponding symbols are different
  • the number of positions at which the corresponding symbols are equal
  • the number of identical symbols in the first string
  • the number of identical symbols in the second string

Pregunta 55

Pregunta
Specify parts of the transmitter side
Respuesta
  • Source decoder, channel decoder, digital demodulator
  • Source encoder, channel encoder, digital modulator
  • Source decoder, channel encoder, digital modulator
  • Source encoder, channel decoder, digital modulator

Pregunta 56

Pregunta
The number of digits by which any two binary sequences differ is called the ...
Respuesta
  • Hamming weight
  • Hamming distance
  • Hamming code
  • Hamming length

Pregunta 57

Pregunta
Specify the case when entropy is maximum
Respuesta
  • p1=0,5 and p2=0,5
  • p1=1 and p2=0
  • p1=0 and p2=1
  • p1=0,9 and p2=0,1

Pregunta 58

Pregunta
The prefix code is also known as ...
Respuesta
  • block code
  • uniquely decodable code
  • convolutional code
  • parity bit

Pregunta 59

Pregunta
In a throw of dice what is the probability of getting number greater than 5.
Respuesta
  • a. 1/3
  • b. 1/6
  • c. 1/5
  • d. 1

Pregunta 60

Pregunta
The string was encoded with Hamming (15,11) code using the transformation matrix. Specify numbers of positions of the parity bits.
Respuesta
  • 12,13,14,15
  • 1,2,3,4
  • 1,2,4,8
  • 2,3,4,5

Pregunta 61

Pregunta
78. For a Hamming (31, 26) code, 31 is the total number of bits and 26 is the number of ...
Respuesta
  • a. redundant bits
  • b. data bits
  • c. parity bits
  • d. none of the given

Pregunta 62

Pregunta
The string was encoded with Hamming (31,26) code using the transformation matrix. Specify numbers of positions of the parity bits.
Respuesta
  • 27,28,29,30,31
  • 1,2,3,4,5
  • 1,2,4,8,16
  • 2,3,4,5,6

Pregunta 63

Pregunta
79. For a Hamming (7, 4) code, 7 is the total number of bits and 4 is the number of ...
Respuesta
  • a. redundant bits
  • b. data bits
  • c. parity bits
  • d. none of the given

Pregunta 64

Pregunta
When data is compressed, the goal is to reduce
Respuesta
  • noise
  • redundancy
  • channel capacity
  • none of the given

Pregunta 65

Pregunta
When the base of the logarithm is 10, then the unit of measure of information is
Respuesta
  • bytes
  • dits
  • nits
  • bits

Pregunta 66

Pregunta
43. Code has dmin = 3. How many errors can be detected by this code?
Respuesta
  • 1
  • 3
  • 2
  • 4

Pregunta 67

Pregunta
When the base of the logarithm is 2, then the unit of measure of information is
Respuesta
  • bytes
  • bits
  • nits
  • dits

Pregunta 68

Pregunta
When the base of the logarithm is e, then the unit of measure of information is
Respuesta
  • bytes
  • nits
  • dits
  • bits

Pregunta 69

Pregunta
Which block or device does the data compression?
Respuesta
  • Channel encoder
  • Source encoder
  • Modulator
  • None of the given

Pregunta 70

Pregunta
Which letter will get the shortest codeword after Huffman coding of the word "abracadabra"?
Respuesta
  • c
  • r
  • d
  • a

Pregunta 71

Pregunta
Which of the following codes has the highest code rate?
Respuesta
  • code rate is constant for all of the Hamming codes
  • Hamming (31,26)
  • Hamming (15,11)
  • Hamming (7,4)

Pregunta 72

Pregunta
Which of the following codes has the highest redundancy?
Respuesta
  • redundancy is constant for all of the Hamming codes
  • Hamming (7,4)
  • Hamming (15,11)
  • Hamming (31,26)

Pregunta 73

Pregunta
Which of the following codes is non-uniform?
Respuesta
  • Shannon-Fano
  • ASCII
  • Hamming
  • None of the given

Pregunta 74

Pregunta
Which of the following codes is prefix?
Respuesta
  • 0, 111, 11
  • 0, 111, 10
  • 0, 101, 10
  • 00, 10, 101

Pregunta 75

Pregunta
Which of the following codes is prefix?
Respuesta
  • 0, 01, 11
  • 0, 10, 11
  • 0, 10, 1
  • 0, 01, 001

Pregunta 76

Pregunta
Which of the following codes is uniform?
Respuesta
  • ASCII
  • Shannon-Fano
  • Huffman
  • None of the given

Pregunta 77

Pregunta
Which of the following codes is uniform?
Respuesta
  • 10,011,11,001,010
  • 0,10,110,1110,1111
  • 10,01,0001,100,1010
  • 100,110,001,000,010

Pregunta 78

Pregunta
Which of the following indicate(s) an error in a received combination?
Respuesta
  • Parity bits
  • Error syndrome
  • Data bits
  • None of the given

Pregunta 79

Pregunta
The string was encoded with Hamming (7,4) code using the transformation matrix. Specify numbers of positions of the parity bits.
Respuesta
  • a. 5,6,7
  • a. 1,2,3
  • a. 1,2,4
  • a. 2,3,4
Mostrar resumen completo Ocultar resumen completo

Similar

Ecuaciones de Segundo Grado
Diego Santos
RAMAS DE LA MEDICINA
angelik.laverde9
Preposicions
Eva_95
CONSTITUCIÓN ESPAÑOLA 1978
Vic Vilve
Romanticismo literario del S. XIX
maya velasquez
Sumas y restas MATEMATICAS (Preguntas fáciles)
RosyFlower Suchard
Texto del FCE para Rellenar Espacios
Diego Santos
LEY 1/2000 ENJUICIAMIENTO CIVIL: "De los procesos matrimoniales y de menores" (II)
Miguel Angel del Rio
Aparato CIRCULATORIO - De Mapa Mental
JL Cadenas
Web 2.0 and other emerging technologies applied to research
patricia sarria
MESOPOTAMIA
Txemi López