Information and Coding

Uncategorized

This set of Digital Communications Multiple Choice Questions & Answers (MCQs) focuses on “Information and Coding”.

1. Self information should be
a) Positive
b) Negative
c) Positive & Negative
d) None of the mentioned

2. The unit of average mutual information is
a) Bits
b) Bytes
c) Bits per symbol
d) Bytes per symbol

3. When probability of error during transmission is 0.5, it indicates that
a) Channel is very noisy
b) No information is received
c) Channel is very noisy & No information is received
d) None of the mentioned

4. Binary Huffman coding is a
a) Prefix condition code
b) Suffix condition code
c) Prefix & Suffix condition code
d) None of the mentioned

5. The event with minimum probability has least number of bits.
a) True
b) False

6. The method of converting a word to stream of bits is called as
a) Binary coding
b) Source coding
c) Bit coding
d) Cipher coding

7. When the base of the logarithm is 2, then the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentioned

8. When X and Y are statistically independent, then I (x,y) is
a) 1
b) 0
c) Ln 2
d) Cannot be determined

9. The self information of random variable is
a) 0
b) 1
c) Infinite
d) Cannot be determined

10. Entropy of a random variable is
a) 0
b) 1
c) Infinite
d) Cannot be determined

11. Which is more efficient method?
a) Encoding each symbol of a block
b) Encoding block of symbols
c) Encoding each symbol of a block & Encoding block of symbols
d) None of the mentioned

12. Lempel-Ziv algorithm is
a) Variable to fixed length algorithm
b) Fixed to variable length algorithm
c) Fixed to fixed length algorithm
d) Variable to variable length algorithm

13. Coded system are inherently capable of better transmission efficiency than the uncoded system.
a) True
b) False

Leave a Reply

Your email address will not be published. Required fields are marked *