Explore topic-wise MCQs in Electronics & Communication Engineering.

This section includes 19 Mcqs, each offering curated multiple-choice questions to sharpen your Electronics & Communication Engineering knowledge and support exam preparation. Choose a topic below to get started.

1.

For any binary (n, h) linear code with minimum distance (2t + 1) or greater \(n - h \ge {\log _2}\left[ {\mathop \sum \limits_{i = 0}^α \left( {\begin{array}{*{20}{c}} n\\ i \end{array}} \right)} \right]\) where α is:

A. 2t + 1
B. t + 1
C. t - 1
D. t
Answer» C. t - 1
2.

If the probability of a message is 1/4, then the information in bits is:

A. 8 bit
B. 4 bit
C. 2 bit
D. 1 bit
Answer» D. 1 bit
3.

A fair dice is rolled one. Find the entropy of the outcomes.

A. 4.564 bits
B. 2.585 bits
C. 3.256 bits
D. 2.654 bits
Answer» C. 3.256 bits
4.

In a binary source, 0s occur three times as often as 1s. What is the information contained in the 1s?

A. 0.415 bit
B. 0.333 bit
C. 3 bit
D. 2 bit
Answer» E.
5.

Four sources are generating information as given below(a) Source1: \({p_1} - \frac{1}{4},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{4}\)(b) Source2: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{8},{p_4} = \frac{1}{8}\)(c) Source3: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{2},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)(d) Source4: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)Arrange theses sources in the descending order of their entropy (H).

A. (c), (d), (a), (b)
B. (a), (d), (c), (b)
C. (d), (c), (a), (b)
D. (b), (a), (c), (d)
Answer» C. (d), (c), (a), (b)
6.

A binary source In which 0s occurs 3 times as often as 1s. Then its entropy in bits/symbol will be

A. 0.75 bits/symbol
B. 0.25 bits/symbol
C. 0.81 bits/symbol
D. 0.85 bits/symbol
Answer» D. 0.85 bits/symbol
7.

If the SNR of 8 kHz white bandlimited Gaussian channel is 25 dB the channel capacity is:

A. 2.40 kbps
B. 53.26 kbps
C. 66.47 kbps
D. 26.84 kbps
Answer» D. 26.84 kbps
8.

For a White Additive Gaussian Channel, the channel bandwidth is 100 MHz, and the S/N power ratio is 40 dB, find the Channel capacity in bits/sec

A. 1328.786 × 106 bits/sec
B. 1248.687 × 106 bits/sec
C. 1245.687 × 106 bits/sec
D. 2245.687 × 106 bits/sec
Answer» B. 1248.687 × 106 bits/sec
9.

Information is:

A. the synonym of probability
B. not related to the probability of information
C. inversely proportional to the probability of information
D. directly proportional to the probability of information
Answer» D. directly proportional to the probability of information
10.

For a system having 16 distinct symbols, maximum entropy is obtained when probabilities are:

A. 1/8
B. 1/4
C. 1/3
D. 1/16
Answer» E.
11.

Channel capacity of a noise-free channel having m symbols is given by:

A. m2
B. 2m
C. Log2m
D. m
Answer» D. m
12.

A random experiment has 64 equally likely outcomes. Find the information associated with each outcome.

A. 3 bits
B. 2 bits
C. 6 bits
D. 5 bits
Answer» D. 5 bits
13.

Calculate the capacity of a standard 4 KHz telephone channel with 32 dB signal-to-noise ratio.

A. 16428 bps
B. 1586 bps
C. 3100 bps
D. 42524 bps
Answer» E.
14.

A source generates four messages m1, m2, m3, and m4 with probabilities 0.5, 0.25, 0.125 and 0.125 respectively. The messages are generated independently of each other. A source coder assigns binary code to each message. Which of the following codes has minimum average length and is also uniquely decodable (sequence as per m1, m2, m3, m4)?

A. 00, 01, 10, 11
B. 0, 1, 10, 11
C. 110, 111, 10, 0
D. 0, 10, 110, 111
Answer» E.
15.

Find the channel capacity of the noiseless discrete channel, with n symbols; x1, x2, x3,…, xn.

A. C = log2 4n
B. C = log2 2n
C. C = log2 n2
D. C = log2 n
Answer» E.
16.

For Gaussian and White channel noise, the capacity of a low-pass channel with a usable bandwidth of 3000 Hz and S/N= 103 at the channel output will be

A. 15000 bits/s
B. 20000 bits/s
C. 25000 bits/s
D. 30000 bits/s
Answer» E.
17.

A source produces three symbols A, B and C with probabilities, P(A) = ½, P(B) = ¼ and P(C) = ¼. The source entropy is

A. ½ bit/symbol
B. 1 bit/symbol
C. 1 ¼ bits/symbol
D. 1 ½ bits/symbol
Answer» E.
18.

Given below are two statements: One is labelled as Assertion (A) and the other is labelled as Reason (R)Assertion (A): The Syndrome depends on both the error pattern and the transmitted code word.Reasons (R): All error patterns that differ by a code word have the same syndrome.In the light of the above statements, choose the correct answer from the options given below:

A. Both (A) and (R) are true and (R) is the correct explanation of (A)
B. Both (A) and (R) are true but (R) is not the correct explanation of (A)
C. (A) is true but (R) is false
D. (A) is false but (R) is true
Answer» E.
19.

A discrete source emits four symbols with probabilities (1/3), (1/3), (1/4) and (1/12) every 100μs. The information rate is:

A. 9795.5 symbols/sec
B. 1.855 symbols/sec
C. 18.55 K bits/ sec
D. 18.55 bits/ sec
Answer» D. 18.55 bits/ sec