MCQOPTIONS
Bookmark
Saved Bookmarks
→
Electronics & Communication Engineering
→
Signals And Systems
→
Channel capacity of a noise-free channel having m...
1.
Channel capacity of a noise-free channel having m symbols is given by:
A.
m2
B.
2m
C.
Log2m
D.
m
Answer» D. m
Show Answer
Discussion
No Comment Found
Post Comment
Related MCQs
For any binary (n, h) linear code with minimum distance (2t + 1) or greater \(n - h \ge {\log _2}\left[ {\mathop \sum \limits_{i = 0}^α \left( {\begin{array}{*{20}{c}} n\\ i \end{array}} \right)} \right]\) where α is:
If the probability of a message is 1/4, then the information in bits is:
A fair dice is rolled one. Find the entropy of the outcomes.
In a binary source, 0s occur three times as often as 1s. What is the information contained in the 1s?
Four sources are generating information as given below(a) Source1: \({p_1} - \frac{1}{4},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{4}\)(b) Source2: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{8},{p_4} = \frac{1}{8}\)(c) Source3: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{2},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)(d) Source4: \({p_1} = \frac{1}{2},{p_2} = \frac{1}{4},{p_3} = \frac{1}{4},{p_4} = \frac{1}{8}\)Arrange theses sources in the descending order of their entropy (H).
A binary source In which 0s occurs 3 times as often as 1s. Then its entropy in bits/symbol will be
If the SNR of 8 kHz white bandlimited Gaussian channel is 25 dB the channel capacity is:
For a White Additive Gaussian Channel, the channel bandwidth is 100 MHz, and the S/N power ratio is 40 dB, find the Channel capacity in bits/sec
Information is:
For a system having 16 distinct symbols, maximum entropy is obtained when probabilities are:
Reply to Comment
×
Name
*
Email
*
Comment
*
Submit Reply