1.

If there are M messages and each message has probability p = 1/M, the entropy is

A. 0
B. 1
C. \({\log _2}M\)
D. \(M{\log _2}M\)
Answer» D. \(M{\log _2}M\)


Discussion

No Comment Found

Related MCQs