1.

In Markov analysis, state probabilities must

A. Sum to one
B. Be less than one
C. Be greater than one
D. None of the above
Answer» B.


Discussion

No Comment Found