MCQOPTIONS
Bookmark
Saved Bookmarks
→
Neural Networks
→
Hopfield Model
→
If two classes are linearly inseparable, can perce..
1.
If two classes are linearly inseparable, can perceptron convergence theorem be applied?
A.
yes
B.
no
Answer» C.
Show Answer
Discussion
No Comment Found
Post Comment
Related MCQs
W(M_+_1)_=_W(M)_+_N(B(M)_‚ÄÖ√Ñ√∂‚ÀÖ√Ë‚ÀÖ¬®_S(M))_A(M),_WHERE_B(M)_IS_DESIRED_OUTPUT,_S(M)_IS_ACTUAL_OUTPUT,_A(M)_IS_INPUT_VECTOR_AND_‚ÄÖ√Ñ√∂‚ÀÖ√Ë‚Àւ§W‚ÄÖ√Ñ√∂‚ÀÖ√Ë‚ÀÖ¬•_DENOTES_WEIGHT,_CAN_THIS_MODEL_BE_USED_FOR_PERCEPTRON_LEARNING??$#
If_e(m)_denotes_error_for_correction_of_weight_then_what_is_formula_for_error_in_perceptron_learning_model:_w(m_+_1)_=_w(m)_+_n(b(m)_–_s(m))_a(m),_where_b(m)_is_desired_output,_s(m)_is_actual_output,_a(m)_is_input_vector_and_‘w’_denotes_weight$#
The perceptron convergence theorem is applicable for what kind of data?
Is it necessary to set initial weights in prceptron convergence theorem to zero?
Two classes are said to be inseparable when?
If two classes are linearly inseparable, can perceptron convergence theorem be applied?
When two classes can be separated by a separate line, they are known as?
In perceptron learning, what happens when input vector is correctly classified?
On what factor the number of outputs depends?
What is the objective of perceptron learning?
Reply to Comment
×
Name
*
Email
*
Comment
*
Submit Reply