MCQOPTIONS
Bookmark
Saved Bookmarks
→
Neural Networks
→
Hopfield Model
→
Is it necessary to set initial weights in prceptr..
1.
Is it necessary to set initial weights in prceptron convergence theorem to zero?
A.
yes
B.
no
Answer» C.
Show Answer
Discussion
No Comment Found
Post Comment
Related MCQs
If e(m) denotes error for correction of weight then what is formula for error in perceptron learning model: w(m + 1) = w(m) + n(b(m) – s(m)) a(m), where b(m) is desired output, s(m) is actual output, a(m) is input vector and ‘w’ denotes weight
w(m + 1) = w(m) + n(b(m) – s(m)) a(m), where b(m) is desired output, s(m) is actual output, a(m) is input vector and ‘w’ denotes weight, can this model be used for perceptron learning?
The perceptron convergence theorem is applicable for what kind of data?
Is it necessary to set initial weights in prceptron convergence theorem to zero?
Two classes are said to be inseparable when?
If two classes are linearly inseparable, can perceptron convergence theorem be applied?
When two classes can be separated by a separate line, they are known as?
In perceptron learning, what happens when input vector is correctly classified?
On what factor the number of outputs depends?
What is the objective of perceptron learning?
Reply to Comment
×
Name
*
Email
*
Comment
*
Submit Reply