machine learning - simple perceptron model and XOR -
machine learning - simple perceptron model and XOR -
sorry maintain asking here. study hard ready reply questions too!
many papers , articles claim there no restriction on choosing activation functions mlp.
it seems matter 1 fits given condition.
and articles mathematically proven simple perceptron can not solve xor problem.
i know simple perceptron model used utilize step function activation function.
but if doesn't matter activation function use, using
f(x)=1 if |x-a|<b f(x)=0 if |x-a|>b
as activation function works on xor problem. (for 2input 1output no hidden layer perceptron model)
i know using artificial functions not learning model. if works anyway, why articles proven doesn't work?
does article means simple perceptron model 1 using step function? or activation function simple perceptron has step function unlike mlp? or wrong?
in general, problem non-differentiable activation functions (like 1 proposed) cannot used back-propagation , other techniques. propagation convenient way estimate right threshold values (a
, b
in example). popular activation functions selected such approximate step behaviour while remaining differentiable.
machine-learning neural-network xor perceptron
Comments
Post a Comment