Welcome to the new platform of Programmer's Heaven! We apologize for the inconvenience caused, if you visited us from a broken link of the previous version. The main reason to move to a new platform is to provide more effective and collaborative experience to you all. Please feel free to experience the new platform and use its exciting features. Contact us for any issue that you need to get clarified. We are more than happy to help you.
Getting confused with Neurorule Algorithm
I'm doing my assigment to extract rule using neurorule algorithm (basically neural network) which is explained in a journal entitled "NeuroRule: A Connectionist Approach to Data Mining".
It has three major steps
1. Network training
2. Network Pruning
3. Rule extracting
I'm still in the first step which is a kind of backpropagation. It is explained that, there are two activation function which is used. Log sigmoid and hyperbolic tangent sigmoid. I get mad with them. They produced a value that isn't really clear to be 0 or 1 so I search about activation function more and find that applying those needs derivative activation function.
But inside the journal, there's no explaination about that. Should I add it or not? and is there any explicit impact if I add it?
Please for the help.
0 · ·