Welcome to the new platform of Programmer's Heaven! We apologize for the inconvenience caused, if you visited us from a broken link of the previous version. The main reason to move to a new platform is to provide more effective and collaborative experience to you all. Please feel free to experience the new platform and use its exciting features. Contact us for any issue that you need to get clarified. We are more than happy to help you.
Neural Network general question
I'm experimenting with backprop NN's and have a couple questions regarding implementation / training:
1. Since the usual method of training a BPNN means normalizing inputs to accommodate the sigmoidal function limitations, how does one accurately model a continuous function with solutions that fall outside these ranges? I've got a few answers in my mind, but it's not really clear to me just yet. That is, suppose I want to model the relationship between, say, product demand and price. Do I have to normalize the NN input / target values for training and then "scale" the outputs after I run the NN so the outputs are meaningful in their original context (i.e., they look like real prices, not normalized values)?
2. Is it legitimate to mix real and discrete inputs? Citing the product demand example, suppose the price of a product in a particular area is dependent upon two things: the month of the year and the local demand in terms of quantity. It would seem most sensible to normalize the quantities as a series of real (continuous) values, and the months as discrete integers (vectors in a binary or one-of-N format). Thus, the inputs would consist of a real-value input for demand, and a series of inputs for a vector of 1's and 0's to represent the month.
Any comments / links would be appreciated.
0 · ·