I'm experimenting with backprop NN's and have a couple questions regarding implementation / training:
1. Since the usual method of training a BPNN means normalizing inputs to accommodate the sigmoidal function limitations, how does one accurately model a continuous function with solutions that fall outside these ranges? I've got a few answers in my mind, but it's not really clear to me just yet. That is, suppose I want to model the relationship between, say, product demand and price. Do I have to normalize the NN input / target values for training and then "scale" the outputs after I run the NN so the outputs are meaningful in their original context (i.e., they look like real prices, not normalized values)?
2. Is it legitimate to mix real and discrete inputs? Citing the product demand example, suppose the price of a product in a particular area is dependent upon two things: the month of the year and the local demand in terms of quantity. It would seem most sensible to normalize the quantities as a series of real (continuous) values, and the months as discrete integers (vectors in a binary or one-of-N format). Thus, the inputs would consist of a real-value input for demand, and a series of inputs for a vector of 1's and 0's to represent the month.
Any comments / links would be appreciated.
0 · ·