Set-up inputs in MATLAB. Open the NetLab toolbox -- "Mathworks" > "NetLab" -- and assign the input parameters as follows: matrix for input vectors, matrix for target values, number of units in the hidden layer, maximum number of iterations for non-convergent networks and the parameters for weight decay.
Use Gaussian regularization statistics for each network layer. Set the weight decay parameters to mlpprior, so that: prior = mlpprior (nin, nhidden, nout, aw1, ab1, aw2, ab2), where "aw1" is the weight decay of first layer weights; "ab1" is the weight decay parameter for the first layer biases; "aw2" is the weight decay parameter for the second layer weights; "ab2" is the weight decay parameter for the second layer of biases. Set-up the network by training the input and output data such that net = mlp(nin, nhidden, nout, function, prior) and [net, options] = netopt (net, options, trainIn, trainOut, method).
Employ standardization of inputs and targets or adjust the penalty term for standard deviations for all inputs and targets. A good generalization can be possible if at least 3 different decay constants are available for the input-hidden, hidden-hidden, and hidden-output weights in the network.