next up previous contents index
Next: DefaultUndefs.m Up: Base Modules Previous: Base Modules

   
ProbSpec.m

Network definition file. Will be run as a module to start training on the specified network. Since it is contained in a problem subdirectory (see Section 1.3 for more on directory hierarchy), the first thing it does is to add the Base/ and Tools/ directories to the default search path in the environment.

The remaining file consists of variable declarations, and ends with a call to the training module (see Sections 3.1 for definition and 4.1.5 for details). The variables at the time of writing this manual are as follows:

inputfilename 
The name of the input file for the problem, as defined in Section 2.
outputfilename 
The name of the output file for the problem, as defined in Section 2.
nOfSamples 
The number of patterns (samples) in the data set (see Section 6 for possible amendments).
numInputs 
Number of units in the input layer of the network. This value should be consistent with the patterns in the data set (see Section 6 for possible amendments).
numHids 
Number of units in the hidden layer of the network. This value is independent of the data set, although should be chosen with care to minimize training time[Reed98, pp. 68, 69].
numOuts 
Number of units in the output layer of the network. This value should be consistent with the patterns in the data set (see Section 6 for possible amendments).
weightLim 
\( \pm \) limit for the weight initialization procedure. See the file Base/RndWtsBs.m for interpretation of values (see Section 4.1.4 for implementation).
rate 
Learning rate parameter for the back-propagation algorithm. This value is independent of the data set, although should be chosen with care to minimize training time[Reed98, pp. 68, 69], [Hassoun95, pp. 211-213].
epochs 
Maximum number of epochs that the network will be allowed to train. Once this limit is reached before meeting the convergence criterion, it will be deemed failure to converge (see Section 3.1.2).
pSS 
Once as many epochs to display the state of the network. Useful for keeping the information output limited, against obscurity. Also affects the plot step size while plotting results of errors (see Section 3.1.3.1).
checkType 
Convergence check procedure explained in Section 3.1.2.1. For the possible values, see Section 4.1.10.
back-propagation 
If it is 1 use back-propagation, otherwise calculate only feed-forward results without employing a learning procedure. Implementation given in weight update rules (see Section 4.1.9).
unitType 
Activation function to be used in all units of the network. Possible values are `val'  and `sig' , see the file Base/MainLoop.m (see Section 4.1.5) for the usage of this variable.
trainType 
Training style, either 'incremental'  or 'batch'  update of the weights of the network. See the file Base/MainLoop.m for interpretation of values (see Section 4.1.5).
trainMethod 
Training method selection. Can either be 'Standard'  for standard back-propagation and 'Newton'  for pseudo-Newton (approximate Newton) weight update rule [Hassoun95, pp. 215-216]. See the file Base/MainLoop.m for interpretation of values (see Section 4.1.5).
patternRandomization 
Boolean variable which decides whether to shuffle order of patterns in an epoch. Useful in incremental learning style. See the file Base/MainLoop.m for interpretation of values (see Section 4.1.5).
criterion , missesCriterion , rejectionCriterion 
criterion values that will be used for the convergence check procedure (see Section 3.1.2.1). See the file Base/ConvChk.m for interpretation of values (see Section 4.1.10).
continue 
Boolean value to determine whether to reuse previous weights. Useful when continuing from intentionally stopped training sessions. If this value is set to 1, the network will continue learning from where it left last time, otherwise the weights will be initialized to random values.
muNewton 
A special training parameter only for the Pseudo-Newton rule, if it is chosen as trainMethod. It is for avoiding infinite slopes in derivative calculations. Implementation given in weight update rules (see file Base/batchModeNewton.m and Base/incrementalModeNewton.m explained in Section 4.1.9) [Hassoun95, pp. 215-216].
normalizeInputs , normalizeWeights 
Boolean variables for activating normalizing procedures. These procedures are devised to ease the training algorithms. For data sets which contain input values bigger than 1, setting normalizeInputs will normalize all inputs to 1 prior to training. For data sets with many input or hidden units, setting normalizeWeights will divide initialized weight values to the fan-in number of the target unit [Hassoun95, p. 242]. normalizeInputs is implemented in Base/LoadIO.m (see Section 4.1.3) and normalizeWeights is implemented in Base/RndWtsBs.m (see Section 4.1.4).
randomseed 
If defined, it will be used for the random weight values used for initialization of the network. This is useful, if one needs to create the exact network twice, or in two different computers, giving the same number seed value would accomplish this.


next up previous contents index
Next: DefaultUndefs.m Up: Base Modules Previous: Base Modules
Cengiz Gunay
2000-06-25