Back propagation learning algorithm uses

The weights of the BP network are stored in the database, and the back propagation learning algorithm is used to correct the error based on the actual output of the network and the expected output. Hech-tNielson et al. Proved that the three-layer forward network can achieve the approximation of any nonlinear mapping, so the feedforward network used for defect prediction is generally mkn, the network input layer has m neurons, and the hidden layer has k Neurons, and there are n neurons in the output layer, m and n are the dimensions of the influencing factors and types of defects, the value is determined by the actual situation, that is, n is determined for the defects to be predicted, and the main defects are summarized for n types of defects The influencing factors can determine m. For the determination of the number of hidden units k, empirical formulas and merger and deletion principles can be used to determine the parameters such as the number of samples, learning step size, momentum factor, etc. can be found in the corresponding literature. For the determined defect prediction, the corresponding parameters can be obtained through analysis, such as the establishment of neural network model 0 for the diagnosis of defects in ductile iron castings.

The establishment and training of MATLAB-based BP network using high-level language to complete the learning and training of neural network is a relatively complicated and time-consuming work, and the simulation program written in MATLAB language is more convenient, fast and efficient.

The transfer function determines that NNbox provides multiple transfer functions. For the transfer function of defect prediction, the general hidden layer transformation function takes the tansig function, which can map the input range of neurons (->, +>) to (-1, +1 ), Which is a differentiable function, very suitable for neurons trained with BP. If the last layer of the BP network is a Sigmoid-type neuron, the output range of the entire network is small; if purelin-type neurons are used, the output of the entire network can be any value, so the transfer function of the general output layer is purelin-type.

Surgical Masks

Reteck Electronic Co., Ltd. , https://www.reteck.com