DOI: 10.14704/nq.2018.16.6.1588

Training Feedforward Neural Networks Using Social Learning Particle Swarm Optimization- A Case Comparison Study on Electrical System

Yuansheng Huang, Ying Qiao, Yuelin Gao

Abstract


The knowledge about training feed forward neural networks (FNNs) is an important and complex issue in the supervised learning field. In the process of learning, the FNNs system involves some input parameters such as connection weights and biases, which may greatly influence the performance of FNNs training. In this paper, a newly developed meta-heuristic method, named social learning particle swarm optimization (SLPSO), is trying to find the optimal combination of connection weights and biases for FNNs, which is often used to deal with power load forecasting problem. In the numerical experiments, a case on the power load forecasting problem is employed to verify the effectiveness of SLPSO. The experiment results indicate that SLPSO has the advantages on the training accuracy and testing accuracy with respect to other six state-of-the-art intelligent optimization algorithms.

Keywords


Training Feed Forward Neural Networks, Social Learning Particle Swarm Optimization, Particle Swarm Optimization, Electrical System

Full Text:

PDF

References


Askarzadeh A, Rezazadeh A. Artificial neural network training using a new efficient optimization algorithm. Applied Soft Computing 2013; 13(2):1206-13.

Bullinaria JA, AlYahya K. Artificial Bee Colony Training of Neural Networks. Nature Inspired Cooperative Strategies for Optimization (NICSO 2013); 512: 191-201.

Chen PW, Lin WY, Huang TH, Pan WT. Using fruit fly optimization algorithm optimized grey model neural network to perform satisfaction analysis for e-business service. Applied Mathematics & Information Sciences 2013; 7: 459-65.

Cheng R, Jin YC. A social learning particle swarm optimization algorithm for scalable optimization. Information Sciences 2015; 291(10): 43-60.

Dayhoff JE. Neural network architectures: an introduction. NewYork: Van Nostrand Reinhold, 1990; 3(4): 2-10.

Gao H, Xu WB. A new particle swarm algorithm and its globally convergent modifications. IEEE transactions on systems, man, and cybernetics. Part B, Cybernetics 2011; 41(5): 1334-51.

Ghritlahre HK, Prasad RK. Investigation on heat transfer characteristics of roughened solar air heater using ANN technique. International Journal of Heat and Technology 2018; 36(1): 102-10.

Gudise VG, Venayagamoorthy GK. Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. Proceedings of the 2003 IEEE, Swarm Intelligence Symposium 2003; 26(6): 110-17.

He S, Wu QH, Saunders JR, Group search optimizer: An optimization algorithm inspired by animal searching behavior. IEEE Transactions on Evolutionary Computation 2009; 13(5): 973-90.

Ilonen J, Kamarainen JK, Lampinen J. Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters 2003; 17(1): 93-105.

Isa NAM, Mamat WMFW. Clustered-hybrid multilayer perceptron network for pattern recognition application. Applied Soft Computing 2011; 11(1): 1457-66.

Isah OR, Usman AD, Tekanyi AMS. A hybrid model of PSO algorithm and artificial neural network for automatic follicle classification. International Journal Bioautomation 2017; 21(1): 43-58.

Kennedy J, Eberhart R. Particle swarm optimization. IEEE International Conference on Neural Networks 1995; 4(8): 1942-48.

Kennedy J.Bare bones particle swarms. Proceedings of the IEEE Swarm Intelligence Symposium 2003; 80-87.

Kulluk S, Ozbakir L, Baykasoglu A. Training neural networks with harmony search algorithms for classification problems. Engineering Applications of Artificial Intelligence 2012; 25(1): 11-19.

Liang JJ, Qin AK, Suganthan PN, Baskar S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Transactions on Evolutionary Computation 2006; 10(3): 281-95.

Lin CJ.A self-adaptive quantum radial basis function network for classification applications. 2004 IEEE International Joint Conference on Neural Networks 2004; 4(12): 3263-68.

Mendes R, Kennedy J, Neves J. The fully informed particle swarm: simpler, maybe better. IEEE Transactions on Evolutionary Computation 2004; 8(3): 204-10.

Mirjalili S, Sadiq AS.Magnetic optimization algorithm for training multilayer perceptron. IEEE International Conference on Industrial and Intelligent Information (ICIII 2011); 2(1): 42-46.

Mirjalili SA, Hashim SZM, Sardroudi HM. Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation 2012; 218(22): 11125-37.

Montana DJ, Davis L. Training feed forward neural networks using genetic algorithms. Proceeding of the 11th international joint conference on Artificial intelligence 1989; 1: 762-67.

Parsopoulos KE, Vrahatis MN. UPSO-a unified particle swarm optimization. Lecture Series on Computational Sciences 2004; 868-73.

Rakitianskaia AS, Engelbrecht AP. Training feedforward neural networks with dynamic particle swarm optimization. Swarm Intelligence 2012; 6(3): 233-70.

Rumelhart DE, Hinton GE, Williams RJ. Learning representations by backpropagating errors. Nature 1986; 323(9): 533-36.

Sen GD, Sharma J, Goyal GR, Singh AK. A Multi-objective PSO (MOPSO) algorithm for optimal active power dispatch with pollution control. Mathematical Modelling of Engineering Problems 2017; 4(3): 113-19.

Shaw S, Kinsner W. Chaotic simulated annealing in multilayer feedforward networks. Canadian Conference on Electrical and Computer Engineering 1996; 1(89): 265-69.

Sun J, Xu WB, Feng B.A global search strategy of quantum-behaved particle swarm optimization. IEEE Conference on Cybernetics and Intelligent Systems 2004; 1(11): 111-16.

Wang TC, Xie YZ. BP-GA data fusion algorithm studies oriented to smart home. Mathematical Modelling of Engineering Problems 2016; 3(3): 135-40.

Wang X. The application of genetic algorithms in the biological medical diagnostic research. International Journal Bioautomation 2016; 20(4): 493-504.

Wang Y, Cai ZX, Zhang QF. Differential evolution with composite trial vector generation strategies and control parameters. IEEE Transactions on Evolutionary Computation 2011; 15(1): 55-66.

Yu JB, Xi LF, Wang SJ. An Improved Particle Swarm Optimization for Evolving Feedforward Artificial Neural Networks. Neural Processing Letters 2007; 26(3): 217-31.

Zhuang XY. Application of support vector machine optimized by genetic algorithm in electric load prediction. Computer Simulation 2012; 29(3): 348-50.


Supporting Agencies





| NeuroScience + QuantumPhysics> NeuroQuantology :: Copyright 2001-2019