[1]LIU Wei,GUO Zhiqing,LIU Guangwei,et al.Improved atom search optimization by combining amplitude random compensation and step size evolution mechanism[J].CAAI Transactions on Intelligent Systems,2022,17(3):602-616.[doi:10.11992/tis.202103033]
Copy
CAAI Transactions on Intelligent Systems[ISSN 1673-4785/CN 23-1538/TP] Volume:
17
Number of periods:
2022 3
Page number:
602-616
Column:
学术论文—人工智能基础
Public date:
2022-05-05
- Title:
-
Improved atom search optimization by combining amplitude random compensation and step size evolution mechanism
- Author(s):
-
LIU Wei1; 2; 3; GUO Zhiqing1; 2; 3; LIU Guangwei4; JIN Bao1; 2; 3; WANG Dong4
-
1. College of Science, Liaoning Technical University, Fuxin 123000, China;
2. Institute of Intelligent Engineering and Mathematics, Liao ning Technical University, Fuxin 123000, China;
3. Institute of Intelligent Engineering and Methematics, Liaoning Technical University, Fuxin 123000, China;
4. College of Mines, Liaoning Technical University, Fuxin 123000, China
-
- Keywords:
-
meta-heuristics algorithms; atom search optimization; tent chaos optimization; amplitude random compensation; step size evolution mechanism; parameter optimization of BP neural network; classification; machine learning
- CLC:
-
TP18
- DOI:
-
10.11992/tis.202103033
- Abstract:
-
The weak optimization accuracy of the atom search optimization algorithm can easily fall into a local extremum owing to population diversity, parameter adaptability, and position dynamics. To overcome this challenge, we propose an improved ASO algorithm (IASO) by integrating chaos optimization, amplitude random compensation, and step size evolution mechanism and successfully apply it to classification tasks. First, tent chaos is introduced to enhance the distribution uniformity of atomic population in the search space. Then, an amplitude function is constructed to randomly perturb the algorithm parameters, and the step evolution factor is added to update the atomic position to enhance the globality and convergence of the algorithm. Finally, the improved algorithm is applied to the parameter optimization of the error feedback BP neural network. Compared with the numerical calculations of six metaheuristic algorithms under 20 benchmark functions, the experimental results indicate that IASO not only shows a good optimization performance in solving multidimensional benchmark functions but also a higher classification accuracy than two comparison algorithms in optimizing the BP neural network parameters.