Abstract
Many techniques with regard to updating parameter have been proposed to help Hopfield neural network (abbr. HNN) escape from the local minimum. In this paper, we propose a new model, objective function adjustment HNN that performs better than other HNN. The proposed algorithm adjusts multipliers of the energy function in gradient ascent direction when the HNN traps in a local minimum. Besides that, another technique is the idea to restart it. The two techniques can help the network to escape from the local minimum. The proposed method was analyzed theoretically and evaluated experimentally through traveling salesman problems simulation. The simulation results obtained from some TSPLIB benchmark problems showed that the proposed algorithm performed better than others and was able to find 100% valid solutions that were optimal or near-optimal solutions.
Original language | English |
---|---|
Pages (from-to) | 4555-4564 |
Number of pages | 10 |
Journal | International Journal of Innovative Computing, Information and Control |
Volume | 6 |
Issue number | 10 |
State | Published - 2010/10 |
Keywords
- Hopfield neural network
- Lagrange relaxation
- Local minimum
- Restart technique
- The traveling salesman problem
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Information Systems
- Computational Theory and Mathematics