Objective function adjustment hopfield-type neural network with restart technique

Xiaofei Wang*, Hiroki Tamura, Wei Wang, Zheng Tang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Many techniques with regard to updating parameter have been proposed to help Hopfield neural network (abbr. HNN) escape from the local minimum. In this paper, we propose a new model, objective function adjustment HNN that performs better than other HNN. The proposed algorithm adjusts multipliers of the energy function in gradient ascent direction when the HNN traps in a local minimum. Besides that, another technique is the idea to restart it. The two techniques can help the network to escape from the local minimum. The proposed method was analyzed theoretically and evaluated experimentally through traveling salesman problems simulation. The simulation results obtained from some TSPLIB benchmark problems showed that the proposed algorithm performed better than others and was able to find 100% valid solutions that were optimal or near-optimal solutions.

Original languageEnglish
Pages (from-to)4555-4564
Number of pages10
JournalInternational Journal of Innovative Computing, Information and Control
Volume6
Issue number10
StatePublished - 2010/10

Keywords

  • Hopfield neural network
  • Lagrange relaxation
  • Local minimum
  • Restart technique
  • The traveling salesman problem

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Information Systems
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Objective function adjustment hopfield-type neural network with restart technique'. Together they form a unique fingerprint.

Cite this