TY - JOUR
T1 - An approximate logic neuron model with a dendritic structure
AU - Ji, Junkai
AU - Gao, Shangce
AU - Cheng, Jiujun
AU - Tang, Zheng
AU - Todo, Yuki
N1 - Publisher Copyright:
© 2015 Elsevier B.V.
PY - 2016/1/15
Y1 - 2016/1/15
N2 - An approximate logic neuron model (ALNM) based on the interaction of dendrites and the dendritic plasticity mechanism is proposed. The model consists of four layers: a synaptic layer, a dendritic layer, a membrane layer, and a soma body. ALNM has a neuronal-pruning function to form its unique dendritic topology for a particular task, through screening out useless synapses and unnecessary dendrites during training. In addition, corresponding to the mature dendritic morphology, the trained ALNM can be substituted by a logic circuit, using the logic NOT, AND and OR operations, which possesses powerful operation capacities and can be simply implemented in hardware. Since the ALNM is a feed-forward model, an error back-propagation algorithm is used to train it. To verify the effectiveness of the proposed model, we apply the model to the Iris, Glass and Cancer datasets. The results of the classification accuracy rate and convergence speed are analyzed, discussed, and compared with a standard back-propagation neural network. Simulation results show that ALNM can be used as an effective pattern classification method. It reduces the size of the dataset features by learning, without losing any essential information. The interaction between features can also be observed in the dendritic morphology. Simultaneously, the logic circuit can be used as a single classifier to deal with big data accurately and efficiently.
AB - An approximate logic neuron model (ALNM) based on the interaction of dendrites and the dendritic plasticity mechanism is proposed. The model consists of four layers: a synaptic layer, a dendritic layer, a membrane layer, and a soma body. ALNM has a neuronal-pruning function to form its unique dendritic topology for a particular task, through screening out useless synapses and unnecessary dendrites during training. In addition, corresponding to the mature dendritic morphology, the trained ALNM can be substituted by a logic circuit, using the logic NOT, AND and OR operations, which possesses powerful operation capacities and can be simply implemented in hardware. Since the ALNM is a feed-forward model, an error back-propagation algorithm is used to train it. To verify the effectiveness of the proposed model, we apply the model to the Iris, Glass and Cancer datasets. The results of the classification accuracy rate and convergence speed are analyzed, discussed, and compared with a standard back-propagation neural network. Simulation results show that ALNM can be used as an effective pattern classification method. It reduces the size of the dataset features by learning, without losing any essential information. The interaction between features can also be observed in the dendritic morphology. Simultaneously, the logic circuit can be used as a single classifier to deal with big data accurately and efficiently.
KW - Back propagation
KW - Dendrite
KW - Logic circuit
KW - Pattern classification
KW - Pruning
UR - http://www.scopus.com/inward/record.url?scp=84955138935&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2015.09.052
DO - 10.1016/j.neucom.2015.09.052
M3 - 学術論文
AN - SCOPUS:84955138935
SN - 0925-2312
VL - 173
SP - 1775
EP - 1783
JO - Neurocomputing
JF - Neurocomputing
ER -