Sie sind auf Seite 1von 16

Dendrite Morphological Neural Networks

Trained by Differential Evolution


Fernando Arce1, Erik Zamora2, Humberto Sossa1, Ricardo Barrn1
Instituto Politcnico Nacional
CIC1 , UPIITA2, Mxico City
ezamorag@ipn.mx

Grants:
SIP-IPN [20160945, 20161116]
CONACYT [155014, 65]

Goal

To develop a theoretical and practical framework


for morphological neurons

Classical Perceptrons

Monolayer

Multilayer

Dendrite Morphological Neuron (DMN)

Decision Boundary for DMN

Motivation

Elimination method, Ritter et al. 2003

Motivation

Sossa & Guevara, 2014

Motivation: Typical Cost functions for Morphological Neurons

Non-differentiable, non-continuous, and have many local minima.

DMN-DE: Training

https://github.com/FernandoArce/DE_for_DMNN
Fitness function

HBd initialization:
K-means++ initialization
Mutation: Select a parent, from each ten parents, and permute 30% of
the k rows with the purpose of changing the class order.
Crossover: We do not report crossover due to obtaining similar results
with and without it in the experimental section.

DMN-DE: Training

https://github.com/FernandoArce/DE_for_DMNN

DMN-DE: Training

DMN-DE: Comparison with Other DMN Training Methods

DMN-DE: Comparison with Other Machine Learning Methods

Conclusions and Future Work

DMN-DE achieved the smallest classification error and the number of dendrites
was much less than the training methods based solely on heuristics.
K++ means and HBd methods can be useful to initialize dendrite parameters
before optimizing them.
DMN-DE showed a competitive performance compared with the MLP, the SVM
and the RBN classifiers.

The algorithm is time consuming. For this reason, future work will be the
implementation of the algorithm on a Graphic Processing Unit (GPU)

References
[1] Ritter, G. X., Sussner, P., Aug 1996. An introduction to morphological neural networks. In: Pattern Recognition, 1996., Proceedings of the
13th 536 International Conference on. Vol. 4. pp. 709717 vol.4.
[2] Sussner, P., Sep 1998. Morphological perceptron learning. In: Intelligent 570 Control (ISIC), 1998. Held jointly with IEEE International
Symposium on Computational Intelligence in Robotics and Automation (CIRA), Intelligent Systems and Semiotics (ISAS), Proceedings. pp.
477482.
[3] Pessoa, L. F., Maragos, P., 2000. Neural networks with hybrid morphological/rank/linear nodes: a unifying framework with applications to
handwritten character recognition. Pattern Recognition 33 (6), 945 960.
[4] Ritter, G. X., Urcid, G., Mar 2003. Lattice algebra approach to single neuron computation. IEEE Transactions on Neural Networks 14 (2), 282295.
[5] Barmpoutis, A., Ritter, G. X., 2006. Orthonormal basis lattice neural networks. In: Fuzzy Systems, 2006 IEEE International Conference on. pp.
331-336.
[6] Ritter, G. X., Urcid, G., Juan-Carlos, V. N., July 2014. Two lattice metrics dendritic computing for pattern recognition. In: Fuzzy Systems
(FUZZIEEE), 2014 IEEE International Conference on. pp. 4552.
[7] Sussner, P., Esmi, E. L., 2011. Morphological perceptrons with competitive learning: Lattice-theoretical framework and constructive learning
algorithm. Information Sciences 181 (10), 1929 1950, special Issue on Information Engineering Applications Based on Lattices.
[8] Sossa, H., Guevara, E., 2014. Efficient training for dendrite morphological neural networks. Neurocomputing 131, 132 142.
[9] Ritter, G. X., Urcid, G., 2007. Computational Intelligence Based on Lattice Theory. Springer Berlin Heidelberg, Berlin, Heidelberg, Ch. Learning
in Lattice Neural Networks that Employ Dendritic Computing, pp. 25-44.
[10] Zamora & Sossa, Regularized Divide and Conquer Training for Dendrite Morphological Neurons, Mecatrnica y Robtica de Servicio:
Teora y Aplicaciones, 2017
[11] Zamora & Sossa, Dendrite Morphological Neurons Trained by Stochastic Gradient Descent, IEEE Symposium Series on Computational
Intelligence, Greece, December 2016.
[12] Arce, Zamora, Barron & Sossa, Dendrite Morphological Neurons Trained by Differential Evolution, IEEE Symposium Series on
Computational Intelligence, Greece, December 2016.
[13] Zamora & Sossa, Dendrite Morphological Neurons Trained by Stochastic Gradient Descent, (submitted to Neurocomputing)

Thanks

Comments or Questions?
Erik Zamora

ezamora1981@gmail.com

Das könnte Ihnen auch gefallen