Beruflich Dokumente
Kultur Dokumente
Marius Crisan
Department of Computer and Software Engineering, Polytechnic University of Timisoara
marius.crisan@cs.upt.ro
wt+1 = k1 + k2(l1twt + l2twt2 + … + lntwtn), (2) where k1 and k2 are scaling parameters.
wd(t+1) = k1+k2(dtwdt+rtwdt2+itwdt3+vtwdt4+etwdt5+stwdt5),
(11)
wc(t+1) = k1+k2(ctwct+atwct2+rtwct3), (12)
wr(t+1) = k1+k2(rtwrt+etwrt2+dtwrt3). (13)
Similarly, the chaotic attractors for the letters c, d, e, i, Figure 6. Chaotic attractor that captures the
and v, can be constructed using the following meaning of the entire sentence (4).
Of course, the variety of letter attractors is not very [8] T.P. Vogels, K. Rajan, and L.F. Abbott, “Neural
rich in our examples because we used only simple Networks Dynamics”, Annual Review of Neuroscience, July
quadratic maps. Therefore, the resulting word patterns 2005, Vol. 28, pp. 357-376.
are somehow less complicated, but they are suggestive
[9] M.I. Rabinovich, P. Varona, A.I. Selverston, and H.D.I.
enough to illustrate the principles of the proposed Abarbanel, “Dynamical principles in neuroscience”, Reviews
model. In a future work, using, for instance, ordinary of modern physics, vol. 78, no. 4, 2006, pp. 1213-1265.
differential equations, more sophisticated word
attractors can be constructed. [10] H. Helbig, Knowledge Representation and the Semantics
of Natural Language (Cognitive Technologies), Springer
5. Conclusions Verlag Berlin Heidelberg, 2006.
[3] C.D. Manning, H. Schuetze, Foundations of Statistical [17] Y-Cheng Lai, “Encoding Digital Information Using
Natural Language Processing, The MIT Press, Cambridge, Transient Chaos,” International Journal of Bifurcation and
Massachussetts, 2003. Chaos, Vol. 10, No. 4, 2000, pp. 787–795.
[4] H.L. Moisl, “Artificial Neural Networks and Natural [18] H.L. Moisl, “Linguistic Computation with State Space
Language Processing”, in Encyclopedia of Library and Trajectories”, in Emergent Neural Computational
Information Science, 2nd ed. (M.J. Bates, M. Niles Maack, Architectures Based on Neuroscience: Towards
and M. Drake ed.), Taylor and Francis Group, 2003. Neuroscience-Inspired Computing (S. Wermter, J. Austin,
and D. Willshaw, ed.), Springer, 2001, pp. 442-460.
[5] Q. Ma, “Natural Language Processing with Neural
Networks”, in Proceedings of the Language Engineering [19] T. Yang, “Dynamics of vocabulary evolution”,
Conference (LEC'02) 2002, pp. 45-56. International Journal of Computational Cognition
(http://www.YangSky.com/yangijcc.htm), Vol. 1, No. 1,
[6] T. Kohonen, P. Somervuo, "Self-organizing maps of March 2003, pp. 1–19.
symbol strings with application to speech recognition", in
Proc. of Workshop on Self-Organizing Maps (WSOM'97), [20] B.K. Matilal, Logic, Language and Reality, 2nd ed.
pp. 2-7, Espoo, Finland, 1997. Motilal Banarsidass Publ., Delhi, 1990 (reprint: 1997).
[7] T. Honkela, Self-Organizing Maps in Natural Language [21] M. Crisan, “Chaos and Natural Language Processing”,
Processing, PhD Thesis, Helsinki University of Technology, Acta Polytechnica Hungarica, Vol. 4, No.3, 2007, pp. 61–74.
Espoo, Finland, 1997. (http://www.cis.hut.fi/~tho/thesis/)