Sie sind auf Seite 1von 9

FORMAL LINGUISTICS AND THE DEDUCTIVE GRAMMAR

Theoretical and Empirical Software Engineering Research Centre Dept. of Electrical and Computer Engineering Schulich Schools of Engineering, University of Calgary 2500 University Drive, NW, Calgary, Alberta, Canada T2N 1N4 Tel: (403) 220 6141, Fax: (403) 282 6855 Email: yingxu@ucalgary.ca

Yingxu Wang, Prof., PhD, P.Eng, FWIF, SMIEEE, MACM International Center for Cognitive Informatics (ICfCI)

Abstract
This paper presents a comparative study on fundamental theories of natural and artificial languages by contrasting their morphologies, syntaxes, semantics, and grammars. Formal syntaxes and semantics of natural languages are analyzed. The abstract syntaxes of English and their formal manipulations are described. A universal language processing model and the deductive grammar of English are developed toward the formalization of the universal grammar proposed in linguistics. Comparative analyses of natural and programming languages, as well as the linguistic perception on software engineering, are discussed. A wide range of applications of the deductive grammar of English have been explored in language acquisition, comprehension, generation, and processing in intelligent systems and cognitive informatics.
Keywords: Cognitive informatis l

implied as in natural languages. Although a language string is symbolically constructed and read sequentially, all natural languages have the so called metalinguistic ability to reference themselves out of the sequences. That is, the ability to construct strings that refer to other strings in the

language.

methodology sprogramming this should software and However, engineering. Cniifcomputing egn g engineering, formal languages, the universal grammar,^ taken as granted as the only approach to software be fnot deductive grammar, syntax, semantics, EBNF, RTPA, of because the

From a linguistic point of view, software science is the application of information technologies in communicating between a variety of stake holders in computing, such as professionals/customers, architects/software engineers, programmers/computers, and computing systems! application environments. Therefore, linguistics and formal language theories play important roles in computing theories, without them computing and software engineering theories would not be complete. It is noteworthy that, historically, language-centered
had been the

Keywords:

dominate

in

comparative linguistics

programming

1. INTRODUCTION

Linguistics is the discipline that studies human or natural languages. Language is an oral and/or written symbolic system for thought, self-expression, and communication. Lewis Thomas highlighted that "the gift of language is the single human trait that marks us all genetically, setting us apart from the rest of life [Thomas, 1974]." This is because functions of languages can be identified as for memory, instruction, communication, modeling, thought, reasoning, problem-solving, prediction, and planning [Pattee, 1986; Casti and Karlqvist, 1986].

languages is inadequate to deal with complicated software systems. In addition, the extent of and the level of abstraction of programming rigorousness are too low in modeling the architectures and languages behaviors of software systems. This is why bridges in mechanical engineering or buildings in civil engineering are not modeled or described by natural or artificial languages. This observation leads to the recognition of the need for mathematical modeling of both software system architectures and static/dynamic behaviors, supplement with the support of automatic code generation systems.
This paper comparatively studies fundamental theories of natural and artificial languages. Formal languages and applications of mathematics in computational linguistics are investigated. This paper analyzes not only how linguistics may improve the understanding of programming languages and their work products software,but also how formal language theories extend the study on natural languages. In the remainder of this paper, formal syntaxes and semantics of natural languages

development,

expressive power

Linguists commonly agree there is a universal language structure or the universal grammar [Chomski, 1956/57/59/62/65/82; Pattee, 1986; O'Grady and Archibald, 2000]. However, the grammar may be precise and explicit as in formal languages, or ambiguous and
Proc. 6th IEEE Int. Conf. on Cognitive Informatics (ICCI'07) D. Zhang, Y. Wang, and W. Kinsner (Eds.) 1-4244-1327-31071$25.OO 2007 IEEE

43

universal grammar proposed by Chomsky [Chomsky, 1957/65]. Comparative analyses of natural and programming languages, as well as the linguistic perceptions on software engineering, are discussed in Section 4.
2. FORMAL SYNTAXES AND SEMANTICS OF NATURAL LANGUAGES

are explored and the abstract syntax of English and its formal manipulations are described in Section 2, which results in a universal language processing model. Based on formal language theories, a deductive grammar of English is developed in Section 3 toward the formalization of the

Theorem 1. Syntaxes of natural languages, Syn, are two-dimensionally describable and are recursive. However, the semantics of languages implied by the sequential syntaxes can be more complicated, i.e. nonsequential and multi-dimensional in most cases, such as the branch, parallel, embedded, concurrent, interleaved, and interrupt structures as shown in Table 1.
Table 1. Semantic Relations of Sentences

Syntaxes deal with relations and combinational rules of words in sentences. Semantics embody the meaning of words and sentences. This section presents a formal treatment of syntaxes and semantics of natural languages, which forms a foundation of the universal language processing model and the deductive grammar ofEnglish.
2.1 Formal Syntactic Models of Natural Languages

Fq [i FBranch
5

FRetoti

Syb

thn-

Desrpion

[V

Embedded |

______7_7____|that, -hi_
faltepnatively

F Fknterleave [|11l f I Fr

br and, simultaneously (action by the same subjects), jthat~ which, if, whetherf _ and, smltaneously (action by different subjects)

Much of our understanding of the syntactic rules of languages has come from linguists where the common rules that underlie natural languages have been studied and elicited. One of the most influential linguistic framework, known as the theory of universal grammar (UG), was proposed by Noam Chomsky [Chomsky, 1957/65]. UG and its modem version, the Government and Binding Theory [Chomsky, 1982], have become a linguistic premise on grammatical analyses in linguistics.

Theorem 2. The semantic relations of sentences, are a set of connectors as given in Table 1, i.e.:
II %= {e*, 11,

7Z,
(2)

$ ,1I

Definition 1. A syntax is a domain of linguistics that studies sentence formation and structures.
Defmition 2. the abstract Definition 2. An An abstract abstract syntax is iS the abstract description of a syntax system where concrete strings of tokens and their grammatical relations are represented and analyzed symbolically.
syntax

In Theorem 2, the seven semantic relations are adopted as a subset of the 17 process relations as defined in RealTime Process Algebra (RTPA) [Wang, 2002, 2007a].

Definition 3. Syntactic elements S in natural languages


can be classified into the categories of lexical (L), functional (S), phrasal (P), and relational (1Z), i.e.:

S- (C, F, P, i)
= {N,V,A,A,P} 1{r, 6, t,- a,y,-} I{NP, VP, AP, AP, PP, CP}
H { , 1,11, , , I11 i} (3)

Linguistic studies are used to the convention of hierarchical tree schema to denote sentence structures in syntactic analyses. In a syntactic perspective, any human language, natural or artificial, is a sequential or onedimensional (I-D) symbol stream of syntactic blocks, which can be decomposed into paragraphs, sentences, phrases, words, and letters from the top-down. Although the syntax of a language is 1 -D, its grammar is recursively structured in a 2-D space. For instance, the abstract productions, A -4 (a , Aa, B) and B -< b, can be denoted as: / A

Aa B
b

(1)

A summary of definitions of syntactic elements of languages in Eq. 3 is provided in Table 2, where an element in angular brackets is optional. In Table 2, there is a special category of lexical components known as complement phrases (CPs). CPs can be a supplemental part ofN/NP, VAVP, A/AP, or P/PP. The rules for defining relations between CPs and other lexical categories of sentences may be referred to O'Grady and Archibald (2000).

44

2.2 Formal Means for Syntactic and Semantic Analyses A Backus-Naur form is a recursive notation for describing the productions of a context-free grammar. It is developed based on the work of John Backus with contributions by Peter Naur [Naur, 1963, 1978].

(ii) T is a finite set ofterminals, T c ;

(iii) Vis a finite set of nonterminals, V I A V

(iv) P is a finite set ofproduction rules denoted by

by a 5-tuple:

Defnition 4. A Backus-Naur Form (BNF) is defined


BNF -(

a ::= f S (v) is a finite set of metasymbols that denote relations

of the multiple derived products 8s separated by alternative selection I.

,T, V

S)

(4)

For example, the BNF counterparts of Eq. I can be

where
(i)

recursively denoted by:

X is a finite nonempty set of alphabet;

A ::=aIAa B B::= b

(5)

Table 2. Definition of Lexical and Syntactic Categories of Languages


| Category

Subeategory

| Symbol

|-

Dewrption

77-V
|

Lex

edvrb

Nica

Preposition Functional |~ ; 7 . Determiner | word

~ |Adverb

Adjective

FV
[
A

IN

|Entitiesandabstract objects
jProperties of a verb

Actions, states, and possessions IPropertiesof a noun


Designated relations in space or time

|P

[Degree I Qualfie

1
[x
-

|==.7 |Noun phase e | Verbphrase

| land,or,that which ifwhether,etc. |[Conjunctionword Iy F7I Ne gative . [7 Inot JA syntactic unit with one or more words as a lexical category F [|Phrasa L I
Fl

[Auxiiary

the, a, this, these, etc. |too, so, very, more, quite, etc. Ialmost, alwys, often perhaps, never, etc. Rwill, ca., may, must, should, could, etc.

=[7

Jdjeciv.phrase' Adverb phrase I


Prepositional
phrase
[phrase

N[ _ .V [V NPetc.

Fi

A_[PP] i VA] V ([A] V A[


[] P [NP] CP

Complement

Supplemental part of N/NP, V/VP, A/AP, or P/PP


.

Relatioal

______jParalluei

|Branch

I:Sequential
IEmbedded
Concurrent

JIZ
[i i
[ |

_...._=_

._

jand,then.
or

A set of connectors

Ftat,
al

and,simultaneously (action by the same subjects) which, if,whether s l(action by different subjects)
y

Interlae="

while, during [Interrupwhe*,

45

BNF is found useful to define context-free grammars of programming languages, because its simple notations, highly recursive structures, and the available support of many compiler generation tools such as YACC [Johnson, 1975], LEX [Lesk, 1975], and ANTLR [Parr, 2000].
It is realized in applications that the descriptive power of BNF may be greatly improved by introducing a few extended metasymbols, particularly those for repetitive and optional structures of grammar rules. There are a variety of extended BNFs proposed for grammar description and analysis [Wirth, 1976]. A typical EBNF is given below.

natural languages, Sem, is a 5-tuple, i.e.:

Sem-(J, B, 0, T, 5)
where

(9)

* * * * * *

J is the subject of the sentence. B is a behavior or action. 0istheobjectofthesentence. 7 is the time whenthe actionisoccurring. S is the spae wher the action is occurring. S iS the space where the action iS occurrig.

According to Theorem 1 (Syn) and Theorem 3 (Sem),

Definition 5. An extended Backus-Naur Form (EBNF) is defined by a similar 5-tuple as given in Eq. 4, i.e.:
, VP S') EBNF-( EBNF ' p,' S,) with an extended set of metasymbols +' where:
(6)

the relationship between a language and its syntaxes and semantics can be illustrated by the Universal Language Processing (ULP) model as shown in Fig. 1. Fig. 1 explains that linguistic analyses are a deductive process that maps the 1-D language into the 5-D semantics via the 2-D syntactical analyses [Wang, 2007a].

(i) The metasymbol ,l~and /

are adopted to denote the recursive structures of derived products, which can be described by the big-R notation [Wang, 2002], i.e.:
n

|O
|

9.. ) -

? P ...

[L(l-D)] |

l=R*Ji
n

(7)

~~~~~00'0
-

s
Sem (5-D)

B
70

Rfi(8)
structures of a derived product.

~~~~~~~0T
Syn (2-D)

. .. . adopted to denote optional (ii) The metasymbol [/1 is

1 The Universal Language Processing (ULP) model ~~~~~~~~~~~Fig.

pharae andsentenc compostions heir joint m.ea inrcomplex sentences conpsits ofdRtPA [Wang
a

As given in Table 1, the semantic relations of sentences Z are a set of important connectors, which formally model
2002]. in c l e c n i

. of . ....... that .. ..is a domain Definition 6. Semantics linguistics studies the interpretation of words and sentences, and analysis of their meanings.

cognition process. According to the Object-AttributeRelation (OAR) model [Wang, 2003/2007b/2007c], the semantics of a sentence may be considered having been understood when: a) The logical relations of parts of the Rsentence are clarified; and b) All parts of sentence are reduced to the terminal entities, which are either a realworld image or a primitive abstract concept. The and foundations on in language ~~~theoretical [Wang, cognition 2006a, 2006b, comprehensilon can be found 2007a].
3. THE DEDUCTIVE GRAMMAR TOWARD THE FORMALIZATION OF UG

Semantic analysis and comprehension are a deductive

Semantics deals with how the meaning of a sentence in a language is obtained and comprehended. Studies on semantics explore mechanisms in the understanding of language and the nature of meaning where syntactic structures play an important role in the interpretation of sentence and the intension and extension of word meaning

[Tarski, 1944; Chomski, 1956/57/59/62/65/82;].

language. Therefore, contemporary linguistic analyses

Syntactic and semantic analyses in linguistics rely on a set of explicitly described rules known as the grammar of a
focus on the study of grammars, which is centered in language acquisition, understanding, and interpretation.

Theorem 3. The mathematical model of semantics of

46

Definition 7. The grammar of a language is a set of common rules that integrates phonetics, phonology, morphology, syntax, and semantics of the language. The grammar governs the articulation, perception, and patterning of speech sounds, the formation of phrases and sentences, and the interpretation of utterance.

categories and properties; while the latter is a syntactic operation that puts words and phrases in an appropriate structure. 3.2 The Deductive Grammar of English An instance of UG is the English grammar. Formal language theories of computing science and software engineering perceive that the grammar of any programming language or professional notation system may be rigorously defined by the EBNF notation [Naur, 1968/73]. The author found that the formal language theory can be extended to describe and analyze the grammars of natural languages such as that of English.

3.1 Properties of Grammars

O'Grady and Archibald (2000) identified five basic properties of grammars as follows: *Generality - All languages have a grammar;
* Parity - All grammars are equivalent in terms of their expressive capacity; * Universality - Grammars are commonly alike, or basic principles and properties are shared in all languages; * Mutability - Grammars of all languages are

Definition 9. The deductive grammar is an abstract grammar that formally denotes the syntactic rules of a language based on which, as a generic formula, valid language sentences can be deductively derived.

constantly changing over time;

* Inaccessibility - Grammatical knowledge of the mother tongue is built at the subconscious layer of the brain.

The above basic properties of grammars form an important part of the foundations of human intelligence. The most interesting property of grammars of natural languages is their expressive parity as formalized below.

of the syntactic elements On the basis of the as given in Table 2, the English grammar can be formally described in EBNF known as the Deductive Grammar of English (DGE) [Wang, 2007a]. A rigorous definition of DGE at the sentence level is given in Fig. 3. Some aspects of DGE are simplified at the bottom level, particularly on person rules of nouns, time rules of verbs, and the matching of nouns and verbs in sentences.
S

defiitions

equivalent.

Theorem 4. All grammars of natural

languages

are

Based on Theorem 4, it is perceived that, im computing and software engineering, all programming languages are equivalent. In other words, no language may claim a primitive status over others as long as they implement the 17 meta processes and 17 process relations as identified in RTPA [Wang, 2002], which form the essential set of fundamental operations in computing [Wang, 2007a].
An important discovery in modern linguistics is the existence of the universal grammar among human languages.

Object::= NP NP::= r [AP] N [PP]

PSubrecte

[Subject] Predicate lSyS : P [e


t N*

[NPyNP AP::= [6] A

VP:= VP yVP [(P][-1 V [Object]*

[K] A PP::= [A] P [NP]

v ] [Object]* [AP] : [: N::= <nouns> V::= <be>


:=<propositions> A ::=<adjectives> A: c <adverbs>
6
(x
y
<do>

I[AP] V [-1] [object]*

<have>

Definition 8. The Universal Grammar (UG) is a system of categories, mechanisms, and constraints shared by all human languages.

<degree words>

1982; O'Grady and Archibald, 2000; Wang, 2007a]. UG treats all languages with the same generic type of syntactic mechanisms, which include the merge and transformation operations. The former is a syntactic operation that combines words in accordance with their syntactic

UG is perceived as innate based on recent nd psychlinguisyic and syclmglsyc studies sedls [homky,X [Chomsky, neurolinguistica neurolinguistic

<auxiliary words> =<determiner


words>

<qualifier words>

=<conjunction words> <' I;>

Fig. 2 The Deductive Grammar of English (DGE)

47

According to DGE, the schema of the most complicated sentence in English that consists of all possible and legal syntactic components of DGE is shown in Fig. 3. The generic schema of DGE can be used as a universal formula to deductively derive any sentence in English. For example, the shortest possible sentence is given in Example 1 in Fig. 3. The longest possible sentence is presented in Example 3, i.e.:

represented by the none phrases (NP) and verb phrases (VP).


4. COMPARATIVE ANALYSIS OF NATURAL AND PROGRAMMING LANGUAGE THEORIES Theorem 5. The tradeoff between syntaxes and semantics states that in the DGE system, the complexities of the syntactic rules (or grammar) Csy, and of the semantic rules Csem are inversely proportional, i.e.:

"The unregistered new student all in the class [and another phrase] will not get the expected comprehensive handbook directly ftom the teacher [or another sentence]."
The above example provided in Fig. 3 is an instance that uses almost all possible syntactic components. Obviously, natural sentences in practical usages are always a subset of the DGE schema. Therefore, they are rathe simple and short as shown in the first two examples in Fig. 3.

Csem
r
s

(10)

Theorem 5 indicates that the simpler the syntactic rules

sinFig

The 1 -D structured sentences as shown in Fig. 3 can be modeled in a 2-D graphical form as shown in Fig. 4. Observing Figs. 2 through 4, it is noteworthy that the syntactic structure of the DGE schema is highly recursive. The recursive characteristics in Fig. 4 are repetitively
No.

smnis n ievra codn oTerm5 because UG or DGE as defined in Fig. 2 are relatively simple, its semantics are much richer, complicated, and more ambiguity. In contrary, because programming languages adopt very detailed and complicated grammars, their semantics are relatively concise, simple, and rigorous.

the

ama the rer orcorrencomplicatedmthe

Subject NP NP NP PP yNP X AP N ... A A __ _jA


ab
d

S (Sentence)

Action VP VP

t A
no

Object* A N AP

y VP

VP

13

Ief ghli' kI

Look W 0 00
m

IqIrIsitu

Note: Words in Sentence 3 are defined as follows: a - The, b - unregistered, c - new, d - student, e - all, f- in, g - the, h - class, i - and, j - ..., k - will, 1- not, m - get, n - the, o - expected, p - comprehensive, q - handbook, r - directly, s - from, t - the, u - teacher, v - or, w - another sentence.

Fig. 3 The schema of a generic sentence based on DGE

Alm N

Fig. 4 The syntax structure of the generic sentence schema in DGE

48

The fundamental elements of natural languages can be classified as shown in Table 3 [Wang, 2005]. Observing Table 3 it can be seen that although natural languages can be rich, complex, and powerfully descriptive, they do share only three common basic structures, such as 'to be (l=),' 'to have (1c),' and 'to do (p).'
Table 3. Fundamental Elements in Natural Languages
Iden

interchangeable in linguistics. A simple syntax will require for a complex semantics, while a complex syntax will result in a simple semantics.
Table 4. Comparative Analysis of Natural and Programming Language Theories
SPhonetcsFF, T Phonology
3

No FR

ICategoiry

| Function

antify objutes and attributes

objects
and

[Category | Notaltion

l=
To have

jDescribe relations f:

rDescribe status
and behaviors
Describe negative facts

possessionl

B (A is B) A IC B (A has B)
(

.Al

Example

Sma Complex

Natural gl Natrallanuagmming I language P a IN/A N/A

ge

IN/A
(<1,000 instructions/
Very complicated

Morphology

(lexis)

(> 60,000 words)

Very large

Todo
Not

i>B
I P j>>.

_. ._ ._ __. _l_l(<100

Syntax

Sinple
rules,

(S reserved words) ri

Fig. 2) (> 1,000 rules)

Indirect to do

A 1>> B 1> C

(A does B)

(A has B to doC

(A is not B) A- IcB : A

A -1= B

=|.

I I F [--

Very complex Fig. 1) |(5-D,

Applications

commrunicationS

context sensitive Thought,

I Simple (2-D)
Cn

Computing, system
control

ex

B n -deI> (A does not B)

(A has not B)

The formal models of UG and DGE provide linguistics, particularly language analyzers, implementers, and recognizers, for a powerful tool to formally describe and process natural language documents. Perspective applications of DGE may be in the development of Internet searching engines, semantic analysis of natural languages, speech recognitions, and intelligent systems for natural language parsing and word processing.
The finding of the comparative analysis of programming and natural languages is summarized in Table 4 [Wang, 2007a]. Intuitively, it is expected that a programming language would be a small subset of natural languages. Surprisingly, this hypothesis is only partially true at the morphology (lexicon) and semantic levels. However, the syntax of programming languages is far more complicated than those of natural languages.

programming languages is much simpler than that of natural languages, which is determined by the basic objectives of applications that should be suitable for
limited machine intelligence. However, for achieving such simple and precise semantics in programming languages, a set of very complex and rigorous syntax and grammatical rules has to be adopted.

It can also be seen in Table 4 that the semantics of

It is noteworthy that a natural language is usually context sensitive. However, almost all programming languages, no matter at machine level or higher level, are supposed to be context free. Therefore, it is interesting to query if a real-world problem and its solution(s), in a context-dependent manner, can be described by a contextfree programming language without losing any information. Automata and compiler theories [Aho et al. 1985; Lewis and Papadimitriou, 1998] indicate a contextsensitive language may be transformed into a corresponding context-free language. But the costs to do so are really expensive, because the context cannot be freely removed. A common trick to do so is to hide (imply) the context of software in data objects and intermediate data structures in programming. However, the drawbacks of this convention, or the limitations of conventional compiling technologies, make programming hard and complicated, because the computational behaviors and their data objects were separated or incoherent in the language's descriptive power. This is an indication that a much natural and context-dependent programming language and related compiling technology are yet to be sought. We may consider that abstract data types (ADTs) and object-oriented programming technologies are context-dependent, because the context (in the form of a set of data objects) has been encapsulated into a set of individual classes and the whole class hierarchy of a software system.
5. CONCLUSIONS

More generally, it is recognized that there is no clearcutting between syntax and semantics in both natural and programming languages as stated in Theorem 5. In other words, syntactic and semantic rules are equivalent and

This paper has comparatively analyzed fundamental theories and formal models of natural and artificial languages. It has explained not only how linguistics may

49

improve the understanding of programming languages and their work products - software, but also how formal language theories extend the study on natural languages. The fidings on the features of natural and programming* languages on morphology, syntax, semantics, grammar, have formally described, which resulted in the development of the universal language processing (ULP) model and the Deductive English Grammar (DEG) as a formalized model of universal grammar. Applications of DEG have been identified in language acquisition, DEGprehaveni beenidenatified ind processing lasngue inauisintelioen, comprehension, generation, and intelligent systems and cognitive informatics.

Generator, AT&T Bell Laboratories, Computing Science Technical Report No.39, Murray Hill, NJ.

Elements of the Theory of Computation, 2nd ed.,

Prentice-Hall Intemational, Englewood Cliffs, NJ. [12] Naur, P. ed. (1963), Revised Report on the Algorithmic Language Algol 60, Communications of the ACM, Vol.6, No.1, pp.1-17.
P. The European Side of the Last [13] Naur, Phase of(1978), the Development of Algol, ACM
SIGPLAN Notices, Vol. 13, pp. 15-44.

ACKNOWLEDGEMEENTS
The author would like to acknowledge the Natural Science and Engineering Council of Canada (NSERC) for its partial support to this work. The author would like to thank reviewers and colleagues for their valuable comments and suggestions to this work.
REFERENCES
[1]
[2]

[14] O'Grady, W. and J. Archibald (2000), An Contemporary Linguistic Analysis: 4th ed., Pearson Education Canada Introduction, InTrodtio
[15]
[16]

http://www.antlr.org/.

Inc., Toronto. Parr, T. (2000), ANTLR Reference Manual,

[3]
[4] [5]
[6]

Aho, A.V., R. Sethi, and J.D. Ullman (1985), Compilers: Principles, Techniques, and Tools, Addison-Wesley Publication Co., New York. Casti J.L. and A. Karlqvist eds. (1986), Complexity, Language, and Life: Mathematical Approaches, International Institute for Applied Systems Analysis, Laxenburg, Austria. Chomski, N. (1956), Three Models for the Description of Languages, I.R.E. Transactions on Information Theory, IT-Vol.2, No.3, pp.1 13-124. Chomsky, N. (1957), Syntactic Structures, Mouton, the Hague. * Chomsky, N. (1959), On Certain Formal Properties of Grammars, Information and Control, Vol.2,

[17]
[18]

Pattee, H.H. (1986), Universal Principles of Measurement and Language Functions in Evolving Systems, in J.L. Casti and A. Karlqvist eds. (1986), Complexity, Language, and Life: Mathematical Approaches, Springer-Verlag, Berlin, pp. 268-28 1. Tarski, A. (1944), The Semantic Conception of

Vol.4, pp.13-47.

Truth, Philosophic Phenomenological Research,

Biology Watcher, Vikmfg Press, NY. [19] Wang, Y. (2002), The Real-Time Process Algebra (RTPA), The International Journal of Annals o Software Engineering, Vol.14, USA, pp. 235-274.
[20] Wang, Y. (2003), On Cognitive Informatics, Brain and Mind: A Transdisciplinary Journal of ~~~~~~~~~~Neuroscience pp.l51i-i67. and Neurophilosophy, Vol.4, No.2,
[21] Wang, Y. (2006a), Cognitive Informatics - Towards the Future Generation Computers that Think and Feel, Keynote speech, Proc. 5th IEEE International

L. (1974), The Lives of a Cell: Notes of a Thomas, l h Vi P

pp.137-167.

[7]
[8]

Chomsky, N. (1962), Context-Free Grammar and Pushdown Storage, Quarterly Progress Report, MIT Research Laboratory, No. 65, pp.187-194. Chomsky, N. (1965), Aspects of the Theory of Syntax, MIT Press, Cambridge, MA.
s e(1982), Consequences of the Theory of Government and

Conference

on

Cognitive Informatics (ICCI'06),

Comsky N.

om

Concept and

[22] Wang, Y. (2006b), On Concept Algebra and Knowledge Representation, Proc. 5th IEEE International Conference on Cognitive Informatics
(ICCI'06), IEEE CS Press, Beijing, China, July, pp. 320-331.
[23] Wang, Y. (2006c), On the Big-R Notation for Describing Iterative and Recursive Behaviors, Proc. 5th IEEE International Conference on Cognitive Informatics (ICCI'06), IEEE CS Press, Beijing, China, July, pp. 132-140.

Beijing, China, IEEE CS Press, July, pp. 3-7.

Binding, MIT Press, Cambridge, MA. [9] Johnson, S.C. (1975), Yacc - Yet Another Compiler Compiler, AT&T Bell Laboratories, Computing Science Technical Report No.32, AT&T Bell Labs., Murray Hill, NJ. [10] Lesk, M.E. (1975), Lex - A Lexical Analyzer

50

[24]

Wang, Y. (2007a), Software Engineering Foundations: A Software Science Perspective, CRC Book Series in Software Engineering, Vol. II, CRC Press, USA, July, 1480pp. Wang, Y. (2007b), The Theoretical Framework of The International Journal of Cognitive Cognitive Informatics and Natural Intelligence (IJCINI), IPI Publishing, USA, 1(1), Jan., pp. 1-27.

[26]

[25]

Wang, Y. (2007c), The OAR Model of Neural Informatics for Internal Knowledge Representation in the Brain, The International Journal of Cognitive Informatics and Natural Intelligence (IJCN), IPG Publishing, USA, 1(3), July, pp. 64-75.
+ Data Structures
=

Info.natics,

[27] Wirth, N. (1976), Algorithm

Programs, Prentice Hall, Englewood Cliffs, NJ.

51

Das könnte Ihnen auch gefallen