Sie sind auf Seite 1von 24

HOWARD LASNIK AND JOSEPH J.

KUPIN

A RESTRICTIVE THEORY OF TRANSFORMATIONAL GRAMMAR*

A set theoretic formalization of a transformational theory in the spirit of Chomsky's


LSLT is presented. The theory differs from Chomsky's, and more markedly from most
current theories, in the extent to which restrictions are imposed on descriptive power.
Many well-justified and linguistically significant limitations on structural description
and structural change are embodied in the present formalization. We give particular
attention to the constructs Phrase Marker and Transformational Cycle providing
modifications which offer increases in both simplicity and explanatory power.

1. Introduction
This is a paper on grammatical formalism. However, it differs in purpose
from two important papers on grammatical formalism: Peters and Ritchie (1973)
and Ginsburg and Partee (1969). Each of these papers presented a formalism that
would be wide enough in scope to permit most then countenanced syntactic
theories to be represented. In effect, these papers were presenting a precise
scientific language ion syntactic theories to make use of. This is not our purpose.
In this paper we are attempting to present a particular theory of syntax in a precise
way. Many of the operations describable within other theories cannot be expressed
within this theory. However, the converse does not appear to be true. In this way
our theory is very restrictive. The class of formal objects that can be trans-
formational rules is more narrow than that in any theory we know of. It is vastly
more narrow than that in the above mentioned formalizations, particularly with
respect to allowable structural descriptions.
There are several reasons for attempting to construct a very restrictive
theory. The first is, simply, that the "best" theory is the most falsifiable theory
(Popper (1959)). This means that in the absence of strong evidence falsifying
a particular linguistic theory, if that theory predicts the occurrence of fewer
grammar-like formal objects than another theory, the former must be preferred
to the latter. The first theory is making claims that are easier to prove false, and

We wish to acknowledge the helpful suggestions of Noam Chomsky, Janet Dean


Fodor, and Helmut Schnelle.

0301-4428/77/0402-0002 $ 2.00
Brought to you by | Brown University Rockefeller Library
Copyright by Walter de Gruyter & Co.
Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
174 Howard Lasnik and Joseph J. Kupin

as long as those claims are not falsified, it is a better theory. Appropriate counter-
evidence would consist of well-documented, highly productive phenomena which
cannot be accounted for within the theory. Even such counterevidence should
not lead to the abandonment of all restrictions, however, but to the search for a
well-motivated minimal "enrichment" of the theory to allow description of the
phenomenon. The more restrictive theory should not be abandoned until that
minimally more powerful theory is found.
The second reason for positing a restrictive theory confronts the question
of language acquisition. We follow Chomsky (1965) in the belief that children
acquire their grammar from an environment that seriously underdetermines it, and
that some evaluation metric is employed to select the appropriate grammar for any
particular language. Certainly if the class of possible grammars is smaller, the
evaluation task becomes simpler. By restricting the class of allowable grammars,
we thus approach an explanation of how language can be acquired.
There is a second, less important, difference between this paper and the two
works mentioned above. Each of these treated equally all of the following phe-
nomena, discussed in Emonds (1970): root transformations, cyclic rules, minor
movement rules and agreement (feature copying) rules. Here we will further the
general program of Chomsky (1972) distinguishing rules by their formal and
functional characteristics and positing as many grammatical components as neces-
sary to account for the formal constraints. As Emonds suggests, the above classes
of phenomena are formally distinct, and we feel that each should be assigned to a
different component of the grammar. In this paper we will be concerned only with
the cyclic transformational phenomena. We will attempt to present the most
restrictive theory that has any hope of accounting for these phenomena.
Section two of this paper explains the definitions and constructs that are
needed for our analysis. Section three presents a detailed example from English
in which the definitions are used to construct a derivation, and our concluding
comments are in section four.

2. Definitions and Constructs


2.0. Vocabulary
This analysis makes use of two universally defined vocabularies. The first
consists of non-terminals, N, and terminals, , of the base component. The non-
terminals are ordered pairs. The first element of each pair is an integer between
zero and three that represents the number of "bars." We assume an "X-bar" system
of roughly the sort presented in Chomsky (1972), Jackendoff (1976). The second
element is a set of syntactic features drawn from a finite universal repertoire.1

By the notation { + AJ , +A 2 ,..., An} we mean a collection of sets in which each set
contains +A{ or AJ (but not both) for each i, 1 <i<n.
< i < n . That
Th is, { + A,, + A 2 ,...} s {{+AJ,
def.
}, {-A!, -

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 175

(1) Formally: A 6 N if A = (i, <ST) where 0 < i < 3


^ e { noun,

.:.}
+ verb,

The terminals consist of feature matrices in some formal phonological


system, as for example Chomsky & Halle (1968). The terminals also include two
distinguished symbols: and t. We assume, as is conventional, that there are a
finite number of terminals in any particular language.
The structure of and is not the main concern of this paper. Throughout
the rest of the paper they will be represented according to the conventions presented
in Chomsky (1959) which are as follows:
(2) ab c . . . single terminals (elements of )
. . . xyz strings of terminals (elements of *)
ABC... single non-terminals (elements of N)
. . .XYZ strings of non-terminals (elements of N*)
... single symbols (elements of u N)
. . . strings of symbols (elements of ()*)
To this we add the script capitals fi/, ^, # etc. to represent arbitrary sets (ordered
or unordered). The context will make it clear what the sets consist of. We shall
also use NP, VP, adj., etc. as convenient shorthand for whatever turns out to be
the appropriate X-bar representation of these commonly used notions. and T7
will stand for arbitrary transformations. 0 represents the null string.
In section 2.1, we will use this first vocabulary to define the class of formal
objects upon which transformations will work. However for the description of
transformations themselves, a second vocabulary is needed. The 'structural de-
scription' portion of a transformation needs access not to specific non-terminals,
but to natural cksses of such non-terminals. We must define a vocabulary Vn, based
upon N, to provide these natural classes. Vn is composed in the following way:
(3) Vn = {(i, Sf)\y S S and (i, Sf) e N}
Vn will be called the set of non-terminal classes.2 The "structural change" portion
of a transformation requires a vocabulary consisting of numerals, elements from ,
and certain operators such as "/" (substitution), " + r" (right adjunction) and "-l,"
(left adjunction). The elements in this set will be described in section 2.2. We will
use f to stand for an arbitrary structural change.

2
Vn is not a non-terminal vocabulary as defined in Chomsky (1959) p. 129. "axiom #2:
A e Vn iff there are , , such that " Vn as we use it here is the closest
analog in the transformational component to the set defined by that axiom, which is
appropriate for a base component. We will extend the conventions of Chomsky (1959)
to Vn as if it were a non-terminal vocabulary.

12 TL IV

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
176 Howard Lasnik and Joseph J. Kupin

2.1. Reduced Phrase Markers

The theory of grammar we are formalizing follows Chomsky (1955) in many


respects. There transformations are conceived of as mappings on phrase markers.
These phrase markers are meant to capture all and only the is a relationships of the
terminal string. In these descriptions we will use a notation that is a closer
approximation to that goal. Reduced phrase markers (RPM's) are closely related
to full phrase markers but are less redundantly specified and contain slightly less
extraneous information. It will be recalled from Chomsky (1955, 1956) that the
phrase marker for a sentence S is the set of strings occurring as lines in any of
the equivalent phrase structure derivations of S. Such a formal object has many,
but not all, of the characteristics normally associated with phrase structure trees.
To describe the relationship between phrase markers and RPM's we introduce the
following notation, (see fh 9 below.)
(4) is a monostring with respect to the sets and if e * *
Intuitively, a reduced phrase marker can be thought of as the subset of a phrase
marker which contains the terminal string of the phrase marker and all and only
the monostrings of the phrase marker. We note that in addition to the terminal
string, there is one and only one element in the reduced phrase marker for each
distinct non-terminal in the phrase marker. One qualification of this general
principle will be discussed below.
We will not have much more to say about the relationship between phrase
markers and RPM's. Well-formedness conditions for RPM's are not phrased in
terms of reductions of well-formed phrase markers, but rather in terms of certain
relationships that must exist within pairs of elements in RPM's. These relationships
embody certain useful linguistic notions, namely "is a", "dominates", and
"precedes". In other theories, these relationships have been defined on the nodes
of a tree, each node representing a particular occurrence of a non-terminal. In our
theory, these relationships are defined on monostrings in a set of strings, each
monostring representing a particular occurrence of a non-terminal.
We will use monostrings as identifiers for particular non-terminals under
discussion in the following way. By comparing a monostring in some set of strings
with a string of terminals in that set, we can immediately ascertain what portion
of the terminal string bears the is a relationship to the non-terminal in the mono-
string. Our predicate is a* incorporates this intuitive algorithm. Our precedes predi-
cate will be true of a monostring and a string in some set of strings &, just
in case the non-terminals of and have the obvious relationship to portions
of a string of terminals in 9. Our dominates predicate has a similarly extended
meaning.
In definitions (5)(8) let = , q>e^,
(5) y is'* in & if xyz e 0>.
(6) dominates in & if = , 0, .

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 177

(7) precedes in &> if y // a* in &, and = , ^.


(8) is cyclic, if A is a cyclic non-terminal.3
Using these predicates, which are defined on arbitrary sets of strings, we can
define the notion ^Reduced Phrase Afarker. We require that every pair of elements
of an RPM satisfy dominates or precedes. We also require that every RPM have at
least two elements: a single non-terminal, and a string of terminals.
(9) ^ is an RPM if there exist A and 2 such that
A 6^ and ze^; and if {, }^,
either dominates in &
or dominates in &
or precedes in &
or precedes in &.
This definition guarantees that RPM's will have all of the following four
necessary properties.4
A) The RPM provides a consistent analysis of the terminal string (in the sense
of Chomsky (1955)). This requires that no two terminals "partially overlap" in
their terminal expansion. For example, {S, abcde, aBde, abDe} cannot be an
RPM since dominates be, and D dominates cd. Here the partial overlap is c.
B) The terminal string and the terminal portions of each monostring must
"agree". Obviously {S, abc, dBe} cannot be allowed as an RPM since two of
the elements do not agree about what terminals the sentence consists of.
C) An RPM can only be the counterpart of a "rooted" tree and never of a
"forest". In {aB, Ab, ab} there is no non-terminal that dominates the entire
terminal string, so it cannot be an RPM.
D) Every element of an RPM (except a single non-null string of terminals)
must be a monostring. {S, AB, ab} is not an RPM since AB cannot be related
to ab. Only monostrings can precede or dominate other strings and neither AB nor
ab is a monostring.
Just as there is a sub-tree associated with each node of a derivation tree,
there is a sub-RPM associated with each element of an RPM. The subP function
provides a way of referring to these "embedded" RPM's. In definition (10) let &
be an RPM such that xA2 6 ^.
(10) subP (xA2, 0>) = { such that x<p2 6 ^}
Note that, as with phrase markers, it is not always possible to "reconstruct"
the ancestry of the terminal string or "tree diagram" associated with some reduced
3
We assume that S and NP are the only cyclic non-terminals. However, amending
this list would cause no difficulties in the formalism.
4
Since these four properties taken together .ire both necessary and sufficient, an alter-
native definition of RPM might make these four properties primitive and in this case the
properties we definitionally assign to RPM's would follow as major theorems.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
178 Howard Lasnik and Joseph J. Kupin

phrase marker.5 Trees (12) and (13) (among others discussed below) would both
be associated with RPM (11).
(11) {S, Ab, Cb, aB, ab}
S
(12) f ^-

c \
\ >
a

<13> C B

i \
\ "
a
For this special case of domination, we have constructed the dominates
predicate so that Ab dominates Cb and Cb dominates Ab. We could have defined
the predicate so that neither of the two was true, but no definition could make
one of them true and the other false.
The choice of this representation, then, constitutes an empirical claim about
human language. All grammars in this theory will necessarily treat (12) and (13)
identically since they have identical representations, namely, (11).
A second consequence of this choice of representation is "pruning" of the
strongest possible sort. Both the following trees, and many others besides, would
be associated with (11).
S
(14)

(15)

As we noted above, an RPM is essentially a collection of is a statements.


An is a statement concerns only the relationship between a portion of the terminal
5
We will say a tree is associated with an RPM if the RPM is the maximal subset of the
phrase marker related to that tree. For a discussion of the relationship between trees and
phrase markers, see Chomsky (1955).

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 179

string and a non-terminal. In that view there is no point in saying a particular


occurrence of the terminal a is a C twice (as (14) apparently does)6. Formally, we note:
(16) {S, Cb, Ab, Cb, aB, ab}=(S, Ab, Cb, aB, aB, ab} =(11)
It is important to note that in principle a base component could distinguish be-
tween (14) and (15). Thus, we are making the claim that a transformational com-
ponent does not require access to all of the information inherent in a base com-
ponent.
There is at least one difference between the trees that can be distinctly
represented with phrase markers and with reduced phrase markers. Phrase markers
can distinguish between (17) and (18) even though they would be associated with
the same reduced phrase marker; namely, (11).

(17)

(18)

No reduced phrase marker can have both aDb and ab in it since neither
precedes nor dominates the other. Assuming that in the base every non-terminal
introduces a terminal, this difference in descriptive power is only relevant under
one particular definition of deletion. Our definition, which obviates this difference,
is given below.

2.2. Definition of Transformation

In this section, we describe in formal terms the notion transformation.


A transformational component is a set of such transformations. The child's task
in learning the aspect of his language that concerns us here is to discover which
transformations constitute the transformational component of the target language.
All of the definitions and all of the principles of application described below are
assumed to be part of general linguistic theory, i.e., to be biologically based.
(19) (... A n , f) is a transformation iff

In this theory, pruning thus becomes a non-issue, since the repeated nodes never exist
to be pruned. There is never a conversion to more tree-like objects so the issue never comes
up. Thus, the effects of pruning, if indeed there are any, are unavoidable.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
180 Howard Lasnik and Joseph J. Kupin

a. n {2, 3} and f 6 {(i/j), (i+ r j), (i+,j)}


where 1 < i < n, 1 < j < n, i ^ j .
i is called the source index,
j is called the goal index.
or*, n 6 {1,2} and fe{(0/j), (b/j)}
where b 6 / and 1 < j ^ n.
j is called the goal index.
There is no source index.
(i/j) indicates a movement transformation;7 (i+ij), left (Chomsky-) ad-
junction; (i+ r j)> right (Chomsky-) adjunction; (0/j), deletion; and (b/j), insertion
of a specified terminal.8
As will be formally spelled out in section 2.3, transformations are inter-
preted in the following way. The indices in f represent subscripts in A j . . . A n ,
and each A is associated with a sub-RPM of the RPM to which the transformation
is being applied. For example, (A! A 2 A3, (3/1)) indicates the movement of the
sub-RPM given index 3 into the position now occupied by the sub-RPM given
index 1. This is therefore a preposing transformation. The sub-RPM with index 2
is required to be present in the RPM but is not altered in the transformational
mapping. Such a sub-RPM is called a catalyst. Note that a transformation need
not have a catalyst.
There are several weighty restrictions that are captured in this formaliration.
The major differences between this formalization and current theories are detailed
below.
A) There are no explicit variables. Instead, there are implicit variables between
consecutive elements in A1.. .An (as we will see in the discussion of the mapping
function). The presence of these implicit variables means no transformation can
specify that two elements must be adjacent.
B) Each T specifies one string condition, without the facility for Boolean
combinations of string conditions, as in Ginsburg and Partee (1969), or quanti-

7
It seems that, in general, movement is restricted to cases where source and goal have
identical specifications in the transformation. For example, NP movement is into an NP
position. This is one version of the structure preserving hypothesis, cf. Emonds (1970). This
could be captured in our formalism by stipulating that if f=(i/j) then AI = Aj. Since there
are a number of unresolved issues pertaining to movement, we will not pursue this question
here.
8
/ is the language-specific set of "insertable elements". In English, / apparently
includes DO (for do-support) and THERE (for there-insertion). We follow Chomsky (1976a) in
the view that transformations do not insert lexical material. Note that the lexically inserted
homophones to the DO and THERE of / are of different syntactic categories. Lexical DO is a
main verb (while do-support DO is an auxiliary) and lexical THERE is an adverb (while there-
insertion THERE is an NP).

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 181

ficational combinations. Also excluded are string conditions consisting of strings


of Boolean statements, as in Peters and Ritchie (1973). One effect of this is that
no dominance relations in the phrase marker can be stated within a T. For example,
no provision is made in the definition of transformation for specifying an operation
that involves only "clause-mates" or that applies to a particular NP only if that NP
is immediately dominated by S (e.g. subject raising a la Postal (1974)). Another
commonly assumed type of structural condition imposes multiple requirements on
portions of the terminal string; for example, that some portion be simultaneously
NP and NP S. This type of condition also is unavailable in the present framework.
C) There can be, at most, one catalyst in a transformation. This severely restricts
the statement of "environments" of transformational alterations.
D) Only non-terminals can be indexed in T rules. This prevents T rules from
deleting particular lexical elements, and also their being "keyed" by the presence
of particular terminals, for example, forcing subject-aux. inversion by the presence
of a terminal Q marker in the front of the sentence.
E) Since every index represents a single element of the RPM, and since any-
thing affected is represented by a single index, it follows that a transformation
will be an operation on constituents. For example, only a constituent can be moved
or deleted.
F) Transformations are not marked optional or obligatory. The certainty of
application of a transformation is decided by general principles to be described in
the definition of derivation.
G) There can be at most two affected constituents. For example, one stateable
generalization is that "an NP moves leftwards replacing another NP (leaving
behind a trace)". The following is unstateable: "An NP moves leftwards replacing
another NP (leaving behind a trace); additionally, be + en is inserted before the
main verb."
One last comment we might make is that this theory entails that there are
only a finite number of transformations. This certainly would seem to improve
the prospects for explaining learnability.

2.3. The Mapping Function

This section describes the function that induces a mapping on reduced


phrase markers effecting the transformation represented by the (X, f) pair described
above. Implicit understanding of this function is assumed to be part of the language
acquisition mechanism, while specific transformations are presumably learned.
There are two major parts to the description of the mapping function. First
to be presented is a formalization of the notion structural description. Following this,
the second part of the mapping function, embodying the notion structural change,
will be presented.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
182 Howard Lasnik and Joseph J. Kupin

2.3.1. The Structural Description Function


This function isoktes a set of elements in a reduced phrase marker. This
set represents an appropriate analysis of the RPM for a given transformation and
cyclic domain within the RPM. The function therefore requires three arguments:
& (the RPM under consideration), T (the transformation to be applied) and
(a cyclic element of & that represents the cyclic domain).
The structural description is a partial function, since not every (, , ^)
triple is in its domain. Only triples in which can apply to & in cycle will
evaluate to a set of monostrings.
In definitions (20) (27), let = (Ar . . An, f), and let & be an RPM and
a cyclic monostring such that ae^

(20)
a. for all i, 1 < i < , dominates XiBjZj in &
and b. <& satisfies the conditions of
i. basic analyzability
ii. subjacency
iii. tensed sentence
iv. COMP island
and c. if f = (b/j) or f = (i/j) then ^
satisfies the condition of lexical conservation.
and d. there is no set W which satisfies a, b, and c and which is more prominent than <&.

(21) basic analyzability


ty satisfies the condition of basic analyzability for the triple (, , ^) if
a. for all i, 1 < i < n,

and b. for all j, 1 < j < n,


BJ is more specific than Aj .

(22) B is more specific than A if there exists an index i, and sets of features and
such that B = (i, ) and A = (i, v) and 3 v.

(23) subjacency
Of satisfies the condition of subjacency for the triple (, , ^) if there is at
most one string ' such that ' is cyclic and such that for some i and j, '
dominates XjBjZ; in 3P and not: ' dominates XjBjZj //; .^.

(24) tensed sentence


W satisfies the condition of tensed sentence for the triple (, , ^) if either
there is no ' as defined in (23), or for all and w if ' dominates tense w
in & then there is a " such that " is cyclic and " dominates tense w in &
and ' dominates " in &.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 1 83

(25) lexical conservation


^ satisfies the lexical conservation condition with respect to (, , ^) if XjBjZj
dominates either Xj tZj or Xj Zj in & where j is the goal index of T.
(26) COMP Island
^ satisfies the COMP island condition with respect to (, , &*) if there is
no index i or strings and w such that COMP w dominates XiBjZj in &.
In definition (27) let <&' = (xiBizJ , . . . , xXz^
(27) 9' is more prominent than 9 in & if there exists an index i such that
a. either /^' dominates xjB^ in &
or Xi'Bj'Zi' precedes XjBjZj in &
and. for all k, l < k < i ,

The lexical conservation condition is our only requirement approaching


what is usually called the recoverability of deletions constraint. In our theory,
true deletion, (0/j), is unconstrained by the syntax. Instead it is constrained by a
semantic condition on surface structures (cf. Fiengo (1974)).
Requirement (20) insures that no transformation will apply ambiguously
within a given cyclic domain. That is, there is at most one analysis for each
(, , ^) triple. This very stringent requirement may have to be weakened slightly
in favor of a condition based upon the predicate "superior" (Chomsky 1973). If
this were done, the structural description function would have to be modified so
as to produce a set of analyses rather than a single analysis as it does here. Minor
modifications in other functions would also be required.

2.3.2. The Structural Change


The second part of the formalization of the mapping function is a description
of the notion structural change. Before describing the function in formal terms,
we will show by example what changes the function will have to be capable of.
We do this using a simple RPM, namely (28), and several simple transformations
(29)(33). The results of applying (29)(33) to (28) are illustrated in (34)(38).
The trees drawn along with RPMs (28) and (34)(38) are for the reader's con-
venience. They have no part in the computation of the mapping.
(28) {, , HcA, hCA, hcD, hcA}
S
A D
/\
H C
I I

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
184 Howard Lasnik and Joseph J. Kupin

(29) (HD, (1/2))


(30) (HD, (b/2))
(31) (HD, (0/2))
(32) (HD, (l+,2))
(33) (HD, (l+.2))
We will assume that (HcA, hcD) is the analysis that is produced as the
structural description of (28) with respect to S and each of the transformations
listed. In this case, (34) through (38) represent the RPM's that the structural change
function must be able to produce from (28).
(34) SC((29),S,(28)) =
{S, Ah, Hch, tCh, tcD, tcH, tch}
S

H C H
t
I I
c
I
h
(35) SC((30),S,(28)) =
{S, Ab, Heb, hClx hcD, heb)
S
A D
/ \C
H
l
b

(36) SC((31),S,(28)) =
{S,A,Hc,hC,hc}
S
A
H C
l l
h C

(37) SC((32),S,(28)) =
{S, AAh, HcAh, tCAh, tcD, tcDh, tcAH, tcAh}

H C D H
l l l l
t C h

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 185

(38) SC((33),S,(28)) =
{S, AhA, HchA, tChA, tcD, tcHA, tchD, tcAh}
S

It is important to notice that very few elements of (28) remain unchanged


in the mappings. For example, the elements representing the non-terminal C in (34)
through (38) are each distinct from the element representing C in (28), even though
hCA is not part of the analysis of (28) and C is not mentioned in any of the trans-
formations applied. This change in the elements representing C is due to the
redundancy inherent in RPM's.
RPM's, while less redundantly specified than full phrase markers, still "over
specify" the terminal string. For example, the fact that is the "final" terminal
symbol in (28) is represented in four of its elements. So when we apply a trans-
formation affecting that "final" symbol in (28), the elements , HcA, hCA and
hcA all have to be modified in concert.
In the examples above, the change in hCA only "reflects" a change that is
actually occurring in other elements of the RPM (such as hcD). In our formalism
these two cases are treated very differently. To compute the mapping we first
divide the RPM into two parts; the part which is "intimately" involved in the
change (in the examples above, hcD would always be in this part) and the remainder
of the RPM, which will be changed only when this is necessary to reflect the
changes being made. (In the examples above, hCA would always be in this part.)
A separate function will be associated with each of these parts and the union of the
two sets of strings produced by these functions will be the structural change.
We now proceed with a formal definition of SC (, , ^) which will induce
the changes suggested by the above examples.
In definitions (39)(43) let = (X, t).
Let "" represent the concatenation product.9
Let i be the source index of T. If there is no source index let i = 0. Let any
string or symbol with a subscript of zero represent the null string: 0.
Let j be the goal index of T.
Let SD(, , 0) = fo A1 ,..., xAnz) for some n, l < n;
and for all i, 1 <, i <, n, y{ is a* XjAjZj in 0*.

{x|x = vw and v e SI and w e,'


def.
By convention, this operation has precedence over set union. For example {a, b} {c} u {d,e}
= {ac,bc}u{d,e}

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
186 Howard Lasnik and Joseph J. Kupin

(39) SC (, , 0>) = primary change (, , ^) U secondary change (, ,


In definitions (40) (43)
let k be the smaller of i and j
and let k' be the larger of i and j
and let v be the string such that ^^^ e ^
(40) primary change(T, , ^) = {xj res (k, , , ^) {vwk>zk<} u

where wk e res(k, , , ^) and


w k ,eres(k',T,a,^).
(41) a. For source index:

b. For goal index:


if f = (i +J), res(j, , , ^) = {Aj} u subP(XiAi2i, ^) {yj} u

if f = (i + r j), res(j, , , ^) = {Aj} u subP( Xj A j2j> ^) {yj u


{yjl-subPfoA,*,^);
if f = (i/j), res(j, , , 9) = {Aj} u su

In definitions (42) (43) let wk and wk' be as in (40) above, and


let = ^ - ({xk} subPixfcAkZk, ^) {zj u
^^,^) {%.}).
(42) secondary change (, , ^) = {| e and = g (, , ,
(43) g(<p, , , 9) = xkwk9/wk> ^ if = xkyk9/yk-2k,
= (p/wkvwk*zk< if 9 = 9'

= x k w k <p / if9
= 9'wk'2k if9 = 9;yk'2k'
= otherwise
As is conventional in definition by parts, we require that the first condition of (43)
that is applicable is selected.

2.4. Transformational Derivation

A derivation with respect to a transformational component in this theory


is a strictly ordered set of RPM's. The first RPM must be in the language10 of the

10
There is a very direct relationship between derivations in a base component and
RPM's in & (3$). See Kupin (In press) for further discussion.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 1 87

base component j? ($). Each later RPM must be the result of a transformational
mapping from the immediately preceeding RPM. Certain other conditions must
hold. Among them are our version of the cycle, (45), a modified form of Kiparsky's
(1973) Elsewhere Condition, (46), and one filtering function precluding A's from
the last RPM in a derivation, (47).

(44) (0*i , . . . , ^n) is a derivation with respect to ff~ and 0$ iff

and b. for all i, 1 < i < n, &{+1 = SC(T, , ^), for some e 9\ and T e &
and c. &i = &-J implies i = j.
(^i, . . ., ^n) obeys the following conditions with respect to y\
d. strict cycle condition
e. specificity condition
/. surface filter condition
In definition (45), for all k, l < k < n,
let Gk be the string such that ^k + 1 = SC(Tk, ak, ^k).

(45) strict cycle


(&i , . . . , ^n) satisfies the strict cycle condition with respect to 3 if for all i
and j, if
CN (,, 0>0 D CN (GJ, 0>j) then i < j.

(46) specificity
(^! , . . . , ^n) satisfies the specificity condition with respect to ^", if for all i,

if SC (, , = j/ and SC ( , , ^) = ^
and = (X, f) and T' = (X', f) and X ^ X' and spec(X, X7), then ^i+\ &

(47) surface filter


(^x ,...., ^n) satisfies the surf acefilter condition with respect to & if there are
no and w such that vAw ^n .

(48) CN (, ^) = { | dominates ^ and is cyclic} u


{, if is ^//r}

(49) spec (X,Y) is true iff


A. X = A a n d Y = B
and A is more specific than (see definition (22)).

otc. X = X'X / / andY = Y'Y'/


and spec (X', Y') and spec (X", Y")
It follows from definition (44) that all transformations are optional and
unordered. The two types of intrinsic "ordering" are outlined in (45) and (46)

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
188 Howard Lasnik and Joseph J. Kupin

above.11 (45) is a partial ordering of the application of rules that says only that
given a particular set of cyclic non-terminals that cover one mapping, no later
mapping can have a set of covering cyclic non-terminals that properly includes the
first. That is, one can not use a "lower S node" as a domain once a rule has been
done "higher up in the tree". This is parallel to what Chomsky (1973) has called
the strict cycle. (45) says nothing about two mappings whose covering set of non-
terminals are not in the subset relation. This is the case in which transformations
are done in two S's in two different places in the sentence as in "Bill knows Si
and 82". This theory makes no claim about whether transformations within Sj or S2
need be applied earlier. Conceptually, (45) is somewhat different from many other
statements of the principle of the cycle. The principle is often taken to be a
requirement that rule applications begin on the most deeply embedded cyclic
domain, and from there proceed to the "next domain up", and so on. Chomsky
(1973) proposed that the notion "transformational cycle" be sharpened by the
addition of the "strict cycle condition" (our (45)). What we suggest is that the
"strict cycle condition" is not merely a part of the cyclic principle, but rather
that it exhausts that principle. It should be noted that though the principle of the
cycle is related to the subjacency condition (23) in that both have to do with
cyclic domains, the two can not be collapsed. The subjacency condition is strictly
Markovian, depending like everything else in structural descriptions only on
"current" structure. The strict cycle, on the other hand, is properly part of the
definition of derivation, since it depends on all earlier stages of the derivation.
Condition (46) says that if two transformations are applicable and one is
more general than the other, the more general one may not be chosen for
application.
Condition (44*:) entails that no derivation includes any vacuous subderivations
(cf. Levine (1976)). This requirement gives the ordered set constituting a derivation
one of the properties of an unordered set. We find in this a potentially interesting
similarity to the case of phrase markers vis-a-vis phrase structure derivations.
Condition (47) allows us to "soften" the effects of the optionality of all
transformations in the following way. The effect of condition (47) is that if is
introduced somewhere in a structure, no particular becomes obligatory, but
rather it is obligatory that something be done somewhere along the line to remove
that ; otherwise, the derivation must be "thrown out". This seems to be the
proper generalization. What is obligatory is not the means used, but the end
achieved.

There is also an ordering inherent in the "feeding" or "bleeding" action of T's. That
is, the application of a transformation sometimes creates a situation where another becomes
applicable (feeding) or creates a situation where another cannot apply within the same
sentence (bleeding).

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 189

3. Example
In this section a fairly complicated structure and three transformations are
presented and part of a derivation is constructed to illustrate the definitions given
above. In what follows, certain details irrelevant to the present investigation have
been omitted. We believe that, to the level of detail we can attempt here, the
structure and the transformations will be part of any adequate analysis of American
English. Lasnik (forthcoming) presents a detailed analysis of the English auxiliary
essentially within this same framework.
The transformations to be considered are:
(50) T! : (COMP WH, (2/1)) WH fronting12, where WH = (3, { + WH})
(51) T2: (NP NP, (2/1)) NP preposing
(52) T3: (NP NP, (1/2)) NP postposing
We will discuss their application in the derivation of the sentence:
(53) Who knows which gifts Paul and Bill were given by John?
The RPM below labeled 9 is assumed to be the initial RPM in this derivation.
We will use the line letters in this listing to refer to elements of 9 in the dis-
cussion below. For the reader's convenience, one of the phrase structure trees
associated with & is given. The nodes in trees (54') and (64') are labelled with
superscript a, b, c,... in correspondence with the elements <z, b, c,... in RPM's (54)
and (64), respectively.
(54) ^:
a. S
b. COMP wh pres know J. past be en give P. and B. wh gifts by
c. S
d. NP pres know J. past be en give P. and B. wh gifts by
e. AwhpresVP
/. wh pres V J. past be en give P. and B. wh gifts by
g. wh pres know S
b. wh pres know COMP J. past be en give P. and B. wh gifts by
/. wh pres know S
j. wh pres know NP past be en give P. and B. wh gifts by
k. wh pres know J. past VP
/. wh pres know J. past PASS give P. and B. wh gifts by

12
As is well known, WH fronting applies to NP's, adverb phrases, adjective phrases
and quantifier phrases. We conclude that all of these phrases have the same number of bars.
It is not totally clear what this number should be. For concreteness we have chosen 3 as
the number. We assume that the lowest "phrase-level" (3-bar) non-terminal dominating
a WH word is specified +WH in its phrase structure derivation. In this example we will not
explicitly mark the difference between NP's with feature +WH and other WH NP's.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
190 Howard Lasnik and Joseph J. Kupin

m. wh pres know J. past be en V P. and B. wh gifts by


n. wh pres know J. past be en give NP wh gifts by
o. wh pres know J. past be en give NP and B. wh gifts by
A wh pres know J. past be en give P. and NP wh gifts by
4 wh pres know J. past be en give P. and B. NP by
r. wh pres know J. past be en give P. and B. wh gifts PP
s. wh pres know J. past be en give P. and B. wh gifts by NP
t. wh pres know J. past be en give P. and B. wh gifts by

(54')
COMPb

wh pres know J. be en give P. and B. wh gifts by

To see that ^ is an RPM consider that each element in the listing either
dominates or precedes each of the following elements in the listing. For example
(54) dominates (540), (54/>), and (54 /) since each of these is of the form:
wh pres know Aj. past be en give wh gifts by and in no case is NP
or 0. (54#) precedes (54^), (54r), and (54j) since each of these is of the form:
wh pres knowAJ. past be en give P. and . where wh gifts by .
To illustrate the SubP function, we have listed below some sub-RPM's in ^ ,
some of which will be used later in the discussion.

(55) SubP(^,^1) = ^1
(56) SubP (*, u = {COMP, }
(57) SubP(*, 9>u = {NP, NP and B., P. and NP, P. and B.}
(58)
(59)
(60) SubP (fc#i) = {NP,wh gifts}
Each of these can be seen to follow the definition given in section 2.1.
Now we are prepared to consider how the three transformations can apply
to 0*i . We will begin by illustrating certain ordered pairs of elements and explaining

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 191

why they do or do not qualify as proper analyses of ^ for the transformation under
consideration.
(62)
(b, q) because of tensed sentence & prominence
(h,q) because of prominence
,^1) = (A,i)
(b, q) because of covering cycle (20o)
(62) For T2 : SD (T2 ,,^1)7* (/, n) because of conservation
(sj) because s precedes} is false
(see analyzability)
(63) For T3 : SD (T3 , # , ^) (d, o) because of subjacency, and conservation

T^ (/,) because of conservation


(, s) because of prominence
SD (T3 , , ^i) 7^ (o,p) because of conservation
We have isolated three structural descriptions and thus we can speak of
three new RPM's: SC(Tl9a^)9 SC^,^^) and SC^,^,^). Each of these
could potentially form a second step in a derivation from ^. However using
SC(T1,^,^1) as the second step ultimately results in A's which cannot be removed,
due to the strict cycle, and so no good derivation will result. We will not carry
this derivation further. Using either of the others will produce the intended result.
In the interest of brevity, we will pursue only the derivation: (^ , SC (Tj ,, ^), . . . ).
We will now describe the computation of ^2 = SC (Tj ,, ^). The choice of
symbols follows that in the explication of SC presented in section 2.32.
1 = goal index = i = k
Xi = wh pres know

2
1 ~ J Past be en give P. and B. wh gifts by
2 = source index = j = k;
x2 = wh pres know J. past be en give P. and B.
A2 = NP
22 = by
yi-
v = J . past be en give P. and B.
y2 = wh gifts
{COMP,NP,wh gifts}

Wi = wh gifts
w 2 =t
primary change (Tx ,,

13 TL IV

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
192 Howard Lasnik and Joseph J. Kupin

( wh pres know} res(l, Tj ,, ^) {J. past be en give P. and . by }


u { wh pres know wh gifts J. past be en give P. and B.} res (2, Tj tg, 9^ '
{byA}
= { wh pres know COMP J. past be en give P. and B. t by ,
wh pres know NP J. past be en give P. and B. t by ,
wh pres know wh gifts J. past be en give P. and B. NP by ,
wh pres know wh gifts J. past be en give P. and B. t by }
Having found primary change (Tj,^,^), we turn our attention to secondary
change (Tx ,& 0^). We first find Si.
{ wh pres know} subP (, 9) ' { J. past be en give P. and B. wh gifts by }
u { wh pres know J. past be en give P. and B.} subP(^, ^) {by }
= {*,*'}
Therefore J = ^ - {,?,/} = {a, . . .,&/, . . .,/>,r, j}.
secondary change (,^,^) is the result of applying the g function to each
member of 2L. Here only a small number of these g computations are illus-
trated. g locates the "involved" terminals (yt and y2) if they occur in the
string, and replaces them with Wj and w2, respectively, thereby reflecting
the change induced by Tj .

Here and below, <X>'s with superscripts are strings taking the place of ' in
the definition of g.
g(, T! ,& u = ObWl vw222 =
COMP wh pres know wh gifts J. past be en give P. and B. t by

wh pres know wh gifts J. past be en give NP t by


g(r,T1,<g,^1) = x1w1vw2Or =
wh pres know wh gifts J. past be en give P. and B. t by NP

wh pres know wh gifts J. past VP


We leave it to the reader to verify that primary change (Tj ,& .^) u secondary
change (,^,^) is exactly the set ^2 below.
(64) 0>2:
a. S
b. COMP wh pres know wh gifts J. past be en give P. and B. t by
e. S
d. NP pres know wh gifts J. past be en give P. and B. t by
e. wh pres VP
/. wh pres V wh gifts past be en give P. and B. t by

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 1 92

g. wh pres know S
h. wh pres know COMP J. past be en give P. and B. t by
/. wh pres know NP J. past be en give P. and B. t by
/ wh presknow wh gifts S
k. . wh pres know wh gifts NP past be en give P. and B. t by
/. wh pres know wh gifts J. past VP
m. '. wh pres know wh gifts J. past PASS give P. and B. t by
. . wh pres know wh gifts J. past be en V P. and B. t by
. wh pres know wh gifts J. past be en give NP t by
p. . wh pres know wh gifts J. past be en give NP and B. t by
q. . wh pres know wh gifts J. past be en give P. and NP t by
r. . wh pres know wh gifts J. past be en give P. and B. NP by
s. wh pres know wh gifts J. past be en give P. and B. t PP
/. wh pres know wh gifts J. past be en give P. and B. t by NP
u. . wh pres know wh gifts J. past be en give P. and B. t by

Sa

<\i
(644')
\ /

CO MPD JS c
^"^ VP
NPd / \
V* N V _

C(A
^\L^^^^
NPM VP*
\ ^"x O^-^

1 .^
m
/ /P pO^^ppS
^7*
N P! \ pXss tr NP ^NP* NP*
/\ /\
wh pres know wh gifts J. past be en give P. and B. i by
1 1 1

We have now completed the first step in one possible derivation of the
seentence "Who knows which gifts Paul and Bill were given by John." Rather
thhan proceed with a second step in the same detail, we will sketch in the remaining
sfcteps. From ^2 there are only two possible moves: SC^, a, &2) on ^e analysis
(>,</), and SC(T3, g,&2\ on the analysis (k,f). Note that 5(3,^,^2) is not

13r3*

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
194 Howard Lasnik and Joseph J. Kupin

(/,/) due to the COMP island constraint.13 No new structural descriptions have
resulted from the change from ^ to ^2 K" we> as we eventually must, remove
the last in the "lower" sentence by applying T3 to ^2, we will begin the
"passive" chain of transformations. The second half of this chain, NP preposing,
is not forced to apply by any syntactic requirement on derivations. It is allowed
to apply optionally and the derivation in which it does not apply is discarded on
semantic grounds. For semantic interpretation, movement traces (t's) must be
"properly bound" by the moved item (Fiengo (1974, 1977)), and in SC (T^,^)
the trace of J. is not properly bound. The application of T2 will replace that t with
(57), leaving behind a trace that is properly bound. NP preposing creates ^4, and
finally SC^T^S,^.) will end the derivation.
We applied the transformations in the following order:
lower S cycle: T x , T3, T2; higher S cycle: Tj
They also could have been applied in either of the following orders:
lower S cycle: T3, T x , T2; higher S cycle: T t .
lower S cycle: T3, T2, T! ; higher S cycle: T t .
These are the only possible successful derivations from ^, and all produce the
desired result.

4. Conclusion
4.0. Some Consequences
It has been our intention to present a restrictive transformational theory in a
revealing formalism. For this reason, it would have been inappropriate to begin
with an all encompassing notation of roughly the Peters-Ritchie sort, and then tack
on the necessary restrictions. Instead, we have attempted to develop a formalism
in which the constraints follow from prior definitions and in this way form part
of a coherent whole. Thus our choice of representation has empirical consequences.
If any of the central constraints are shown to be invalid, our theory of grammar
will be falsified. For example, the straightforward definition of transformation
given in 2.2 transparently embodies most of the generalizations listed in that
section.
13
Note that there is an apparent difficulty in that movement out of COMP even into
another COMP will be blocked quite generally by our tensed S condition, preventing the
derivation of "Who do you think Bob saw?". Movement of the WH word into the COMP
of the embedded sentence is permitted, but movement from this COMP into the higher
COMP is blocked by (24). There are a number of possible modifications that will allow
COMP to function as an "escape hatch" as in Chomsky (1973). For example (20) could be
changed in such a way that when f = (i/j), (24) must be satisfied only when one of the non-
terminals indexed by i and j is not COMP. We might also mention that recent work (see in
particular Huang (1977)) indicates that COMP has internal structure: one substructure
for sentence introducers such as English THAT and FOR, and another for WH phrases. Clearly
it is only the latter that is relevant to COMP to COMP movement and to (26).

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
A restrictive theory of transformational grammar 195

We have thus far ignored one of the major research questions of recent work
on grammatical formalism, namely, that of weak generative capacity. It seems clear
to us that our theory shares the defect of the Aspects theory noted by Peters and
Ritchie (1973). Our deletion operation presumably results in grammars that lack
the survivor property of Peters (1973); hence, our theory provides a grammar for
every r. e. set. Nonetheless, on one level, it makes sense to say that our theory is
better than the Aspects theory (as articulated by Peters and Ritchie). Peters and
Ritchie's proof depends upon the presence in grammars of a deletion operation of
a particular sort, and is virtually independent of all other grammatical properties,
many of which are important to linguistic investigation. It is with respect to these
other properties that the theories in question diverge. In comparing two theories,
it is reasonable to abstract away from their common virtues and shortcomings.
In the present instance, such an abstraction leaves our theory much less powerful.
Notice that we use the term "powerful" not with respect to the character
of the languages generated but rather with respect to the relative size of the classes
of grammars allowed. In 1., we argued that the theory is best that allows the
smallest subset of grammars consistent with empirical evidence. From this point
of view, the fact that some of the knguages generated may be non-recursive is of
subsidiary importance. The relevant consequence of the Peters-Ritchie proof is
that a grammar is available for every r. e. set.
To state this in another way, our concern throughout this paper has been
with restricting the class of grammars compatible with reasonably limited data,
and not with resolving the decidability problem for the sentences of particular
grammars. We have not considered this second problem and are not convinced that
it is of any inherent linguistic import.

4.1. Current and Future Syntactic Research

We have argued that limiting syntactic theory in the way we have is


methodologically sound. That is, our theory is a priori better than most existing
alternatives. However, a theory is "too good" if it does not allow for descriptions
of all phenomena that properly belong in its domain. There is good reason to
believe that our theory or some close variant may be empirically adequate. Much
early syntactic work presented analyses that our theory does not countenance.
More recently, many of the phenomena motivating such analyses have been re-
examined with results quite compatible with our central proposals. Examples
include NP movement, Fiengo (1974, 1977); the English auxiliary verb system,
Lasnik (forthcoming); pronominal reference, Jackendoff (1972), Lasnik (1976);
WH movement and related phenomena, Chomsky (1976a, 1976b); crosscategorial
transformations, Bresnan (1976). We feel reasonably confident that close examination
of additional recalcitrant phenomena will have similar positive results. To the
extent that we can continue to construct empirically adequare yet narrow theories
we move from description toward explanation.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM
196 Howard Lasnik and Joseph J. Kupin

REFERENCES
ANDERSON, S.R., and P. KIPARSKY, eds. (1973), A Festschrift for Morris Halle, Holt, Rinehart,
and Winston, New York.
BRESNAN, J. (1976), On the Form and Functioning of Transformations. Linguistic Inquiry 7,
340.
CHOMSKY, N. (1955), The Logical Structure of Linguistic Theory (Plenum, New York 1975).
CHOMSKY, N. (1956), Three Models for the Description of Language. LR.E. Transactions on
Information Theory, IT-2,113124.
CHOMSKY, N. (1959), On Certain Formal Properties of Grammars. Information & Control 2,
137167.
CHOMSKY, N. (1965), Aspects of the Theory of Syntax. MIT Press, Cambridge, Massachusetts.
CHOMSKY, N. (1972), Studies on Semantics in Generative Grammar. Mouton, The Hague.
CHOMSKY, N. (1973), Conditions on Transformations. In Anderson and Kiparsky.
CHOMSKY, N. (1976a), On Wh-Movement. Presented at the Irvine Conference on Formal
Syntax.
CHOMSKY, N. (1976b), Conditions on Rules of Grammar. Linguistic Analysis 2, 303351.
CHOMSKY, N., and M. HALLE (1968), The Sound Pattern of English. Harper & Row, New York.
EMONDS, J. (1970), Root and Structure Preserving Transformations. Unpublished MIT diss.
FIENGO, R. (1974), Semantic Conditions on Surface Structure. Unpublished MIT diss.
FIENGO, R. (1977), On Trace Theory. Linguistic Inquiry 8, 3561.
FIENGO, R., and H. LASNIK (1976), Some Issues in the Theory of Transformations. Linguistic
Inquiry 7,182-191.
GINSBURG, S., and B. PARTEE (1969), A Mathematical Model of Transformational Grammars.
Information & Control 15,297334.
HUANG, P. (1977), Wb-frontingand Related Processes. Unpublished, University of Connecticut diss.
JACKENDOFF, R. (1972), Semantic Interpretation in Generative Grammar. MIT Press, Cambridge,
Massachusetts.
JACKENDOFF, R. (1976), X Syntax. To appear as Linguistic Inquiry Monograph No. 2.
KIPARSKY, P. (1973), Elsewhere in Phonology. In Anderson and Kiparsky.
KUPIN, J. (In press), A Motivated Alternative to Phrase Markers. Linguistic Inquiry 1, 2.
LASNIK, H. (1976), Remarks on Coreference. Linguistic Analysis 2, 122.
LASNIK, H. (forthcoming), Restricting the Theory of Transformations: A Case Study.
LEVINE, A. (1976), Why Argue about Rule Ordering? Linguistic Analysis 2,115124.
PETERS, S. (1973), On Restricting Deletion Transformations. M. Gross, et al. eds. The Formal
Analysis of Natural Language, Mouton, The Hague.
PETERS, S., and R.W. RITCHIE (1973), On the Generative Power of Transformational
Grammars. Information Sciences 6, 4983.
POPPER, K. (1959), The Logic of Scientific Discovery. Basic Books, New York.
POSTAL, P. (1974), On Raising. MIT Press, Cambridge, Massachusetts.

Brought to you by | Brown University Rockefeller Library


Authenticated | 128.148.252.35
Download Date | 9/5/12 5:34 AM

Das könnte Ihnen auch gefallen