The notion of derivations in linguistics: Syntax - UMD Linguistics [PDF]

Dec 11, 2011 - ordering among generalized transformation although such ordering is permitted by the theory of Transforma

8 downloads 5 Views 369KB Size

Recommend Stories


Linguistics
If you feel beautiful, then you are. Even if you don't, you still are. Terri Guillemets

Linguistics
You're not going to master the rest of your life in one day. Just relax. Master the day. Than just keep

Linguistics
In every community, there is work to be done. In every nation, there are wounds to heal. In every heart,

linguistics
In the end only three things matter: how much you loved, how gently you lived, and how gracefully you

Systemic-Functional Linguistics and the Notion of Grammatical Metaphor
Open your mouth only if what you are going to say is more beautiful than the silience. BUDDHA

PdF Linguistics for Everyone
Every block of stone has a statue inside it and it is the task of the sculptor to discover it. Mich

German Linguistics: Syntax and Morphology of the German Verb
Love only grows by sharing. You can only have more for yourself by giving it away to others. Brian

[PDF] Contemporary Linguistics
There are only two mistakes one can make along the road to truth; not going all the way, and not starting.

Corpus linguistics in ESP
Don't fear change. The surprise is the only way to new discoveries. Be playful! Gordana Biernat

Publishing in Linguistics ALL
Courage doesn't always roar. Sometimes courage is the quiet voice at the end of the day saying, "I will

Idea Transcript


50 years of Linguistics at MIT

December 11, 2011

The notion of derivations in  linguistics: Syntax Howard Lasnik U. Of Maryland (MIT 1972)

The LSLT model: Phrase structure component (context free; non-recursive); includes lexical insertion. Creates P-markers (which are set theoretic objects). P(hrase)-markers: Given a particular ΣF (PS) grammar and a particular terminal string (i.e., string of terminal symbols) (TS): a) Construct all of the equivalent PS derivations of TS. b) Collect all of the lines occurring in any of those equivalent derivations into a set. This set is the phrase marker (PM), a complete representation of the PS of TS, as far as the grammar is concerned. (Two PS derivations are equivalent if and only if they involve the same rewrite rules the same number of times, but not necessarily in the same order.)

2

An artificial illustrative example: Σ: S F: S → NP VP NP → he, Mary VP → V NP V → saw Derivation 2: Line 1: S Line 2: NP VP Line 3: he VP Line 4: he V NP Line 5: he V Mary Line 6: he saw Mary Derivation 4: Line 1: S Line 2: NP VP Line 3: NP V NP Line 4: he V NP Line 5: he saw NP Line 6: he saw Mary Derivation 5: Line 1: S Line 2: NP VP Line 3: NP V NP Line 4: NP saw NP Line 5: NP saw Mary Line 6: he saw Mary

Derivation 1: Line 1: S Line 2: NP VP Line 3: he VP Line 4: he V NP Line 5: he saw NP Line 6: he saw Mary Derivation 3: Line 1: S Line 2: NP VP Line 3: NP V NP Line 4: he V NP Line 5: he V Mary Line 6: he saw Mary Derivation 5: Line 1: S Line 2: NP VP Line 3: NP V NP Line 4: NP saw NP Line 5: he saw NP Line 6: he saw Mary

PM = {S, NP VP, he VP, he V NP, he V Mary, he saw NP, NP V NP, NP saw NP, NP saw Mary, NP V Mary, he saw Mary} 3

A P-marker must indicate what substrings of the terminal string are constituents, and for the ones that are, what the 'labels' of the constituents are. In LSLT, the fundamental predicate characterizing these relations is the is a relation.

In our simplified ex. the following are the is a relations: he is an NP

Mary is an NP

saw is a V

saw Mary is a VP

he saw Mary is an S

By comparing certain members of the PM with the terminal string one by one, one can compute all the ‘is a’ relations in the following fashion.

4

Compare ‘he saw Mary’ with ‘he VP’: he saw Mary he VP From this we deduce that ‘saw Mary is a VP. Now, compare ‘he saw Mary’ with ‘he V Mary’ he saw Mary he V Mary This determines that ‘saw’ is a V. And so on.

In general, no one derivation will have all of the strings necessary for this algorithm to yield all of the is a relations. That’s why LSLT specified that the PM contain all the strings in all of the derivations. In this regard, the LSLT theory of phrase structure was intensely derivational.

5

We don’t need the entire P-marker to determine the ‘isa’ relations. Some members of the set, the ‘monostrings’, consist of exactly one non-terminal symbol and any number of terminal symbols. Those areWKHRQO\RQHVWKH algorithm uses. (They are red and italicized below): {S, NP VP, he VP, he V NP, he V Mary, he saw NP, NP V NP, NP saw NP, NP saw Mary, NP V Mary, he saw Mary}

For this reason, Lasnik and Kupin (1977) proposed a revised construct to represent phrase structure: the Reduced Phrase Marker, the set consisting just of the terminal string and all the monostrings. [There was also a more significant change – to a non-derivational approach to phrase structure. More on this momentarily.]

6

McCawley (1968) argued against the LSLT derivational theory of phrase structure. One of his arguments was, roughly, that we could simplify the theory by creating constituent structure trees directly, rather than, he implies, by first going through all the equivalent derivations, etc. However, the trees in Chomsky’s presentations were almost entirely for expository purposes. Chomsky’s theory was set theoretic, not graph theoretic, so no conversion to trees was necessary, or even relevant. Another argument was more telling, and directly relevant to my concerns today: Chomsky’s phrase structure rules were ordered, hence demanding derivations. But McCawley showed that the ordering wasn’t necessary. (The ordering was invariably to capture contextual restrictions on particular lexical items, especially verbs. But once the model added a lexicon, as in Chomsky (1965), those ordering restrictions on PS rules became irrelevant.)

7

McCawley attributed to Richard Stanley a tree based theory that lacked derivations. Instead, it had “node admissibility conditions”. But nothing in principle precludes a set-theoretic model without derivations. Lasnik and Kupin (1977) presented just such a model. In that model, PMs are simply sets of (mono)strings that satisfy certain conditions.

8

The transformational component of the Chomskian model is, of course, highly derivational. It includes an initial PM (or set thereof). The source of the initial PM(s), derivational or not, is irrelevant. This is one among many Markovian aspects of transformational derivations.

In the original LSLT framework, the phrase structure derivations produced only simple mono-clausal structures, which could then be merged together by generalized transformations conjoining two (or possibly more) separate structures, or embedding one into another. The generalized transformations constituted the recursive component of the grammar. The GTs could interact with the singulary Ts (those operating on a single ‘tree’ rather than on a ‘forest’) in various ZD\s. A major level of representation in this model was the T-marker, the record of all the Ts that applied and exactly how they interacted. 9

Chomsky (1965) innovation: Allow the phrase structure rule component itself to have a recursive character.

Chomsky's major arguments for this new organization were that it resulted in a simpler overall theory (eliminating one kind of operation), and at the same time it explained the absence of certain kinds of derivations that seemed not to be needed. The second of these points is our concern here.

10

Following Fillmore (1963), Chomsky argued that while there is extensive ordering among singulary transformations "... there are no known cases of ordering among generalized transformation although such ordering is permitted by the theory of Transformation-markers." Chomsky (1965, p.133)

Further, Chomsky claimed that while there are many cases of singulary transformations that must apply to a constituent sentence before it is embedded, or that must apply to a 'matrix' sentence after another sentence is embedded in it, "... there are no really convincing cases of singulary transformations that must apply to a matrix sentence before a sentence transform is embedded in it...."

11

The Aspects theory, with its elimination of generalized transformations, was a response to these concerns. As in the LSLT theory, there is still extensive ordering among singulary transformations. In both frameworks, the set of singulary transformations was seen as a linear sequence: an ordered list. (And, to the extent that the orderings were crucial, there was motivation for a derivational model, rather analogous to the opacity arguments for phonological derivations.) But, as they say in the infomercials, there’s more:

12

D‐Structure

Given the full Aspects modification, this list of rules applies cyclically, first operating on the most deeply embedded clause, then the next most deeply embedded, and so on, working up the tree until they apply on the highest clause, the entire generalized P-marker S‐Structure

Thus, singulary transformations apply to constituent sentences as if 'before' they are embedded, and to matrix sentences as if 'after' embedding has taken place. "The ordering possibilities that are permitted by the theory of Transformation-markers but apparently never put to use are now excluded in principle." Chomsky (1965, p.135) 13

In passing, I note that the cyclic principle was one of two major syntactic innovations of the mid 1960s borrowed from phonology.

The principle was first formulated in Chomsky et al. (1956), and applied in the phonological analysis of a variety of languages from the early 1960s on, most notably in SPE (Chomsky and Halle (1968)).

"... it is natural to suppose that in general the phonetic shape of a complex unit ... will be determined by the inherent properties of its parts and the manner in which these parts are combined, and that similar rules will apply to units of different levels of complexity. These observations suggest a general principle for the application of rules of the phonological component, namely, what we will call the principle of the 'transformational cycle.'" [SPE p.15]

14

Interestingly, it was almost three decades before it became clear that the absence of certain kinds of derivations was not a real argument against generalized transformations. For it was not recursion in the base that excluded the unwanted derivations: It was the cyclic principle. And there are a variety of ways that a version of the cyclic principle could be grafted onto a theory with generalized transformations, as work of the last decade and a half has amply demonstrated.

15

Derivations and interfaces In LSLT the T-marker is the interface with semantic interpretation, and the final derived syntactic representation is the interface with morphophonemics. In Aspects, we have:

16

Already in Aspects, it was pointed out that some aspects of semantic interpretation (basically all except thematic relations) depend on derived structure. With the introduction of trace theory in the early 1970’s, D-structure was no longer crucial even for thematic relations. That led to the ‘Extended Standard Theory’ and GB model:

17

BUT, there had already been suggestions in the late 1960’s and very early 1970’s for links to semantic and phonological interpretation not just at levels but internal to the derivation. Particularly important were the work of Bresnan (1971) wrt phonology (specifically sentence intonation) and Jackendoff (1969, 1972) wrt semantics (specifically anaphora). For both, the derivation-internal points were end of cycle structures. The picture then would look like:

18

D‐Structure

(Some) semantic interpretation

(Some) semantic and phonological interpretation (Some) semantic and phonological interpretation (Some) semantic and phonological interpretation (Some) semantic and phonological interpretation (Some) semantic and phonological interpretation

S‐Structure

(Some) semantic and phonological interpretation

19

Going further back in time, recall the Aspects arguments against GTs. The one based on non-occurring derivations turned out to miss the mark, as it was the cyclic principle, not recursion in the base that blocked those derivations. The simplicity argument took for granted that PS rules exist, and that the Aspects modification thus reduced the class of basic operations from 3 to 2 (just singulary Ts and PS rules). But if there are no PS rules (as suggested by McCawley and again by Lasnik and Kupin) …

20

An early, or middle-aged, Minimalist model of structure building where everything is done by transformations. GTs don’t just merge clausal structures with clausal structures, but even lexical items with lexical items. The derivation begins with a selection of lexical items (the ‘numeration’), which are then combined; these combinations are then combined with other lexical items or other combinations, and so on. Meanwhile, interspersed with these instances of ‘external merge’ we also have ‘internal merge’, the classic singulary movement transformation. All these operations are constrained by some version of cyclicity (for example, always merge at the ‘root’ of the ‘tree’). Eventually (the point of ‘spellout’), the derivation branches to PF and LF.

21

Numeration

Spellout

PF

LF 22

There are, though, a number of reasons for thinking that Bresnan and Jackendoff were fundamentally correct about the derivational nature of the interface relations. Among many examples, I could mention Barss (1986) on ‘reconstruction’ into intermediate psoitions; McCoskey (1991) and Torrego (1984) on intermediate footprints of wh-movement; Uriagereka (1999) on deriving certain island constraints; Fox and Pesetsky (2003) on forcing successive cyclic movement; and Merchant (2001), based on Perlmutter (1971), on distinctions between representational and derivational constraints. Grafting Bresnan’s and Jackendoff’s ideas ideas onto the early minimalist model, we wind up with a ‘Mutiple Spellout’ (Uriagereka (1999)) or Derivation by Phase (Chomsky (2001)) picture.

23

Numeration

Semantic and phonological interpretation Semantic and phonological interpretation Semantic and phonological interpretation Semantic and phonological interpretation Semantic and phonological interpretation

End of  derivation

24

What I find fascinating about all these developments is that the same ideas (and sometimes even arguments) keep coming back in slightly revised forms, and combining in slightly different ways. Does this mean these ideas must be right? Or is it just a poverty of imagination situation? I hope I’ll last long enough to find out. Maybe at the 100th anniversary?

25

B a r s s , A n d r e w . 1 9 8 6 . C h a in s a n d a n a p h o r i c d e p e n d e n c e : O n r e c o n s tr u c tio n a n d its im p lic a tio n s . D o c to r a l d is s er t a tio n , M I T , C a m b r id g e , M a ss . B r e s n a n , J o a n W . 1 9 7 1 . S e n te n ce s tr es s a n d s y n ta c tic tr a n s fo r m a tio n s . L a n g u a g e 4 7 : 2 5 7 - 2 8 1 . C h o m sk y , N o a m . 1 9 5 5 . T h e lo g ic a l str u c tu r e o f lin g u istic th e o r y . M s . H a r v a r d U n iv e r s it y , C a m b r id g e , M a ss . a n d M I T , C a m b r id g e, M a s s. , . [ R e v is ed 1 9 5 6 v e r s io n p u b lis h e d in p a r t b y P le n u m , N ew Y o r k , 1 9 7 5 ; U n iv er s ity o f C h ic a g o P r es s , C h ica g o , 1 9 8 5 ] . C h o m sk y , N o a m . 1 9 6 5 . A s p e c ts o f t h e th e o r y o f s y n ta x . C a m b r id g e , M a s s. : M I T p r e ss . C h o m sk y , N o a m . 2 0 0 1 . D e r iv a tio n b y p h a s e. I n K e n H a le : A l ife in la n g u a g e , ed . M ic h a e l K e n s to w ic z, 1 - 5 2 . C a m b r id g e , M a ss . : M I T P r es s. C h o m sk y , N o a m a n d M o r r is H a l le. 1 9 6 8 . T h e s o u n d p a tte r n o f E n g lis h . N ew Y o r k : H a r p e r a n d Row. C h o m sk y , N o a m , M o r r is H a lle a n d F r ed L u k o ff . 1 9 5 6 . O n a c ce n t a n d ju n c tu r e in E n g lis h . I n F o r R o m a n J a k o b s o n , e d . M . H a lle , H . L u n t, a n d H . M a c L ea n , 6 5 - 8 0 . T h e H a g u e : M o u to n & C o. F illm o r e , C h a r les J . 1 9 6 3 . T h e p o s itio n o f e m b ed d in g tr a n s fo r m a t io n s in a g r a m m a r . W o r d 1 9 : 2 0 8 231. F o x , D a n n y a n d D a v id P es e tsk y . 2 0 0 3 . C y c lic lin e a r iz a tio n a n d th e t y p o lo g y o f m o v e m e n t. M s. M I T , C a m b r id g e, M a s s. J a c k en d o ff, R a y . 1 9 6 9 . S o m e r u le s o f s e m a n tic in te r p r e ta tio n fo r E n g lis h . D o cto r a l d is se r ta tio n , M I T , C a m b r id g e, M a s s. J a c k en d o ff, R a y . 1 9 7 2 . S e m a n tic in te r p r e ta tio n in g e n e r a t iv e g r a m m a r . C a m b r id g e , M a ss . : M I T P r es s. L a s n ik , H o w a r d a n d J o se p h J . K u p in . 1 9 7 7 . A r e s tr ic t iv e th e o r y o f t r a n s fo r m a tio n a l g r a m m a r . T h e o r e t ic a l L in g u is t ic s 4 : 1 7 3 - 1 9 6 . [ R e p r in ted in E ss a y s o n r e str ic tiv e n e ss a n d le a r n a b lity , H o w a r d L a s n ik , 1 7 - 4 1 . D o r d r e c h t: K lu w e r , 1 9 9 0 ] M cC a w le y , J a m e s . 1 9 6 8 . C o n ce r n in g th e b a se c o m p o n e n t o f a tr a n s fo r m a tio n a l g r a m m a r . F o u n d a t io n s o f L a n g u a g e 4 : 2 4 3 - 2 6 9 . M cC lo sk e y , J a m e s . 1 9 9 1 . R e s u m p tiv e p r o n o u n s , A '- b in d in g , a n d le v e ls o f r ep r es e n ta tio n in I r is h . I n S y n ta x a n d S e m a n t ic s 2 3 : T h e S y n ta x o f th e M o d e r n C e ltic L a n g u a g e s , ed . R a n d a ll H e n d r ic k , 1 9 9 - 2 4 8 . N ew Y o r k : A c a d e m ic P r es s. M er c h a n t, J a so n . 2 0 0 1 . T h e s y n ta x o f s ile n c e : S lu ic in g , is la n d s , a n d t h e th e o r y o f e llip s i s . O x fo r d : O x fo r d U n iv e r s ity P r e ss . P er lm u tte r , D a v id . 1 9 7 1 . D e e p a n d s u r fa ce c o n s tr a in ts in s y n ta x . N e w Y o r k : H o lt, R in e h a r t a n d W in s to n . T o r r e g o , E s th e r . 1 9 8 4 . O n in v e r s io n in S p a n is h a n d s o m e o f its e ffe c ts. L in g u is tic I n q u ir y 1 5 : 1 0 3 129. U r ia g e r ek a , J u a n . 1 9 9 9 . M u lt ip le s p e ll- o u t. I n W o r k in g m in im a lis m , ed . S a m u e l D a v id E p s te in a n d N o r b e r t H o r n s te in , 2 5 1 - 2 8 2 . C a m b r id g e , M a s s. : M I T P r es s.

Smile Life

When life gives you a hundred reasons to cry, show life that you have a thousand reasons to smile

Get in touch

© Copyright 2015 - 2024 PDFFOX.COM - All rights reserved.