![Generative grammar](https://www.english.nina.az/wikipedia/image/aHR0cHM6Ly91cGxvYWQud2lraW1lZGlhLm9yZy93aWtpcGVkaWEvY29tbW9ucy90aHVtYi8yLzI4L0NnaXNmLXRnZy5wbmcvMTYwMHB4LUNnaXNmLXRnZy5wbmc=.png )
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists (/ˈdʒɛnərətɪvɪsts/), tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.
![image](https://www.english.nina.az/wikipedia/image/aHR0cHM6Ly93d3cuZW5nbGlzaC5uaW5hLmF6L3dpa2lwZWRpYS9pbWFnZS9hSFIwY0hNNkx5OTFjR3h2WVdRdWQybHJhVzFsWkdsaExtOXlaeTkzYVd0cGNHVmthV0V2WTI5dGJXOXVjeTkwYUhWdFlpOHlMekk0TDBObmFYTm1MWFJuWnk1d2JtY3ZNekF3Y0hndFEyZHBjMll0ZEdkbkxuQnVadz09LnBuZw==.png)
Generative grammar began in the late 1950s with the work of Noam Chomsky, having roots in earlier approaches such as structural linguistics. The earliest version of Chomsky's model was called Transformational grammar, with subsequent iterations known as Government and binding theory and the Minimalist program. Other present-day generative models include Optimality theory, Categorial grammar, and Tree-adjoining grammar.
Principles
Generative grammar is an umbrella term for a variety of approaches to linguistics. What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge.
Cognitive science
Generative grammar studies language as part of cognitive science. Thus, research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language.
Like other approaches in linguistics, generative grammar engages in linguistic description rather than linguistic prescription.
Explicitness and generality
Generative grammar proposes models of language consisting of explicit rule systems, which make testable falsifiable predictions. This is different from traditional grammar where grammatical patterns are often described more loosely. These models are intended to be parsimonious, capturing generalizations in the data with as few rules as possible. For example, because English imperative tag questions obey the same restrictions that second person future declarative tags do, Paul Postal proposed that the two constructions are derived from the same underlying structure. By adopting this hypothesis, he was able to capture the restrictions on tags with a single rule. This kind of reasoning is commonplace in generative research.
Particular theories within generative grammar have been expressed using a variety of formal systems, many of which are modifications or extensions of context free grammars.
Competence versus performance
Generative grammar generally distinguishes linguistic competence and linguistic performance. Competence is the collection of subconscious rules that one knows when one knows a language; performance is the system which puts these rules to use. This distinction is related to the broader notion of Marr's levels used in other cognitive sciences, with competence corresponding to Marr's computational level.
For example, generative theories generally provide competence-based explanations for why English speakers would judge the sentence in (1) as odd. In these explanations, the sentence would be ungrammatical because the rules of English only generate sentences where demonstratives agree with the grammatical number of their associated noun.
- (1) *That cats is eating the mouse.
By contrast, generative theories generally provide performance-based explanations for the oddness of center embedding sentences like one in (2). According to such explanations, the grammar of English could in principle generate such sentences, but doing so in practice is so taxing on working memory that the sentence ends up being unparsable.
- (2) *The cat that the dog that the man fed chased meowed.
In general, performance-based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing. As a result, the choice between a competence-based explanation and a performance-based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence. For example, while many generative models of syntax explain island effects by positing constraints within the grammar, it has also been argued that some or all of these constraints are in fact the result of limitations on performance.
Non-generative approaches often do not posit any distinction between competence and performance. For instance, usage-based models of language assume that grammatical patterns arise as the result of usage.
Innateness and universality
A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not. Within generative grammar, it is generally accepted that at least some domain-specific aspects are innate, and the term "universal grammar" is often used as a placeholder for whichever those turn out to be.
The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments. For example, one famous poverty of the stimulus argument concerns the acquisition of yes-no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are. The empirical basis of poverty of the stimulus arguments has been challenged by Geoffrey Pullum and others, leading to back-and-forth debate in the language acquisition literature. Recent work has also suggested that some recurrent neural network architectures are able to learn hierarchical structure without an explicit constraint.
Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature-specifications in the lexicon. On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked. In a 2002 paper, Noam Chomsky, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.
In day-to-day research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.
Subfields
Research in generative grammar spans a number of subfields. These subfields are also studied in non-generative approaches.
Syntax
Syntax studies the rule systems which combine smaller units such as morphemes into larger units such as phrases and sentences. Within generative syntax, prominent approaches include Minimalism, Government and binding theory, Lexical-functional grammar (LFG), and Head-driven phrase structure grammar (HPSG).
Phonology
Phonology studies the rule systems which organize linguistic sounds. For example, research in phonology includes work on phonotactic rules which govern which phonemes can be combined, as well as those that determine the placement of stress, tone, and other suprasegmental elements. Within generative grammar, a prominent approach to phonology is Optimality Theory.
Semantics
Semantics studies the rule systems that determine expressions' meanings. Within generative grammar, semantics is a species of formal semantics, providing compositional models of how the denotations of sentences are computed on the basis of the meanings of the individual morphemes and their syntactic structure.
Extensions
Music
Generative grammar has been applied to music theory and analysis since the 1980s. One notable approach is Fred Lerdahl and Ray Jackendoff's Generative theory of tonal music, which formalized and extended ideas from Schenkerian analysis.
Biolinguistics
Recent work in generative-inspired biolinguistics has proposed that universal grammar consists solely of syntactic recursion, and that it arose recently in humans as the result of a random genetic mutation. Generative-inspired biolinguistics has not uncovered any particular genes responsible for language. While some prospects were raised at the discovery of the FOXP2 gene, there is not enough support for the idea that it is 'the grammar gene' or that it had much to do with the relatively recent emergence of syntactical speech.
History
As a distinct research tradition, generative grammar began in the late 1950s with the work of Noam Chomsky. However, its roots include earlier structuralist approaches such as glossematics which themselves had older roots, for instance in the work of the ancient Indian grammarian Pāṇini. Military funding to generative research was an important factor in its early spread in the 1960s.
The initial version of generative syntax was called transformational grammar. In transformational grammar, rules called transformations mapped a level of representation called deep structures to another level of representation called surface structure. The semantic interpretation of a sentence was represented by its deep structure, while the surface structure provided its pronunciation. For example, an active sentence such as "The doctor examined the patient" and "The patient was examined by the doctor", had the same deep structure. The difference in surface structures arises from the application of the passivization transformation, which was assumed to not affect meaning. This assumption was challenged in the 1960s by the discovery of examples such as "Everyone in the room knows two languages" and "Two languages are known by everyone in the room".[citation needed]
After the Linguistics wars of the late 1960s and early 1970s, Chomsky developed a revised model of syntax called Government and binding theory, which eventually grew into Minimalism. In the aftermath of those disputes, a variety of other generative models of syntax were proposed including relational grammar, Lexical-functional grammar (LFG), and Head-driven phrase structure grammar (HPSG).[citation needed]
Generative phonology originally focused on rewrite rules, in a system commonly known as SPE Phonology after the 1968 book The Sound Pattern of English by Chomsky and Morris Halle. In the 1990s, this approach was largely replaced by Optimality theory, which was able to capture generalizations called which needed to be stipulated in SPE phonology.
Semantics emerged as a subfield of generative linguistics during the late 1970s, with the pioneering work of Richard Montague. Montague proposed a system called Montague grammar which consisted of interpretation rules mapping expressions from a bespoke model of syntax to formulas of intensional logic. Subsequent work by Barbara Partee, Irene Heim, Tanya Reinhart, and others showed that the key insights of Montague Grammar could be incorporated into more syntactically plausible systems.
See also
- Cognitive linguistics
- Cognitive revolution
- Digital infinity
- Formal grammar
- Functional theories of grammar
- Generative lexicon
- Generative metrics
- Generative principle
- Generative semantics
- Generative systems
- Parsing
- Phrase structure rules
- Syntactic Structures
References
- "Generativist". Dictionary.com Unabridged (Online). August 2024.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. pp. 296, 311. doi:10.1002/9780470756409.ch12}.
...generative grammar is not so much a theory as a family or theories, or a school of thought... [having] shared assumptions and goals, widely used formal devices, and generally accepted empirical results
- Carnie, Andrew (2002). Syntax: A Generative Introduction. Wiley-Blackwell. p. 5. ISBN 978-0-631-22543-0.
- Carnie, Andrew (2002). Syntax: A Generative Introduction. Wiley-Blackwell. pp. 4–6, 8. ISBN 978-0-631-22543-0.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. pp. 295–296, 299–300. doi:10.1002/9780470756409.ch12.
- Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. p. 14. ISBN 978-0199243709.
- Carnie, Andrew (2002). Syntax: A Generative Introduction. Wiley-Blackwell. p. 8. ISBN 978-0-631-22543-0.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. pp. 295, 297. doi:10.1002/9780470756409.ch12.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. pp. 298–300. doi:10.1002/9780470756409.ch12.
- Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. pp. 14–15. ISBN 978-0199243709.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. pp. 297–298. doi:10.1002/9780470756409.ch12}.
- Pritchett, Bradley (1992). Grammatical competence and parsing performance. University of Chicago Press. p. 2. ISBN 0-226-68442-3.
- Marr, David (1982). Vision. MIT Press. p. 28. ISBN 978-0262514620.
- Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. pp. 4–7, 17. ISBN 978-0199243709.
- Dillon, Brian; Momma, Shota (2021), Psychological background to linguistic theories (PDF) (Course notes)
- Sprouse, Jon; Wagers, Matt; Phillips, Colin (2013). "Deriving competing predictions from grammatical approaches and reductionist approaches to island effects". In Sprouse, Jon; Hornstein, Norbert (eds.). Experimental syntax and island effects. Cambridge University Press. doi:10.1017/CBO9781139035309.002.
- Phillips, Colin (2013). "On the nature of island constraints I: Language processing and reductionist accounts" (PDF). In Sprouse, Jon; Hornstein, Norbert (eds.). Experimental syntax and island effects. Cambridge University Press. doi:10.1017/CBO9781139035309.005.
- Hofmeister, Philip; Staum Casasanto, Laura; Sag, Ivan (2013). "Islands in the grammar? Standards of evidence". In Sprouse, Jon; Hornstein, Norbert (eds.). Experimental syntax and island effects. Cambridge University Press. doi:10.1017/CBO9781139035309.004.
- Vyvyan, Evans; Green, Melanie (2006). Cognitive Linguistics: An Introduction. Edinburgh University Press. pp. 108–111. ISBN 0-7486-1832-5.
- Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. p. 299. doi:10.1002/9780470756409.ch12}.
- Pesetsky, David (1999). "Linguistic universals and universal grammar". In Wilson, Robert; Keil, Frank (eds.). The MIT encyclopedia of the cognitive sciences. MIT Press. pp. 476–478. doi:10.7551/mitpress/4660.001.0001.
- Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. pp. 8–11. ISBN 978-0199243709.
- Lasnik, Howard; Lidz, Jeffrey (2017). "The Argument from the Poverty of the Stimulus" (PDF). In Roberts, Ian (ed.). The Oxford Handbook of Universal Grammar. Oxford University Press.
- Crain, Stephen; Nakayama, Mineharu (1987). "Structure dependence in grammar formation". Language. 63 (3). doi:10.2307/415004.
- Pullum, Geoff; Scholz, Barbara (2002). "Empirical assessment of stimulus poverty arguments". The Linguistic Review. 18 (1–2): 9–50. doi:10.1515/tlir.19.1-2.9.
- Legate, Julie Anne; Yang, Charles (2002). "Empirical re-assessment of stimulus poverty arguments" (PDF). The Linguistic Review. 18 (1–2): 151–162. doi:10.1515/tlir.19.1-2.9.
- McCoy, R. Thomas; Frank, Robert; Linzen, Tal (2018). "Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks" (PDF). Proceedings of the 40th Annual Conference of the Cognitive Science Society: 2093–2098.
- Gallego, Ángel (2012). "Parameters". In Boeckx, Cedric (ed.). The Oxford Handbook of Linguistic Minimalism. Oxford University Press. doi:10.1093/oxfordhb/9780199549368.013.0023.
- McCarthy, John (1992). Doing optimality theory. Wiley. pp. 1–3. ISBN 978-1-4051-5136-8.
- Hauser, Marc; Chomsky, Noam; Fitch, W. Tecumseh (2002). "The faculty of language: what is it, who has it, and how did it evolve". Science. 298: 1569–1579. doi:10.1126/science.298.5598.1569.
- Carnie, Andrew (2002). Syntax: A Generative Introduction. Wiley-Blackwell. p. 25. ISBN 978-0-631-22543-0.
- Clements, Nick (1999). "Phonology". In Wilson, Robert; Keil, Frank (eds.). The MIT encyclopedia of the cognitive sciences. MIT Press. pp. 639–641. doi:10.7551/mitpress/4660.003.0026.
- Irene Heim; Angelika Kratzer (1998). Semantics in generative grammar. Wiley-Blackwell. ISBN 978-0-631-19713-3.
- Baroni, Mario; Maguire, Simon; Drabkin, William (1983). "The Concept of Musical Grammar". Music Analysis. 2 (2): 175–208. doi:10.2307/854248.
- Lerdahl, Fred; Ray Jackendoff (1983). A Generative Theory of Tonal Music. MIT Press. ISBN 978-0-262-62107-6.
- Berwick, Robert; Chomsky, Noam (2015). Why Only Us: Language and Evolution. MIT Press. ISBN 9780262034241.
- Scharff C, Haesler S (December 2005). "An evolutionary perspective on FoxP2: strictly for the birds?". Curr. Opin. Neurobiol. 15 (6): 694–703. doi:10.1016/j.conb.2005.10.004. PMID 16266802. S2CID 11350165.
- Scharff C, Petri J (July 2011). "Evo-devo, deep homology and FoxP2: implications for the evolution of speech and language". Philos. Trans. R. Soc. Lond. B Biol. Sci. 366 (1574): 2124–40. doi:10.1098/rstb.2011.0001. PMC 3130369. PMID 21690130.
- Diller, Karl C.; Cann, Rebecca L. (2009). Rudolf Botha; Chris Knight (eds.). Evidence Against a Genetic-Based Revolution in Language 50,000 Years Ago. Oxford Series in the Evolution of Language. Oxford: Oxford University Press. pp. 135–149. ISBN 978-0-19-954586-5.
- Newmeyer, Frederick (1986). Linguistic Theory in America. Academic Press. pp. 17–18. ISBN 0-12-517152-8.
- Koerner, E. F. K. (1978). "Towards a historiography of linguistics". Toward a Historiography of Linguistics: Selected Essays. John Benjamins. pp. 21–54.
- Bloomfield, Leonard, 1929, 274; cited in Rogers, David, 1987, 88
- Hockett, Charles, 1987, 41
- Newmeyer, F. J. (1986). Has there been a 'Chomskyan revolution' in linguistics?. Language, 62(1), p.13
- Partee, Barbara (2011). "Formal semantics: Origins, issues, early impact". The Baltic International Yearbook of Cognition, Logic and Communication. 6. CiteSeerX 10.1.1.826.5720.
- Crnič, Luka; Pesetsky, David; Sauerland, Uli (2014). "Introduction: Biographical Notes" (PDF). In Crnič, Luka; Sauerland, Uli (eds.). The art and craft of semantics: A Festschrift for Irene Heim.
Further reading
- Chomsky, Noam. 1965. Aspects of the theory of syntax. Cambridge, Massachusetts: MIT Press.
- Hurford, J. (1990) Nativist and functional explanations in language acquisition. In I. M. Roca (ed.), Logical Issues in Language Acquisition, 85–136. Foris, Dordrecht.
- Marantz, Alec (2019). "What do linguists do?" (PDF).
- Isac, Daniela; Charles Reiss (2013). I-language: An Introduction to Linguistics as Cognitive Science, 2nd edition. Oxford University Press. ISBN 978-0-19-953420-3.
Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans subconscious grammatical knowledge Generative linguists or generativists ˈ dʒ ɛ n er e t ɪ v ɪ s t s tend to share certain working assumptions such as the competence performance distinction and the notion that some domain specific aspects of grammar are partly innate in humans These assumptions are rejected in non generative approaches such as usage based models of language Generative linguistics includes work in core areas such as syntax semantics phonology psycholinguistics and language acquisition with additional extensions to topics including biolinguistics and music cognition A syntax tree in which the sentence S breaks down into a noun phrase NP and a verb phrase VP both of which break down into additional smaller constituents Generative grammar began in the late 1950s with the work of Noam Chomsky having roots in earlier approaches such as structural linguistics The earliest version of Chomsky s model was called Transformational grammar with subsequent iterations known as Government and binding theory and the Minimalist program Other present day generative models include Optimality theory Categorial grammar and Tree adjoining grammar PrinciplesGenerative grammar is an umbrella term for a variety of approaches to linguistics What unites these approaches is the goal of uncovering the cognitive basis of language by formulating and testing explicit models of humans subconscious grammatical knowledge Cognitive science Generative grammar studies language as part of cognitive science Thus research in the generative tradition involves formulating and testing hypotheses about the mental processes that allow humans to use language Like other approaches in linguistics generative grammar engages in linguistic description rather than linguistic prescription Explicitness and generality Generative grammar proposes models of language consisting of explicit rule systems which make testable falsifiable predictions This is different from traditional grammar where grammatical patterns are often described more loosely These models are intended to be parsimonious capturing generalizations in the data with as few rules as possible For example because English imperative tag questions obey the same restrictions that second person future declarative tags do Paul Postal proposed that the two constructions are derived from the same underlying structure By adopting this hypothesis he was able to capture the restrictions on tags with a single rule This kind of reasoning is commonplace in generative research Particular theories within generative grammar have been expressed using a variety of formal systems many of which are modifications or extensions of context free grammars Competence versus performance Generative grammar generally distinguishes linguistic competence and linguistic performance Competence is the collection of subconscious rules that one knows when one knows a language performance is the system which puts these rules to use This distinction is related to the broader notion of Marr s levels used in other cognitive sciences with competence corresponding to Marr s computational level For example generative theories generally provide competence based explanations for why English speakers would judge the sentence in 1 as odd In these explanations the sentence would be ungrammatical because the rules of English only generate sentences where demonstratives agree with the grammatical number of their associated noun 1 That cats is eating the mouse By contrast generative theories generally provide performance based explanations for the oddness of center embedding sentences like one in 2 According to such explanations the grammar of English could in principle generate such sentences but doing so in practice is so taxing on working memory that the sentence ends up being unparsable 2 The cat that the dog that the man fed chased meowed In general performance based explanations deliver a simpler theory of grammar at the cost of additional assumptions about memory and parsing As a result the choice between a competence based explanation and a performance based explanation for a given phenomenon is not always obvious and can require investigating whether the additional assumptions are supported by independent evidence For example while many generative models of syntax explain island effects by positing constraints within the grammar it has also been argued that some or all of these constraints are in fact the result of limitations on performance Non generative approaches often do not posit any distinction between competence and performance For instance usage based models of language assume that grammatical patterns arise as the result of usage Innateness and universality A major goal of generative research is to figure out which aspects of linguistic competence are innate and which are not Within generative grammar it is generally accepted that at least some domain specific aspects are innate and the term universal grammar is often used as a placeholder for whichever those turn out to be The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments For example one famous poverty of the stimulus argument concerns the acquisition of yes no questions in English This argument starts from the observation that children only make mistakes compatible with rules targeting even though the examples which they encounter could have been generated by a simpler rule that targets linear order In other words children seem to ignore the possibility that the question rule is as simple as switch the order of the first two words and immediately jump to alternatives that rearrange constituents in tree structures This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure even though they have to figure out what those rules are The empirical basis of poverty of the stimulus arguments has been challenged by Geoffrey Pullum and others leading to back and forth debate in the language acquisition literature Recent work has also suggested that some recurrent neural network architectures are able to learn hierarchical structure without an explicit constraint Within generative grammar there are a variety of theories about what universal grammar consists of One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature specifications in the lexicon On the other hand a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints and that all variation arises from differences in how these constraints are ranked In a 2002 paper Noam Chomsky Marc Hauser and W Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure In day to day research the notion that universal grammar exists motivates analyses in terms of general principles As much as possible facts about particular languages are derived from these general principles rather than from language specific stipulations SubfieldsResearch in generative grammar spans a number of subfields These subfields are also studied in non generative approaches Syntax Syntax studies the rule systems which combine smaller units such as morphemes into larger units such as phrases and sentences Within generative syntax prominent approaches include Minimalism Government and binding theory Lexical functional grammar LFG and Head driven phrase structure grammar HPSG Phonology Phonology studies the rule systems which organize linguistic sounds For example research in phonology includes work on phonotactic rules which govern which phonemes can be combined as well as those that determine the placement of stress tone and other suprasegmental elements Within generative grammar a prominent approach to phonology is Optimality Theory Semantics Semantics studies the rule systems that determine expressions meanings Within generative grammar semantics is a species of formal semantics providing compositional models of how the denotations of sentences are computed on the basis of the meanings of the individual morphemes and their syntactic structure ExtensionsMusic Generative grammar has been applied to music theory and analysis since the 1980s One notable approach is Fred Lerdahl and Ray Jackendoff s Generative theory of tonal music which formalized and extended ideas from Schenkerian analysis Biolinguistics Recent work in generative inspired biolinguistics has proposed that universal grammar consists solely of syntactic recursion and that it arose recently in humans as the result of a random genetic mutation Generative inspired biolinguistics has not uncovered any particular genes responsible for language While some prospects were raised at the discovery of the FOXP2 gene there is not enough support for the idea that it is the grammar gene or that it had much to do with the relatively recent emergence of syntactical speech HistoryAs a distinct research tradition generative grammar began in the late 1950s with the work of Noam Chomsky However its roots include earlier structuralist approaches such as glossematics which themselves had older roots for instance in the work of the ancient Indian grammarian Paṇini Military funding to generative research was an important factor in its early spread in the 1960s The initial version of generative syntax was called transformational grammar In transformational grammar rules called transformations mapped a level of representation called deep structures to another level of representation called surface structure The semantic interpretation of a sentence was represented by its deep structure while the surface structure provided its pronunciation For example an active sentence such as The doctor examined the patient and The patient was examined by the doctor had the same deep structure The difference in surface structures arises from the application of the passivization transformation which was assumed to not affect meaning This assumption was challenged in the 1960s by the discovery of examples such as Everyone in the room knows two languages and Two languages are known by everyone in the room citation needed After the Linguistics wars of the late 1960s and early 1970s Chomsky developed a revised model of syntax called Government and binding theory which eventually grew into Minimalism In the aftermath of those disputes a variety of other generative models of syntax were proposed including relational grammar Lexical functional grammar LFG and Head driven phrase structure grammar HPSG citation needed Generative phonology originally focused on rewrite rules in a system commonly known as SPE Phonology after the 1968 book The Sound Pattern of English by Chomsky and Morris Halle In the 1990s this approach was largely replaced by Optimality theory which was able to capture generalizations called which needed to be stipulated in SPE phonology Semantics emerged as a subfield of generative linguistics during the late 1970s with the pioneering work of Richard Montague Montague proposed a system called Montague grammar which consisted of interpretation rules mapping expressions from a bespoke model of syntax to formulas of intensional logic Subsequent work by Barbara Partee Irene Heim Tanya Reinhart and others showed that the key insights of Montague Grammar could be incorporated into more syntactically plausible systems See alsoCognitive linguistics Cognitive revolution Digital infinity Formal grammar Functional theories of grammar Generative lexicon Generative metrics Generative principle Generative semantics Generative systems Parsing Phrase structure rules Syntactic StructuresReferences Generativist Dictionary com Unabridged Online August 2024 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell pp 296 311 doi 10 1002 9780470756409 ch12 generative grammar is not so much a theory as a family or theories or a school of thought having shared assumptions and goals widely used formal devices and generally accepted empirical results Carnie Andrew 2002 Syntax A Generative Introduction Wiley Blackwell p 5 ISBN 978 0 631 22543 0 Carnie Andrew 2002 Syntax A Generative Introduction Wiley Blackwell pp 4 6 8 ISBN 978 0 631 22543 0 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell pp 295 296 299 300 doi 10 1002 9780470756409 ch12 Adger David 2003 Core syntax A minimalist approach Oxford University Press p 14 ISBN 978 0199243709 Carnie Andrew 2002 Syntax A Generative Introduction Wiley Blackwell p 8 ISBN 978 0 631 22543 0 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell pp 295 297 doi 10 1002 9780470756409 ch12 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell pp 298 300 doi 10 1002 9780470756409 ch12 Adger David 2003 Core syntax A minimalist approach Oxford University Press pp 14 15 ISBN 978 0199243709 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell pp 297 298 doi 10 1002 9780470756409 ch12 Pritchett Bradley 1992 Grammatical competence and parsing performance University of Chicago Press p 2 ISBN 0 226 68442 3 Marr David 1982 Vision MIT Press p 28 ISBN 978 0262514620 Adger David 2003 Core syntax A minimalist approach Oxford University Press pp 4 7 17 ISBN 978 0199243709 Dillon Brian Momma Shota 2021 Psychological background to linguistic theories PDF Course notes Sprouse Jon Wagers Matt Phillips Colin 2013 Deriving competing predictions from grammatical approaches and reductionist approaches to island effects In Sprouse Jon Hornstein Norbert eds Experimental syntax and island effects Cambridge University Press doi 10 1017 CBO9781139035309 002 Phillips Colin 2013 On the nature of island constraints I Language processing and reductionist accounts PDF In Sprouse Jon Hornstein Norbert eds Experimental syntax and island effects Cambridge University Press doi 10 1017 CBO9781139035309 005 Hofmeister Philip Staum Casasanto Laura Sag Ivan 2013 Islands in the grammar Standards of evidence In Sprouse Jon Hornstein Norbert eds Experimental syntax and island effects Cambridge University Press doi 10 1017 CBO9781139035309 004 Vyvyan Evans Green Melanie 2006 Cognitive Linguistics An Introduction Edinburgh University Press pp 108 111 ISBN 0 7486 1832 5 Wasow Thomas 2003 Generative Grammar PDF In Aronoff Mark Ress Miller Janie eds The Handbook of Linguistics Blackwell p 299 doi 10 1002 9780470756409 ch12 Pesetsky David 1999 Linguistic universals and universal grammar In Wilson Robert Keil Frank eds The MIT encyclopedia of the cognitive sciences MIT Press pp 476 478 doi 10 7551 mitpress 4660 001 0001 Adger David 2003 Core syntax A minimalist approach Oxford University Press pp 8 11 ISBN 978 0199243709 Lasnik Howard Lidz Jeffrey 2017 The Argument from the Poverty of the Stimulus PDF In Roberts Ian ed The Oxford Handbook of Universal Grammar Oxford University Press Crain Stephen Nakayama Mineharu 1987 Structure dependence in grammar formation Language 63 3 doi 10 2307 415004 Pullum Geoff Scholz Barbara 2002 Empirical assessment of stimulus poverty arguments The Linguistic Review 18 1 2 9 50 doi 10 1515 tlir 19 1 2 9 Legate Julie Anne Yang Charles 2002 Empirical re assessment of stimulus poverty arguments PDF The Linguistic Review 18 1 2 151 162 doi 10 1515 tlir 19 1 2 9 McCoy R Thomas Frank Robert Linzen Tal 2018 Revisiting the poverty of the stimulus hierarchical generalization without a hierarchical bias in recurrent neural networks PDF Proceedings of the 40th Annual Conference of the Cognitive Science Society 2093 2098 Gallego Angel 2012 Parameters In Boeckx Cedric ed The Oxford Handbook of Linguistic Minimalism Oxford University Press doi 10 1093 oxfordhb 9780199549368 013 0023 McCarthy John 1992 Doing optimality theory Wiley pp 1 3 ISBN 978 1 4051 5136 8 Hauser Marc Chomsky Noam Fitch W Tecumseh 2002 The faculty of language what is it who has it and how did it evolve Science 298 1569 1579 doi 10 1126 science 298 5598 1569 Carnie Andrew 2002 Syntax A Generative Introduction Wiley Blackwell p 25 ISBN 978 0 631 22543 0 Clements Nick 1999 Phonology In Wilson Robert Keil Frank eds The MIT encyclopedia of the cognitive sciences MIT Press pp 639 641 doi 10 7551 mitpress 4660 003 0026 Irene Heim Angelika Kratzer 1998 Semantics in generative grammar Wiley Blackwell ISBN 978 0 631 19713 3 Baroni Mario Maguire Simon Drabkin William 1983 The Concept of Musical Grammar Music Analysis 2 2 175 208 doi 10 2307 854248 Lerdahl Fred Ray Jackendoff 1983 A Generative Theory of Tonal Music MIT Press ISBN 978 0 262 62107 6 Berwick Robert Chomsky Noam 2015 Why Only Us Language and Evolution MIT Press ISBN 9780262034241 Scharff C Haesler S December 2005 An evolutionary perspective on FoxP2 strictly for the birds Curr Opin Neurobiol 15 6 694 703 doi 10 1016 j conb 2005 10 004 PMID 16266802 S2CID 11350165 Scharff C Petri J July 2011 Evo devo deep homology and FoxP2 implications for the evolution of speech and language Philos Trans R Soc Lond B Biol Sci 366 1574 2124 40 doi 10 1098 rstb 2011 0001 PMC 3130369 PMID 21690130 Diller Karl C Cann Rebecca L 2009 Rudolf Botha Chris Knight eds Evidence Against a Genetic Based Revolution in Language 50 000 Years Ago Oxford Series in the Evolution of Language Oxford Oxford University Press pp 135 149 ISBN 978 0 19 954586 5 Newmeyer Frederick 1986 Linguistic Theory in America Academic Press pp 17 18 ISBN 0 12 517152 8 Koerner E F K 1978 Towards a historiography of linguistics Toward a Historiography of Linguistics Selected Essays John Benjamins pp 21 54 Bloomfield Leonard 1929 274 cited in Rogers David 1987 88 Hockett Charles 1987 41 Newmeyer F J 1986 Has there been a Chomskyan revolution in linguistics Language 62 1 p 13 Partee Barbara 2011 Formal semantics Origins issues early impact The Baltic International Yearbook of Cognition Logic and Communication 6 CiteSeerX 10 1 1 826 5720 Crnic Luka Pesetsky David Sauerland Uli 2014 Introduction Biographical Notes PDF In Crnic Luka Sauerland Uli eds The art and craft of semantics A Festschrift for Irene Heim Further readingChomsky Noam 1965 Aspects of the theory of syntax Cambridge Massachusetts MIT Press Hurford J 1990 Nativist and functional explanations in language acquisition In I M Roca ed Logical Issues in Language Acquisition 85 136 Foris Dordrecht Marantz Alec 2019 What do linguists do PDF Isac Daniela Charles Reiss 2013 I language An Introduction to Linguistics as Cognitive Science 2nd edition Oxford University Press ISBN 978 0 19 953420 3