Chomsky hierarchy
The Chomsky hierarchy (infrequently referred to as the Chomsky–Schützenberger hierarchy^{[1]}) in the fields of formal language theory, computer science, and linguistics, is a containment hierarchy of classes of formal grammars.
History
The general idea of a hierarchy of grammars was first described by Noam Chomsky in Chomsky 1956. MarcelPaul Schützenberger also played a role in the development of the theory of formal languages; the paper Chomsky & Schützenberger 1963 describes the modern hierarchy including contextfree grammars.^{[1]}
Formal grammars
A formal grammar of this type consists of a finite set of production rules (lefthand side → righthand side), where each side consists of a finite sequence of the following symbols:
 a finite set of nonterminal symbols (indicating that some production rule can yet be applied)
 a finite set of terminal symbols (indicating that no production rule can be applied)
 a start symbol (a distinguished nonterminal symbol)
A formal grammar provides an axiom schema for (or generates) a formal language, which is a (usually infinite) set of finitelength sequences of symbols that may be constructed by applying production rules to another sequence of symbols (which initially contains just the start symbol). A rule may be applied by replacing an occurrence of the symbols on its lefthand side with those that appear on its righthand side. A sequence of rule applications is called a derivation. Such a grammar defines the formal language: all words consisting solely of terminal symbols which can be reached by a derivation from the start symbol.
Nonterminals are often represented by uppercase letters, terminals by lowercase letters, and the start symbol by S. For example, the grammar with terminals {a, b}, nonterminals {S, A, B}, production rules
 S → AB
 S → ε (where ε is the empty string)
 A → aS
 B → b
and start symbol S, defines the language of all words of the form [math]\displaystyle{ a^n b^n }[/math] (i.e. n copies of a followed by n copies of b).
The following is a simpler grammar that defines the same language:
Terminals {a, b}, Nonterminals {S}, Start symbol S, Production rules
 S → aSb
 S → ε
As another example, a grammar for a toy subset of the English language is given by:
 terminals
 {generate, hate, great, green, ideas, linguists}
 nonterminals
 {S, NP, VP, N, V, Adj}
 production rules
 S → NP VP
 NP → Adj NP
 NP → N
 VP → V NP
 VP → V
 N → ideas
 N → linguists
 V → generate
 V → hate
 Adj → great
 Adj → green
and start symbol S. An example derivation is
 S → NP VP → Adj NP VP → Adj N VP → Adj N V NP → Adj N V Adj NP → Adj N V Adj Adj NP → Adj N V Adj Adj N → great N V Adj Adj N → great linguists V Adj Adj N → great linguists generate Adj Adj N → great linguists generate great Adj N → great linguists generate great green N → great linguists generate great green ideas.
Other sequences that can be derived from this grammar are: "ideas hate great linguists", and "ideas generate". While these sentences are nonsensical, they are syntactically correct. A syntactically incorrect sentence (e.g. "ideas ideas great hate") cannot be derived from this grammar. See "Colorless green ideas sleep furiously" for a similar example given by Chomsky in 1957; see Phrase structure grammar and Phrase structure rules for more natural language examples and the problems of formal grammar in that area.
The hierarchy
The following table summarizes each of Chomsky's four types of grammars, the class of language it generates, the type of automaton that recognizes it, and the form its rules must have.
Grammar  Languages  Automaton  Production rules (constraints)*  Examples^{[2]} 

Type0  Recursively enumerable  Turing machine  [math]\displaystyle{ \gamma \rightarrow \alpha }[/math] ([math]\displaystyle{ \gamma }[/math] nonempty)  [math]\displaystyle{ L = \{ww }[/math] describes a terminating Turing machine[math]\displaystyle{ \} }[/math] 
Type1  Contextsensitive  Linearbounded nondeterministic Turing machine  [math]\displaystyle{ \alpha A \beta \rightarrow \alpha \gamma \beta }[/math]  [math]\displaystyle{ L = \{a^nb^nc^nn \gt 0\} }[/math] 
Type2  Contextfree  Nondeterministic pushdown automaton  [math]\displaystyle{ A \rightarrow \alpha }[/math]  [math]\displaystyle{ L = \{a^nb^nn \gt 0\} }[/math] 
Type3  Regular  Finite state automaton  [math]\displaystyle{ A \rightarrow \text{a} }[/math] and [math]\displaystyle{ A \rightarrow \text{a}B }[/math] 
[math]\displaystyle{ L = \{a^nn \geq 0\} }[/math] 
* Meaning of symbols:

Note that the set of grammars corresponding to recursive languages is not a member of this hierarchy; these would be properly between Type0 and Type1.
Every regular language is contextfree, every contextfree language is contextsensitive, every contextsensitive language is recursive and every recursive language is recursively enumerable. These are all proper inclusions, meaning that there exist recursively enumerable languages that are not contextsensitive, contextsensitive languages that are not contextfree and contextfree languages that are not regular.^{[3]}
Type0 grammars
Type0 grammars include all formal grammars. They generate exactly all languages that can be recognized by a Turing machine. These languages are also known as the recursively enumerable or Turingrecognizable languages.^{[4]} Note that this is different from the recursive languages, which can be decided by an alwayshalting Turing machine.
Type1 grammars
Type1 grammars generate contextsensitive languages. These grammars have rules of the form [math]\displaystyle{ \alpha A\beta \rightarrow \alpha\gamma\beta }[/math] with [math]\displaystyle{ A }[/math] a nonterminal and [math]\displaystyle{ \alpha }[/math], [math]\displaystyle{ \beta }[/math] and [math]\displaystyle{ \gamma }[/math] strings of terminals and/or nonterminals. The strings [math]\displaystyle{ \alpha }[/math] and [math]\displaystyle{ \beta }[/math] may be empty, but [math]\displaystyle{ \gamma }[/math] must be nonempty. The rule [math]\displaystyle{ S \rightarrow \epsilon }[/math] is allowed if [math]\displaystyle{ S }[/math] does not appear on the right side of any rule. The languages described by these grammars are exactly all languages that can be recognized by a linear bounded automaton (a nondeterministic Turing machine whose tape is bounded by a constant times the length of the input.)
Type2 grammars
Type2 grammars generate the contextfree languages. These are defined by rules of the form [math]\displaystyle{ A \rightarrow \alpha }[/math] with [math]\displaystyle{ A }[/math] being a nonterminal and [math]\displaystyle{ \alpha }[/math] being a string of terminals and/or nonterminals. These languages are exactly all languages that can be recognized by a nondeterministic pushdown automaton. Contextfree languages—or rather its subset of deterministic contextfree language—are the theoretical basis for the phrase structure of most programming languages, though their syntax also includes contextsensitive name resolution due to declarations and scope. Often a subset of grammars is used to make parsing easier, such as by an LL parser.
Type3 grammars
Type3 grammars generate the regular languages. Such a grammar restricts its rules to a single nonterminal on the lefthand side and a righthand side consisting of a single terminal, possibly followed by a single nonterminal (right regular). Alternatively, the righthand side of the grammar can consist of a single terminal, possibly preceded by a single nonterminal (left regular). These generate the same languages. However, if leftregular rules and rightregular rules are combined, the language need no longer be regular. The rule [math]\displaystyle{ S \rightarrow \varepsilon }[/math] is also allowed here if [math]\displaystyle{ S }[/math] does not appear on the right side of any rule. These languages are exactly all languages that can be decided by a finite state automaton. Additionally, this family of formal languages can be obtained by regular expressions. Regular languages are commonly used to define search patterns and the lexical structure of programming languages.
References
 ↑ ^{1.0} ^{1.1} Allott, Nicholas; Lohndal, Terje; Rey, Georges (27 April 2021). "Synoptic Introduction". A Companion to Chomsky: 1–17. doi:10.1002/9781119598732.ch1. https://www.researchgate.net/profile/NicholasAllott/publication/351812216_Synoptic_Introduction/links/641ff75c66f8522c38d42fd4/SynopticIntroduction.pdf.
 ↑ Geuvers, H.; Rot, J. (2016). "Applications, Chomsky hierarchy, and Recap". Regular Languages. https://www.cs.ru.nl/~herman/onderwijs/2016TnA/lecture7.pdf.
 ↑ Chomsky, Noam (1963). "Chapter 12: Formal Properties of Grammars". in Luce, R. Duncan; Bush, Robert R.; Galanter, Eugene. Handbook of Mathematical Psychology. II. John Wiley and Sons, Inc.. pp. 323–418.
 ↑ Sipser, Michael (1997). Introduction to the Theory of Computation (1st ed.). Cengage Learning. p. 130. ISBN 053494728X. https://archive.org/details/introductiontoth00sips/page/130. "The ChurchTuring Thesis"
 Chomsky, Noam (1956). "Three models for the description of language". IRE Transactions on Information Theory 2 (3): 113–124. doi:10.1109/TIT.1956.1056813. https://chomsky.info/wpcontent/uploads/195609.pdf.
 Chomsky, Noam (1959). "On certain formal properties of grammars". Information and Control 2 (2): 137–167. doi:10.1016/S00199958(59)903626. https://www.sciencedirect.com/science/article/pii/S0019995859903626/pdf?md5=9d466f851651bd592afa5ee561b7a0b0&pid=1s2.0S0019995859903626main.pdf.
 Chomsky, Noam; Schützenberger, Marcel P. (1963). "The algebraic theory of context free languages". in Braffort, P.; Hirschberg, D.. Computer Programming and Formal Systems. Amsterdam: North Holland. pp. 118–161. http://wwwigm.univmlv.fr/~berstel/Mps/Travaux/A/19637ChomskyAlgebraic.pdf.
 Davis, Martin D.; Sigal, Ron (1994). Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science (2nd ed.). Boston: Academic Press, Harcourt, Brace. p. 327. ISBN 0122063821. https://archive.org/details/computabilitycom00davi_405.
Original source: https://en.wikipedia.org/wiki/Chomsky hierarchy.
Read more 