DisCoCat

From HandWiki
Revision as of 15:37, 6 February 2024 by S.Timg (talk | contribs) (correction)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Mathematical framework for natural language processing

DisCoCat (Categorical Compositional Distributional) is a mathematical framework for natural language processing which uses category theory to unify distributional semantics with the principle of compositionality. The grammatical derivations in a categorial grammar (usually a pregroup grammar) are interpreted as linear maps acting on the tensor product of word vectors to produce the meaning of a sentence or a piece of text. String diagrams are used to visualise information flow and reason about natural language semantics.

History

The framework was first introduced by Bob Coecke, Mehrnoosh Sadrzadeh, and Stephen Clark[1] as an application of categorical quantum mechanics to natural language processing. It started with the observation that pregroup grammars and quantum processes shared a common mathematical structure: they both form a rigid category (also known as a non-symmetric compact closed category). As such, they both benefit from a graphical calculus, which allows a purely diagrammatic reasoning. Although the analogy with quantum mechanics was kept informal at first, it eventually led to the development of quantum natural language processing.[2][3]

Definition

There are multiple definitions of DisCoCat in the literature, depending on the choice made for the compositional aspect of the model. The common denominator between all the existent versions, however, always involves a categorical definition of DisCoCat as a structure-preserving functor from a category of grammar to a category of semantics, which usually encodes the distributional hypothesis.

The original paper[1] used the categorical product of FinVect with a pregroup seen as a posetal category. This approach has some shortcomings: all parallel arrows of a posetal category are equal, which means that pregroups cannot distinguish between different grammatical derivations for the same syntactically ambiguous sentence.[4] A more intuitive manner of saying the same is that one works with diagrams rather than with partial orders when describing grammar.

This problem is overcome when one considers the free rigid category [math]\displaystyle{ \mathbf{G} }[/math] generated by the pregroup grammar.[5] That is, [math]\displaystyle{ \mathbf{G} }[/math] has generating objects for the words and the basic types of the grammar, and generating arrows [math]\displaystyle{ w \to t }[/math] for the dictionary entries which assign a pregroup type [math]\displaystyle{ t }[/math] to a word [math]\displaystyle{ w }[/math]. The arrows [math]\displaystyle{ f: w_1 \dots w_n \to s }[/math] are grammatical derivations for the sentence [math]\displaystyle{ w_1 \dots w_n }[/math] which can be represented as string diagrams with cups and caps, i.e. adjunction units and counits.[6]

With this definition of pregroup grammars as free rigid categories, DisCoCat models can be defined as strong monoidal functors [math]\displaystyle{ F : \mathbf{G} \to \mathbf{FinVect} }[/math]. Spelling things out in detail, they assign a finite dimensional vector space [math]\displaystyle{ F(x) }[/math] to each basic type [math]\displaystyle{ x }[/math] and a vector [math]\displaystyle{ F(w) \in F(t) = F(t_1) \otimes \dots \otimes F(t_n) }[/math] in the appropriate tensor product space to each dictionary entry [math]\displaystyle{ w \to t }[/math] where [math]\displaystyle{ t = t_1 \dots t_n }[/math] (objects for words are sent to the monoidal unit, i.e. [math]\displaystyle{ F(w) = 1 }[/math]). The meaning of a sentence [math]\displaystyle{ f: w_1 \dots w_n \to s }[/math] is then given by a vector [math]\displaystyle{ F(f) \in F(s) }[/math] which can be computed as the contraction of a tensor network.[7]

The reason behind the choice of [math]\displaystyle{ \mathbf{FinVect} }[/math] as the category of semantics is that vector spaces are the usual setting of distributional reading in computational linguistics and natural language processing. The underlying idea of distributional hypothesis "A word is characterized by the company it keeps" is particularly relevant when assigning meaning to words like adjectives or verbs, whose semantic connotation is strongly dependent on context.

Variations

Variations of DisCoCat have been proposed with a different choice for the grammar category. The main motivation behind this lies in the fact that pregroup grammars have been proved to be weakly equivalent to context-free grammars.[8] One example of variation[9] chooses Combinatory categorial grammar as the grammar category.

List of linguistic phenomena

The DisCoCat framework has been used to study the following phenomena from linguistics.

Applications in NLP

The DisCoCat framework has been applied to solve the following tasks in natural language processing.

See also

External links

References

  1. 1.0 1.1 Coecke, Bob; Sadrzadeh, Mehrnoosh; Clark, Stephen (2010-03-23). "Mathematical Foundations for a Compositional Distributional Model of Meaning". arXiv:1003.4394 [cs.CL].
  2. Zeng, William; Coecke, Bob (2016-08-02). "Quantum Algorithms for Compositional Natural Language Processing". Electronic Proceedings in Theoretical Computer Science 221: 67–75. doi:10.4204/EPTCS.221.8. ISSN 2075-2180. 
  3. Coecke, Bob; de Felice, Giovanni; Meichanetzidis, Konstantinos; Toumi, Alexis (2020-12-07). "Foundations for Near-Term Quantum Natural Language Processing". arXiv:2012.03755 [quant-ph].
  4. Preller, Anne (2014-12-27). "From Logical to Distributional Models". Electronic Proceedings in Theoretical Computer Science 171: 113–131. doi:10.4204/EPTCS.171.11. ISSN 2075-2180. 
  5. Preller, Anne; Lambek, Joachim (2007-01-18). "Free Compact 2-Categories" (in en). Mathematical Structures in Computer Science 17 (doi: 10.1017/S0960129506005901): 309. doi:10.1017/S0960129506005901. https://hal-lirmm.ccsd.cnrs.fr/lirmm-00137681. 
  6. Selinger, Peter (2010). "A survey of graphical languages for monoidal categories". New Structures for Physics. Lecture Notes in Physics. 813. pp. 289–355. doi:10.1007/978-3-642-12821-9_4. ISBN 978-3-642-12820-2. 
  7. de Felice, Giovanni; Meichanetzidis, Konstantinos; Toumi, Alexis (2020-09-15). "Functorial Question Answering". Electronic Proceedings in Theoretical Computer Science 323: 84–94. doi:10.4204/EPTCS.323.6. ISSN 2075-2180. 
  8. Buszkowski, Wojciech (2001). "Lambek grammars based on pregroups.". In International Conference on Logical Aspects of Computational Linguistics. 
  9. Yeung, Richie; Kartsaklis, Dimitri (2021). "A CCG-based version of the DisCoCat framework". arXiv:2105.07720 [cs.CL].
  10. Sadrzadeh, Mehrnoosh; Kartsaklis, Dimitri; Balkır, Esma (2018). "Sentence entailment in compositional distributional semantics". Annals of Mathematics and Artificial Intelligence 82 (4): 189–218. doi:10.1007/s10472-017-9570-x. https://scholar.googleusercontent.com/scholar.bib?q=info:RmKmNl-W7K8J:scholar.google.com/&output=citation&scisdr=CgUx7uJgEJT6uf2eIyQ:AAGBfm0AAAAAY2uYOyQroAN5jVsHpnOm5INOsOVv3JV6&scisig=AAGBfm0AAAAAY2uYO6CoH69A34XAjBGynCL94eJzJkU5&scisf=4&ct=citation&cd=-1&hl=en. 
  11. Kartsaklis, Dimitri (2016). "Coordination in Categorical Compositional Distributional Semantics". Electronic Proceedings in Theoretical Computer Science 221: 29–38. doi:10.4204/EPTCS.221.4. 
  12. Bankova, Dea; Coecke, Bob; Lewis, Martha; Marsden, Dan (2018). "Graded hyponymy for compositional distributional semantics". Journal of Language Modelling 6 (2): 225–260. https://scholar.googleusercontent.com/scholar.bib?q=info:j1036XEK5QkJ:scholar.google.com/&output=citation&scisdr=CgUx7uJgEJT6uf2R5bU:AAGBfm0AAAAAY2uX_bXvCx4aD6Lbg5B_hkLURh-yXa9_&scisig=AAGBfm0AAAAAY2uX_TXADdSVe7bfdiBIjsXni3_b_ynE&scisf=4&ct=citation&cd=-1&hl=en. 
  13. Meyer, Francois; Lewis, Martha (2020-10-12). "Modelling Lexical Ambiguity with Density Matrices". arXiv:2010.05670 [cs.CL].
  14. Coecke, Bob; de Felice, Giovanni; Marsden, Dan; Toumi, Alexis (2018-11-08). "Towards Compositional Distributional Discourse Analysis". Electronic Proceedings in Theoretical Computer Science 283: 1–12. doi:10.4204/EPTCS.283.1. ISSN 2075-2180. 
  15. Wijnholds, Gijs; Sadrzadeh, Mehrnoosh (2019). "A type-driven vector semantics for ellipsis with anaphora using lambek calculus with limited contraction". Journal of Logic, Language and Information 28 (2): 331–358. doi:10.1007/s10849-019-09293-4. https://scholar.googleusercontent.com/scholar.bib?q=info:BocKa55sSXwJ:scholar.google.com/&output=citation&scisdr=CgUx7uJgEJT6uf2e35I:AAGBfm0AAAAAY2uYx5LJVXM0Ps9f8FNWD-l28jb9P1ln&scisig=AAGBfm0AAAAAY2uYx8mr7fiIjCQUup-bBjEWSa4DQms5&scisf=4&ct=citation&cd=-1&hl=en. 
  16. Bradley, Tai-Danae; Lewis, Martha; Master, Jade; Theilman, Brad (2018). "Translating and Evolving: Towards a Model of Language Change in DisCoCat". Electronic Proceedings in Theoretical Computer Science 283: 50–61. doi:10.4204/EPTCS.283.4. 
  17. Grefenstette, Edward; Sadrzadeh, Mehrnoosh (2011-06-20). "Experimental Support for a Categorical Compositional Distributional Model of Meaning". arXiv:1106.4058 [cs.CL].
  18. Kartsaklis, Dimitri; Sadrzadeh, Mehrnoosh (2013). Prior disambiguation of word tensors for constructing sentence vectors. https://www.researchgate.net/publication/262373873. 
  19. Grefenstette, Edward; Dinu, Georgiana; Zhang, Yao-Zhong; Sadrzadeh, Mehrnoosh; Baroni, Marco (2013-01-30). "Multi-Step Regression Learning for Compositional Distributional Semantics". arXiv:1301.6939 [cs.CL].
  20. de Felice, Giovanni; Meichanetzidis, Konstantinos; Toumi, Alexis (2019). "Functorial Question Answering". Electronic Proceedings in Theoretical Computer Science 323: 84–94. doi:10.4204/EPTCS.323.6. 
  21. Tyrrell, Brian (2018-11-08). "Applying Distributional Compositional Categorical Models of Meaning to Language Translation". Electronic Proceedings in Theoretical Computer Science 283: 28–49. doi:10.4204/EPTCS.283.3. ISSN 2075-2180. 
  22. Coecke, Bob; de Felice, Giovanni; Marsden, Dan; Toumi, Alexis (2018-11-08). "Towards Compositional Distributional Discourse Analysis". Electronic Proceedings in Theoretical Computer Science 283: 1–12. doi:10.4204/EPTCS.283.1. ISSN 2075-2180.