Composition of relations

From HandWiki
Revision as of 15:35, 6 February 2024 by TextAI (talk | contribs) (update)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Mathematical operation

In the mathematics of binary relations, the composition of relations is the forming of a new binary relation R; S from two given binary relations R and S. In the calculus of relations, the composition of relations is called relative multiplication,[1] and its result is called a relative product.[2]:40 Function composition is the special case of composition of relations where all relations involved are functions.

The word uncle indicates a compound relation: for a person to be an uncle, he must be the brother of a parent. In algebraic logic it is said that the relation of Uncle ([math]\displaystyle{ x U z }[/math]) is the composition of relations "is a brother of" ([math]\displaystyle{ x B y }[/math]) and "is a parent of" ([math]\displaystyle{ y P z }[/math]). [math]\displaystyle{ U = BP \quad \text{ is equivalent to: } \quad xByPz \text{ if and only if } xUz. }[/math]

Beginning with Augustus De Morgan,[3] the traditional form of reasoning by syllogism has been subsumed by relational logical expressions and their composition.[4]

Definition

If [math]\displaystyle{ R \subseteq X \times Y }[/math] and [math]\displaystyle{ S \subseteq Y \times Z }[/math] are two binary relations, then their composition [math]\displaystyle{ R; S }[/math] is the relation [math]\displaystyle{ R; S = \{(x,z) \in X \times Z : \text{ there exists } y \in Y \text{ such that } (x,y) \in R \text{ and } (y,z) \in S\}. }[/math]

In other words, [math]\displaystyle{ R; S \subseteq X \times Z }[/math] is defined by the rule that says [math]\displaystyle{ (x,z) \in R; S }[/math] if and only if there is an element [math]\displaystyle{ y \in Y }[/math] such that [math]\displaystyle{ x\,R\,y\,S\,z }[/math] (that is, [math]\displaystyle{ (x,y) \in R }[/math] and [math]\displaystyle{ (y,z) \in S }[/math]).[5]:13

Notational variations

The semicolon as an infix notation for composition of relations dates back to Ernst Schroder's textbook of 1895.[6] Gunther Schmidt has renewed the use of the semicolon, particularly in Relational Mathematics (2011).[2]:40[7] The use of semicolon coincides with the notation for function composition used (mostly by computer scientists) in category theory,[8] as well as the notation for dynamic conjunction within linguistic dynamic semantics.[9]

A small circle [math]\displaystyle{ (R \circ S) }[/math] has been used for the infix notation of composition of relations by John M. Howie in his books considering semigroups of relations.[10] However, the small circle is widely used to represent composition of functions [math]\displaystyle{ g(f(x)) = (g \circ f)(x) }[/math] which reverses the text sequence from the operation sequence. The small circle was used in the introductory pages of Graphs and Relations[5]:18 until it was dropped in favor of juxtaposition (no infix notation). Juxtaposition [math]\displaystyle{ (RS) }[/math] is commonly used in algebra to signify multiplication, so too, it can signify relative multiplication.

Further with the circle notation, subscripts may be used. Some authors[11] prefer to write [math]\displaystyle{ \circ_l }[/math] and [math]\displaystyle{ \circ_r }[/math] explicitly when necessary, depending whether the left or the right relation is the first one applied. A further variation encountered in computer science is the Z notation: [math]\displaystyle{ \circ }[/math] is used to denote the traditional (right) composition, but ⨾ (U+2A3E Z NOTATION RELATIONAL COMPOSITION) denotes left composition.[12][13]

Mathematical generalizations

Binary relations [math]\displaystyle{ R \subseteq X\times Y }[/math] are morphisms [math]\displaystyle{ R : X\to Y }[/math] in the category [math]\displaystyle{ \mathsf{Rel} }[/math]. In Rel the objects are sets, the morphisms are binary relations and the composition of morphisms is exactly composition of relations as defined above. The category Set of sets and functions is a subcategory of [math]\displaystyle{ \mathsf{Rel} }[/math] where the maps [math]\displaystyle{ X\to Y }[/math] are functions [math]\displaystyle{ f:X\to Y }[/math].

Given a regular category [math]\displaystyle{ \mathbb{X} }[/math], its category of internal relations [math]\displaystyle{ \mathsf{Rel}(\mathbb{X}) }[/math] has the same objects as [math]\displaystyle{ \mathbb{X} }[/math], but now the morphisms [math]\displaystyle{ X\to Y }[/math] are given by subobjects [math]\displaystyle{ R\subseteq X\times Y }[/math] in [math]\displaystyle{ \mathbb{X} }[/math].[14] Formally, these are jointly monic spans between [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math]. Categories of internal relations are allegories. In particular [math]\displaystyle{ \mathsf{Rel}(\mathsf{Set})\cong \mathsf{Rel} }[/math]. Given a field [math]\displaystyle{ k }[/math] (or more generally a principal ideal domain), the category of relations internal to matrices over [math]\displaystyle{ k }[/math], [math]\displaystyle{ \mathsf{Rel}(\mathsf{Mat}(k)) }[/math] has morphisms [math]\displaystyle{ n\to m }[/math] linear subspaces [math]\displaystyle{ R \subseteq k^n\oplus k^m }[/math]. The category of linear relations over the finite field [math]\displaystyle{ \mathbb{F}_2 }[/math] is isomorphic to the phase-free qubit ZX-calculus modulo scalars.

Properties

  • Composition of relations is associative: [math]\displaystyle{ R;(S;T) = (R;S);T. }[/math]
  • The converse relation of [math]\displaystyle{ R \, ; S }[/math] is [math]\displaystyle{ (R \, ; S)^\textsf{T} = S^{\textsf{T}} \, ; R^{\textsf{T}}. }[/math] This property makes the set of all binary relations on a set a semigroup with involution.
  • The composition of (partial) functions (that is, functional relations) is again a (partial) function.
  • If [math]\displaystyle{ R }[/math] and [math]\displaystyle{ S }[/math] are injective, then [math]\displaystyle{ R \, ; S }[/math] is injective, which conversely implies only the injectivity of [math]\displaystyle{ R. }[/math]
  • If [math]\displaystyle{ R }[/math] and [math]\displaystyle{ S }[/math] are surjective, then [math]\displaystyle{ R \, ; S }[/math] is surjective, which conversely implies only the surjectivity of [math]\displaystyle{ S. }[/math]
  • The set of binary relations on a set [math]\displaystyle{ X }[/math] (that is, relations from [math]\displaystyle{ X }[/math] to [math]\displaystyle{ X }[/math]) together with (left or right) relation composition forms a monoid with zero, where the identity map on [math]\displaystyle{ X }[/math] is the neutral element, and the empty set is the zero element.

Composition in terms of matrices

Finite binary relations are represented by logical matrices. The entries of these matrices are either zero or one, depending on whether the relation represented is false or true for the row and column corresponding to compared objects. Working with such matrices involves the Boolean arithmetic with [math]\displaystyle{ 1 + 1 = 1 }[/math] and [math]\displaystyle{ 1 \times 1 = 1. }[/math] An entry in the matrix product of two logical matrices will be 1, then, only if the row and column multiplied have a corresponding 1. Thus the logical matrix of a composition of relations can be found by computing the matrix product of the matrices representing the factors of the composition. "Matrices constitute a method for computing the conclusions traditionally drawn by means of hypothetical syllogisms and sorites."[15]

Heterogeneous relations

Consider a heterogeneous relation [math]\displaystyle{ R \subseteq A \times B; }[/math] that is, where [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] are distinct sets. Then using composition of relation [math]\displaystyle{ R }[/math] with its converse [math]\displaystyle{ R^\textsf{T}, }[/math] there are homogeneous relations [math]\displaystyle{ R R^\textsf{T} }[/math] (on [math]\displaystyle{ A }[/math]) and [math]\displaystyle{ R^\textsf{T} R }[/math] (on [math]\displaystyle{ B }[/math]).

If for all [math]\displaystyle{ x \in A }[/math] there exists some [math]\displaystyle{ y \in B, }[/math] such that [math]\displaystyle{ x R y }[/math] (that is, [math]\displaystyle{ R }[/math] is a (left-)total relation), then for all [math]\displaystyle{ x, x R R^\textsf{T} x }[/math] so that [math]\displaystyle{ R R^\textsf{T} }[/math] is a reflexive relation or [math]\displaystyle{ I \subseteq R R^\textsf{T} }[/math] where I is the identity relation [math]\displaystyle{ \{x I x : x \in A\}. }[/math] Similarly, if [math]\displaystyle{ R }[/math] is a surjective relation then [math]\displaystyle{ R^\textsf{T} R \supseteq I = \{x I x : x \in B\}. }[/math] In this case [math]\displaystyle{ R \subseteq R R^\textsf{T} R. }[/math] The opposite inclusion occurs for a difunctional relation.

The composition [math]\displaystyle{ \bar{R}^\textsf{T} R }[/math] is used to distinguish relations of Ferrer's type, which satisfy [math]\displaystyle{ R \bar{R}^\textsf{T} R = R. }[/math]

Example

Let [math]\displaystyle{ A = }[/math] { France, Germany, Italy, Switzerland } and [math]\displaystyle{ B = }[/math] { French, German, Italian } with the relation [math]\displaystyle{ R }[/math] given by [math]\displaystyle{ a R b }[/math] when [math]\displaystyle{ b }[/math] is a national language of [math]\displaystyle{ a. }[/math] Since both [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] is finite, [math]\displaystyle{ R }[/math] can be represented by a logical matrix, assuming rows (top to bottom) and columns (left to right) are ordered alphabetically: [math]\displaystyle{ \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 1 & 1 \end{pmatrix}. }[/math]

The converse relation [math]\displaystyle{ R^\textsf{T} }[/math] corresponds to the transposed matrix, and the relation composition [math]\displaystyle{ R^\textsf{T}; R }[/math] corresponds to the matrix product [math]\displaystyle{ R^\textsf{T} R }[/math] when summation is implemented by logical disjunction. It turns out that the [math]\displaystyle{ 3 \times 3 }[/math] matrix [math]\displaystyle{ R^\textsf{T} R }[/math] contains a 1 at every position, while the reversed matrix product computes as: [math]\displaystyle{ R R^\textsf{T} = \begin{pmatrix} 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 1 & 1 \\ 1 & 1 & 1 & 1 \end{pmatrix}. }[/math] This matrix is symmetric, and represents a homogeneous relation on [math]\displaystyle{ A. }[/math]

Correspondingly, [math]\displaystyle{ R^\textsf{T} \, ; R }[/math] is the universal relation on [math]\displaystyle{ B, }[/math] hence any two languages share a nation where they both are spoken (in fact: Switzerland). Vice versa, the question whether two given nations share a language can be answered using [math]\displaystyle{ R \, ; R^\textsf{T}. }[/math]

Schröder rules

For a given set [math]\displaystyle{ V, }[/math] the collection of all binary relations on [math]\displaystyle{ V }[/math] forms a Boolean lattice ordered by inclusion [math]\displaystyle{ (\subseteq). }[/math] Recall that complementation reverses inclusion: [math]\displaystyle{ A \subseteq B \text{ implies } B^{\complement} \subseteq A^{\complement}. }[/math] In the calculus of relations[16] it is common to represent the complement of a set by an overbar: [math]\displaystyle{ \bar{A} = A^{\complement}. }[/math]

If [math]\displaystyle{ S }[/math] is a binary relation, let [math]\displaystyle{ S^\textsf{T} }[/math] represent the converse relation, also called the transpose. Then the Schröder rules are [math]\displaystyle{ Q R \subseteq S \quad \text{ is equivalent to } \quad Q^\textsf{T} \bar{S} \subseteq \bar{R} \quad \text{ is equivalent to } \quad \bar{S} R^\textsf{T} \subseteq \bar{Q}. }[/math] Verbally, one equivalence can be obtained from another: select the first or second factor and transpose it; then complement the other two relations and permute them.[5]:15–19

Though this transformation of an inclusion of a composition of relations was detailed by Ernst Schröder, in fact Augustus De Morgan first articulated the transformation as Theorem K in 1860.[4] He wrote[17] [math]\displaystyle{ L M \subseteq N \text{ implies } \bar{N} M^\textsf{T} \subseteq \bar{L}. }[/math]

With Schröder rules and complementation one can solve for an unknown relation [math]\displaystyle{ X }[/math] in relation inclusions such as [math]\displaystyle{ R X \subseteq S \quad \text{and} \quad XR \subseteq S. }[/math] For instance, by Schröder rule [math]\displaystyle{ R X \subseteq S \text{ implies } R^\textsf{T} \bar{S} \subseteq \bar{X}, }[/math] and complementation gives [math]\displaystyle{ X \subseteq \overline{R^\textsf{T} \bar{S}}, }[/math] which is called the left residual of [math]\displaystyle{ S }[/math] by [math]\displaystyle{ R }[/math].

Quotients

Just as composition of relations is a type of multiplication resulting in a product, so some operations compare to division and produce quotients. Three quotients are exhibited here: left residual, right residual, and symmetric quotient. The left residual of two relations is defined presuming that they have the same domain (source), and the right residual presumes the same codomain (range, target). The symmetric quotient presumes two relations share a domain and a codomain.

Definitions:

  • Left residual: [math]\displaystyle{ A\backslash B \mathrel{:=} \overline{A^\textsf{T} \bar{B} } }[/math]
  • Right residual: [math]\displaystyle{ D/C \mathrel{:=} \overline{\bar{D} C^\textsf{T}} }[/math]
  • Symmetric quotient: [math]\displaystyle{ \operatorname{syq} (E, F) \mathrel{:=} \overline{E^\textsf{T} \bar{F} } \cap \overline{\bar{E}^\textsf{T} F} }[/math]

Using Schröder's rules, [math]\displaystyle{ A X \subseteq B }[/math] is equivalent to [math]\displaystyle{ X \subseteq A \backslash B. }[/math] Thus the left residual is the greatest relation satisfying [math]\displaystyle{ A X \subseteq B. }[/math] Similarly, the inclusion [math]\displaystyle{ Y C \subseteq D }[/math] is equivalent to [math]\displaystyle{ Y \subseteq D / C, }[/math] and the right residual is the greatest relation satisfying [math]\displaystyle{ Y C \subseteq D. }[/math][2]:43–6

One can practice the logic of residuals with Sudoku.[further explanation needed]

Join: another form of composition

A fork operator [math]\displaystyle{ (\lt ) }[/math] has been introduced to fuse two relations [math]\displaystyle{ c : H \to A }[/math] and [math]\displaystyle{ d : H \to B }[/math] into [math]\displaystyle{ c \,(\lt )\, d : H \to A \times B. }[/math] The construction depends on projections [math]\displaystyle{ a : A \times B \to A }[/math] and [math]\displaystyle{ b : A \times B \to B, }[/math] understood as relations, meaning that there are converse relations [math]\displaystyle{ a^{\textsf{T}} }[/math] and [math]\displaystyle{ b^{\textsf{T}}. }[/math] Then the fork of [math]\displaystyle{ c }[/math] and [math]\displaystyle{ d }[/math] is given by[18] [math]\displaystyle{ c\,(\lt )\,d ~\mathrel{:=}~ c ;a^\textsf{T} \cap\ d ;b^\textsf{T}. }[/math]

Another form of composition of relations, which applies to general [math]\displaystyle{ n }[/math]-place relations for [math]\displaystyle{ n \geq 2, }[/math] is the join operation of relational algebra. The usual composition of two binary relations as defined here can be obtained by taking their join, leading to a ternary relation, followed by a projection that removes the middle component. For example, in the query language SQL there is the operation Join (SQL).

See also

Notes

  1. Bjarni Jónssen (1984) "Maximal Algebras of Binary Relations", in Contributions to Group Theory, K.I. Appel editor American Mathematical Society ISBN:978-0-8218-5035-0
  2. 2.0 2.1 2.2 Gunther Schmidt (2011) Relational Mathematics, Encyclopedia of Mathematics and its Applications, vol. 132, Cambridge University Press ISBN:978-0-521-76268-7
  3. A. De Morgan (1860) "On the Syllogism: IV and on the Logic of Relations"
  4. 4.0 4.1 Daniel D. Merrill (1990) Augustus De Morgan and the Logic of Relations, page 121, Kluwer Academic ISBN:9789400920477
  5. 5.0 5.1 5.2 Gunther Schmidt & Thomas Ströhlein (1993) Relations and Graphs, Springer books
  6. Ernst Schroder (1895) Algebra und Logik der Relative
  7. Paul Taylor (1999). Practical Foundations of Mathematics. Cambridge University Press. p. 24. ISBN 978-0-521-63107-5. https://books.google.com/books?id=iSCqyNgzamcC&pg=PA24.  A free HTML version of the book is available at http://www.cs.man.ac.uk/~pt/Practical_Foundations/
  8. Michael Barr & Charles Wells (1998) Category Theory for Computer Scientists , page 6, from McGill University
  9. Rick Nouwen and others (2016) Dynamic Semantics §2.2, from Stanford Encyclopedia of Philosophy
  10. John M. Howie(1995) Fundamentals of Semigroup Theory, page 16, LMS Monograph #12, Clarendon Press ISBN:0-19-851194-9
  11. Kilp, Knauer & Mikhalev, p. 7
  12. ISO/IEC 13568:2002(E), p. 23
  13. Unicode character: Z Notation relational composition from FileFormat.info
  14. "internal relations". https://ncatlab.org/nlab/show/internal+relation. 
  15. Irving Copilowish (December 1948) "Matrix development of the calculus of relations", Journal of Symbolic Logic 13(4): 193–203 Jstor link, quote from page 203
  16. Vaughn Pratt The Origins of the Calculus of Relations, from Stanford University
  17. De Morgan indicated contraries by lower case, conversion as M−1, and inclusion with )), so his notation was [math]\displaystyle{ n M^{-1} )) \ l. }[/math]
  18. Gunther Schmidt and Michael Winter (2018): Relational Topology, page 26, Lecture Notes in Mathematics vol. 2208, Springer books, ISBN:978-3-319-74451-3

References

  • M. Kilp, U. Knauer, A.V. Mikhalev (2000) Monoids, Acts and Categories with Applications to Wreath Products and Graphs, De Gruyter Expositions in Mathematics vol. 29, Walter de Gruyter,ISBN:3-11-015248-7.