Commutative property

From HandWiki
Short description: Property of some mathematical operations
Commutative property
Commutativity of binary operations (without question mark).svg
StatementA binary operation is commutative if changing the order of the operands does not change the result.

In mathematics, a binary operation is commutative if changing the order of the operands does not change the result. It is a fundamental property of many binary operations, and many mathematical proofs depend on it. Perhaps most familiar as a property of arithmetic, e.g. "3 + 4 = 4 + 3" or "2 × 5 = 5 × 2", the property can also be used in more advanced settings. The name is needed because there are operations, such as division and subtraction, that do not have it (for example, "3 − 5 ≠ 5 − 3"); such operations are not commutative, and so are referred to as noncommutative operations. The idea that simple operations, such as the multiplication and addition of numbers, are commutative was for many years implicitly assumed. Thus, this property was not named until the 19th century, when mathematics started to become formalized.[1][2] A similar property exists for binary relations; a binary relation is said to be symmetric if the relation applies regardless of the order of its operands; for example, equality is symmetric as two equal mathematical objects are equal regardless of their order.[3]

Mathematical definitions

A binary operation [math]\displaystyle{ * }[/math] on a set S is called commutative if[4][5] [math]\displaystyle{ x * y = y * x\qquad\mbox{for all }x,y\in S. }[/math] In other words, an operation is commutative if every two elements commute. An operation that does not satisfy the above property is called noncommutative.

One says that x commutes with y or that x and y commute under [math]\displaystyle{ * }[/math] if [math]\displaystyle{ x * y = y * x. }[/math] That is, a specific pair of elements may commute even if the operation is (strictly) noncommutative.


The cumulation of apples, which can be seen as an addition of natural numbers, is commutative.

Commutative operations

The addition of vectors is commutative, because [math]\displaystyle{ \vec a+\vec b=\vec b+ \vec a }[/math].

Noncommutative operations

Some noncommutative binary operations:[6]

Division, subtraction, and exponentiation

Division is noncommutative, since [math]\displaystyle{ 1 \div 2 \neq 2 \div 1 }[/math].

Subtraction is noncommutative, since [math]\displaystyle{ 0 - 1 \neq 1 - 0 }[/math]. However it is classified more precisely as anti-commutative, since [math]\displaystyle{ 0 - 1 = - (1 - 0) }[/math].

Exponentiation is noncommutative, since [math]\displaystyle{ 2^3\neq3^2 }[/math]. This property leads to two different "inverse" operations of exponentiation (namely, the nth-root operation and the logarithm operation), which is unlike the multiplication. [7]

Truth functions

Some truth functions are noncommutative, since the truth tables for the functions are different when one changes the order of the operands. For example, the truth tables for (A ⇒ B) = (¬A ∨ B) and (B ⇒ A) = (A ∨ ¬B) are

A B A ⇒ B B ⇒ A

Function composition of linear functions

Function composition of linear functions from the real numbers to the real numbers is almost always noncommutative. For example, let [math]\displaystyle{ f(x)=2x+1 }[/math] and [math]\displaystyle{ g(x)=3x+7 }[/math]. Then [math]\displaystyle{ (f \circ g)(x) = f(g(x)) = 2(3x+7)+1 = 6x+15 }[/math] and [math]\displaystyle{ (g \circ f)(x) = g(f(x)) = 3(2x+1)+7 = 6x+10 }[/math] This also applies more generally for linear and affine transformations from a vector space to itself (see below for the Matrix representation).

Matrix multiplication

Matrix multiplication of square matrices is almost always noncommutative, for example: [math]\displaystyle{ \begin{bmatrix} 0 & 2 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 0 & 1 \\ 0 & 1 \end{bmatrix} \neq \begin{bmatrix} 0 & 1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ 0 & 1 \end{bmatrix} }[/math]

Vector product

The vector product (or cross product) of two vectors in three dimensions is anti-commutative; i.e., b × a = −(a × b).

History and etymology

The first known use of the term was in a French Journal published in 1814

Records of the implicit use of the commutative property go back to ancient times. The Egyptians used the commutative property of multiplication to simplify computing products.[8][9] Euclid is known to have assumed the commutative property of multiplication in his book Elements.[10] Formal uses of the commutative property arose in the late 18th and early 19th centuries, when mathematicians began to work on a theory of functions. Today the commutative property is a well-known and basic property used in most branches of mathematics.

The first recorded use of the term commutative was in a memoir by François Servois in 1814,[1][11] which used the word commutatives when describing functions that have what is now called the commutative property. The word is a combination of the French word commuter meaning "to substitute or switch" and the suffix -ative meaning "tending to" so the word literally means "tending to substitute or switch". The term then appeared in English in 1838.[2] in Duncan Farquharson Gregory's article entitled "On the real nature of symbolical algebra" published in 1840 in the Transactions of the Royal Society of Edinburgh.[12]

Propositional logic

Rule of replacement

In truth-functional propositional logic, commutation,[13][14] or commutativity[15] refer to two valid rules of replacement. The rules allow one to transpose propositional variables within logical expressions in logical proofs. The rules are: [math]\displaystyle{ (P \lor Q) \Leftrightarrow (Q \lor P) }[/math] and [math]\displaystyle{ (P \land Q) \Leftrightarrow (Q \land P) }[/math] where "[math]\displaystyle{ \Leftrightarrow }[/math]" is a metalogical symbol representing "can be replaced in a proof with".

Truth functional connectives

Commutativity is a property of some logical connectives of truth functional propositional logic. The following logical equivalences demonstrate that commutativity is a property of particular connectives. The following are truth-functional tautologies.

Commutativity of conjunction
[math]\displaystyle{ (P \land Q) \leftrightarrow (Q \land P) }[/math]
Commutativity of disjunction
[math]\displaystyle{ (P \lor Q) \leftrightarrow (Q \lor P) }[/math]
Commutativity of implication (also called the law of permutation)
[math]\displaystyle{ \big(P \to (Q \to R)\big) \leftrightarrow \big(Q \to (P \to R)\big) }[/math]
Commutativity of equivalence (also called the complete commutative law of equivalence)
[math]\displaystyle{ (P \leftrightarrow Q) \leftrightarrow (Q \leftrightarrow P) }[/math]

Set theory

In group and set theory, many algebraic structures are called commutative when certain operands satisfy the commutative property. In higher branches of mathematics, such as analysis and linear algebra the commutativity of well-known operations (such as addition and multiplication on real and complex numbers) is often used (or implicitly assumed) in proofs.[16][17][18]

Mathematical structures and commutativity

  • A commutative semigroup is a set endowed with a total, associative and commutative operation.
  • If the operation additionally has an identity element, we have a commutative monoid
  • An abelian group, or commutative group is a group whose group operation is commutative.[17]
  • A commutative ring is a ring whose multiplication is commutative. (Addition in a ring is always commutative.)[19]
  • In a field both addition and multiplication are commutative.[20]

Related properties


Main page: Associative property

The associative property is closely related to the commutative property. The associative property of an expression containing two or more occurrences of the same operator states that the order operations are performed in does not affect the final result, as long as the order of terms does not change. In contrast, the commutative property states that the order of the terms does not affect the final result.

Most commutative operations encountered in practice are also associative. However, commutativity does not imply associativity. A counterexample is the function [math]\displaystyle{ f(x, y) = \frac{x + y}{2}, }[/math] which is clearly commutative (interchanging x and y does not affect the result), but it is not associative (since, for example, [math]\displaystyle{ f(-4, f(0, +4)) = -1 }[/math] but [math]\displaystyle{ f(f(-4, 0), +4) = +1 }[/math]). More such examples may be found in commutative non-associative magmas. Furthermore, associativity does not imply commutativity either – for example multiplication of quaternions or of matrices is always associative but not always commutative.


Main page: Distributive property


Graph showing the symmetry of the addition function

Some forms of symmetry can be directly linked to commutativity. When a commutative operation is written as a binary function [math]\displaystyle{ z=f(x,y), }[/math] then this function is called a symmetric function, and its graph in three-dimensional space is symmetric across the plane [math]\displaystyle{ y=x }[/math]. For example, if the function f is defined as [math]\displaystyle{ f(x,y)=x+y }[/math] then [math]\displaystyle{ f }[/math] is a symmetric function.

For relations, a symmetric relation is analogous to a commutative operation, in that if a relation R is symmetric, then [math]\displaystyle{ a R b \Leftrightarrow b R a }[/math].

Non-commuting operators in quantum mechanics

Main page: Physics:Canonical commutation relation

In quantum mechanics as formulated by Schrödinger, physical variables are represented by linear operators such as [math]\displaystyle{ x }[/math] (meaning multiply by [math]\displaystyle{ x }[/math]), and [math]\displaystyle{ \frac{d}{dx} }[/math]. These two operators do not commute as may be seen by considering the effect of their compositions [math]\displaystyle{ x \frac{d}{dx} }[/math] and [math]\displaystyle{ \frac{d}{dx} x }[/math] (also called products of operators) on a one-dimensional wave function [math]\displaystyle{ \psi(x) }[/math]: [math]\displaystyle{ x\cdot {\mathrm{d}\over \mathrm{d}x}\psi = x\cdot \psi' \ \neq \ \psi + x\cdot \psi' = {\mathrm{d}\over \mathrm{d}x}\left( x\cdot \psi \right) }[/math]

According to the uncertainty principle of Heisenberg, if the two operators representing a pair of variables do not commute, then that pair of variables are mutually complementary, which means they cannot be simultaneously measured or known precisely. For example, the position and the linear momentum in the [math]\displaystyle{ x }[/math]-direction of a particle are represented by the operators [math]\displaystyle{ x }[/math] and [math]\displaystyle{ -i \hbar \frac{\partial}{\partial x} }[/math], respectively (where [math]\displaystyle{ \hbar }[/math] is the reduced Planck constant). This is the same example except for the constant [math]\displaystyle{ -i \hbar }[/math], so again the operators do not commute and the physical meaning is that the position and linear momentum in a given direction are complementary.

See also


  1. 1.0 1.1 Cabillón & Miller, Commutative and Distributive
  2. 2.0 2.1 Flood, Raymond; Rice, Adrian; Wilson, Robin, eds (2011). Mathematics in Victorian Britain. Oxford University Press. p. 4. ISBN 9780191627941. 
  3. Weisstein, Eric W.. "Symmetric Relation". 
  4. Krowne, p. 1
  5. Weisstein, Commute, p. 1
  6. Yark, p. 1
  7. "User MathematicalOrchid" (in en). 
  8. Lumpkin 1997, p. 11
  9. Gay & Shute 1987
  10. O'Conner & Robertson Real Numbers
  11. O'Conner & Robertson, Servois
  12. Gregory, D. F. (1840). "On the real nature of symbolical algebra". Transactions of the Royal Society of Edinburgh 14: 208–216. 
  13. Moore and Parker
  14. Copi & Cohen 2005
  15. Hurley & Watson 2016
  16. Axler 1997, p. 2
  17. 17.0 17.1 Gallian 2006, p. 34
  18. Gallian 2006, pp. 26,87
  19. Gallian 2006, p. 236
  20. Gallian 2006, p. 250




Online resources