Bézout matrix

From HandWiki
Revision as of 21:34, 27 September 2021 by imported>JMinHep (add)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In mathematics, a Bézout matrix (or Bézoutian or Bezoutiant) is a special square matrix associated with two polynomials, introduced by James Joseph Sylvester (1853) and Arthur Cayley (1857) and named after Étienne Bézout. Bézoutian may also refer to the determinant of this matrix, which is equal to the resultant of the two polynomials. Bézout matrices are sometimes used to test the stability of a given polynomial.

Definition

Let [math]\displaystyle{ f(z) }[/math] and [math]\displaystyle{ g(z) }[/math] be two complex polynomials of degree at most n,

[math]\displaystyle{ f(z) = \sum_{i=0}^n u_i z^i,\qquad g(z) = \sum_{i=0}^n v_i z^i. }[/math]

(Note that any coefficient [math]\displaystyle{ u_i }[/math] or [math]\displaystyle{ v_i }[/math] could be zero.) The Bézout matrix of order n associated with the polynomials f and g is

[math]\displaystyle{ B_n(f,g)=\left(b_{ij}\right)_{i,j=0,\dots,n-1} }[/math]

where the entries [math]\displaystyle{ b_{ij} }[/math] result from the identity

[math]\displaystyle{ \frac{f(x)g(y)-f(y)g(x)}{x-y} =\sum_{i,j=0}^{n-1} b_{ij}\,x^{i}\,y^{j}. }[/math]

It is an n × n complex matrix, and its entries are such that if we let [math]\displaystyle{ m_{ij} = \min\{i,n-1-j\} }[/math] for each [math]\displaystyle{ i, j = 0, \dots, n-1 }[/math], then:

[math]\displaystyle{ b_{ij}=\sum_{k=0}^{m_{ij}}(u_{j+k+1}v_{i-k}-u_{i-k}v_{j+k+1}). }[/math]

To each Bézout matrix, one can associate the following bilinear form, called the Bézoutian:

[math]\displaystyle{ \operatorname{Bez}: \mathbb{C}^n\times\mathbb{C}^n \to \mathbb{C}: (x,y)\mapsto \operatorname{Bez}(x,y) = x^{*}B_n(f,g)\,y. }[/math]

Examples

  • For n = 3, we have for any polynomials f and g of degree (at most) 3:
[math]\displaystyle{ B_3(f,g)=\left[\begin{matrix}u_1v_0-u_0 v_1 & u_2 v_0-u_0 v_2 & u_3 v_0-u_0 v_3\\u_2 v_0-u_0 v_2 & u_2v_1-u_1v_2+u_3v_0-u_0v_3 & u_3 v_1-u_1v_3\\u_3v_0-u_0v_3 & u_3v_1-u_1v_3 & u_3v_2-u_2v_3\end{matrix}\right]\!. }[/math]
  • Let [math]\displaystyle{ f(x) = 3x^3-x }[/math] and [math]\displaystyle{ g(x) = 5x^2+1 }[/math] be the two polynomials. Then:
[math]\displaystyle{ B_4(f,g)=\left[\begin{matrix}-1 & 0 & 3 & 0\\0 &8 &0 &0 \\3 & 0 & 15 & 0\\0 & 0 & 0 & 0\end{matrix}\right]\!. }[/math]

The last row and column are all zero as f and g have degree strictly less than n (which is 4). The other zero entries are because for each [math]\displaystyle{ i = 0, \dots, n }[/math], either [math]\displaystyle{ u_i }[/math] or [math]\displaystyle{ v_i }[/math] is zero.

Properties

  • [math]\displaystyle{ B_n(f,g) }[/math] is symmetric (as a matrix);
  • [math]\displaystyle{ B_n(f,g) = -B_n(g,f) }[/math];
  • [math]\displaystyle{ B_n(f,f) = 0 }[/math];
  • [math]\displaystyle{ (f, g) \mapsto B_n(f,g) }[/math] is a bilinear function;
  • [math]\displaystyle{ B_n(f,g) }[/math] is a real matrix if f and g have real coefficients;
  • [math]\displaystyle{ B_n(f,g) }[/math] is nonsingular with [math]\displaystyle{ n=\max(\deg(f),\deg(g)) }[/math] if and only if f and g have no common roots.
  • [math]\displaystyle{ B_n(f,g) }[/math] with [math]\displaystyle{ n = \max(\deg(f),\deg(g)) }[/math] has determinant which is the resultant of f and g.

Applications

An important application of Bézout matrices can be found in control theory. To see this, let f(z) be a complex polynomial of degree n and denote by q and p the real polynomials such that f(iy) = q(y) + ip(y) (where y is real). We also denote r for the rank and σ for the signature of [math]\displaystyle{ B_n(p,q) }[/math]. Then, we have the following statements:

  • f(z) has n − r roots in common with its conjugate;
  • the left r roots of f(z) are located in such a way that:
    • (r + σ)/2 of them lie in the open left half-plane, and
    • (r − σ)/2 lie in the open right half-plane;
  • f is Hurwitz stable if and only if [math]\displaystyle{ B_n(p,q) }[/math] is positive definite.

The third statement gives a necessary and sufficient condition concerning stability. Besides, the first statement exhibits some similarities with a result concerning Sylvester matrices while the second one can be related to Routh–Hurwitz theorem.

References