Polynomial SOS

From HandWiki

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms [math]\displaystyle{ g_1(x),\ldots,g_k(x) }[/math] of degree m such that [math]\displaystyle{ h(x) = \sum_{i=1}^k g_i(x)^2 . }[/math]

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.[1] The same is also valid for the analog problem on positive symmetric forms.[2][3]

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4][5] Moreover, every real nonnegative form can be approximated as closely as desired (in the [math]\displaystyle{ l_1 }[/math]-norm of its coefficient vector) by a sequence of forms [math]\displaystyle{ \{f_\epsilon\} }[/math] that are SOS.[6]

Square matricial representation (SMR)

To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as [math]\displaystyle{ h(x) = x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}} }[/math] where [math]\displaystyle{ x^{\{m\}} }[/math] is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying [math]\displaystyle{ h(x) = x^{\left\{m\right\}'}Hx^{\{m\}} }[/math] and [math]\displaystyle{ L(\alpha) }[/math] is a linear parameterization of the linear space [math]\displaystyle{ \mathcal{L} = \left\{L=L':~x^{\{m\}'} L x^{\{m\}}=0\right\}. }[/math]

The dimension of the vector [math]\displaystyle{ x^{\{m\}} }[/math] is given by [math]\displaystyle{ \sigma(n,m) = \binom{n+m-1}{m}, }[/math] whereas the dimension of the vector [math]\displaystyle{ \alpha }[/math] is given by [math]\displaystyle{ \omega(n,2m) = \frac{1}{2}\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m). }[/math]

Then, h(x) is SOS if and only if there exists a vector [math]\displaystyle{ \alpha }[/math] such that [math]\displaystyle{ H + L(\alpha) \ge 0, }[/math] meaning that the matrix [math]\displaystyle{ H + L(\alpha) }[/math] is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression [math]\displaystyle{ h(x)=x^{\{m\}'}\left(H+L(\alpha)\right)x^{\{m\}} }[/math] was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[8]

Examples

  • Consider the form of degree 4 in two variables [math]\displaystyle{ h(x)=x_1^4-x_1^2x_2^2+x_2^4 }[/math]. We have [math]\displaystyle{ m = 2,~x^{\{m\}} = \begin{pmatrix} x_1^2\\x_1x_2\\x_2^2\end{pmatrix}\!, ~H+L(\alpha) = \begin{pmatrix} 1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1 \end{pmatrix}\!. }[/math] Since there exists α such that [math]\displaystyle{ H+L(\alpha)\ge 0 }[/math], namely [math]\displaystyle{ \alpha=1 }[/math], it follows that h(x) is SOS.
  • Consider the form of degree 4 in three variables [math]\displaystyle{ h(x)=2x_1^4-2.5x_1^3x_2+x_1^2x_2x_3-2x_1x_3^3+5x_2^4+x_3^4 }[/math]. We have [math]\displaystyle{ m=2,~x^{\{m\}}=\begin{pmatrix}x_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end{pmatrix}, ~H+L(\alpha) = \begin{pmatrix} 2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\ -1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\ 0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\ -\alpha_1&0&\alpha_4&5&0&-\alpha_6\\ -\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\ -\alpha_3&-\alpha_5&-1&-\alpha_6&0&1 \end{pmatrix}. }[/math] Since [math]\displaystyle{ H+L(\alpha)\ge 0 }[/math] for [math]\displaystyle{ \alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57) }[/math], it follows that h(x) is SOS.

Generalizations

Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms [math]\displaystyle{ G_1(x),\ldots,G_k(x) }[/math] of degree m such that [math]\displaystyle{ F(x)=\sum_{i=1}^k G_i(x)'G_i(x) . }[/math]

Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as [math]\displaystyle{ F(x) = \left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right) }[/math] where [math]\displaystyle{ \otimes }[/math] is the Kronecker product of matrices, H is any symmetric matrix satisfying [math]\displaystyle{ F(x) = \left(x^{\{m\}}\otimes I_r\right)'H\left(x^{\{m\}}\otimes I_r\right) }[/math] and [math]\displaystyle{ L(\alpha) }[/math] is a linear parameterization of the linear space [math]\displaystyle{ \mathcal{L}=\left\{L=L':~\left(x^{\{m\}}\otimes I_r\right)'L\left(x^{\{m\}}\otimes I_r\right)=0\right\}. }[/math]

The dimension of the vector [math]\displaystyle{ \alpha }[/math] is given by [math]\displaystyle{ \omega(n,2m,r)=\frac{1}{2}r\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right). }[/math]

Then, F(x) is SOS if and only if there exists a vector [math]\displaystyle{ \alpha }[/math] such that the following LMI holds: [math]\displaystyle{ H+L(\alpha) \ge 0. }[/math]

The expression [math]\displaystyle{ F(x) = \left(x^{\{m\}}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^{\{m\}}\otimes I_r\right) }[/math] was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.

Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f = fT. When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials [math]\displaystyle{ h_1,\ldots,h_k }[/math] such that [math]\displaystyle{ f(X) = \sum_{i=1}^{k} h_i(X)^T h_i(X). }[/math]

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]

References

  1. Hilbert, David (September 1888). "Ueber die Darstellung definiter Formen als Summe von Formenquadraten". Mathematische Annalen 32 (3): 342–350. doi:10.1007/bf01443605. https://zenodo.org/record/1428214. 
  2. Choi, M. D.; Lam, T. Y. (1977). "An old question of Hilbert". Queen's Papers in Pure and Applied Mathematics 46: 385–405. 
  3. Goel, Charu; Kuhlmann, Salma; Reznick, Bruce (May 2016). "On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms". Linear Algebra and Its Applications 496: 114–120. doi:10.1016/j.laa.2016.01.024. 
  4. Lasserre, Jean B. (2007). "Sufficient conditions for a real polynomial to be a sum of squares". Archiv der Mathematik 89 (5): 390–398. doi:10.1007/s00013-007-2251-y. http://www.optimization-online.org/DB_HTML/2007/02/1587.html. 
  5. Powers, Victoria; Wörmann, Thorsten (1998). "An algorithm for sums of squares of real polynomials". Journal of Pure and Applied Algebra 127 (1): 99–104. doi:10.1016/S0022-4049(97)83827-3. http://www.mathcs.emory.edu/~vicki/pub/sos.pdf. 
  6. Lasserre, Jean B. (2007). "A Sum of Squares Approximation of Nonnegative Polynomials". SIAM Review 49 (4): 651–669. doi:10.1137/070693709. Bibcode2007SIAMR..49..651L. 
  7. Chesi, G.; Tesi, A.; Vicino, A.; Genesio, R. (1999). "On convexification of some minimum distance problems". Karlsruhe, Germany: IEEE. pp. 1446–1451. https://ieeexplore.ieee.org/document/7099515. 
  8. Choi, M.; Lam, T.; Reznick, B. (1995). "Sums of squares of real polynomials". pp. 103–125. https://www.researchgate.net/publication/240268385. 
  9. Chesi, G.; Garulli, A.; Tesi, A.; Vicino, A. (2003). "Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions". Maui, Hawaii: IEEE. pp. 4670–4675. doi:10.1109/CDC.2003.1272307. 
  10. Helton, J. William (September 2002). ""Positive" Noncommutative Polynomials Are Sums of Squares". The Annals of Mathematics 156 (2): 675–694. doi:10.2307/3597203. 
  11. Burgdorf, Sabine; Cafuta, Kristijan; Klep, Igor; Povh, Janez (25 October 2012). "Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials". Computational Optimization and Applications 55 (1): 137–153. doi:10.1007/s10589-012-9513-8. 

See also