## Generating positive polynomials that are not sums of squares

Motivation: “To prove that P=NP”. More seriously, there is a very powerful technique, called Sum of Squares (SOS) of polynomials in the optimization. For instance, if $f(x) = \sum_i g_i^2(x), x \in \mathbb{R}^n$, where $f(x)$ and $g_i(x)$ are some polynomials it is obvious that $f(x)$ is non-negative. Unfortunately, this does not work in opposite direction. There are examples of positive polynomials that are not a sum of squares of other polynomials. Therefore, automatic generation of non-negative polynomials that are not sums of squares allows to extend applicability of SOS technique. If we would be enable to generate enough of such polynomials the global minimum of any polynomial function $f(x)$ would be accessible. Consequently, P would be equal NP. This post has modest goal is to understand how to generate algorithmically a single positive polynomial that is not a sum of squares of other polynomials in arbitrary dimension. In a sense this is a formalization of previous post.

Preliminaries

Let $x \in \mathbb{R}^n, \alpha \in \mathbb{ N }_{0}^n, \mathbb{X}^d = \{ X = \prod_k X_k^{\alpha_k} | \sum_k \alpha_k \leq d \}$, where $X_k$ is an arbitrary unique symbol denoting variable. For example, $\mathbb{X}^0= \{ 1 \}, \mathbb{X}^1= \{ 1, X_1, X_2, ..., X_n \}$. Let $\mathbb{X}^d(x)$ being evaluation of the set $\mathbb{X}^d$ at point $x$. For example, $\mathbb{X}^0(x)= \{ 1 \}$ for any point x, $\mathbb{X}^1( (0,1) )= \{ 1, 0, 1 \}$. $\mathbb{X}^2( (0,1) )= \{ 1, 0, 1, 0, 1, 0 \}$, where $\mathbb{X}^2 = \{ 1, X_1, X_2, ..., X_n, X_1^2, X_2^2, ... X_N^2, X_1 X_2, X_1 X_3, ... X_{n-1} X_n \}$. Let polynomial be a linear combination of elements of $\mathbb{X}^d : f(x) = \sum_k c_k \mathbb{X}^d_k$, where $\mathbb{X}^d_k$ enumerates elements of the set.

NP-complete integer partition problem

Let $a_i \in \mathbb{N}, \xi_i \in {\pm1}$. Integer partition problem is asking whether $\exists \xi : \sum_k a_k \xi_k =0$, e.g. whether it is possible to divide set of integers into two complementary sets that have equal sum. This problem is NP-complete.

Proposition 1 Integer partition problem is equivalent to the question whether global minimum of polynomial $\sum_k (X_k^2-1)^2 + \left( \sum_k a_k X_k \right)^2$ is greater than zero or not.

Proof Since the polynomial is a sum of squares of real variables, it is non-negative and can be zero only when each term is zero. the first sum can only be zero at the values of $x_k= \{\pm1\}$. The second sum at given values can be zero only when partition exists $\blacksquare$

Corollary To prove P=NP it is sufficient to analyse quartic polynomials.
Proof Obvious $\blacksquare$

Here we would like to establish some properties of the quadratic polynomials on the hyper-cube $\mathbb{H} = \{ x \in \{0; 1\}^n \}$

Propostiion 2 Consider quadratic polynomial $f(x)$. If $f(x) = 0, \forall x~\in~\tilde{ \mathbb{H} }, \tilde{ \mathbb{H} } = \{ x: x~\in~\mathbb{H}, \sum_k x_k \leq 2 \}$ than $f(x) =0, \forall x \in \mathbb{H}$.

Proof
$f(x)= c_1+ \sum_k c_{X_k} X_k + c_{X_k^2} X_k^2 + \sum \limits_{j \neq k} c_{X_j X_k} X_j X_k$
Consequitively,
$x=0 \Rightarrow c_1=0$.
$\sum_k x_k =1 \Rightarrow x_k=1, x_{j \neq k}=0 \Rightarrow c_{X_k} + c_{X_k^2}=0$.
$\sum_k x_k =2 \Rightarrow x_k =1, x_j=1, x_{ l \neq k,j }=0 \Rightarrow c_{X_kX_j}=0$, taking into account second case.

Finally, $f(x) = \sum_k c_{X_k} X_k(X_k-1)$, which is clearly 0 for $x \in \mathbb{H}$, since in each term in the sum either first or second factor is 0 $\blacksquare$

Quartics that vanish on $\tilde{ \mathbb{H} }$
$f(x)= c_1+ \sum_k c_{X_k} X_k + c_{X_k^2} X_k^2 + c_{X_k^3}X_k^3 +c_{X_k^4}X_k^4 +\sum \limits_{j \neq k} c_{X_j X_k} X_j X_k + c_{X_j^2 X_k} X_j^2 X_k+ c_{X_j^3 X_k} X_j^3 X_k+ c_{X_j^2 X_k^2} X_j^2 X_k^2+ \sum \limits_{j \neq k \neq m} c_{X_j X_k X_m} X_j X_k X_m + c_{X_j^2 X_k X_m} X_j^2 X_k X_m + \sum \limits_{j \neq k \neq m \neq l} c_{X_j X_k X_m X_l} X_j X_k X_m X_l$.

Consequently,
$x=0 \Rightarrow c_1=0$.
$\sum_k x_k =1 \Rightarrow x_k=1, x_{j \neq k}=0 \Rightarrow c_{X_k} + c_{X_k^2} + c_{X_k^3} + c_{X_k^4}=0$.
$\sum_k x_k =2 \Rightarrow x_k =1, x_j=1, x_{ l \neq k,j }=0 \Rightarrow c_{X_j X_k} + c_{X_j^2 X_k}+ c_{X_j^3 X_k}+ c_{X_j^2 X_k^2}=0$, taking into account second case.

Additionally, we want quartics to preserve sign around $x \in \tilde{ \mathbb{H} }$, since we are interested in positive polynomials. Necessary condition is that gradient equals 0 for corresponding points.

$\frac{ \partial f(x)}{ \partial X_q }= (c_{X_k} + 2 c_{X_k^2} X_k + 3 c_{X_k^3}X_k^2 + 4 c_{X_k^4}X_k^3) \delta_{k,q} +\sum \limits_{j \neq k} (c_{X_j X_k} X_j + c_{X_j^2 X_k} X_j^2+ c_{X_j^3 X_k} X_j^3+ 2 c_{X_j^2 X_k^2} X_j^2 X_k) \delta_{k,q}+ ( 2 c_{X_j^2 X_k} X_j X_k+ 3 c_{X_j^3 X_k} X_j^2 X_k) \delta_{j,q} + \sum \limits_{j \neq k \neq m} c_{X_j X_k X_m} X_j X_m \delta_{k,q} + c_{X_j^2 X_k X_m} (X_j^2 X_m \delta_{k,q}+2 X_j X_k X_m\delta_{j,q}) + \sum \limits_{j \neq k \neq m \neq l} c_{X_j X_k X_m X_l} X_j X_m X_l \delta_{k,q}$.

Consequently,
$x=0 \Rightarrow c_{X_k}=0$.
$\sum_k x_k =1$ is splitted into 2 cases. $x_q=1$ and $x_q=0$
$x_q=1 \Rightarrow 2 c_{X_q^2} + 3 c_{X_q^3} + 4 c_{X_q^4} = 0$ together with $c_{X_k^2} + c_{X_k^3} + c_{X_k^4}=0$ that lead to $c_{X_k^3}= -2 c_{X_k^2}, c_{X_k^4}= c_{X_k^2}$
$x_q=0 \Rightarrow c_{X_j X_q} + c_{X_j^2 X_q} + c_{X_j^3 X_q}=0, \forall j,q$.
$\sum_k x_k =2$ split again into the same cases.
$x_q=1 \Rightarrow 2 c_{X_j^2 X_q^2} + 2 c_{X_q^2 X_j } +3 c_{X_q^3 X_j }= 0$ together with $c_{X_j X_q} + c_{X_j^2 X_q} + c_{X_j^3 X_q}=0$ and $c_{X_j X_k} + c_{X_j^2 X_k}+ c_{X_j^3 X_k}+ c_{X_j^2 X_k^2}=0$ that lead to $c_{X_j^2 X_q^2}= 0, c_{X_q^2 X_j} = -3 c_{X_q X_j}, c_{X_q^3 X_j}= 2 c_{X_q X_j}$.
$x_q=0 \Rightarrow c_{X_j X_m X_q} + c_{X_j^2 X_m X_q} + c_{X_m^2 X_j X_q}= 0$. The last condition counted over triplet lead to Robinson perturbation term $c_{X_j^2 X_m X_q}= c_{X_j X_m^2 X_q} = c_{X_j X_m X_q^2}= - \frac{1}{2} c_{X_j X_m X_q}$

Combining all conditions together we get
$f(x)= \sum_k c_{X_k^2} X_k^2(X_k-1)^2 +\sum \limits_{j \neq k} c_{X_j X_k} ( X_j X_k - X_j^2 X_k- X_j^2 X_k+X_j^2 X_k^2)+ \sum \limits_{j \neq k \neq m} c_{X_j X_k X_m}(-2 X_j X_k X_m + X_j^2 X_k X_m +X_j X_k^2 X_m +X_j X_k X_m^2)+ \sum \limits_{j \neq k \neq m \neq l} c_{X_j X_k X_m X_l} X_j X_k X_m X_l .$

Now we need to analyze the possible values of coefficients that guarantee positiveness of quartic polynomial.

Update Jan 6 2016

What is interesting is that
(1) if quartic is positive and has $n+1$ zeros forming n-simplex than its form is constrained. After a linear transformation this simplex can be transformed into standard orthogonal simplex with zeros being at
(0,0,0,…, 0),
(1,0,0,…, 0),
(0,1,0,…, 0),
(0,0,0,…, 1).
This leads to constraints on the coefficients of positive quartic polynomials. For instance, zero at 0 lead to $c_1=0$. Zero gradient (necessary for quartic being positive) leads to $c_{X_k}= 0$. Zeros at other vertices lead to the following constraint $c_{X_k^4}+c_{X_k^3}+c_{X_k^2}= 0$, whereas zero gradient leads to $\frac{\partial}{\partial X_k} \Rightarrow 2 c_{X_k^2} + 3 c_{X_k^3} + 4 c_{X_k^4} = 0$ leading to null-space $(1, -2, 1)$, i.e. $c_{X_k^2}= c_k, c_{X_k^3}+-2c_k, c_{X_k^4}= c_k$ for any $c_k$ which leads to polynomial
$\sum_k c_k X_k^2(X_k-1)^2 + f(x)$, where $f(x)$ have no single variable term. Therefore, this simple requirement leads to representation of quartic positive polynomial as sums of squares plus some perturbation.

(2) It is sufficient to spread enough zeros across space, not necessary on the corner of hyper-cube to nullify all coefficient of quadratics. There are $n_2=\frac{(n+2)(n+1)}{2}$ coefficients in quadratics. For the corresponding quartic we need at least $\frac{(n+2)(n+1)}{2}$ generic points where quartic and its gradient vanish. That leads to $\frac{(n+2)(n+1)^2}{2}$ constraints. We need more coefficients to have some freedom. That is achievable for the dimension more than 5. That is saying that perturbation can be very strong.

To be continued.