Quadratic unconstrained binary optimization

From HandWiki
Short description: Combinatorial optimization problem

Quadratic unconstrained binary optimization (QUBO), also known as unconstrained binary quadratic programming (UBQP), is a combinatorial optimization problem with a wide range of applications from finance and economics to machine learning.[1] QUBO is an NP hard problem, and for many classical problems from theoretical computer science, like maximum cut, graph coloring and the partition problem, embeddings into QUBO have been formulated.[2][3] Embeddings for machine learning models include support-vector machines, clustering and probabilistic graphical models.[4] Moreover, due to its close connection to Ising models, QUBO constitutes a central problem class for adiabatic quantum computation, where it is solved through a physical process called quantum annealing.[5]

Definition

Let 𝔹={0,1} the set of binary digits (or bits), then 𝔹n is the set of binary vectors of fixed length n. Given a symmetric or upper triangular matrix Qn×n, whose entries Qij define a weight for each pair of indices i,j{1,,n}, we can define the function fQ:𝔹n that assigns a value to each binary vector x through

fQ(x)=xQx=i=1nj=1nQijxixj.

Alternatively, the linear and quadratic parts can be separated as

fQ,q(x)=xQx+qx,

where Qn×n and qn. This is equivalent to the previous definition through Q=Q+diag[q] using the diag operator, exploiting that x=xx for all binary values x.

Intuitively, the weight Qij is added if both xi=1 and xj=1. The QUBO problem consists of finding a binary vector x* that minimizes fQ, i.e., x𝔹n:fQ(x*)fQ(x).

In general, x* is not unique, meaning there may be a set of minimizing vectors with equal value w.r.t. fQ. The complexity of QUBO arises from the number of candidate binary vectors to be evaluated, as |𝔹n|=2n grows exponentially in n.

Sometimes, QUBO is defined as the problem of maximizing fQ, which is equivalent to minimizing fQ=fQ.

Properties

QUBO is scale invariant for positive factors α>0, which leave the optimum x* unchanged:

fαQ(x)=x(αQ)x=α(xQx)=αfQ(x).

In its general form, QUBO is NP-hard and cannot be solved efficiently by any polynomial-time algorithm.[6] However, there are polynomially-solvable special cases, where Q has certain properties,[7] for example:

  • If all coefficients are positive, the optimum is trivially x*=(0,,0). Similarly, if all coefficients are negative, the optimum is x*=(1,,1).
  • If Q is diagonal, the bits can be optimized independently, and the problem is solvable in 𝒪(n). The optimal variable assignments are simply xi*=1 if Qii<0, and xi*=0 otherwise.
  • If all off-diagonal elements of Q are non-positive, the corresponding QUBO problem is solvable in polynomial time.[8]

QUBO can be solved using integer linear programming solvers like CPLEX or Gurobi Optimizer. This is possible since QUBO can be reformulated as a linear constrained binary optimization problem. To achieve this, substitute the product xixj by an additional binary variable zij𝔹 and add the constraints xizij, xjzij and xi+xj1zij. Note that zij can also be relaxed to continuous variables within the bounds zero and one.

Applications

QUBO is a structurally simple, yet computationally hard optimization problem. It can be used to encode a wide range of optimization problems from various scientific areas.[9]

Maximum Cut

Given a graph G=(V,E) with vertex set V={1,,n} and edges EV×V, the maximum cut (max-cut) problem consists of finding two subsets S,TV with T=VS, such that the number of edges between S and T is maximized.

The more general weighted max-cut problem assumes edge weights wij0i,jV, with (i,j)Ewij=0, and asks for a partition S,TV that maximizes the sum of edge weights between S and T, i.e.,

maxSViS,jSwij.

By setting wij=1 for all (i,j)E this becomes equivalent to the original max-cut problem above, which is why we focus on this more general form in the following.

For every vertex in iV we introduce a binary variable xi with the interpretation xi=0 if iS and xi=1 if iT. As T=VS, every i is in exactly one set, meaning there is a 1:1 correspondence between binary vectors x𝔹n and partitions of V into two subsets.

We observe that, for any i,jV, the expression xi(1xj)+(1xi)xj evaluates to 1 if and only if i and j are in different subsets, equivalent to logical XOR. Let W+n×n with Wij=wiji,jV. By extending above expression to matrix-vector form we find that

xW(1x)+(1x)Wx=2xWx+(W1+W1)x

is the sum of weights of all edges between S and T, where 1=(1,1,,1)n. As this is a quadratic function over x, it is a QUBO problem whose parameter matrix we can read from above expression as

Q=2Wdiag[W1+W1],

after flipping the sign to make it a minimization problem.

Cluster Analysis

Binary Clustering with QUBO
20 points with random cluster assignment
A bad cluster assignment.
20 points with sensible cluster assignment
A good cluster assignment.
Visual representation of a clustering problem with 20 points: Circles of the same color belong to the same cluster. Each circle can be understood as a binary variable in the corresponding QUBO problem.

Next, we consider the problem of cluster analysis, where we are given a set of N points in d-dimensional space and want to assign each point to one of two classes or clusters, such that points in the same cluster are similar to each other. For this example we set N=20 and d=2. The data is given as a matrix X20×2, where each row contains two cartesian coordinates. For two clusters, we can assign a binary variable xi𝔹 to the point corresponding to the i-th row in X, indicating whether it belongs to the first (xi=0) or second cluster (xi=1). Consequently, we have 20 binary variables, which form a binary vector x𝔹20 that corresponds to a cluster assignment of all points (see figure).

One way to derive a clustering is to consider the pairwise distances between points. Given a cluster assignment x, the expression xixj+(1xi)(1xj) evaluates to 1 if points i and j are in the same cluster. Similarly, xi(1xj)+(1xi)xj=1 indicates that they are in different clusters. Let dij>0 denote the Euclidean distance between the points i and j, i.e.,

dij=XiXj,

where Xi is the i-th row of X.

In order to define a cost function to minimize, when points i and j are in the same cluster we add their positive distance dij, and subtract it when they are in different clusters. This way, an optimal solution tends to place points which are far apart into different clusters, and points that are close into the same cluster.

Let DN×N with Dij=dij/2 for all i,j{1,,n}. Given an assignment x𝔹N, such a cost function is given by

f(x)=xDxxD(1x)(1x)Dx+(1x)D(1x)=4xDx41Dx+1D1,

where 1=(1,1,,1)N.

From the second line we can see that this expression can be re-arranged to a QUBO problem by defining

Q=4D4diag[D1]

and ignoring the constant term 1D1. Using these parameters, a binary vector minimizing this QUBO instance Q will correspond to an optimal cluster assignment w.r.t. above cost function.

Connection to Ising models

QUBO is very closely related and computationally equivalent to the Ising model, whose Hamiltonian function is defined as

H(σ)=σJσ+hσ=i,jJijσiσj+jhjσj

with real-valued parameters hj,Jij for all i,j. The spin variables σj are binary with values from {1,+1} instead of 𝔹. Note that this formulation is simplified, since, in a physics context, σi are typically Pauli operators, which are complex-valued matrices of size 2n×2n, whereas here we treat them as binary variables. Many formulations of the Ising model Hamiltonian further assume that the variables are arranged in a lattice, where only neighboring pairs of variables ij can have non-zero coefficients; here, we simply assume that Jij=0 if i and j are not neighbors.

Applying the identity σ=12x yields an equivalent QUBO problem [10]

σJσ+hσ=(12x)J(12x)+h(12x)=4xJx41Jx+1J12hx+h1=x(4J)x(4J1+2h)x+1J1+h1const.,

whose weight matrix Q is given by

Q=4Jdiag[4J1+2h],

again ignoring the constant term, which does not affect the minization. Using the identity x=(1σ)/2, a QUBO problem with matrix Q can be converted to an equivalent Ising model using the same technique, yielding

J=Q/4,h=(Q1+Q1)/4,

and a constant offset of 1Q1/4.[10]

References

  1. Kochenberger, Gary; Hao, Jin-Kao; Glover, Fred; Lewis, Mark; Lu, Zhipeng; Wang, Haibo; Wang, Yang (2014). "The unconstrained binary quadratic programming problem: a survey.". Journal of Combinatorial Optimization 28: 58–81. doi:10.1007/s10878-014-9734-0. https://leeds-faculty.colorado.edu/glover/454%20-%20xQx%20survey%20article%20as%20published%202014.pdf. 
  2. Glover, Fred; Kochenberger, Gary (2019). "A Tutorial on Formulating and Using QUBO Models". arXiv:1811.11538 [cs.DS].
  3. Lucas, Andrew (2014). "Ising formulations of many NP problems". Frontiers in Physics 2: 5. doi:10.3389/fphy.2014.00005. Bibcode2014FrP.....2....5L. 
  4. Mücke, Sascha; Piatkowski, Nico; Morik, Katharina (2019). "Learning Bit by Bit: Extracting the Essence of Machine Learning". LWDA. https://pdfs.semanticscholar.org/f484/b4a789e1563b91a416a7cfabbf72f0aa3b2a.pdf. 
  5. Tom Simonite (8 May 2013). "D-Wave's Quantum Computer Goes to the Races, Wins". MIT Technology Review. http://www.technologyreview.com/view/514686/d-waves-quantum-computer-goes-to-the-races-wins/. 
  6. A. P. Punnen (editor), Quadratic unconstrained binary optimization problem: Theory, Algorithms, and Applications, Springer, Springer, 2022.
  7. Çela, E., Punnen, A.P. (2022). Complexity and Polynomially Solvable Special Cases of QUBO. In: Punnen, A.P. (eds) The Quadratic Unconstrained Binary Optimization Problem. Springer, Cham. https://doi.org/10.1007/978-3-031-04520-2_3
  8. See Theorem 3.16 in Punnen (2022); note that the authors assume the maximization version of QUBO.
  9. Ratke, Daniel (2021-06-10). "List of QUBO formulations". https://blog.xa0.de/post/List-of-QUBO-formulations/. 
  10. 10.0 10.1 Mücke, S. (2025). Quantum-Classical Optimization in Machine Learning. Shaker Verlag. https://d-nb.info/1368090214