Chebyshev center
In geometry, the Chebyshev center of a bounded set [math]\displaystyle{ Q }[/math] having non-empty interior is the center of the minimal-radius ball enclosing the entire set [math]\displaystyle{ Q }[/math], or alternatively (and non-equivalently) the center of largest inscribed ball of [math]\displaystyle{ Q }[/math].[1]
In the field of parameter estimation, the Chebyshev center approach tries to find an estimator [math]\displaystyle{ \hat x }[/math] for [math]\displaystyle{ x }[/math] given the feasibility set [math]\displaystyle{ Q }[/math], such that [math]\displaystyle{ \hat x }[/math] minimizes the worst possible estimation error for x (e.g. best worst case).
Mathematical representation
There exist several alternative representations for the Chebyshev center. Consider the set [math]\displaystyle{ Q }[/math] and denote its Chebyshev center by [math]\displaystyle{ \hat{x} }[/math]. [math]\displaystyle{ \hat{x} }[/math] can be computed by solving:
- [math]\displaystyle{ \min_{{\hat x},r} \left\{ r:\left\| {\hat x} - x \right\|^2 \leq r, \forall x \in Q \right\} }[/math]
with respect to the Euclidean norm [math]\displaystyle{ \|\cdot\| }[/math], or alternatively by solving:
- [math]\displaystyle{ \operatorname{\underset{\mathit{\hat{x}}}{argmin}} \max_{x \in Q} \left\| x - \hat x \right\|^2. }[/math][1]
Despite these properties, finding the Chebyshev center may be a hard numerical optimization problem. For example, in the second representation above, the inner maximization is non-convex if the set Q is not convex.
Properties
In inner product spaces and two-dimensional spaces, if [math]\displaystyle{ Q }[/math] is closed, bounded and convex, then the Chebyshev center is in [math]\displaystyle{ Q }[/math]. In other words, the search for the Chebyshev center can be conducted inside [math]\displaystyle{ Q }[/math] without loss of generality.[2]
In other spaces, the Chebyshev center may not be in [math]\displaystyle{ Q }[/math], even if [math]\displaystyle{ Q }[/math] is convex. For instance, if [math]\displaystyle{ Q }[/math] is the tetrahedron formed by the convex hull of the points (1,1,1), (-1,1,1), (1,-1,1) and (1,1,-1), then computing the Chebyshev center using the [math]\displaystyle{ \ell_{\infty} }[/math] norm yields[3]
- [math]\displaystyle{ 0 = \operatorname{\underset{\mathit{\hat{x}}}{argmin}}\max _{x\in Q}\left\|x-{\hat {x}}\right\|_{\infty}^{2}. }[/math]
Relaxed Chebyshev center
Consider the case in which the set [math]\displaystyle{ Q }[/math] can be represented as the intersection of [math]\displaystyle{ k }[/math] ellipsoids.
- [math]\displaystyle{ \min_{\hat x} \max_x \left\{ \left\| {\hat x} - x \right\|^2 :f_i (x) \le 0,0 \le i \le k \right\} }[/math]
with
- [math]\displaystyle{ f_i (x) = x^T Q_i x + 2g_i^T x + d_i \le 0,0 \le i \le k. \, }[/math]
By introducing an additional matrix variable [math]\displaystyle{ \Delta = x x^T }[/math], we can write the inner maximization problem of the Chebyshev center as:
- [math]\displaystyle{ \min_{\hat x} \max_{(\Delta ,x) \in G} \left\{ \left\| {\hat x} \right\|^2 - 2{\hat x}^T x + \operatorname{Tr}(\Delta ) \right\} }[/math]
where [math]\displaystyle{ \operatorname{Tr}(\cdot) }[/math] is the trace operator and
- [math]\displaystyle{ G = \left\{(\Delta ,x):{\rm{f}}_i (\Delta ,x) \le 0,0 \le i \le k,\Delta = xx^T \right\} }[/math]
- [math]\displaystyle{ f_i (\Delta ,x) = \operatorname{Tr}(Q_i \Delta ) + 2g_i^T x + d_i. }[/math]
Relaxing our demand on [math]\displaystyle{ \Delta }[/math] by demanding [math]\displaystyle{ \Delta \ge xx^T }[/math], i.e. [math]\displaystyle{ \Delta - xx^T \in S_+ }[/math] where [math]\displaystyle{ S_+ }[/math] is the set of positive semi-definite matrices, and changing the order of the min max to max min (see the references for more details), the optimization problem can be formulated as:
- [math]\displaystyle{ RCC = \max_{(\Delta ,x) \in {T}} \left\{ - \left\| x \right\|^2 + \operatorname{Tr}(\Delta ) \right\} }[/math]
with
- [math]\displaystyle{ {T} = \left\{ (\Delta ,x):f_i (\Delta ,x) \le 0,0 \le i \le k,\Delta \ge xx^T \right\}. }[/math]
This last convex optimization problem is known as the relaxed Chebyshev center (RCC). The RCC has the following important properties:
- The RCC is an upper bound for the exact Chebyshev center.
- The RCC is unique.
- The RCC is feasible.
Constrained least squares
It can be shown that the well-known constrained least squares (CLS) problem is a relaxed version of the Chebyshev center.[citation needed]
The original CLS problem can be formulated as:
- [math]\displaystyle{ {\hat x}_{CLS} = \operatorname*{\arg\min}_{x \in C} \left\| y - Ax \right\|^2 }[/math]
with
- [math]\displaystyle{ { C} = \left\{ x:f_i (x) = x^T Q_i x + 2g_i^T x + d_i \le 0,1 \le i \le k \right\} }[/math]
- [math]\displaystyle{ Q_i \ge 0,g_i \in R^m ,d_i \in R. }[/math]
It can be shown that this problem is equivalent to the following optimization problem:
- [math]\displaystyle{ \max_{(\Delta ,{{x}}) \in {V}} \left\{ { - \left\| {{x}} \right\|^2 + \operatorname{Tr}(\Delta )} \right\} }[/math]
with
- [math]\displaystyle{ V = \left\{ \begin{array}{c} (\Delta ,x):x \in C{\rm{ }} \\ \operatorname{Tr}(A^T A\Delta ) - 2y^T A^T x + \left\| y \right\|^2 - \rho \le 0,\rm{ }\Delta \ge xx^T \\ \end{array} \right\}. }[/math]
One can see that this problem is a relaxation of the Chebyshev center (though different than the RCC described above).
RCC vs. CLS
A solution set [math]\displaystyle{ (x,\Delta) }[/math] for the RCC is also a solution for the CLS, and thus [math]\displaystyle{ T \in V }[/math]. This means that the CLS estimate is the solution of a looser relaxation than that of the RCC. Hence the CLS is an upper bound for the RCC, which is an upper bound for the real Chebyshev center.
Modeling constraints
Since both the RCC and CLS are based upon relaxation of the real feasibility set [math]\displaystyle{ Q }[/math], the form in which [math]\displaystyle{ Q }[/math] is defined affects its relaxed versions. This of course affects the quality of the RCC and CLS estimators. As a simple example consider the linear box constraints:
- [math]\displaystyle{ l \leq a^T x \leq u }[/math]
which can alternatively be written as
- [math]\displaystyle{ (a^T x - l)(a^T x - u) \leq 0. }[/math]
It turns out that the first representation results with an upper bound estimator for the second one, hence using it may dramatically decrease the quality of the calculated estimator.
This simple example shows us that great care should be given to the formulation of constraints when relaxation of the feasibility region is used.
Linear programming problem
This problem can be formulated as a linear programming problem, provided that the region Q is an intersection of finitely many hyperplanes.[4] Given a polytope, Q, defined as follows then it can be solved via the following linear program.
- [math]\displaystyle{ Q = \{x\in R^n: Ax \leq b\} }[/math]
- [math]\displaystyle{ \begin{align} & \max_{r, \hat x} && r\\ & \text{s.t.} && a_i \hat x + \|a_i\|r \leq b_i \\ & \text{and} && r\geq 0 \end{align} }[/math]
See also
- Bounding sphere
- Smallest-circle problem
- Circumscribed circle (covers circumcenter)
- Centre (geometry)
- Centroid
References
- ↑ 1.0 1.1 Boyd, Stephen P.; Vandenberghe, Lieven (2004). Convex Optimization. Cambridge University Press. ISBN 978-0-521-83378-3. https://web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf#page=430. Retrieved October 15, 2011.
- ↑ Amir, Dan (1984). "Best Simultaneous Approximation (Chebyshev Centers)". International Series of Numerical Mathematics / Internationale Schriftenreihe zur Numerischen Mathematik / Série internationale d'Analyse numérique. Birkhäuser. pp. 19–35. ISBN 9783034862530.
- ↑ Dabbene, Fabrizio; Sznaier, Mario; Tempo, Roberto (August 2014). "Probabilistic Optimal Estimation With Uniformly Distributed Noise". IEEE Transactions on Automatic Control 59 (8): 2113–2127. doi:10.1109/tac.2014.2318092.
- ↑ "Archived copy". http://www.ifor.math.ethz.ch/teaching/lectures/intro_ss11/Exercises/solutionEx11-12.pdf.
- Y. C. Eldar, A. Beck, and M. Teboulle, "A Minimax Chebyshev Estimator for Bounded Error Estimation," IEEE Trans. Signal Process., 56(4): 1388–1397 (2007).
- A. Beck and Y. C. Eldar, "Regularization in Regression with Bounded Noise: A Chebyshev Center Approach," SIAM J. Matrix Anal. Appl. 29 (2): 606–625 (2007).
Original source: https://en.wikipedia.org/wiki/Chebyshev center.
Read more |