Convex optimization
These problems are handpicked from my projects in Convex Analysis & Optimization and Real Analysis. The solutions and proofs are my own and are linked as PDFs under the section titles.
Published:
2021-09-13
Semicontinuity
Let $f : \R^n \to \R$ be a lower semicontinuous function, and $g : \R \to \R$ be a lower semicontinuous and nondecreasing function. Show that the function $h$ defined by $h(x) = g \circ f(x)$ is lower semicontinuous. Give an example showing that the nondecrease assumption is essential.
Metric Topology
Find in $(\R, |x-y|)$, the closure, the interior, the accumulation points, and the isolated points of, $$ A=\left\{(-1)^{n}+\frac{1}{n}, \quad n \in \mathbb{N}^{*}\right\}. $$
Sequential Compactness
Let $(E,d)$ be a metric space and let $\left(x_{n}\right)_{n \in \mathbb{N}}$ be a sequence in $E$ that converges to $a \in E$. Prove that the set $\left\{x_{n} \mid n \in \mathbb{N}\right\} \cup{a}$ is compact.
Quadratic Programming
Consider the quadratic optimization problem:
$$
(P) \begin{cases}\min & f(x)=\left(x_{1}-x_{2}\right)^{2}+\left(x_{2}+x_{3}\right)^{2}+\left(x_{3}-x_{1}\right)^{2} \\ \text { s.t. } & x_{1}+2 x_{2}+3 x_{3}=10 \; \text { and }\; 3 x_{1}+2 x_{2}+x_{3}=14\end{cases}
$$
Part (a)
Find a vector $b \in \R^2$, a symmetric matrix $W$ of size $3 \times 3$ and a matrix $A$ of size $2 \times 3$ such that,
$$
f(x)=\frac{1}{2}\langle W x, x\rangle \text { and the constraint is } A x=b
$$
Part (b)
Verify that $W$ is positive definite and that the rank of $A$ is $2$.
Part (c)
Diagonalize the matrix $W$ and deduce its square root $W^\frac{1}{2}$.
Part (d)
Let $\bar{x}$ be a particular solution of the system $Ax = b$. Prove using the transformation $y = W^\frac{1}{2} (x - \bar{x})$, that $(P)$ is equivalent to the following:
$$
\text { (Q) } \begin{cases}\min & \frac{1}{2}\Vert y\Vert^{2}+\left\langle W^\frac{1}{2} \bar{x}, y\right\rangle \\ \text { s.t. } & A W^{-\frac{1}{2}} y=0\end{cases}
$$
Part (e)
Prove that $(Q)$ has a unique solution that you compute.
Part (f)
Deduce that $(P)$ has a unique solution that you compute.
Convexity & Optimality
Let $S \subset \R^2$ be the set defined by,
$$
S:=\{(x, y): x \geq 0, y \geq 0,-x+y \leq 2 \text { and } 2 x+3 y \leq 11\}
$$
We define the function $f:\R^2\to\R$ by,
$$
f(x, y):=x^{2}+y^{2}-8 x-20 y+89
$$
Part (a)
Prove that $S$ and $f$ are convex.
Part (b)
Prove that $(1,3)$ is an optimal solution for minimizing $f$ over $S$.
Part (c)
Prove that $(1,3)$ is the only optimal solution.
QP Optimality Condition
We consider the following quadratic programming problem: $$ (Q)\begin{cases}\min & f(x)=\left(x_{1}-x_{2}\right)^{2}+\left(x_{2}-x_{3}\right)^{2}+\left(x_{3}-x_{1}\right)^{2}+\alpha x_{1}+\beta x_{2}+\gamma x_{3} \\ \text { s.t. } & x_{1}+2 x_{2}+3 x_{3} \leq 10 \;\text { and }\; 3 x_{1}+2 x_{2}+x_{3} \leq 14\end{cases} $$ Prove that $(Q)$ has a minimal solution if and only if $\alpha + \beta + \gamma \leq 0$.
Computing Subdifferentials
Calculate the subdifferential of the following functions:
Part (a)
$f(x)=\max \{1,|x-1|\}$ on $\mathbb{R}$.
Part (b)
$f(x)=1-\sqrt{1-x^{2}}$ if $x\in[-1,1]$ and $\infty$ otherwise.
Part (c)
$f(x)= \begin{cases}0 & x \in[-1,1] \\ |x|-1 & x \in[-2,-1)\cup(1,2] \\ \infty & x \in(-\infty,-2)\cup( 2, \infty)\end{cases}$
Subdifferential of Distance Function
Let $C \subset \R^n$ be a nonempty, closed and convex set. We denote by $d_C(\cdot)$ the distance function associated to $C$. Find the subdifferential $\partial d_C(x)$ for all $x \in C$.
Normal and Tangent Cones
In $\R^n$, we consider the set:
$$
\Lambda_{n}=\left\{\left(x_{1}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: x_{i} \geq 0 \; \forall i \text { and } \sum_{i=1}^{n} x_{i}=1\right\}
$$
Part (a)
Verify that $\Lambda_n$ is closed and convex.
Part (b)
Sketch $\Lambda_3$ and then find geometrically the normal and tangent cones $N_{\Lambda_3}(x)$ and $T_{\Lambda_3}(x)$ at the points: $(\frac{1}{3}, \frac{1}{3}, \frac{1}{3})$, $(0, \frac{1}{2}, \frac{1}{2})$, $(0, 0, 1)$
Part (c)
For $\bar{x} \in \Lambda_n$, we denote by $I(\bar{x})$ the set of $i$ such that $\bar{x}_i=0$. Note that $I(\bar{x})$ can be an empty set. Prove that,
-
The tangent cone $T_{\Lambda_{n}}(\bar{x})=$
$\left\{\left(\alpha_{1}, \ldots, \alpha_{n}\right) \in \mathbb{R}^{n}: \alpha_{i} \geq 0 \; \forall i \in I(\bar{x}) \text { and } \sum_{i=1}^{n} \alpha_{i}=0\right\}$
-
The normal cone $N_{\Lambda_{n}}(\bar{x})=$
$\left\{\left(\beta_{0}, \ldots, \beta_{0}\right): \beta_{0} \in \mathbb{R}\right\}+\left\{\left(\beta_{1}, \ldots, \beta_{n}\right): \beta_{i} \leq 0 \text { for } i \in I(\bar{x}) \text { and } 0 \text { otherwise}\right\}$
Part (d)
Now let $f: \R^n \to \R$ be convex and differentiable. Prove that $\bar{x} \in \Lambda_n$ is a global minimum of $f$ over $\Lambda_n$ if and only if there is a constant $\bar{c}$ such that,
$$
\begin{cases}f_{x_{i}}(\bar{x})=\bar{c} & \forall i \notin I(\bar{x}) \\ f_{x_{i}}(\bar{x}) \geq \bar{c} & \forall i \in I(\bar{x})\end{cases}
$$
Part (e)
Application: Minimize two functions $f(x,y,z) = x^2 + 2y^2 + z^2$ and $g(x) = x + 2y + z$ over $\Lambda_3$.
Convexity of Minimization Problem
For $x_{1}, x_{2}, a \in \R$, we consider the following minimization problem,
$$
\left(P_{a}\right)\begin{dcases}
\min_{x_{1}, x_{2}} &x_{1}^{2}+a x_{2}^{2}+x_{1} x_{2}+x_{1},
\\ \text{ s.t. } &x_{1}+x_{2}-1 \leq 0
\end{dcases}
$$
Part (a)
Prove that $(P_a)$ is a convex problem if and only if $a \geq \frac{1}{4}$.
Part (b)
Find the optimal solution and the optimal value of $(P_a)$ when $a \geq \frac{1}{4}$.
Conjugate Functions and Subdifferentials
Let $f: \R \to \R \cup \{\infty\}$ be a proper, closed and convex function such that its subdifferential, $$ \partial f(x)= \begin{cases}\frac{1}{2}(x-1) & x<0, \\ {\left[-\frac{1}{2}, \frac{1}{2}\right]} & x=0, \\ \frac{1}{2}(x+1) & x>0.\end{cases} $$ Find $f$, the conjugate’s subdifferential $\partial f^*$, and the conjugate function $f^*$.
Duality & Infimal–Convolution
For $f_1$ and $f_2$ two proper functions from $\R^n \to \R\cup\{\infty\}$, we define the infimal–convolution of $f_1$ and $f_2$, denoted by $f_1 \square f_2$, is the function defined on $\R^n$ by:
$$
\left(f_{1} \square f_{2}\right)(x):=\inf _{z}\left\{f_{1}(z)+f_{2}(x-z)\right\}
$$
Part (a)
Prove, using the definitions, that $(f_1 \square f_2)^* = f_1^* + f_2^*$.
Part (b)
Now let $C$ be a nonempty closed and convex set and let $d_C(\cdot)$ be the distance function to the set $C$, and $\delta_C(\cdot)$ be its indicator function.
-
Verify that $d_C = \delta_C\square\Vert \cdot \Vert$.
-
Deduce that $(d_C)^*(\cdot) = \sigma_C(\cdot) + \delta_{B_*}(\cdot)$, where $\sigma_C(\cdot)$ is the support of $C$ and $\bar{B}_*$ is the unit closed ball for the dual norm.
-
Deduce that,
$\partial\left(d_{C}\right)^{*}(y)=\partial \sigma_{C}(y)+\partial \delta_{B_{*}}(y)$. -
Prove that $y \in \partial d_C(x)$ if and only if there exist $x_1 \in C$ and $x_2 \in \R^n$ such that, $x = x_1 + x_2$ and, $y \in \partial \delta_C(x_1)$ and $y \in \partial \Vert \cdot \Vert (x_2)$.
-
Deduce that, $$ \partial d_{C}(x)= \begin{cases}N_{C}(x) \cap \bar{B} & \text {if } x \in C, \\ \dfrac{x-c_{x}}{\left\Vert x-c_{x}\right\Vert} & \text {if } x \notin C,\end{cases} $$ where $\bar{B}$ is the unit closed ball and $c_x$ is the unique projection of $x$ to the set $C$.
-
Deduce that $d_C(\cdot)$ is differentiable at any $x \notin C$.