Math Review

This section develops the basic theory of convex functions that we will need later.

Vectors are considered as column vectors, unless they are explicitly transposed. \(x^{T}y\) is the scalar product \(\sum_{i=1}^d x_i y_i\) of two vectors \(x,y \in \mathbb{R}\). \(\| x \|\) denoted the eculidean norm(l2-norm or 2-norm) of vector x, \(\| x \|^2=x^Tx=\sum_{i=1}^{d}x_i^2\). We also use \(N=\{1,2,...\}\) and \(R_{+}:=\{x\in R : x \geq 0\}\) to denote the natural and npn-negative real numbers, respectively.

The Cauchy-Schwarz inequality

Let \(u,v \in \mathbb{R}^d\). Then \(|u^T v| \leq \|u\| \|v\|\). The inequality holds beyond the Euclidean norm; all we need is an inner product, and a norm induced by it. But here, we only discuss the Euclidean case. For nonzero vectors, the Cauchy-Schwarz inequality is equivalent to \(-1 \geq \frac{u^T v}{\|u\| \|v\|} \leq 1\) and this fraction can be used to define the angle \(\alpha\) between u and v: \(cos(\alpha)=\frac{u^T v}{\|u\| \|v\|}\), where \(\alpha \in [0,\pi]\).