# Operations¶

Convex.jl currently supports the following functions. These functions may be composed according to the DCP composition rules to form new convex, concave, or affine expressions. Convex.jl transforms each problem into an equivalent cone program in order to pass the problem to a specialized solver. Depending on the types of functions used in the problem, the conic constraints may include linear, second-order, exponential, or semidefinite constraints, as well as any binary or integer constraints placed on the variables. Below, we list each function available in Convex.jl organized by the (most complex) type of cone used to represent that function, and indicate which solvers may be used to solve problems with those cones. Problems mixing many different conic constraints can be solved by any solver that supports every kind of cone present in the problem.

## Linear Program Representable Functions¶

An optimization problem using only these functions can be solved by any LP solver.

operation description vexity slope implicit constraint / notes
x+y or x.+y addition affine increasing none
x-y or x.-y subtraction affine

increasing in $$x$$

decreasing in $$y$$

none

none

x*y multiplication affine

increasing if

constant term $$\ge 0$$

decreasing if

constant term $$\le 0$$

not monotonic

otherwise

one term is constant
x/y division affine increasing $$y$$ is a scalar constant
x .* y elemwise multiplication affine increasing one term is constant
x[1:4, 2:3] indexing and slicing affine increasing none
diag(x, k) $$k$$-th diagonal of a matrix affine increasing none
diagm(x) turn vector into diagonal matrix affine increasing $$x$$ is a vector
x' transpose affine increasing none
x'*y or dot(x,y) $$x' y$$ affine increasing one term is constant
vec(x) vector representation affine increasing none
reshape(x, m, n) reshape into $$m \times n$$ affine increasing none
minimum(x) $$\min(x)$$ concave increasing none
maximum(x) $$\max(x)$$ convex increasing none

[x y] or [x; y]

hcat(x, y) or

vcat(x, y)

stacking affine increasing none
trace(x) $$\mathrm{tr} \left(X \right)$$ affine increasing none
conv(h,x)

$$h \in \mathbb{R}^m$$

$$x \in \mathbb{R}^m$$

$$h*x \in \mathbb{R}^{m+n-1}$$

entry $$i$$ is given by

$$\sum_{j=1}^m h_jx_{i-j}$$

affine

increasing if $$h\ge 0$$

decreasing if $$h\le 0$$

not monotonic

otherwise

$$h$$ is constant
min(x,y) $$\min(x,y)$$ concave increasing none
max(x,y) $$\max(x,y)$$ convex increasing none
pos(x,y) $$\max(x,0)$$ convex increasing none
neg(x,y) $$\max(-x,0)$$ convex decreasing none
inv_pos(x) $$1/\max(x,0)$$ convex decreasing $$x>0$$
abs(x) $$\left|x\right|$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

none

## Second-Order Cone Representable Functions¶

An optimization problem using these functions can be solved by any SOCP solver (including ECOS, SCS, Mosek, Gurobi, and CPLEX). Of course, if an optimization problem has both LP and SOCP representable functions, then any solver that can solve both LPs and SOCPs can solve the problem.

operation description vexity slope implicit constraint
norm(x, p) $$(\sum x_i^p)^{1/p}$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

p >= 1
vecnorm(x, p) $$(\sum x_{ij}^p)^{1/p}$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

p >= 1
quad_form(x, P) $$x^T P x$$

convex in $$x$$

affine in $$P$$

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

increasing in $$P$$

either $$x$$ or $$P$$

must be constant

quad_over_lin(x, y) $$x^T x/y$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

decreasing in $$y$$

$$y > 0$$
sum_squares(x) $$\sum x_i^2$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

none
sqrt(x) $$\sqrt{x}$$ convex decreasing $$x>0$$
square(x), x^2 $$x^2$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

none
geo_mean(x, y) $$\sqrt{xy}$$ concave increasing $$x\ge0$$, $$y\ge0$$

huber(x)

huber(x, M)

$$\begin{cases} x^2 &|x| \leq M \\ 2M|x| - M^2 &|x| > M \end{cases}$$ convex

increasing on $$x \ge 0$$

decreasing on $$x \le 0$$

$$M>=1$$

## Exponential Cone Representable Functions¶

An optimization problem using these functions can be solved by any exponential cone solver (SCS).

operation description vexity slope implicit constraint
logsumexp(x) $$\log(\sum_i \exp(x_i))$$ convex increasing none
exp(x) $$\exp(x)$$ convex increasing none
log(x) $$\log(x)$$ concave increasing $$x>0$$
entropy(x) $$\sum_{ij} -x_{ij} \log (x_{ij})$$ concave not monotonic $$x>0$$
logistic_loss(x) $$\log(1 + \exp(x_i))$$ convex increasing none

## Semidefinite Program Representable Functions¶

An optimization problem using these functions can be solved by any SDP solver (including SCS and Mosek).

operation description vexity slope implicit constraint
nuclear_norm(x) sum of singular values of $$x$$ convex not monotonic none
operator_norm(x) max of singular values of $$x$$ convex not monotonic none
lambda_max(x) max eigenvalue of $$x$$ convex increasing x is positive semidefinite
lambda_min(x) min eigenvalue of $$x$$ concave increasing x is positive semidefinite
matrix_frac(x, P) $$x^TP^{-1}x$$ convex not monotonic P is positive semidefinite

## Exponential + SDP representable Functions¶

An optimization problem using these functions can be solved by any solver that supports exponential constraints and semidefinite constraints simultaneously (SCS).

operation description vexity slope implicit constraint
logdet(x) log of determinant of $$x$$ concave increasing x is positive semidefinite

## Promotions¶

When an atom or constraint is applied to a scalar and a higher dimensional variable, the scalars are promoted. For example, we can do max(x, 0) gives an expression with the shape of x whose elements are the maximum of the corresponding element of x and 0.