9.5 Examples

Norm and Penalty Approximation

In the first example we solve the norm approximation problems

minimize  ∥Ax - b∥∞,      minimize  ∥Ax - b∥1 ,

and the penalty approximation problem

                                   ({ 0          |u| ≤ 3∕4
minimize ∑   ϕ((Ax - b) ),     ϕ(u ) =  |u|- 3∕4   3∕4 ≤ |u| ≤ 3∕2
           k          k            ( 2|u|- 9∕4  |u| ≥ 3∕2.

We use randomly generated data.

The code uses the Matplotlib package for plotting the histograms of the residual vectors for the two solutions. It generates the figure shown below.

from cvxopt import normal  
from cvxopt.modeling import variable, op, max, sum  
import pylab  
 
m, n = 500, 100  
A = normal(m,n)  
b = normal(m)  
 
x1 = variable(n)  
op(max(abs(A*x1-b))).solve()  
 
x2 = variable(n)  
op(sum(abs(A*x2-b))).solve()  
 
x3 = variable(n)  
op(sum(max(0, abs(A*x3-b)-0.75, 2*abs(A*x3-b)-2.25))).solve()  
 
pylab.subplot(311)  
pylab.hist(A*x1.value-b, m/5)  
pylab.subplot(312)  
pylab.hist(A*x2.value-b, m/5)  
pylab.subplot(313)  
pylab.hist(A*x3.value-b, m/5)  
pylab.show()

PIC

Equivalently, we can formulate and solve the problems as LPs.

t = variable()  
x1 = variable(n)  
op(t, [-t <= A*x1-b, A*x1-b<=t]).solve()  
 
u = variable(m)  
x2 = variable(n)  
op(sum(u), [-u <= A*x2+b, A*x2+b <= u]).solve()  
 
v = variable(m)  
x3 = variable(n)  
op(sum(v), [v >= 0, v >= A*x3+b-0.75, v >= -(A*x3+b)-0.75, v >= 2*(A*x3-b)-2.25, v >= -2*(A*x3-b)-2.25]).solve()

Robust Linear Programming
The robust LP
minimize  cTx
subject to sup∥v∥ ≤1(ai + v)Tx ≤ bi,  i = 1,...,m
                ∞

is equivalent to the problem

           T
minimize  cTx
subject to  ai x + ∥x∥1 ≤ bi, i = 1,...,m.

The following code computes the solution and the solution of the equivalent LP

minimize   cTx
subject to aTi x+ 1T y ≤ bi, i = 1,...,m
          - y ≼ x ≼ y

for randomly generated data.

from cvxopt import normal, uniform  
from cvxopt.modeling import variable, dot, op, sum  
 
m, n = 500, 100  
A = normal(m,n)  
b = uniform(m)  
c = normal(n)  
 
x = variable(n)  
op(dot(c,x), A*x+sum(abs(x)) <= b).solve()  
 
x2 = variable(n)  
y = variable(n)  
op(dot(c,x2), [A*x2+sum(y) <= b, -y <= x2, x2 <= y]).solve()

1-Norm Support Vector Classifier

The following problem arises in classification:

                 T
minimize   ∥x∥1 + 1 u
subject to Ax ≽ 1- u
          u ≽ 0.

It can be solved as follows.

x = variable(A.size[1],’x’)  
u = variable(A.size[0],’u’)  
op(sum(abs(x)) + sum(u), [A*x >= 1-u, u >= 0]).solve()

An equivalent unconstrained formulation is

x = variable(A.size[1],’x’)  
op(sum(abs(x)) + sum(max(0,1-A*x))).solve()