Learn R Programming

⚠️There's a newer version (0.8.1) of this package.Take me there.

Convex Optimization in R by convexjlr

convexjlr is an R package for Disciplined Convex Programming (DCP) by providing a high level wrapper for Julia package Convex.jl. The aim is to provide optimization results rapidly and reliably in R once you formulate your problem as a convex problem. convexjlr can solve linear programs, second order cone programs, semidefinite programs, exponential cone programs, mixed-integer linear programs, and some other DCP-compliant convex programs through Convex.jl.

Installation:

convexjlr is on CRAN now! To use package convexjlr, you first have to install Julia https://julialang.org/ on your computer, and then you can install convexjlr just like any other R packages.

We hope you use convexjlr to solve your own problems. If you would like to share your experience on using convexjlr or have any questions about convexjlr, don't hesitate to contact me: cxl508@psu.edu.

Quick Example

We will show a short example for convexjlr in solving linear regression problem. To use package convexjlr, we first need to attach it and do the initial setip:

library(convexjlr)
#> 
#> Attaching package: 'convexjlr'
#> The following object is masked from 'package:base':
#> 
#>     norm
setup()
#> Doing initialization. It may take some time. Please wait.
#> [1] TRUE

And this is our linear regression function using convexjlr:

linear_regression <- function(x, y){
    p <- ncol(x)
    ## n is a scalar, you don't have to use J(.) to send it to Julia.
    n <- nrow(x) ## n <- J(nrow(x))
    ## x is a matrix and y is a vector, you have to use J(.) to send them to Julia.
    x <- J(x)
    y <- J(y)
    ## coefficient vector beta and intercept b.
    beta <- Variable(p)
    b <- Variable()
    ## MSE is mean square error.
    MSE <- Expr(sum((y - x %*% beta - b) ^ 2) / n)
    ## In linear regression, we want to minimize MSE.
    p1 <- minimize(MSE)
    cvx_optim(p1)
    list(coef = value(beta), intercept = value(b))
}

In the function, x is the predictor matrix, y is the response we have. And the linear_regression function will return the coefficient and intercept solved by cvx_optim.

Now we can see a little example using the linear_regression function we have just built.

n <- 1000
p <- 5
## Sigma, the covariance matrix of x, is of AR-1 strcture.
Sigma <- outer(1:p, 1:p, function(i, j) 0.5 ^ abs(i - j))
x <- matrix(rnorm(n * p), n, p) %*% chol(Sigma)
## The real coefficient is all zero except the first, second and fourth elements.
beta0 <- c(5, 1, 0, 2, 0)
y <- x %*% beta0 + 0.2 * rnorm(n)

linear_regression(x, y)$coef
#>             [,1]
#> [1,] 4.998312910
#> [2,] 1.002886430
#> [3,] 0.003071815
#> [4,] 1.989776899
#> [5,] 0.019197631

More Examples

More examples (including using convexjlr for Lasso, logistic regression and Support Vector Machine) can be found in the pakage vignette or on the github page: https://github.com/Non-Contradiction/convexjlr

Copy Link

Version

Install

install.packages('convexjlr')

Monthly Downloads

41

Version

0.5.1

License

Apache License | file LICENSE

Issues

Pull Requests

Stars

Forks

Maintainer

Changcheng Li

Last Published

June 21st, 2017

Functions in convexjlr (0.5.1)

huber

Huber loss
lambdamax

Largest eigenvalues of x
matrixfrac

x^T P^-1 x
maximum

Largest elements
square

Square of x
sumlargest

Sum of the largest elements
vecdot

Inner product of vector representation of two matrices
vecnorm

p-norm of vector representation of x
entropy

sum(-x * log(x))
geomean

Geometric mean of x and y
lambdamin

Smallest eigenvalues of x
logdet

Log of determinant of x
sumsmallest

Sum of the smallest elements
sumsquares

Sum of squares of x
tr

Trace of matrix
value

Get values of expressions at optimizer
addConstraint

Add constraints to optimization problem
cvx_optim

Solve optimization problem
minimum

Smallest elements
neg

Negative parts
Expr

Create expressions to be used for optimization problem creation
J

Make a variable to be of Julia's awareness
norm

p-norm of x
nuclearnorm

Sum of singular values of x
dot

Inner product
dotsort

Inner product of two vectors after sorted
logisticloss

log(1 + exp(x))
logsumexp

log(sum(exp(x)))
operatornorm

Largest singular value of x
quadform

x^T P x
setup

Doing the setup for the package convexjlr
variable_creating

Create variable for optimization problem
pos

Positive parts
problem_creating

Create optimization problem
property

Get properties of optimization problem
vec

Vector representation