Learn R Programming

OptHoldoutSize (version 0.1.0.1)

grad_mincost_powerlaw: Gradient of minimum cost (power law)

Description

Compute gradient of minimum cost assuming a power-law form of k2

Assumes cost function is l(n;k1,N,theta) = k1 n + k2(n;theta) (N-n) with k2(n;theta)=k2(n;a,b,c)= a n^(-b) + c

Usage

grad_mincost_powerlaw(N, k1, theta)

Value

List/data frame of dimension (number of evaluations) x 5 containing partial derivatives of nstar (optimal holdout size) with respect to N, k1, a, b, c respectively.

Arguments

N

Total number of samples on which the predictive score will be used/fitted. Can be a vector.

k1

Cost value in the absence of a predictive score. Can be a vector.

theta

Parameters for function k2(n) governing expected cost to an individual sample given a predictive score fitted to n samples. Can be a matrix of dimension n x n_par, where n_par is the number of parameters of k2.

Examples

Run this code
# Evaluate minimum for a range of values of k1, and compute derivative
N=10000;
k1=seq(0.1,0.5,length=20)
A=3; B=1.5; C=0.15; theta=c(A,B,C)

mincost=optimal_holdout_size(N,k1,theta)
grad_mincost=grad_mincost_powerlaw(N,k1,theta)

plot(0,type="n",ylim=c(0,1560),xlim=range(k1),xlab=expression("k"[1]),
  ylab="Optimal holdout set size")
lines(mincost$k1,mincost$cost,col="black")
lines(mincost$k1,grad_mincost[,2],col="red")
legend(0.2,800,c(expression(paste("l(n"["*"],")")),
                       expression(paste(partialdiff[k1],"l(n"["*"],")"))),
    col=c("black","red"),lty=1,bty="n")

Run the code above in your browser using DataLab