observations used in quantile regression fitting
$$min_{b \in R^{p}}\sum_{i=1}^{n}\rho_{\tau}(y_i-x_{i}^{'}b)$$
where \(\rho_{\tau}(r)=r[\tau-I(r<0)]$ for $\tau \in (0,1)\). This
yields the modified linear program
$$min(\tau e^{'}u+(1-\tau)e^{'}v|y=Xb+u-v,(u,v) \in
R_{+}^{2n})$$
Adding slack variables, \(s\), satisfying the constrains
\(a+s=e\), we
obtain the barrier function
$$B(a, s, u) = y^{'}a+\mu \sum_{i=1}^{n}(loga_{i}+logs_{i})$$
which should be maximized subject to the constrains
\(X^{'}a=(1-\tau)X^{'}e\) and \(a+s=e\). The Newton step
\(\delta_{a}\)
solving
$$max{y^{'}\delta_{a}+\mu \delta^{'}_{a}(A^{-1}-S^{-1})e-
\frac{1}{2}\mu \delta_{a}^{'}(A^{-2}+S^{-2})\delta_{a}}$$
subject to \(X{'}\delta_{a}=0\), satisfies
$$y+\mu(A^{-1}-S^{-1})e-\mu(A^{-2}+S^{-2})\delta_{a}=Xb$$
for some \(b\in R^{p}\), and \(\delta_{a}\) such that
\(X^{'}\delta_{a}=0\).
Using the constraint, we can solve explicitly for the vector
\(b\),
$$b=(X^{'}WX)^{-1}X^{'}W[y+\mu(A^{-1}-S^{-1})e]$$
where \(W=(A^{-2}+S^{-2})^{-1}\). This is a form of the primal log
barrier algorithm described above. Setting \(\mu=0\) in each step
yields an affine scaling variant of the algorithm. The basic linear
algebra of each iteration is essentially unchanged, only the form
of the diagonal weighting matrix \(W\) has chagned.