## S3 method for class 'formula':
blackboost(formula, data = list(), weights = NULL, ...)
## S3 method for class 'matrix':
blackboost(x, y, weights = NULL, ...)
blackboost_fit(object, tree_controls = 
    ctree_control(teststat = "max",
                  testtype = "Teststatistic",
                  mincriterion = 0,
                  maxdepth = 2),
    fitmem = ctree_memory(object, TRUE), family = GaussReg(), 
    control = boost_control(), weights = NULL)boost_data, see boost_dpp.TreeControl, which can be
                  obtained using ctree_control. 
                  Defines hyper parameters TreeFitMemory.boost_family-class,
                implementing the negative gradient corresponding
                to the loss function to be optimized, by default, 
                squared boost_control
                  which defines the hyper parameters of the
                  boosting algorithm.gbm.  The
  main difference is that arbitrary loss functions to be optimized 
  can be specified via the family argument to blackboost whereas
  gbm uses hard-coded loss functions. 
  Moreover, the base learners (conditional
  inference trees, see ctree) are a little bit more flexible.The regression fit is a black box prediction machine and thus hardly interpretable.
  Usually, the formula based interface blackboost should be used,
  the fitting procedure without data preprocessing is assessible
  via blackboost_fit, for example for cross-validation.
Greg Ridgeway (1999), The state of boosting. Computing Science and Statistics, 31, 172--181.
  Peter Buhlmann and Torsten Hothorn (2007),
  Boosting algorithms: regularization, prediction and model fitting.
  Statistical Science, accepted.
  
### a simple two-dimensional example: cars data
    cars.gb <- blackboost(dist ~ speed, data = cars,
                          control = boost_control(mstop = 50))
    cars.gb
    ### plot fit
    plot(dist ~ speed, data = cars)
    lines(cars$speed, predict(cars.gb), col = "red")Run the code above in your browser using DataLab