Learn R Programming

mlflow (version 0.7.0)

mlflow_rfunc_serve: Serve an RFunc MLflow Model

Description

Serve an RFunc MLflow Model as a local web api.

Usage

mlflow_rfunc_serve(model_path, run_uuid = NULL, host = "127.0.0.1",
  port = 8090, daemonized = FALSE, browse = !daemonized,
  restore = FALSE)

Arguments

model_path

The path to the MLflow model, as a string.

run_uuid

ID of run to grab the model from.

host

Address to use to serve model, as a string.

port

Port to use to serve model, as numeric.

daemonized

Makes 'httpuv' server daemonized so R interactive sessions are not blocked to handle requests. To terminate a daemonized server, call 'httpuv::stopDaemonizedServer()' with the handle returned from this call.

browse

Launch browser with serving landing page?

restore

Should mlflow_restore_snapshot() be called before serving?

Examples

Run this code
# NOT RUN {
library(mlflow)

# save simple model with constant prediction
mlflow_save_model(function(df) 1, "mlflow_constant")

# serve an existing model over a web interface
mlflow_rfunc_serve("mlflow_constant")

# request prediction from server
httr::POST("http://127.0.0.1:8090/predict/")
# }

Run the code above in your browser using DataLab