mlflow (version 1.0.0)

mlflow_rfunc_serve: Serve an RFunc MLflow Model

Description

Serves an RFunc MLflow model as a local web API.

Usage

mlflow_rfunc_serve(model_uri, host = "127.0.0.1", port = 8090,
  daemonized = FALSE, browse = !daemonized, ...)

Arguments

model_uri

The location, in URI format, of the MLflow model.

host

Address to use to serve model, as a string.

port

Port to use to serve model, as numeric.

daemonized

Makes `httpuv` server daemonized so R interactive sessions are not blocked to handle requests. To terminate a daemonized server, call `httpuv::stopDaemonizedServer()` with the handle returned from this call.

browse

Launch browser with serving landing page?

...

Optional arguments passed to `mlflow_predict()`.

Details

The URI scheme must be supported by MLflow - i.e. there has to be an MLflow artifact repository corresponding to the scheme of the URI. The content is expected to point to a directory containing MLmodel. The following are examples of valid model uris:

- ``file:///absolute/path/to/local/model`` - ``file:relative/path/to/local/model`` - ``s3://my_bucket/path/to/model`` - ``runs:/<mlflow_run_id>/run-relative/path/to/model``

For more information about supported URI schemes, see the Artifacts Documentation at https://www.mlflow.org/docs/latest/tracking.html#supported-artifact-stores.

Examples

Run this code
# NOT RUN {
library(mlflow)

# save simple model with constant prediction
mlflow_save_model(function(df) 1, "mlflow_constant")

# serve an existing model over a web interface
mlflow_rfunc_serve("mlflow_constant")

# request prediction from server
httr::POST("http://127.0.0.1:8090/predict/")
# }

Run the code above in your browser using DataCamp Workspace