tfdeploy v0.6.1


Monthly downloads



Deploy 'TensorFlow' Models

Tools to deploy 'TensorFlow' <> models across multiple services. Currently, it provides a local server for testing 'cloudml' compatible services.


Deploying TensorFlow Models from R

Status CRAN\_Status\_Badge codecov

While TensorFlow models are typically defined and trained using R or Python code, it is possible to deploy TensorFlow models in a wide variety of environments without any runtime dependency on R or Python:

  • TensorFlow Serving is an open-source software library for serving TensorFlow models using a gRPC interface.

  • CloudML is a managed cloud service that serves TensorFlow models using a REST interface.

  • RStudio Connect provides support for serving models using the same REST API as CloudML, but on a server within your own organization.

TensorFlow models can also be deployed to mobile and embedded devices including iOS and Android mobile phones and Raspberry Pi computers. The tfdeploy package includes a variety of tools designed to make exporting and serving TensorFlow models straightforward. For documentation on using tfdeploy, see the package website at

Functions in tfdeploy

Name Description
predict_savedmodel.graph_prediction Predict using a Loaded SavedModel
predict_savedmodel.webapi_prediction Predict using a Web API
serve_savedmodel Serve a SavedModel
reexports Objects exported from other packages
load_savedmodel Load a SavedModel
predict_savedmodel.export_prediction Predict using an Exported SavedModel
predict_savedmodel Predict using a SavedModel
No Results!

Vignettes of tfdeploy

No Results!

Last month downloads


Type Package
License Apache License 2.0
Encoding UTF-8
LazyData true
RoxygenNote 6.1.1
VignetteBuilder knitr
NeedsCompilation no
Packaged 2019-06-13 18:26:35 UTC; dfalbel
Repository CRAN
Date/Publication 2019-06-14 16:30:03 UTC

Include our badge in your README