Import a notebook or the contents of an entire directory.
db_workspace_import(
path,
file = NULL,
content = NULL,
format = c("AUTO", "SOURCE", "HTML", "JUPYTER", "DBC", "R_MARKDOWN"),
language = NULL,
overwrite = FALSE,
host = db_host(),
token = db_token(),
perform_request = TRUE
)Absolute path of the notebook or directory.
Path of local file to upload. See formats parameter.
Content to upload, this will be base64-encoded and has a limit of 10MB.
One of AUTO, SOURCE, HTML, JUPYTER, DBC, R_MARKDOWN.
Default is SOURCE.
One of R, PYTHON, SCALA, SQL. Required when format
is SOURCE otherwise ignored.
Flag that specifies whether to overwrite existing object.
FALSE by default. For DBC overwrite is not supported since it may contain
a directory.
Databricks workspace URL, defaults to calling db_host().
Databricks workspace token, defaults to calling db_token().
If TRUE (default) the request is performed, if
FALSE the httr2 request is returned without being performed.
file and content are mutually exclusive. If both are specified content
will be ignored.
If path already exists and overwrite is set to FALSE, this call returns
an error RESOURCE_ALREADY_EXISTS. You can use only DBC format to import
a directory.
Other Workspace API:
db_workspace_delete(),
db_workspace_export(),
db_workspace_get_status(),
db_workspace_list(),
db_workspace_mkdirs()