Learn R Programming

⚠️There's a newer version (0.2.12) of this package.Take me there.

brickster

Overview

{brickster} is the R toolkit for Databricks, it includes:

Quick Start

library(brickster)

# only requires `DATABRICKS_HOST` if using OAuth U2M
# first request will open browser window to login
Sys.setenv(DATABRICKS_HOST = "https://<workspace-prefix>.cloud.databricks.com")

# open RStudio/Positron connection pane to view Databricks resources
open_workspace()

# list all SQL warehouses
warehouses <- db_sql_warehouse_list()

Refer to the "Connect to a Databricks Workspace" article for more details on getting authentication configured.

Usage

{DBI} Backend

library(brickster)
library(DBI)

# Connect to Databricks using DBI (assumes you followed quickstart to authenticate)
con <- dbConnect(
  DatabricksSQL(),
  warehouse_id = "<warehouse-id>"
)

# Standard {DBI} operations
tables <- dbListTables(con)
dbGetQuery(con, "SELECT * FROM samples.nyctaxi.trips LIMIT 5")

# Use with {dbplyr} for {dplyr} syntax
library(dplyr)
library(dbplyr)

nyc_taxi <- tbl(con, I("samples.nyctaxi.trips"))

result <- nyc_taxi |>
  filter(year(tpep_pickup_datetime) == 2016) |>
  group_by(pickup_zip) |>
  summarise(
    trip_count = n(),
    avg_fare = mean(fare_amount, na.rm = TRUE),
    avg_distance = mean(trip_distance, na.rm = TRUE)
  ) |>
  collect()

Download & Upload to Volume

library(readr)
library(brickster)

# upload `data.csv` to a volume
local_file <- tempfile(fileext = ".csv")
write_csv(x = iris, file = local_file)
db_volume_write(
  path = "/Volumes/<catalog>/<schema>/<volume>/data.csv",
  file = local_file
)

# read `data.csv` from a volume and write to a file
downloaded_file <- tempfile(fileext = ".csv")
file <- db_volume_read(
  path = "/Volumes/<catalog>/<schema>/<volume>/data.csv",
  destination = downloaded_file
)
volume_csv <- read_csv(downloaded_file)

Databricks REPL

Run commands against an existing interactive Databricks cluster, read this article for more details.

library(brickster)

# commands after this will run on the interactive cluster
# read the vignette for more details
db_repl(cluster_id = "<interactive_cluster_id>")

Installation

install.packages("brickster")

Development Version

# install.packages("pak")
pak::pak("databrickslabs/brickster")

API Coverage

{brickster} is very deliberate with choosing what API's are wrapped. {brickster} isn't intended to replace IaC tooling (e.g. Terraform) or to be used for account/workspace administration.

APIAvailableVersion
DBFSYes2.0
SecretsYes2.0
ReposYes2.0
mlflow Model RegistryYes2.0
ClustersYes2.0
LibrariesYes2.0
WorkspaceYes2.0
EndpointsYes2.0
Query HistoryYes2.0
JobsYes2.1
Volumes (Files)Yes2.0
SQL Statement ExecutionYes2.0
REST 1.2 CommandsPartially1.2
Unity Catalog - TablesYes2.1
Unity Catalog - VolumesYes2.1
Unity CatalogPartially2.1

Copy Link

Version

Install

install.packages('brickster')

Monthly Downloads

518

Version

0.2.11

License

Apache License (>= 2)

Issues

Pull Requests

Stars

Forks

Maintainer

Zac Davies

Last Published

December 13th, 2025

Functions in brickster (0.2.11)

databricks-dbplyr

dbplyr Backend for Databricks SQL
databricks-dbi

DBI Interface for Databricks SQL Warehouses
cluster_autoscale

Cluster Autoscale
close_workspace

Close Databricks Workspace Connection
dbColumnInfo,DatabricksResult-method

Get column information from result
dbClearResult,DatabricksResult-method

Clear result set
dbIsValid,DatabricksConnection-method

Check if connection is valid
dbListFields,DatabricksConnection,AsIs-method

List column names of a Databricks table (AsIs method)
dbBegin,DatabricksConnection-method

Begin transaction (not supported)
dbQuoteIdentifier,DatabricksConnection,character-method

Quote identifiers for Databricks SQL
dbGetRowsAffected,DatabricksResult-method

Get number of rows affected (not applicable for SELECT)
dbRollback,DatabricksConnection-method

Rollback transaction (not supported)
dbSendQuery,DatabricksConnection,character-method

Send query to Databricks (asynchronous)
dbSendStatement,DatabricksConnection,character-method

Send statement to Databricks
dbGetRowCount,DatabricksResult-method

Get number of rows fetched
dbExistsTable,DatabricksConnection,Id-method

Check if table exists (Id method)
dbExistsTable,DatabricksConnection,AsIs-method

Check if table exists (AsIs method)
dbExecute,DatabricksConnection,character-method

Execute statement on Databricks
cron_schedule

Cron Schedule
dbConnect,DatabricksDriver-method

Connect to Databricks SQL Warehouse
dbExistsTable,DatabricksConnection,character-method

Check if table exists in Databricks
dbDisconnect,DatabricksConnection-method

Disconnect from Databricks
copy_to.DatabricksConnection

Copy data frame to Databricks as table or view
db_cluster_events

List Cluster Activity Events
dbGetStatement,DatabricksResult-method

Get SQL statement from result
dbDataType,DatabricksConnection-method

Map R data types to Databricks SQL types
db_cluster_list

List Clusters
dbFetch,DatabricksResult-method

Fetch results from Databricks query
dbWriteTable,DatabricksConnection,AsIs,data.frame-method

Write table to Databricks (AsIs name signature)
db_cluster_pin

Pin a Cluster
dbListFields,DatabricksConnection,character-method

List column names of a Databricks table
db_cluster_list_node_types

List Available Cluster Node Types
dbWriteTable,DatabricksConnection,Id,data.frame-method

Write a data frame to Databricks table (Id method)
dbHasCompleted,DatabricksResult-method

Check if query has completed
db_cluster_start

Start a Cluster
db_cluster_terminate

Delete/Terminate a Cluster
db_cluster_delete

Delete/Terminate a Cluster
db_cluster_edit

Edit a Cluster
db_cluster_get

Get Details of a Cluster
db_cluster_create

Create a Cluster
db_cluster_action

Cluster Action Helper Function
dbListTables,DatabricksConnection-method

List tables in Databricks catalog/schema
dbCommit,DatabricksConnection-method

Commit transaction (not supported)
dbGetInfo,DatabricksConnection-method

Get connection information
db_cluster_unpin

Unpin a Cluster
db_collect.DatabricksConnection

Collect query results with proper progress timing for Databricks
db_append_with_select_values

Append data using atomic INSERT INTO with SELECT VALUES
db_context_status

Get Information About an Execution Context
db_context_command_cancel

Cancel a Command
dbWriteTable,DatabricksConnection,character,data.frame-method

Write a data frame to Databricks table
db_current_user

Get Current User Info
db_create_table_as_select_values

Create table with explicit schema before inserting values
db_current_workspace_id

Detect Current Workspace ID
db_dbfs_put

DBFS Put
db_cluster_resize

Resize a Cluster
db_context_command_run

Run a Command
db_context_command_run_and_wait

Run a Command and Wait For Results
db_context_destroy

Delete an Execution Context
db_dbfs_read

DBFS Read
db_cluster_runtime_versions

List Available Databricks Runtime Versions
db_cluster_restart

Restart a Cluster
db_dbfs_list

DBFS List
db_context_manager

Databricks Execution Context Manager (R6 Class)
db_dbfs_get_status

DBFS Get Status
db_jobs_runs_delete

Delete Job Run
db_jobs_runs_cancel

Cancel Job Run
db_jobs_reset

Overwrite All Settings For A Job
dbGetQuery,DatabricksConnection,character-method

Execute SQL query and return results
db_dbfs_create

DBFS Create
db_dbfs_delete

DBFS Delete
db_dbfs_mkdirs

DBFS mkdirs
db_context_command_parse

Parse Command Results
db_host

Generate/Fetch Databricks Host
db_jobs_runs_get_output

Get Job Run Output
db_jobs_create

Create Job
db_create_table_from_data

Create table from data frame structure
db_jobs_get

Get Job Details
db_jobs_runs_list

List Job Runs
db_jobs_delete

Delete a Job
db_libs_install

Install Library on Cluster
db_libs_uninstall

Uninstall Library on Cluster
db_current_cloud

Detect Current Workspaces Cloud
db_generate_values_sql

Generate VALUES SQL from data frame
db_jobs_repair_run

Repair A Job Run
db_libs_all_cluster_statuses

Get Status of All Libraries on All Clusters
db_jobs_list

List Jobs
db_generate_typed_values_sql_for_view

Generate typed VALUES SQL for temporary views (helper)
dbQuoteIdentifier,DatabricksConnection,SQL-method

Quote SQL objects (passthrough)
db_cluster_list_zones

List Availability Zones (AWS Only)
dbQuoteIdentifier,DatabricksConnection,Id-method

Quote complex identifiers (schema.table)
db_libs_cluster_status

Get Status of Libraries on Cluster
db_jobs_run_now

Trigger A New Job Run
db_lakebase_get

Get Database Instance
db_mlflow_model_open_transition_reqs

Get All Open Stage Transition Requests for the Model Version
db_lakebase_creds_generate

Generate Database Credential
db_mlflow_model_delete_transition_req

Delete a Model Version Stage Transition Request
db_mlflow_model_approve_transition_req

Approve Model Version Stage Transition Request
db_cluster_perm_delete

Permanently Delete a Cluster
db_dbfs_move

DBFS Move
db_lakebase_get_by_uid

Find Database Instance by UID
db_query_update

Update a SQL Query
db_read_netrc

Read .netrc File
db_mlflow_registered_model_details

Get Registered Model Details
db_mlflow_model_version_comment_edit

Edit a Comment on a Model Version
db_query_delete

Delete a SQL Query
db_query_create

Create a SQL Query
db_repl

Remote REPL to Databricks Cluster
db_repo_delete

Delete Repo
db_repo_get

Get Repo
db_secrets_scope_delete

Delete Secret Scope
db_secrets_scope_create

Create Secret Scope
db_mlflow_model_reject_transition_req

Reject Model Version Stage Transition Request
db_oauth_client

Create OAuth 2.0 Client
db_context_command_status

Get Information About a Command
db_context_create

Create an Execution Context
db_repo_create

Create Repo
db_secrets_put

Put Secret in Secret Scope
db_sql_exec_poll_for_success

Poll a Query Until Successful
db_sql_exec_cancel

Cancel SQL Query
db_secrets_list

List Secrets in Secret Scope
db_dbfs_close

DBFS Close
db_dbfs_add_block

DBFS Add Block
db_escape_string_literal

Escape string literals for inline SQL VALUES
db_lakebase_list

List Database Instances
db_mlflow_model_version_comment

Make a Comment on a Model Version
db_sql_type_to_empty_vector

Create Empty R Vector from Databricks SQL Type
db_repo_get_all

Get All Repos
db_perform_request

Perform Databricks API Request
db_repo_update

Update Repo
db_req_error_body

Propagate Databricks API Errors
db_sql_warehouse_get

Get Warehouse
db_sql_warehouse_list

List Warehouses
db_uc_tables_summaries

List Table Summaries (Unity Catalog)
db_uc_tables_list

List Tables (Unity Catalog)
db_secrets_scope_acl_get

Get Secret Scope ACL
db_secrets_scope_acl_delete

Delete Secret Scope ACL
db_uc_tables_exists

Check Table Exists (Unity Catalog)
db_uc_tables_get

Get Table (Unity Catalog)
db_sql_warehouse_create

Create Warehouse
db_sql_query

Execute query with SQL Warehouse
db_sql_exec_status

Get SQL Query Status
db_sql_query_history

List Warehouse Query History
db_sql_fetch_results

Fetch SQL Query Results from Completed Query
db_uc_volumes_get

Get Volume (Unity Catalog)
db_volume_dir_create

Volume FileSystem Create Directory
db_jobs_runs_get

Get Job Run Details
db_volume_dir_delete

Volume FileSystem Delete Directory
db_generate_typed_values_sql

Generate type-aware VALUES SQL from data frame
db_jobs_runs_export

Export Job Run Output
db_uc_volumes_update

Update Volume (Unity Catalog)
db_secrets_scope_acl_put

Put ACL on Secret Scope
db_secrets_scope_acl_list

List Secret Scope ACL's
db_sql_create_empty_result

Create Empty Data Frame from Query Manifest
db_volume_delete

Volume FileSystem Delete
db_request

Databricks Request Helper
db_jobs_runs_submit

Create And Trigger A One-Time Run
db_jobs_update

Partially Update A Job
db_volume_recursive_delete_contents

Recursively delete all contents of a volume directory
db_volume_upload_dir

Upload Directory to Volume in Parallel
db_sql_exec_and_wait

Execute SQL Query and Wait for Completion
db_volume_read

Volume FileSystem Read
db_volume_list

Volume FileSystem List Directory Contents
db_sql_warehouse_edit

Edit Warehouse
db_sql_warehouse_delete

Delete Warehouse
db_uc_catalogs_get

Get Catalog (Unity Catalog)
db_token

Fetch Databricks Token
db_uc_volumes_create

Update Volume (Unity Catalog)
db_uc_volumes_delete

Delete Volume (Unity Catalog)
db_vs_endpoints_delete

Delete a Vector Search Endpoint
db_mlflow_model_transition_stage

Transition a Model Version's Stage
db_uc_volumes_list

List Volumes (Unity Catalog)
db_volume_write

Volume FileSystem Write
db_vs_endpoints_get

Get a Vector Search Endpoint
db_vs_endpoints_create

Create a Vector Search Endpoint
db_query_get

Get a SQL Query
db_mlflow_model_transition_req

Make a Model Version Stage Transition Request
db_sql_warehouse_start

Start Warehouse
db_mlflow_model_version_comment_delete

Delete a Comment on a Model Version
db_sql_exec_query

Execute SQL Query
db_vs_indexes_scan

Scan a Vector Search Index
db_sql_exec_result

Get SQL Query Results
db_vs_indexes_get

Get a Vector Search Index
db_vs_indexes_sync

Synchronize a Vector Search Index
db_vs_indexes_list

List Vector Search Indexes
db_workspace_mkdirs

Make a Directory (Workspaces)
db_sql_warehouse_stop

Stop Warehouse
get_latest_dbr

Get Latest Databricks Runtime (DBR)
db_write_table_standard

Write table using standard SQL approach
db_uc_schemas_list

List Schemas (Unity Catalog)
db_workspace_list

List Directory Contents (Workspaces)
db_workspace_import

Import Notebook/Directory (Workspaces)
db_volume_dir_exists

Volume FileSystem Check Directory Exists
is.embedding_source_column

Test if object is of class EmbeddingSourceColumn
get_and_start_warehouse

Get and Start Warehouse
delta_sync_index_spec

Delta Sync Vector Search Index Specification
db_uc_tables_delete

Delete Table (Unity Catalog)
is.cron_schedule

Test if object is of class CronSchedule
get_and_start_cluster

Get and Start Cluster
is.dbfs_storage_info

Test if object is of class DbfsStorageInfo
default_config_profile

Returns the default config profile
db_query_list

List SQL Queries
db_vs_indexes_query

Query a Vector Search Index
is.embedding_vector_column

Test if object is of class EmbeddingVectorColumn
embedding_source_column

Embedding Source Column
db_vs_indexes_query_next_page

Query Vector Search Next Page
is.lib_pypi

Test if object is of class PyPiLibrary
dbfs_storage_info

DBFS Storage Information
is.lib_whl

Test if object is of class WhlLibrary
dbplyr_edition.DatabricksConnection

Declare dbplyr API version for Databricks connections
direct_access_index_spec

Delta Sync Vector Search Index Specification
init_script_info

Init Script Info
in_databricks_nb

Detect if running within Databricks Notebook
is.delta_sync_index

Test if object is of class DeltaSyncIndex
determine_brickster_venv

Determine brickster virtualenv
is.direct_access_index

Test if object is of class DirectAccessIndex
is.pipeline_task

Test if object is of class PipelineTask
is.gcp_attributes

Test if object is of class GcpAttributes
is.lib_egg

Test if object is of class EggLibrary
lib_whl

Wheel Library (Python)
is.git_source

Test if object is of class GitSource
is.lib_cran

Test if object is of class CranLibrary
is.python_wheel_task

Test if object is of class PythonWheelTask
embedding_vector_column

Embedding Vector Column
open_workspace

Connect to Databricks Workspace
lib_maven

Maven Library (Scala)
lib_pypi

PyPi Library (Python)
libraries

Libraries
gcp_attributes

GCP Attributes
is.access_control_request

Test if object is of class AccessControlRequest
generate_temp_name

Generate unique temporary table/view name
is.aws_attributes

Test if object is of class AwsAttributes
db_vs_endpoints_list

List Vector Search Endpoints
db_volume_file_exists

Volume FileSystem File Status
is.email_notifications

Test if object is of class JobEmailNotifications
is.notebook_task

Test if object is of class NotebookTask
is.new_cluster

Test if object is of class NewCluster
is.docker_image

Test if object is of class DockerImage
git_source

Git Source for Job Notebook Tasks
is.cluster_log_conf

Test if object is of class ClusterLogConf
is.init_script_info

Test if object is of class InitScriptInfo
is.condition_task

Test if object is of class ConditionTask
sql_table_analyze.DatabricksConnection

Handle table analysis for Databricks
python_wheel_task

Python Wheel Task
read_databrickscfg

Reads Databricks CLI Config
is.spark_submit_task

Test if object is of class SparkSubmitTask
is.sql_file_task

Test if object is of class SqlFileTask
is.job_task

Test if object is of class JobTaskSettings
is.run_job_task

Test if object is of class RunJobTask
sql_translation.DatabricksConnection

SQL translation environment for Databricks SQL
sql_query_save.DatabricksConnection

Create temporary views and tables in Databricks
db_request_json

Generate Request JSON
pipeline_task

Pipeline Task
spark_submit_task

Spark Submit Task
spark_python_task

Spark Python Task
is.vector_search_index_spec

Test if object is of class VectorSearchIndexSpec
job_task

Job Task
is.s3_storage_info

Test if object is of class S3StorageInfo
db_vs_indexes_upsert_data

Upsert Data into a Vector Search Index
db_workspace_delete

Delete Object/Directory (Workspaces)
db_vs_indexes_create

Create a Vector Search Index
sql_query_task

SQL Query Task
db_should_use_volume_method

Check if volume method should be used
db_secrets_scope_list_all

List Secret Scopes
db_sql_global_warehouse_get

Get Global Warehouse Config
db_secrets_delete

Delete Secret in Secret Scope
new_cluster

New Cluster
docker_image

Docker Image
db_write_table_volume

Write table using volume-based approach
email_notifications

Email Notifications
db_wsid

Fetch Databricks Workspace ID
spark_jar_task

Spark Jar Task
show,DatabricksResult-method

Show method for DatabricksResult
notebook_task

Notebook Task
db_vs_indexes_delete

Delete a Vector Search Index
db_sql_process_inline

Process Inline SQL Query Results
is.azure_attributes

Test if object is of class AzureAttributes
is.lib_jar

Test if object is of class JarLibrary
is.lib_maven

Test if object is of class MavenLibrary
db_workspace_get_status

Get Object Status (Workspaces)
for_each_task

For Each Task
db_vs_indexes_delete_data

Delete Data from a Vector Search Index
db_uc_catalogs_list

List Catalogs (Unity Catalog)
db_uc_schemas_get

Get Schema (Unity Catalog)
is.cluster_autoscale

Test if object is of class AutoScale
file_storage_info

File Storage Information
is.access_control_req_group

Test if object is of class AccessControlRequestForGroup
db_workspace_export

Export Notebook or Directory (Workspaces)
is.access_control_req_user

Test if object is of class AccessControlRequestForUser
is.sql_query_task

Test if object is of class SqlQueryTask
is.valid_task_type

Test if object is of class JobTask
is.file_storage_info

Test if object is of class FileStorageInfo
is.for_each_task

Test if object is of class ForEachTask
is.libraries

Test if object is of class Libraries
is.library

Test if object is of class Library
read_env_var

Reads Environment Variables
remove_lib_path

Remove Library Path
lib_cran

Cran Library (R)
job_tasks

Job Tasks
run_job_task

Run Job Task
is.spark_python_task

Test if object is of class SparkPythonTask
lib_egg

Egg Library (Python)
lib_jar

Jar Library (Scala)
is.spark_jar_task

Test if object is of class SparkJarTask
show,DatabricksConnection-method

Show method for DatabricksConnection
show,DatabricksDriver-method

Show method for DatabricksDriver
s3_storage_info

S3 Storage Info
sql_query_fields.DatabricksConnection

SQL Query Fields for Databricks connections
sql_file_task

SQL File Task
use_databricks_cfg

Returns whether or not to use a .databrickscfg file
wait_for_lib_installs

Wait for Libraries to Install on Databricks Cluster
access_control_request

Access Control Request
add_lib_path

Add Library Path
DatabricksSQL

Create Databricks SQL Driver
aws_attributes

AWS Attributes
azure_attributes

Azure Attributes
DatabricksResult-class

DBI Result for Databricks
access_control_req_user

Access Control Request For User
access_control_req_group

Access Control Request for Group
DatabricksDriver-class

DBI Driver for Databricks
DatabricksConnection-class

DBI Connection for Databricks
condition_task

Condition Task
dbAppendTable,DatabricksConnection,Id,data.frame-method

Append rows to an existing Databricks table (Id method)
dbAppendTable,DatabricksConnection,character,data.frame-method

Append rows to an existing Databricks table
cluster_log_conf

Cluster Log Configuration