Learn R Programming

brickster

Overview

{brickster} is the R toolkit for Databricks, it includes:

Quick Start

library(brickster)

# first request will open browser window to login for U2M
Sys.setenv(DATABRICKS_HOST = "https://<workspace-prefix>.cloud.databricks.com")

# open RStudio/Positron connection pane to view Databricks resources
open_workspace()

# list all SQL warehouses
warehouses <- db_sql_warehouse_list()

Refer to the "Connect to a Databricks Workspace" article for more details on getting authentication configured.

Usage

{DBI} Backend

library(brickster)
library(DBI)

# Connect to Databricks using DBI (assumes you followed quickstart to authenticate)
con <- dbConnect(
  DatabricksSQL(),
  warehouse_id = "<warehouse-id>"
)

# Standard {DBI} operations
tables <- dbListTables(con)
dbGetQuery(con, "SELECT * FROM samples.nyctaxi.trips LIMIT 5")

# Use with {dbplyr} for {dplyr} syntax
library(dplyr)
library(dbplyr)

nyc_taxi <- tbl(con, I("samples.nyctaxi.trips"))

result <- nyc_taxi |>
  filter(year(tpep_pickup_datetime) == 2016) |>
  group_by(pickup_zip) |>
  summarise(
    trip_count = n(),
    avg_fare = mean(fare_amount, na.rm = TRUE),
    avg_distance = mean(trip_distance, na.rm = TRUE)
  ) |>
  collect()

Download & Upload to Volume

library(readr)
library(brickster)

# upload `data.csv` to a volume
local_file <- tempfile(fileext = ".csv")
write_csv(x = iris, file = local_file)
db_volume_write(
  path = "/Volumes/<catalog>/<schema>/<volume>/data.csv",
  file = local_file
)

# read `data.csv` from a volume and write to a file
downloaded_file <- tempfile(fileext = ".csv")
file <- db_volume_read(
  path = "/Volumes/<catalog>/<schema>/<volume>/data.csv",
  destination = downloaded_file
)
volume_csv <- read_csv(downloaded_file)

Databricks REPL

Run commands against an existing interactive Databricks cluster, read this article for more details.

library(brickster)

# commands after this will run on the interactive cluster
# read the vignette for more details
db_repl(cluster_id = "<interactive_cluster_id>")

Installation

install.packages("brickster")

Development Version

# install.packages("pak")
pak::pak("databrickslabs/brickster")

API Coverage

{brickster} is very deliberate with choosing what API's are wrapped. {brickster} isn't intended to replace IaC tooling (e.g. Terraform) or to be used for account/workspace administration.

APIAvailableVersion
DBFSYes2.0
SecretsYes2.0
ReposYes2.0
mlflow Model RegistryYes2.0
ClustersYes2.0
LibrariesYes2.0
WorkspaceYes2.0
EndpointsYes2.0
Query HistoryYes2.0
JobsYes2.1
Volumes (Files)Yes2.0
SQL Statement ExecutionYes2.0
REST 1.2 CommandsPartially1.2
Unity Catalog - TablesYes2.1
Unity Catalog - VolumesYes2.1
Unity CatalogPartially2.1

Copy Link

Version

Install

install.packages('brickster')

Monthly Downloads

518

Version

0.2.12

License

Apache License (>= 2)

Issues

Pull Requests

Stars

Forks

Maintainer

Zac Davies

Last Published

February 4th, 2026

Functions in brickster (0.2.12)

DatabricksConnection-class

DBI Connection for Databricks
DatabricksResult-class

DBI Result for Databricks
access_control_req_user

Access Control Request For User
access_control_request

Access Control Request
azure_attributes

Azure Attributes
DatabricksSQL

Create Databricks SQL Driver
aws_attributes

AWS Attributes
access_control_req_group

Access Control Request for Group
DatabricksDriver-class

DBI Driver for Databricks
add_lib_path

Add Library Path
cluster_autoscale

Cluster Autoscale
close_workspace

Close Databricks Workspace Connection
condition_task

Condition Task
cluster_log_conf

Cluster Log Configuration
copy_to.DatabricksConnection

Copy data frame to Databricks as table or view
cron_schedule

Cron Schedule
dbAppendTable,DatabricksConnection,character,data.frame-method

Append rows to an existing Databricks table
dbAppendTable,DatabricksConnection,Id,data.frame-method

Append rows to an existing Databricks table (Id method)
dbBegin,DatabricksConnection-method

Begin transaction (not supported)
databricks-dbi

DBI Interface for Databricks SQL Warehouses
dbClearResult,DatabricksResult-method

Clear result set
databricks-dbplyr

dbplyr Backend for Databricks SQL
dbConnect,DatabricksDriver-method

Connect to Databricks SQL Warehouse
dbCreateTable,DatabricksConnection,AsIs-method

Create an empty Databricks table (AsIs method)
dbGetInfo,DatabricksConnection-method

Get connection information
dbCreateTable,DatabricksConnection,character-method

Create an empty Databricks table
dbCreateTable,DatabricksConnection,Id-method

Create an empty Databricks table (Id method)
dbFetch,DatabricksResult-method

Fetch results from Databricks query
dbCommit,DatabricksConnection-method

Commit transaction (not supported)
dbColumnInfo,DatabricksResult-method

Get column information from result
dbGetRowCount,DatabricksResult-method

Get number of rows fetched
dbGetQuery,DatabricksConnection,character-method

Execute SQL query and return results
dbDisconnect,DatabricksConnection-method

Disconnect from Databricks
dbQuoteIdentifier,DatabricksConnection,Id-method

Quote complex identifiers (schema.table)
dbGetRowsAffected,DatabricksResult-method

Get number of rows affected (not applicable for SELECT)
dbGetStatement,DatabricksResult-method

Get SQL statement from result
dbDataType,DatabricksConnection-method

Map R data types to Databricks SQL types
dbListTables,DatabricksConnection-method

List tables in Databricks catalog/schema
dbExistsTable,DatabricksConnection,Id-method

Check if table exists (Id method)
dbQuoteIdentifier,DatabricksConnection,SQL-method

Quote SQL objects (passthrough)
dbQuoteIdentifier,DatabricksConnection,character-method

Quote identifiers for Databricks SQL
dbExistsTable,DatabricksConnection,character-method

Check if table exists in Databricks
dbSendStatement,DatabricksConnection,character-method

Send statement to Databricks
dbHasCompleted,DatabricksResult-method

Check if query has completed
dbIsValid,DatabricksConnection-method

Check if connection is valid
dbWriteTable,DatabricksConnection,AsIs,data.frame-method

Write table to Databricks (AsIs name signature)
dbSendQuery,DatabricksConnection,character-method

Send query to Databricks (asynchronous)
db_cluster_delete

Delete/Terminate a Cluster
dbRollback,DatabricksConnection-method

Rollback transaction (not supported)
dbExistsTable,DatabricksConnection,AsIs-method

Check if table exists (AsIs method)
db_assert_valid_conn

Assert that a connection is valid
dbExecute,DatabricksConnection,character-method

Execute statement on Databricks
db_cluster_edit

Edit a Cluster
db_clean_table_name

Clean table name input
dbListFields,DatabricksConnection,AsIs-method

List column names of a Databricks table (AsIs method)
db_cluster_list_zones

List Availability Zones (AWS Only)
db_cluster_perm_delete

Permanently Delete a Cluster
dbReadTable,DatabricksConnection,character-method

Read a Databricks table
db_collect.DatabricksConnection

Collect query results with proper progress timing for Databricks
db_cluster_terminate

Delete/Terminate a Cluster
db_cluster_start

Start a Cluster
db_current_user

Get Current User Info
db_cluster_unpin

Unpin a Cluster
db_current_workspace_id

Detect Current Workspace ID
dbWriteTable,DatabricksConnection,Id,data.frame-method

Write a data frame to Databricks table (Id method)
dbReadTable,DatabricksConnection,Id-method

Read a Databricks table (Id method)
db_context_command_parse

Parse Command Results
db_dbfs_add_block

DBFS Add Block
dbReadTable,DatabricksConnection,AsIs-method

Read a Databricks table (AsIs method)
db_cluster_restart

Restart a Cluster
db_context_command_cancel

Cancel a Command
db_cluster_runtime_versions

List Available Databricks Runtime Versions
db_dbfs_get_status

DBFS Get Status
db_dbfs_close

DBFS Close
db_dbfs_list

DBFS List
db_escape_string_literal

Escape string literals for inline SQL VALUES
db_generate_typed_values_sql

Generate type-aware VALUES SQL from data frame
db_cluster_events

List Cluster Activity Events
db_cluster_get

Get Details of a Cluster
db_context_command_run

Run a Command
db_context_command_run_and_wait

Run a Command and Wait For Results
dbRemoveTable,DatabricksConnection,AsIs-method

Remove a Databricks table (AsIs method)
db_jobs_list

List Jobs
db_dbfs_mkdirs

DBFS mkdirs
db_jobs_delete

Delete a Job
db_dbfs_move

DBFS Move
db_jobs_runs_get_output

Get Job Run Output
db_dbfs_put

DBFS Put
db_current_cloud

Detect Current Workspaces Cloud
db_jobs_reset

Overwrite All Settings For A Job
db_dbfs_read

DBFS Read
db_jobs_runs_list

List Job Runs
db_jobs_run_now

Trigger A New Job Run
db_create_table_from_data

Create table from data frame structure
dbListFields,DatabricksConnection,character-method

List column names of a Databricks table
db_libs_install

Install Library on Cluster
db_libs_uninstall

Uninstall Library on Cluster
dbRemoveTable,DatabricksConnection,Id-method

Remove a Databricks table (Id method)
dbRemoveTable,DatabricksConnection,character-method

Remove a Databricks table
db_jobs_get

Get Job Details
db_jobs_runs_submit

Create And Trigger A One-Time Run
db_jobs_update

Partially Update A Job
db_mlflow_model_reject_transition_req

Reject Model Version Stage Transition Request
db_mlflow_model_open_transition_reqs

Get All Open Stage Transition Requests for the Model Version
db_jobs_runs_export

Export Job Run Output
db_query_delete

Delete a SQL Query
db_jobs_runs_get

Get Job Run Details
db_jobs_repair_run

Repair A Job Run
db_lakebase_get_by_uid

Find Database Instance by UID
db_mlflow_registered_model_details

Get Registered Model Details
db_mlflow_model_approve_transition_req

Approve Model Version Stage Transition Request
db_mlflow_model_version_comment_edit

Edit a Comment on a Model Version
db_lakebase_list

List Database Instances
db_mlflow_model_delete_transition_req

Delete a Model Version Stage Transition Request
db_cluster_action

Cluster Action Helper Function
db_append_with_select_values

Append data using atomic INSERT INTO with SELECT VALUES
dbWriteTable,DatabricksConnection,character,data.frame-method

Write a data frame to Databricks table
db_repo_update

Update Repo
db_query_get

Get a SQL Query
db_perform_request

Perform Databricks API Request
db_oauth_client

Create OAuth 2.0 Client
db_query_list

List SQL Queries
db_query_update

Update a SQL Query
db_cluster_pin

Pin a Cluster
db_cluster_create

Create a Cluster
db_repo_create

Create Repo
db_repo_delete

Delete Repo
db_request

Databricks Request Helper
db_cluster_list

List Clusters
db_assert_statement

Assert that a statement is provided
db_request_json

Generate Request JSON
db_secrets_scope_delete

Delete Secret Scope
db_secrets_scope_list_all

List Secret Scopes
db_secrets_scope_acl_get

Get Secret Scope ACL
db_secrets_scope_acl_list

List Secret Scope ACL's
db_sql_global_warehouse_get

Get Global Warehouse Config
db_sql_fetch_results_parallel

Fetch SQL Query Results (Parallel Path)
db_req_error_body

Propagate Databricks API Errors
db_cluster_resize

Resize a Cluster
db_dbfs_delete

DBFS Delete
db_context_create

Create an Execution Context
db_dbfs_create

DBFS Create
db_context_command_status

Get Information About a Command
db_jobs_create

Create Job
db_lakebase_get

Get Database Instance
db_libs_all_cluster_statuses

Get Status of All Libraries on All Clusters
db_lakebase_creds_generate

Generate Database Credential
db_host

Generate/Fetch Databricks Host
db_uc_schemas_get

Get Schema (Unity Catalog)
db_uc_schemas_list

List Schemas (Unity Catalog)
db_repo_get

Get Repo
db_sql_warehouse_list

List Warehouses
db_sql_warehouse_start

Start Warehouse
db_volume_write

Volume FileSystem Write
db_volume_upload_dir

Upload Directory to Volume in Parallel
db_uc_tables_list

List Tables (Unity Catalog)
db_uc_tables_get

Get Table (Unity Catalog)
db_volume_file_exists

Volume FileSystem File Status
db_sql_query_history

List Warehouse Query History
db_sql_exec_status

Get SQL Query Status
db_sql_exec_result

Get SQL Query Results
db_sql_type_to_empty_vector

Create Empty R Vector from Databricks SQL Type
db_context_manager

Databricks Execution Context Manager (R6 Class)
db_context_destroy

Delete an Execution Context
db_cluster_list_node_types

List Available Cluster Node Types
db_volume_list

Volume FileSystem List Directory Contents
db_vs_endpoints_create

Create a Vector Search Endpoint
db_libs_cluster_status

Get Status of Libraries on Cluster
db_vs_endpoints_delete

Delete a Vector Search Endpoint
db_sql_process_inline

Process Inline SQL Query Results
db_vs_indexes_list

List Vector Search Indexes
db_repo_get_all

Get All Repos
db_secrets_scope_acl_put

Put ACL on Secret Scope
db_vs_indexes_query

Query a Vector Search Index
db_secrets_scope_create

Create Secret Scope
db_sql_exec_cancel

Cancel SQL Query
db_sql_exec_and_wait

Execute SQL Query and Wait for Completion
db_sql_query

Execute query with SQL Warehouse
db_uc_catalogs_get

Get Catalog (Unity Catalog)
db_wsid

Fetch Databricks Workspace ID
db_uc_volumes_list

List Volumes (Unity Catalog)
db_uc_catalogs_list

List Catalogs (Unity Catalog)
db_workspace_import

Import Notebook/Directory (Workspaces)
db_uc_volumes_update

Update Volume (Unity Catalog)
db_vs_indexes_get

Get a Vector Search Index
db_workspace_get_status

Get Object Status (Workspaces)
email_notifications

Email Notifications
db_vs_indexes_delete_data

Delete Data from a Vector Search Index
is.access_control_request

Test if object is of class AccessControlRequest
embedding_source_column

Embedding Source Column
is.access_control_req_user

Test if object is of class AccessControlRequestForUser
db_vs_indexes_scan

Scan a Vector Search Index
direct_access_index_spec

Delta Sync Vector Search Index Specification
db_prepare_create_table_fields

Prepare fields for CREATE TABLE
db_vs_indexes_query_next_page

Query Vector Search Next Page
docker_image

Docker Image
dbfs_storage_info

DBFS Storage Information
db_context_status

Get Information About an Execution Context
is.git_source

Test if object is of class GitSource
db_create_table_as_select_values

Create table with explicit schema before inserting values
is.init_script_info

Test if object is of class InitScriptInfo
is.s3_storage_info

Test if object is of class S3StorageInfo
db_sql_warehouse_stop

Stop Warehouse
db_token

Fetch Databricks Token
is.cluster_autoscale

Test if object is of class AutoScale
db_generate_typed_values_sql_for_view

Generate typed VALUES SQL for temporary views (helper)
is.spark_jar_task

Test if object is of class SparkJarTask
is.spark_python_task

Test if object is of class SparkPythonTask
db_generate_values_sql

Generate VALUES SQL from data frame
db_secrets_delete

Delete Secret in Secret Scope
db_repl

Remote REPL to Databricks Cluster
db_query_create

Create a SQL Query
db_secrets_list

List Secrets in Secret Scope
db_read_netrc

Read .netrc File
db_uc_volumes_delete

Delete Volume (Unity Catalog)
db_sql_exec_poll_for_success

Poll a Query Until Successful
db_sql_exec_query

Execute SQL Query
db_sql_fetch_results

Fetch SQL Query Results from Completed Query
db_uc_volumes_get

Get Volume (Unity Catalog)
db_uc_tables_summaries

List Table Summaries (Unity Catalog)
db_jobs_runs_cancel

Cancel Job Run
db_volume_dir_delete

Volume FileSystem Delete Directory
db_sql_fetch_results_fast

Fetch SQL Query Results (Fast Path)
db_jobs_runs_delete

Delete Job Run
db_volume_dir_exists

Volume FileSystem Check Directory Exists
is.spark_submit_task

Test if object is of class SparkSubmitTask
db_mlflow_model_version_comment

Make a Comment on a Model Version
db_volume_recursive_delete_contents

Recursively delete all contents of a volume directory
db_mlflow_model_transition_req

Make a Model Version Stage Transition Request
db_mlflow_model_transition_stage

Transition a Model Version's Stage
db_mlflow_model_version_comment_delete

Delete a Comment on a Model Version
db_secrets_put

Put Secret in Secret Scope
db_secrets_scope_acl_delete

Delete Secret Scope ACL
db_sql_warehouse_create

Create Warehouse
db_sql_create_empty_result

Create Empty Data Frame from Query Manifest
db_uc_volumes_create

Update Volume (Unity Catalog)
db_volume_read

Volume FileSystem Read
db_vs_indexes_delete

Delete a Vector Search Index
db_should_use_volume_method

Check if volume method should be used
db_vs_indexes_create

Create a Vector Search Index
sql_query_fields.DatabricksConnection

SQL Query Fields for Databricks connections
new_cluster

New Cluster
libraries

Libraries
db_vs_indexes_upsert_data

Upsert Data into a Vector Search Index
db_vs_indexes_sync

Synchronize a Vector Search Index
db_workspace_delete

Delete Object/Directory (Workspaces)
determine_brickster_venv

Determine brickster virtualenv
db_workspace_export

Export Notebook or Directory (Workspaces)
delta_sync_index_spec

Delta Sync Vector Search Index Specification
embedding_vector_column

Embedding Vector Column
db_sql_warehouse_delete

Delete Warehouse
is.cluster_log_conf

Test if object is of class ClusterLogConf
is.embedding_vector_column

Test if object is of class EmbeddingVectorColumn
db_sql_warehouse_get

Get Warehouse
db_sql_warehouse_edit

Edit Warehouse
db_uc_tables_exists

Check Table Exists (Unity Catalog)
db_uc_tables_delete

Delete Table (Unity Catalog)
file_storage_info

File Storage Information
init_script_info

Init Script Info
is.access_control_req_group

Test if object is of class AccessControlRequestForGroup
db_volume_delete

Volume FileSystem Delete
db_vs_endpoints_get

Get a Vector Search Endpoint
db_volume_dir_create

Volume FileSystem Create Directory
is.lib_whl

Test if object is of class WhlLibrary
is.file_storage_info

Test if object is of class FileStorageInfo
is.lib_pypi

Test if object is of class PyPiLibrary
is.lib_maven

Test if object is of class MavenLibrary
db_vs_endpoints_list

List Vector Search Endpoints
db_workspace_list

List Directory Contents (Workspaces)
is.docker_image

Test if object is of class DockerImage
is.direct_access_index

Test if object is of class DirectAccessIndex
is.lib_egg

Test if object is of class EggLibrary
is.lib_jar

Test if object is of class JarLibrary
is.python_wheel_task

Test if object is of class PythonWheelTask
sql_query_save.DatabricksConnection

Create temporary views and tables in Databricks
get_and_start_cluster

Get and Start Cluster
db_write_table_standard

Write table using standard SQL approach
job_task

Job Task
job_tasks

Job Tasks
is.libraries

Test if object is of class Libraries
sql_translation.DatabricksConnection

SQL translation environment for Databricks SQL
generate_temp_name

Generate unique temporary table/view name
db_write_table_volume

Write table using volume-based approach
for_each_task

For Each Task
db_workspace_mkdirs

Make a Directory (Workspaces)
dbplyr_edition.DatabricksConnection

Declare dbplyr API version for Databricks connections
read_env_var

Reads Environment Variables
git_source

Git Source for Job Notebook Tasks
in_databricks_nb

Detect if running within Databricks Notebook
is.condition_task

Test if object is of class ConditionTask
gcp_attributes

GCP Attributes
is.aws_attributes

Test if object is of class AwsAttributes
is.delta_sync_index

Test if object is of class DeltaSyncIndex
get_latest_dbr

Get Latest Databricks Runtime (DBR)
get_and_start_warehouse

Get and Start Warehouse
is.dbfs_storage_info

Test if object is of class DbfsStorageInfo
default_config_profile

Returns the default config profile
read_databrickscfg

Reads Databricks CLI Config
is.azure_attributes

Test if object is of class AzureAttributes
use_databricks_cfg

Returns whether or not to use a .databrickscfg file
is.embedding_source_column

Test if object is of class EmbeddingSourceColumn
is.lib_cran

Test if object is of class CranLibrary
is.job_task

Test if object is of class JobTaskSettings
is.email_notifications

Test if object is of class JobEmailNotifications
show,DatabricksDriver-method

Show method for DatabricksDriver
is.new_cluster

Test if object is of class NewCluster
is.sql_file_task

Test if object is of class SqlFileTask
is.library

Test if object is of class Library
lib_jar

Jar Library (Scala)
lib_maven

Maven Library (Scala)
lib_egg

Egg Library (Python)
show,DatabricksResult-method

Show method for DatabricksResult
is.notebook_task

Test if object is of class NotebookTask
is.sql_query_task

Test if object is of class SqlQueryTask
open_workspace

Connect to Databricks Workspace
is.run_job_task

Test if object is of class RunJobTask
lib_cran

Cran Library (R)
notebook_task

Notebook Task
show,DatabricksConnection-method

Show method for DatabricksConnection
s3_storage_info

S3 Storage Info
is.cron_schedule

Test if object is of class CronSchedule
python_wheel_task

Python Wheel Task
pipeline_task

Pipeline Task
is.valid_task_type

Test if object is of class JobTask
is.pipeline_task

Test if object is of class PipelineTask
is.vector_search_index_spec

Test if object is of class VectorSearchIndexSpec
remove_lib_path

Remove Library Path
is.gcp_attributes

Test if object is of class GcpAttributes
is.for_each_task

Test if object is of class ForEachTask
lib_whl

Wheel Library (Python)
spark_jar_task

Spark Jar Task
lib_pypi

PyPi Library (Python)
wait_for_lib_installs

Wait for Libraries to Install on Databricks Cluster
warehouse_id_from_http_path

Extract warehouse ID from an http_path
run_job_task

Run Job Task
sql_file_task

SQL File Task
spark_submit_task

Spark Submit Task
sql_query_task

SQL Query Task
spark_python_task

Spark Python Task
sql_table_analyze.DatabricksConnection

Handle table analysis for Databricks