sparklyr (version 1.3.1)

stream_read_delta: Read Delta Stream

Description

Reads a Delta Lake table as a Spark dataframe stream.

Usage

stream_read_delta(sc, path, name = NULL, options = list(), ...)

Arguments

sc

A spark_connection.

path

The path to the file. Needs to be accessible from the cluster. Supports the "hdfs://", "s3a://" and "file://" protocols.

name

The name to assign to the newly generated stream.

options

A list of strings with additional options.

...

Optional arguments; currently unused.

Details

Please note that Delta Lake requires installing the appropriate package by setting the packages parameter to "delta" in spark_connect()

See Also

Other Spark stream serialization: stream_read_csv(), stream_read_json(), stream_read_kafka(), stream_read_orc(), stream_read_parquet(), stream_read_socket(), stream_read_text(), stream_write_console(), stream_write_csv(), stream_write_delta(), stream_write_json(), stream_write_kafka(), stream_write_memory(), stream_write_orc(), stream_write_parquet(), stream_write_text()

Examples

Run this code
# NOT RUN {
library(sparklyr)
sc <- spark_connect(master = "local", version = "2.4", packages = "delta")

sdf_len(sc, 5) %>% spark_write_delta(path = "delta-test")

stream <- stream_read_delta(sc, "delta-test") %>%
  stream_write_json("json-out")

stream_stop(stream)

# }
# NOT RUN {
# }

Run the code above in your browser using DataCamp Workspace