Find a given Spark installation by version.
Install versions of Spark for use with local Spark connections
(i.e. spark_connect(master = "local"
)
spark_install_find(version = NULL, hadoop_version = NULL,
installed_only = TRUE, latest = FALSE, hint = FALSE)spark_install(version = NULL, hadoop_version = NULL, reset = TRUE,
logging = "INFO", verbose = interactive())
spark_uninstall(version, hadoop_version)
spark_install_dir()
spark_install_tar(tarfile)
spark_installed_versions()
spark_available_versions(show_hadoop = FALSE)
Version of Spark to install. See spark_available_versions
for a list of supported versions
Version of Hadoop to install. See spark_available_versions
for a list of supported versions
Search only the locally installed versions?
Check for latest version?
On failure should the installation code be provided?
Attempts to reset settings to defaults.
Logging level to configure install. Supported options: "WARN", "INFO"
Report information as Spark is downloaded / installed
Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.
Show Hadoop distributions?
List with information about the installed version.