spark_install_find
Find a given Spark installation by version.
Install versions of Spark for use with local Spark connections
(i.e. spark_connect(master = "local"
)
- Keywords
- internal
Usage
spark_install_find(
version = NULL,
hadoop_version = NULL,
installed_only = TRUE,
latest = FALSE,
hint = FALSE
)spark_install(
version = NULL,
hadoop_version = NULL,
reset = TRUE,
logging = "INFO",
verbose = interactive()
)
spark_uninstall(version, hadoop_version)
spark_install_dir()
spark_install_tar(tarfile)
spark_installed_versions()
spark_available_versions(
show_hadoop = FALSE,
show_minor = FALSE,
show_future = FALSE
)
Arguments
- version
Version of Spark to install. See
spark_available_versions
for a list of supported versions- hadoop_version
Version of Hadoop to install. See
spark_available_versions
for a list of supported versions- installed_only
Search only the locally installed versions?
- latest
Check for latest version?
- hint
On failure should the installation code be provided?
- reset
Attempts to reset settings to defaults.
- logging
Logging level to configure install. Supported options: "WARN", "INFO"
- verbose
Report information as Spark is downloaded / installed
- tarfile
Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.
- show_hadoop
Show Hadoop distributions?
- show_minor
Show minor Spark versions?
- show_future
Should future versions which have not been released be shown?
Value
List with information about the installed version.