spark_install_find

0th

Percentile

Find a given Spark installation by version.

Install versions of Spark for use with local Spark connections (i.e. spark_connect(master = "local")

Keywords
internal
Usage
spark_install_find(version = NULL, hadoop_version = NULL,
  installed_only = TRUE, latest = FALSE, hint = FALSE)

spark_install(version = NULL, hadoop_version = NULL, reset = TRUE, logging = "INFO", verbose = interactive())

spark_uninstall(version, hadoop_version)

spark_install_dir()

spark_install_tar(tarfile)

spark_installed_versions()

spark_available_versions(show_hadoop = FALSE)

Arguments
version

Version of Spark to install. See spark_available_versions for a list of supported versions

hadoop_version

Version of Hadoop to install. See spark_available_versions for a list of supported versions

installed_only

Search only the locally installed versions?

latest

Check for latest version?

hint

On failure should the installation code be provided?

reset

Attempts to reset settings to defaults.

logging

Logging level to configure install. Supported options: "WARN", "INFO"

verbose

Report information as Spark is downloaded / installed

tarfile

Path to TAR file conforming to the pattern spark-###-bin-(hadoop)?### where ### reference spark and hadoop versions respectively.

show_hadoop

Show Hadoop distributions?

Value

List with information about the installed version.

Aliases
  • spark_install_find
  • spark_install
  • spark_uninstall
  • spark_install_dir
  • spark_install_tar
  • spark_installed_versions
  • spark_available_versions
Documentation reproduced from package sparklyr, version 0.9.2, License: Apache License 2.0 | file LICENSE

Community examples

Looks like there are no examples yet.