spark_default_compilation_spec

0th

Percentile

Default Compilation Specification for Spark Extensions

This is the default compilation specification used for Spark extensions, when used with compile_package_jars.

Usage
spark_default_compilation_spec(pkg = infer_active_package_name(),
  locations = NULL)
Arguments
pkg

The package containing Spark extensions to be compiled.

locations

Additional locations to scan. By default, the directories /opt/scala and /usr/local/scala will be scanned.

Aliases
  • spark_default_compilation_spec
Documentation reproduced from package sparklyr, version 0.7.0, License: Apache License 2.0 | file LICENSE

Community examples

Looks like there are no examples yet.