spark_default_compilation_spec
From sparklyr v0.8.4
by Javier Luraschi
Default Compilation Specification for Spark Extensions
This is the default compilation specification used for
Spark extensions, when used with compile_package_jars
.
Usage
spark_default_compilation_spec(pkg = infer_active_package_name(),
locations = NULL)
Arguments
- pkg
The package containing Spark extensions to be compiled.
- locations
Additional locations to scan. By default, the directories
/opt/scala
and/usr/local/scala
will be scanned.
Community examples
Looks like there are no examples yet.