Define a Spark Compilation Specification
For use with
compile_package_jars. The Spark compilation
specification is used when compiling Spark extension Java Archives, and
defines which versions of Spark, as well as which versions of Scala, should
be used for compilation.
spark_compilation_spec( spark_version = NULL, spark_home = NULL, scalac_path = NULL, scala_filter = NULL, jar_name = NULL, jar_path = NULL, jar_dep = NULL, embedded_srcs = "embedded_sources.R" )
The Spark version to build against. This can be left unset if the path to a suitable Spark home is supplied.
The path to a Spark home installation. This can be left unset if
spark_versionis supplied; in such a case,
sparklyrwill attempt to discover the associated Spark installation using
The path to the
scalaccompiler to be used during compilation of your Spark extension. Note that you should ensure the version of
scalacselected matches the version of
scalacused with the version of Spark you are compiling against.
An optional R function that can be used to filter which
scalafiles are used during compilation. This can be useful if you have auxiliary files that should only be included with certain versions of Spark.
The name to be assigned to the generated
The path to the
jartool to be used during compilation of your Spark extension.
An optional list of additional
Embedded source file(s) under
<R package root>/javato be included in the root of the resulting jar file as resources
Most Spark extensions won't need to define their own compilation specification,
and can instead rely on the default behavior of