compile_package_jars
. The Spark compilation
specification is used when compiling Spark extension Java Archives, and
defines which versions of Spark, as well as which versions of Scala, should
be used for compilation.
spark_compilation_spec(spark_version = NULL, spark_home = NULL, scalac_path = NULL, scala_filter = NULL, jar_name = NULL)
spark_version
is supplied; in such a case,
sparklyr
will attempt to discover the associated Spark
installation using spark_home_dir
.scalac
compiler to be used
during compilation of your Spark extension. Note that you should
ensure the version of scalac
selected matches the version of
scalac
used with the version of Spark you are compiling against.scala
files are used during compilation. This can be
useful if you have auxiliary files that should only be included with
certain versions of Spark.jar
.compile_package_jars
.