Cache a Spark Table
Force a Spark table with name
name to be loaded into memory.
Operations on cached tables should normally (although not always)
be more performant than the same operation performed on an uncached
tbl_cache(sc, name, force = TRUE)
- The table name.
- Force the data to be loaded into memory? This is accomplished
by calling the
countAPI on the associated Spark DataFrame.
Looks like there are no examples yet.