install.packages('drake')build_graphigraph dependency network of your project.plan_summaries().make(..., parallelism = "Makefile"),
see what your Makefile recipes
will look like in advance.backendanalysesas_fileas_drake_filenamecheckconfigparallelism argument
to make() for your system.storr cache namespaces used by drake.storr namespaces that are cleaned
during a call to clean().vis_drake_graph().vis_drake_graph().plan_analyses() and plan_summaries().prework
argument to make().hook argument to make().make().make(..., parallelism = "Makefile").drakeargs argument to make()."future" backendmake().command
argument to make().make().default_system2_argsevaluateplan argument of make().sessionInfo()
of the last call to make().hook argument to make()
for which no targets get built and no imports get processed.examples_drakeexpandgatherdrake that you want information
on a file (target or import), not an ordinary object.hook argument to
make() that redirects error messages to files.make() can detect
all the package's nested functions.make().make(),
on an existing internal configuration list.example_drakemake() will build your targets
in successive parallelizable stages.make(..., jobs = YOUR_CHOICE).Makefile
during make(..., parallelism = "Makefile").make(..., parallelism = "Makefile").hook argument to
make() that redirects output messages to files.plot_graphknitr/rmarkdown source files
of a workflow plan command.targets
argument to make(), given a workflow plan
data frame.make().vis_drake_graph().read_graphplandrake_example("basic")read_planmake().make().read_configmake().drake_config()
list from the last make().dataframes_graph().drake cache.render_graphhook argument to
make() that redirects output and error messagesmake(..., parallelism = "Makefile").make().drake_config().storr-related errors.plan_drakesessionshell.sh file required by
make(..., parallelism = 'Makefile', prepend = 'SHELL=./shell.sh').summariesstorr cache namespaces
that store target-level information.workflowworkplan