install.packages('drake')make().command
argument to make().parallelism argument
to make() for your system.make().vis_drake_graph().drakehook argument to make()
for which no targets get built and no imports get processed.evaluatedefault_system2_argsmake() can detect
all the package's nested functions.plan_analyses() and plan_summaries().args argument to make().hook argument to make().vis_drake_graph().make().make(..., parallelism = "Makefile").drake_example("basic")vis_drake_graph()."future" backendMakefile
during make(..., parallelism = "Makefile").make().plan argument of make().example_drakehook argument to
make() that redirects error messages to files.plot_graphgatherprework
argument to make().make_with_config()make(..., parallelism = "Makefile").targets
argument to make(), given a workflow plan
data frame.shell.sh file required by
make(..., parallelism = 'Makefile', prepend = 'SHELL=./shell.sh').make().examples_drakesessionInfo()
of the last call to make().summariesstorr cache namespaces
that store target-level information.expandhook argument to
make() that redirects output and error messagesknitr/rmarkdown source files
of a workflow plan command.drake_config().make().drake cache.make().make().read_graphread_plandrake that you want information
on a file (target or import), not an ordinary object.storr-related errors.sessionmake() will build your targets
in successive parallelizable stages.plandrake_config()
list from the last make().read_configdataframes_graph().make(),
on an existing internal configuration list.render_graphmake(..., jobs = YOUR_CHOICE).make(..., parallelism = "Makefile").hook argument to
make() that redirects output messages to files.plan_drakemake().workflowworkplanmake(..., parallelism = "Makefile"),
see what your Makefile recipes
will look like in advance.plan_summaries().as_filebackendanalysesas_drake_filenamestorr namespaces that are cleaned
during a call to clean().igraph dependency network of your project.build_graphstorr cache namespaces used by drake.checkconfig