Writing an agent to disk with agent_write() is good practice for keeping
data validation intel close at hand for later retrieval (with
agent_read()). By default, any data table that the agent may have before
being committed to disk will be expunged. This behavior can be changed by
setting keep_tbl to TRUE but this only works in the case where the table
is not of the tbl_dbi or the tbl_spark class.
agent_write(agent, filename, path = NULL, keep_tbl = FALSE)An agent object of class ptblank_agent that is created with
create_agent().
The file name to create on disk for the agent.
An optional path to which the file should be saved (combined with
filename).
An option to keep a data table that is associated with the
agent (which is the case when the agent is created using
create_agent(tbl = <data table, ...)). The default is FALSE where the
data table is removed before writing to disk. For database tables of the
class tbl_dbi and for Spark DataFrames (tbl_spark) the table is always
removed (even if keep_tbl is set to TRUE).
It can be helpful to set a table-reading function to later reuse the agent
when read from disk through agent_read(). This can be done with the
read_fn argument of create_agent() or, later, with set_read_fn().
Alternatively, we can reintroduce the agent to a data table with the
set_tbl() function.