write.table to avoid
exceeding memory limits for very large data.frames.safe.write(value, file, batch, row.names = TRUE, ..., sep = ',',
eol = '', quote.string=FALSE)
- value
{a data.frame;}
- file
{a file object (connection, file name, etc).}
- batch
{maximum number of rows to write at a time.}
- ...
{any other arguments are passed to write.table.}
- sep
{field separator passed to write.table.}
- eol
{end of line character passed to write.table.}
- quote.string
{logical value passed to write.table.}
The function has a while loop invoking write.table
for subsets of batch rows of value. Since this is
a helper function for mysqlWriteTable has hardcoded
other arguments to write.table.
NULL, invisibly.
No error checking whatsoever is done. ctr.file <- file("dump.sqloader", "w")
safe.write(big.data, file = ctr.file, batch = 25000)
internal