Write a data.frame avoiding exceeding memory limits

This function batches calls to write.table to avoid exceeding memory limits for very large data.frames.

safe.write(value, file, batch, row.names = TRUE, ..., sep = ',', 
  eol = '', quote.string=FALSE)
value{a data.frame;}
  file{a file object (connection, file name, etc).}
  batch{maximum number of rows to write at a time.}
  ...{any other arguments are passed to write.table.}
  sep{field separator passed to write.table.}
  eol{end of line character passed to write.table.}
  quote.string{logical value passed to write.table.}
The function has a while loop invoking write.table for subsets of batch rows of value. Since this is a helper function for mysqlWriteTable has hardcoded other arguments to write.table.
NULL, invisibly. No error checking whatsoever is done.


ctr.file <- file("dump.sqloader", "w") safe.write(, file = ctr.file, batch = 25000) internal

  • safe.write
Documentation reproduced from package RSQLite, version 0.8-2, License: LGPL (>= 2)

Community examples

Looks like there are no examples yet.