Usage
DFS_cat( file, con = stdout(), henv = hive() )
DFS_delete( file, recursive = FALSE, henv = hive() )
DFS_dir_create( path, henv = hive() )
DFS_dir_exists( path, henv = hive() )
DFS_dir_remove( path, recursive = TRUE, henv = hive() )
DFS_file_exists( file, henv = hive() )
DFS_get_object( file, henv = hive() )
DFS_read_lines( file, n = -1L, henv = hive() )
DFS_list( path = ".", henv = hive() )
DFS_tail( file, n = 6L, size = 1024L, henv = hive() )
DFS_put( files, path = ".", henv = hive() )
DFS_put_object( obj, file, henv = hive() )
DFS_write_lines( text, file, henv = hive() )
Arguments
henv
An object containing the local Hadoop configuration.
file
a character string representing a file on the DFS.
files
a character string representing files located on the
local file system to be copied to the
DFS.
n
an integer specifying the number of lines to read.
obj
an R object to be serialized to/from the DFS.
path
a character string representing a full path name in the
DFS (without the leading hdfs://); for many functions the
default corresponds to the user's home directory in the DFS.
recursive
logical. Should elements of the path other than the last be
deleted recursively?
size
an integer specifying the number of bytes to be read. Must
be sufficiently large otherwise n does not have the desired effect.
text
a (vector of) character string(s) to be written to the DFS.
con
A connection to be used for printing the output provided by
cat. Default: standard output connection, has currently no
other effect