BatchJobs (version 1.8)

makeClusterFunctionsSSH: Create an SSH cluster to execute jobs.

Description

Worker nodes must share the same file system and be accessible by ssh without manually entering passwords (e.g. by ssh-agent or passwordless pubkey). Note that you can also use this function to parallelize on multiple cores on your local machine. But you still have to run an ssh server and provide passwordless access to localhost.

Usage

makeClusterFunctionsSSH(..., workers)

Arguments

...

[SSHWorker] Worker objects, all created with makeSSHWorker.

workers

[list of SSHWorker] Alternative way to pass workers.

Value

[ClusterFunctions].

See Also

makeSSHWorker

Other clusterFunctions: makeClusterFunctionsInteractive, makeClusterFunctionsLSF, makeClusterFunctionsLocal, makeClusterFunctionsMulticore, makeClusterFunctionsOpenLava, makeClusterFunctionsSGE, makeClusterFunctionsSLURM, makeClusterFunctionsTorque, makeClusterFunctions

Examples

Run this code
# NOT RUN {
# Assume you have three nodes larry, curley and moe. All have 6
# cpu cores. On curley and moe R is installed under
# "/opt/R/R-current" and on larry R is installed under
# "/usr/local/R/". larry should not be used extensively because
# somebody else wants to compute there as well.
# Then a call to 'makeClusterFunctionsSSH'
# might look like this:

cluster.functions = makeClusterFunctionsSSH(
  makeSSHWorker(nodename = "larry", rhome = "/usr/local/R", max.jobs = 2),
  makeSSHWorker(nodename = "curley", rhome = "/opt/R/R-current"),
  makeSSHWorker(nodename = "moe", rhome = "/opt/R/R-current"))
# }

Run the code above in your browser using DataCamp Workspace