Rscript

require(multicore)
cat(sprintf("Multicore functions running on maximum %d cores",
ifelse(length(commandArgs(trailingOnly=TRUE)),
cores <- commandArgs(trailingOnly=TRUE)[1],
cores <- 1)))

## Multicore functions running on maximum 1 cores
cores is set to 1, but when I run as an Rscript, I can specify how many cores I want to use.

add mc.cores = cores to mclapply calls :

# Example processor hungry multicore operation
mats <- mclapply(1:500, function(x) matrix(rnorm(x*x),ncol = x) %*% matrix(rnorm(x*x), ncol = x), mc.cores = cores)

$ Rscript –vanilla R/myscript.R 12 &> logs/myscript.log &
Rscript runs the .R file as a standalone script, without going into the R environment. The –vanilla flag means that you run the script without calling in your .Rprofile (which is typically set up for interactive use) and without prompting you to save a workspace image.

## example #! script for a Unix-alike
#! /path/to/Rscript --vanilla --default-packages=utils
args <- commandArgs(TRUE)
res <- try(install.packages(args))
if(inherits(res, "try-error")) q(status=1) else q()

SparkR needs R and rJava.
install.packages("rJava")
You can check if rJava is installed correctly by running
library(rJava)
If you get no output after the above command, it means that rJava has been installed successfully! If you get an error message while installing rJava, then you might need to the following to configure Java with R. Exit R, run the following command in a shell and relaunch R to install rJava.
usb/$ R CMD javareconf -e

The Spark API for launching R worker

public static org.apache.spark.api.r.BufferedStreamThread createRWorker(String rLibDir, int port)