Skip to content

Commit

Permalink
Merge pull request #48 from Mu-Sigma/develop
Browse files Browse the repository at this point in the history
Corrected a mistake in the description regarding SparkR installation
  • Loading branch information
naren1991 authored Jan 3, 2019
2 parents ab5cfd2 + 555096a commit 0ea395f
Show file tree
Hide file tree
Showing 5 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Description: Enables data scientists to compose pipelines of analysis which cons
Note - To enable pipelines involving Spark tasks, the package uses the 'SparkR' package.
The SparkR package needs to be installed to use Spark as an engine within a pipeline. SparkR is distributed natively with Apache Spark and is not distributed on CRAN. The SparkR version needs to directly map to the Spark version (hence the native distribution), and care needs to be taken to ensure that this is configured properly.
To install SparkR from Github, run the following command if you know the Spark version: 'devtools::install_github('apache/spark@v2.x.x', subdir='R/pkg')'.
The other option is to install R by running the following terminal commands if Spark has already been installed: '$ export SPARK_HOME=/path/to/spark/directory && cd $SPARK_HOME/R/lib/SparkR/ && R -e "devtools::install('.')"'.
The other option is to install SparkR by running the following terminal commands if Spark has already been installed: '$ export SPARK_HOME=/path/to/spark/directory && cd $SPARK_HOME/R/lib/SparkR/ && R -e "devtools::install('.')"'.
Depends: R (>= 3.4.0), magrittr, pipeR, methods
Imports: ggplot2, dplyr, futile.logger, RCurl, rlang (>= 0.3.0), proto, purrr, devtools
Suggests: plotly, knitr, rmarkdown, parallel, visNetwork, rjson, DT, shiny, R.devices, corrplot, car, foreign
Expand Down
2 changes: 1 addition & 1 deletion R/analysisPipelines_package.R
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
#' \itemize{
#' \item devtools::install_github('apache/spark@v2.x.x', subdir='R/pkg')
#' }
#' The other option is to install R by running the following terminal commands if Spark has already been installed:
#' The other option is to install SparkR by running the following terminal commands if Spark has already been installed:
#' \itemize{
#' \item $ export SPARK_HOME=/path/to/spark/directory
#' \item $ cd $SPARK_HOME/R/lib/SparkR/
Expand Down
2 changes: 1 addition & 1 deletion man/analysisPipelines.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion vignettes/Analysis_pipelines_for_working_with_sparkR.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ To install from Github, run the following command, if you know the Spark version
devtools::install_github('apache/spark@v2.x.x', subdir='R/pkg')
```

The other option is to install R by running the following *terminal* commands if Spark has already been installed.
The other option is to install SparkR by running the following *terminal* commands if Spark has already been installed.

```{bash eval = F}
$ export SPARK_HOME=/path/to/spark/directory
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ To install from Github, run the following command, if you know the Spark version
devtools::install_github('apache/spark@v2.x.x', subdir='R/pkg')
```

The other option is to install R by running the following *terminal* commands if Spark has already been installed.
The other option is to install SparkR by running the following *terminal* commands if Spark has already been installed.

```{bash eval = F}
$ export SPARK_HOME=/path/to/spark/directory
Expand Down

0 comments on commit 0ea395f

Please sign in to comment.