The Docker container executes a [`start-notebook.sh` script](./start-notebook.sh) script by default. The `start-notebook.sh` script handles the `NB_UID`, `NB_GID` and `GRANT_SUDO` features documented in the next section, and then executes the `jupyter notebook`.
The Docker container executes a `start-notebook.sh` script script by default. The `start-notebook.sh` script handles the `NB_UID`, `NB_GID` and `GRANT_SUDO` features documented in the next section, and then executes the `jupyter notebook`.
You can pass [Jupyter command line options](https://jupyter.readthedocs.io/en/latest/projects/jupyter-command.html) through the `start-notebook.sh` script when launching the container. For example, to secure the Notebook server with a custom password hashed using `IPython.lib.passwd()` instead of the default token, run the following:
You can pass [Jupyter command line options](https://jupyter.readthedocs.io/en/latest/projects/jupyter-command.html) through the `start-notebook.sh` script when launching the container. For example, to secure the Notebook server with a custom password hashed using `IPython.lib.passwd()` instead of the default token, run the following:
...
@@ -101,9 +101,9 @@ You can bypass the provided scripts and specify your an arbitrary start command.
...
@@ -101,9 +101,9 @@ You can bypass the provided scripts and specify your an arbitrary start command.
## Image Specifics
## Image Specifics
## Spark and PySpark
### Spark and PySpark
### Using Spark Local Mode
#### Using Spark Local Mode
This configuration is nice for using Spark on small, local data.
This configuration is nice for using Spark on small, local data.
...
@@ -166,9 +166,9 @@ To use Python 2 in the notebook and on the workers, change the `PYSPARK_PYTHON`
...
@@ -166,9 +166,9 @@ To use Python 2 in the notebook and on the workers, change the `PYSPARK_PYTHON`
Of course, all of this can be hidden in an [IPython kernel startup script](http://ipython.org/ipython-doc/stable/development/config.html?highlight=startup#startup-files), but "explicit is better than implicit." :)
Of course, all of this can be hidden in an [IPython kernel startup script](http://ipython.org/ipython-doc/stable/development/config.html?highlight=startup#startup-files), but "explicit is better than implicit." :)
## Connecting to a Spark Cluster on Standalone Mode
#### Connecting to a Spark Cluster in Standalone Mode
Connection to Spark Cluster on Standalone Mode requires the following set of steps:
Connection to Spark Cluster in Standalone Mode requires the following set of steps:
0. Verify that the docker image (check the Dockerfile) and the Spark Cluster which is being deployed, run the same version of Spark.
0. Verify that the docker image (check the Dockerfile) and the Spark Cluster which is being deployed, run the same version of Spark.
1.[Deploy Spark on Standalone Mode](http://spark.apache.org/docs/latest/spark-standalone.html).
1.[Deploy Spark on Standalone Mode](http://spark.apache.org/docs/latest/spark-standalone.html).
@@ -9,7 +9,7 @@ Jupyter Docker Stacks are a set of ready-to-run Docker images containing Jupyter
...
@@ -9,7 +9,7 @@ Jupyter Docker Stacks are a set of ready-to-run Docker images containing Jupyter
:maxdepth: 1
:maxdepth: 1
using
using
features
configuration
contributing
contributing
Quick Start
Quick Start
...
@@ -18,19 +18,19 @@ Quick Start
...
@@ -18,19 +18,19 @@ Quick Start
The examples below may help you get started if you have Docker installed, know which Docker image you want to use, and want to launch a single Jupyter Notebook server in a container. The other pages in this documentation describe additional uses and features in detail.::
The examples below may help you get started if you have Docker installed, know which Docker image you want to use, and want to launch a single Jupyter Notebook server in a container. The other pages in this documentation describe additional uses and features in detail.::
# Run a Jupyter Notebook server in a Docker container started
# Run a Jupyter Notebook server in a Docker container started
# from the jupyter/scipy-notebook image built from Git commit 27ba573.
# from the jupyter/scipy-notebook image built from Git commit 2c80cf3537ca.
# All files saved in the container are lost when the notebook server exits.
# All files saved in the container are lost when the notebook server exits.
# -ti: pseudo-TTY+STDIN open, so the logs appear in the terminal
# -ti: pseudo-TTY+STDIN open, so the logs appear in the terminal
# -rm: remove the container on exit
# -rm: remove the container on exit
# -p: publish the notebook port 8888 as port 8888 on the host
# -p: publish the notebook port 8888 as port 8888 on the host
docker run -ti --rm -p 8888:8888 jupyter/scipy-notebook:27ba573
docker run -ti --rm -p 8888:8888 jupyter/scipy-notebook:2c80cf3537ca
# Run a Jupyter Notebook server in a Docker container started from the
# Run a Jupyter Notebook server in a Docker container started from the
# jupyter/r-notebook image built from Git commit cf1a3aa.
# jupyter/r-notebook image built from Git commit e5c5a7d3e52d.
# All files written to ~/work in the container are saved to the
# All files written to ~/work in the container are saved to the
# current working on the host and persist even when the notebook server
# current working on the host and persist even when the notebook server
# exits.
# exits.
docker run -ti --rm -p 8888:8888 -v "$PWD":/home/jovyan/work jupyter/r-notebook:cf1a3aa
docker run -ti --rm -p 8888:8888 -v "$PWD":/home/jovyan/work jupyter/r-notebook:e5c5a7d3e52d
# Run a Jupyter Notebook server in a background Docker container started
# Run a Jupyter Notebook server in a background Docker container started
# from the latest jupyter/all-spark-notebook image available on the local
# from the latest jupyter/all-spark-notebook image available on the local