Commit 581534e2 authored by Romain's avatar Romain Committed by GitHub

Merge branch 'master' into pandoc

parents 9b983ea8 31b807ec
...@@ -3,7 +3,8 @@ ...@@ -3,7 +3,8 @@
# Ubuntu 18.04 (bionic) # Ubuntu 18.04 (bionic)
# https://hub.docker.com/_/ubuntu/?tab=tags&name=bionic # https://hub.docker.com/_/ubuntu/?tab=tags&name=bionic
ARG BASE_CONTAINER=ubuntu:bionic-20200112@sha256:bc025862c3e8ec4a8754ea4756e33da6c41cba38330d7e324abd25c8e0b93300 ARG ROOT_CONTAINER=ubuntu:bionic-20200112@sha256:bc025862c3e8ec4a8754ea4756e33da6c41cba38330d7e324abd25c8e0b93300
ARG BASE_CONTAINER=$ROOT_CONTAINER
FROM $BASE_CONTAINER FROM $BASE_CONTAINER
LABEL maintainer="Jupyter Project <jupyter@googlegroups.com>" LABEL maintainer="Jupyter Project <jupyter@googlegroups.com>"
......
...@@ -26,7 +26,7 @@ If there's agreement that the feature belongs in one or more of the core stacks: ...@@ -26,7 +26,7 @@ If there's agreement that the feature belongs in one or more of the core stacks:
1. Implement the feature in a local clone of the `jupyter/docker-stacks` project. 1. Implement the feature in a local clone of the `jupyter/docker-stacks` project.
2. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call: 2. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call:
``` ```
make image/somestack-notebook make build/somestack-notebook
``` ```
3. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes. 3. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
4. Watch for Travis to report a build success or failure for your PR on GitHub. 4. Watch for Travis to report a build success or failure for your PR on GitHub.
......
...@@ -8,7 +8,7 @@ Please follow the process below to update a package version: ...@@ -8,7 +8,7 @@ Please follow the process below to update a package version:
2. Adjust the version number for the package. We prefer to pin the major and minor version number of packages so as to minimize rebuild side-effects when users submit pull requests (PRs). For example, you'll find the Jupyter Notebook package, `notebook`, installed using conda with `notebook=5.4.*`. 2. Adjust the version number for the package. We prefer to pin the major and minor version number of packages so as to minimize rebuild side-effects when users submit pull requests (PRs). For example, you'll find the Jupyter Notebook package, `notebook`, installed using conda with `notebook=5.4.*`.
3. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call: 3. Please build the image locally before submitting a pull request. Building the image locally shortens the debugging cycle by taking some load off [Travis CI](http://travis-ci.org/), which graciously provides free build services for open source projects like this one. If you use `make`, call:
``` ```
make image/somestack-notebook make build/somestack-notebook
``` ```
4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes. 4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
5. Watch for Travis to report a build success or failure for your PR on GitHub. 5. Watch for Travis to report a build success or failure for your PR on GitHub.
......
...@@ -14,7 +14,7 @@ Please follow the process below to add new tests: ...@@ -14,7 +14,7 @@ Please follow the process below to add new tests:
2. If your test should run against a single image, add your test code to one of the modules in `some-notebook/test/` or create a new module. 2. If your test should run against a single image, add your test code to one of the modules in `some-notebook/test/` or create a new module.
3. Build one or more images you intend to test and run the tests locally. If you use `make`, call: 3. Build one or more images you intend to test and run the tests locally. If you use `make`, call:
``` ```
make image/somestack-notebook make build/somestack-notebook
make test/somestack-notebook make test/somestack-notebook
``` ```
4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes. 4. [Submit a pull request](https://github.com/PointCloudLibrary/pcl/wiki/A-step-by-step-guide-on-preparing-and-submitting-a-pull-request) (PR) with your changes.
......
...@@ -15,8 +15,10 @@ RUN apt-get -y update && \ ...@@ -15,8 +15,10 @@ RUN apt-get -y update && \
apt-get install --no-install-recommends -y openjdk-8-jre-headless ca-certificates-java && \ apt-get install --no-install-recommends -y openjdk-8-jre-headless ca-certificates-java && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
# Using the preferred mirror to download the file
RUN cd /tmp && \ RUN cd /tmp && \
wget -q http://mirrors.ukfast.co.uk/sites/ftp.apache.org/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz && \ wget -q $(wget -qO- https://www.apache.org/dyn/closer.lua/spark/spark-${APACHE_SPARK_VERSION}/spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz\?as_json | \
python -c "import sys, json; content=json.load(sys.stdin); print(content['preferred']+content['path_info'])") && \
echo "2426a20c548bdfc07df288cd1d18d1da6b3189d0b78dee76fa034c52a4e02895f0ad460720c526f163ba63a17efae4764c46a1cd8f9b04c60f9937a554db85d2 *spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" | sha512sum -c - && \ echo "2426a20c548bdfc07df288cd1d18d1da6b3189d0b78dee76fa034c52a4e02895f0ad460720c526f163ba63a17efae4764c46a1cd8f9b04c60f9937a554db85d2 *spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" | sha512sum -c - && \
tar xzf spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz -C /usr/local --owner root --group root --no-same-owner && \ tar xzf spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz -C /usr/local --owner root --group root --no-same-owner && \
rm spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz rm spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz
...@@ -36,11 +38,11 @@ RUN apt-get -y update && \ ...@@ -36,11 +38,11 @@ RUN apt-get -y update && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
# Spark and Mesos config # Spark and Mesos config
ENV SPARK_HOME=/usr/local/spark \ ENV SPARK_HOME=/usr/local/spark
PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip \ ENV PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip \
MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so \ MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so \
SPARK_OPTS="--driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info" \ SPARK_OPTS="--driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info" \
PATH=$PATH:/usr/local/spark/bin PATH=$PATH:$SPARK_HOME/bin
USER $NB_UID USER $NB_UID
......
...@@ -16,4 +16,15 @@ def test_spark_shell(container): ...@@ -16,4 +16,15 @@ def test_spark_shell(container):
c.wait(timeout=30) c.wait(timeout=30)
logs = c.logs(stdout=True).decode('utf-8') logs = c.logs(stdout=True).decode('utf-8')
LOGGER.debug(logs) LOGGER.debug(logs)
assert 'res0: Int = 2' in logs assert 'res0: Int = 2' in logs
\ No newline at end of file
def test_pyspark(container):
"""PySpark should be in the Python path"""
c = container.run(
tty=True,
command=['start.sh', 'python', '-c', '"import pyspark"']
)
rv = c.wait(timeout=30)
assert rv == 0 or rv["StatusCode"] == 0
logs = c.logs(stdout=True).decode('utf-8')
LOGGER.debug(logs)
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment