-
Notifications
You must be signed in to change notification settings - Fork 329
Interactive docker image #709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
indy-3rdman
wants to merge
13
commits into
dotnet:main
Choose a base branch
from
indy-3rdman:interactive-docker-image
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 5 commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
d2d621a
initial docker images
13b4283
Merge branch 'master' of https://github.com/dotnet/spark into docker-…
f805a97
initial interactive notebook docker image files
53c4c54
Merge branch 'master' of https://github.com/dotnet/spark into interac…
3e2de92
Dockerfile update to fix Dialog issue
7f85c97
removed Microsoft.dotnet-interactive version specification
873fa34
change default notebook directory
d00509b
Merge branch 'master' of https://github.com/dotnet/spark into interac…
afd854e
updated for dotnet-spark version 1.0.0
33dc9f4
Dockerfile(s) cleanup
dbef269
Removed copy of HelloSpark project
7fb707f
Merge branch 'master' of https://github.com/dotnet/spark into interac…
3e434b6
Dockerfile cleanup
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| FROM mcr.microsoft.com/dotnet-spark:2.4.6-0.12.1-interactive |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,7 @@ | ||
| # .NET for Apache Spark Interactive | ||
|
|
||
| This interactive notebook contains allows you to explore .NET for Apache Spark in your web-browser. | ||
|
|
||
| To launch it, just click the button below: | ||
|
|
||
| [](https://mybinder.org/v2/gh/indy-3rdman/spark/docker_images_init?urlpath=lab/tree/nb/) |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,65 @@ | ||
| # .NET for Apache Spark interactive Docker image | ||
|
|
||
| ## Description | ||
|
|
||
| This directory contains the source code to build a docker interactive image by using the [jupyter/base-notebook](https://hub.docker.com/r/jupyter/base-notebook) as foundation. | ||
|
|
||
| ## Building | ||
|
|
||
| To build the image run the [build.sh](build.sh) bash script. Per default it should build an image using the latest supported versions of .NET Core, Apache Spark and .NET for Apache Spark. | ||
|
|
||
| You can also build for different versions, by specifying one of the following options: | ||
|
|
||
| ```bash | ||
| -a, --apache-spark | ||
| -d, --dotnet-spark | ||
| ``` | ||
|
|
||
| For more details please run | ||
|
|
||
| ```bash | ||
| build.sh -h | ||
| ``` | ||
|
|
||
| Please note, that not all version combinations are supported, however. | ||
|
|
||
| ## The image build stages | ||
|
|
||
| Using different stages makes sense to efficiently build multiple images that are based on the same .NET core SDK etc, but are using different .NET for Apache Spark or Apache Spark versions. | ||
| In that way, dependencies (e.g. .NET Core SDK) do not have to be downloaded again and again, while building an image for a different version. This saves time and bandwidth. | ||
|
|
||
| The three stages used in the build process are: | ||
|
|
||
| - ### **dotnet-interactive** | ||
|
|
||
| Builds on the jupyter/base-notebook image and installs the .NET Core SDK, along with Microsoft.DotNet.Interactive. | ||
|
|
||
| - ### **dotnet-spark-interactive-base** | ||
|
|
||
| Adds the specified .NET for Apache Spark version to the dotnet-interactive image and also copies/builds the HelloSpark example into the image. HelloSpark is also use to install the correct microsoft-spark-*.jar version that is required to start a spark-submit session in debug mode. | ||
|
|
||
| - ### **dotnet-spark (interactive)** | ||
|
|
||
| Gets/installs the specified Apache Spark version and add the example notebooks. | ||
|
|
||
| ## Docker Run Example | ||
|
|
||
| To start a new container based on the dotnet-spark interactive image, just run the following command. | ||
|
|
||
| ```bash | ||
| docker run --name dotnet-spark-interactive -d -p 8888:8888 mcr.microsoft.com/dotnet-spark:interactive-latest | ||
| ``` | ||
|
|
||
| After that, examine the logs of the container to get the correct URL that is required to connect to Juypter using the authentication token. | ||
|
|
||
| ```bash | ||
| docker logs -f dotnet-spark-interactive | ||
| ``` | ||
|
|
||
|  | ||
|
|
||
| It is important to start the .NET for Apache Spark backend in debug mode first, before using it in any of the notebooks. | ||
|
|
||
| The helper script start-spark-debug.sh can do this for you, as demonstrated below. | ||
|
|
||
|  |
50 changes: 50 additions & 0 deletions
50
docker/images/interactive/apache-spark/01-start-spark-debug.ipynb
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,50 @@ | ||
| { | ||
| "cells": [ | ||
| { | ||
| "cell_type": "markdown", | ||
| "metadata": {}, | ||
| "source": [ | ||
| "# Start .NET for Apache Spark in Debug mode\n", | ||
| "\n", | ||
| "Please run the cell below, before executing any .NET for Apache Spark code in a interactive .NET notebook." | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "! start-spark-debug.sh" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [] | ||
| } | ||
| ], | ||
| "metadata": { | ||
| "kernelspec": { | ||
| "display_name": "Python 3", | ||
| "language": "python", | ||
| "name": "python3" | ||
| }, | ||
| "language_info": { | ||
| "codemirror_mode": { | ||
| "name": "ipython", | ||
| "version": 3 | ||
| }, | ||
| "file_extension": ".py", | ||
| "mimetype": "text/x-python", | ||
| "name": "python", | ||
| "nbconvert_exporter": "python", | ||
| "pygments_lexer": "ipython3", | ||
| "version": "3.7.6" | ||
| } | ||
| }, | ||
| "nbformat": 4, | ||
| "nbformat_minor": 4 | ||
| } |
97 changes: 97 additions & 0 deletions
97
docker/images/interactive/apache-spark/02-run-dotnet-spark.ipynb
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,97 @@ | ||
| { | ||
| "cells": [ | ||
| { | ||
| "cell_type": "markdown", | ||
| "metadata": {}, | ||
| "source": [ | ||
| "# .NET for Apache Spark example\n", | ||
| "\n", | ||
| "### 1) Add the Microsoft.Spark NuGet package" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "#r \"nuget: Microsoft.Spark\"" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "markdown", | ||
| "metadata": {}, | ||
| "source": [ | ||
| "### 2) Create a new Spark Session" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "using Microsoft.Spark.Sql;\n", | ||
| "\n", | ||
| "var spark = SparkSession.Builder().GetOrCreate();" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "markdown", | ||
| "metadata": {}, | ||
| "source": [ | ||
| "### 3) Create a new DataFrame of Integers and show it." | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "var df = spark.CreateDataFrame(new int[] { 1, 2, 3 });\n", | ||
| "df.Show();" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "markdown", | ||
| "metadata": {}, | ||
| "source": [ | ||
| "### 4) Create a second new DataFrame of Strings and show it." | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [ | ||
| "var df2 = spark.CreateDataFrame(new string[] { \".NET\", \"for\", \"Apache\", \"Spark\" });\n", | ||
| "df2.Show();" | ||
| ] | ||
| }, | ||
| { | ||
| "cell_type": "code", | ||
| "execution_count": null, | ||
| "metadata": {}, | ||
| "outputs": [], | ||
| "source": [] | ||
| } | ||
| ], | ||
| "metadata": { | ||
| "kernelspec": { | ||
| "display_name": ".NET (C#)", | ||
| "language": "C#", | ||
| "name": ".net-csharp" | ||
| }, | ||
| "language_info": { | ||
| "file_extension": ".cs", | ||
| "mimetype": "text/x-csharp", | ||
| "name": "C#", | ||
| "pygments_lexer": "csharp", | ||
| "version": "8.0" | ||
| } | ||
| }, | ||
| "nbformat": 4, | ||
| "nbformat_minor": 4 | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,30 @@ | ||
| ARG DOTNET_SPARK_VERSION=0.12.1 | ||
| FROM dotnet-spark-interactive-base:$DOTNET_SPARK_VERSION | ||
|
|
||
| ARG SPARK_VERSION=2.4.6 | ||
|
|
||
| ENV DAEMON_RUN=true | ||
| ENV SPARK_VERSION=$SPARK_VERSION | ||
| ENV SPARK_HOME=/spark | ||
|
|
||
| ENV HADOOP_VERSION=2.7 | ||
| ENV PATH="${SPARK_HOME}/bin:${DOTNET_WORKER_DIR}:${PATH}" | ||
| ENV DOTNETBACKEND_PORT=5567 | ||
| ENV JUPYTER_ENABLE_LAB=true | ||
|
|
||
| USER root | ||
|
|
||
| COPY bin/* /usr/local/bin/ | ||
| COPY *.ipynb ${HOME}/nb/ | ||
|
|
||
| RUN cd / \ | ||
| && wget -q --show-progress --progress=bar:force:noscroll https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \ | ||
| && tar -xvzf spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \ | ||
| && mv spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION} spark \ | ||
| && rm spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \ | ||
| && chmod 755 /usr/local/bin/start-spark-debug.sh \ | ||
| && chown -R ${NB_UID} ${HOME} \ | ||
| && cd ${HOME}/nb | ||
|
|
||
| USER ${USER} | ||
| WORKDIR ${HOME}/nb/ | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,31 @@ | ||
| [supervisord] | ||
| nodaemon=true | ||
| logfile=/home/jovyan/supervisord.log | ||
| pidfile=/home/jovyan/supervisord.pid | ||
|
|
||
| [program:notebook] | ||
| directory=/home/jovyan/nb | ||
| command=/usr/local/bin/start-notebook.sh | ||
| autorestart = unexpected | ||
| startsecs = 0 | ||
| startretries = 0 | ||
| exitcodes = 0 | ||
| priority=1 | ||
| stdout_logfile=/dev/stdout | ||
| stderr_logfile=/dev/stderr | ||
| stdout_logfile_maxbytes = 0 | ||
| stderr_logfile_maxbytes = 0 | ||
|
|
||
| [program:spark-debug] | ||
| directory=/dotnet/Debug/netcoreapp3.1 | ||
| command=/usr/local/bin/start-spark-debug.sh | ||
| autorestart = unexpected | ||
| startsecs = 0 | ||
| startretries = 0 | ||
| exitcodes = 0 | ||
| priority=2 | ||
| stdout_logfile=/dev/stdout | ||
| stderr_logfile=/dev/stderr | ||
| stdout_logfile_maxbytes = 0 | ||
| stderr_logfile_maxbytes = 0 | ||
|
|
6 changes: 6 additions & 0 deletions
6
docker/images/interactive/apache-spark/templates/scripts/start-spark-debug.sh
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,6 @@ | ||
| #!/usr/bin/env bash | ||
|
|
||
| # Start the .NET for Apache Spark backend in debug mode | ||
|
|
||
| cd /dotnet/Debug/netcoreapp3.1 || exit | ||
| /spark/bin/spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner --jars "/dotnet/Debug/netcoreapp3.1/*.jar" --master local microsoft-spark-2.4.x-0.12.1.jar debug 5567 |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem necessary.