Skip to content
Open
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,8 @@ bld/

# Visual Studio 2015/2017 cache/options directory
.vs/
# Visual Studio Code cache/options directory
.vscode/
# Uncomment if you have tasks that create the project's static files in wwwroot
#wwwroot/

Expand Down
1 change: 1 addition & 0 deletions binder/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
FROM mcr.microsoft.com/dotnet-spark:2.4.6-0.12.1-interactive
7 changes: 7 additions & 0 deletions binder/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# .NET for Apache Spark Interactive

This interactive notebook contains allows you to explore .NET for Apache Spark in your web-browser.

To launch it, just click the button below:

[![Binder](./dotnet-spark-binder.svg)](https://mybinder.org/v2/gh/indy-3rdman/spark/docker_images_init?urlpath=lab/tree/nb/)
1 change: 1 addition & 0 deletions binder/dotnet-spark-binder.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
65 changes: 65 additions & 0 deletions docker/images/interactive/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# .NET for Apache Spark interactive Docker image

## Description

This directory contains the source code to build a docker interactive image by using the [jupyter/base-notebook](https://hub.docker.com/r/jupyter/base-notebook) as foundation.

## Building

To build the image run the [build.sh](build.sh) bash script. Per default it should build an image using the latest supported versions of .NET Core, Apache Spark and .NET for Apache Spark.

You can also build for different versions, by specifying one of the following options:

```bash
-a, --apache-spark
-d, --dotnet-spark
```

For more details please run

```bash
build.sh -h
```

Please note, that not all version combinations are supported, however.

## The image build stages

Using different stages makes sense to efficiently build multiple images that are based on the same .NET core SDK etc, but are using different .NET for Apache Spark or Apache Spark versions.
In that way, dependencies (e.g. .NET Core SDK) do not have to be downloaded again and again, while building an image for a different version. This saves time and bandwidth.

The three stages used in the build process are:

- ### **dotnet-interactive**

Builds on the jupyter/base-notebook image and installs the .NET Core SDK, along with Microsoft.DotNet.Interactive.

- ### **dotnet-spark-interactive-base**

Adds the specified .NET for Apache Spark version to the dotnet-interactive image and also copies/builds the HelloSpark example into the image. HelloSpark is also use to install the correct microsoft-spark-*.jar version that is required to start a spark-submit session in debug mode.

- ### **dotnet-spark (interactive)**

Gets/installs the specified Apache Spark version and add the example notebooks.

## Docker Run Example

To start a new container based on the dotnet-spark interactive image, just run the following command.

```bash
docker run --name dotnet-spark-interactive -d -p 8888:8888 mcr.microsoft.com/dotnet-spark:interactive-latest
```

After that, examine the logs of the container to get the correct URL that is required to connect to Juypter using the authentication token.

```bash
docker logs -f dotnet-spark-interactive
```

![launch](img/dotnet-interactive-docker-launch.gif)

It is important to start the .NET for Apache Spark backend in debug mode first, before using it in any of the notebooks.

The helper script start-spark-debug.sh can do this for you, as demonstrated below.

![example](img/dotnet-interactive-docker-example.gif)
50 changes: 50 additions & 0 deletions docker/images/interactive/apache-spark/01-start-spark-debug.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Start .NET for Apache Spark in Debug mode\n",
"\n",
"Please run the cell below, before executing any .NET for Apache Spark code in a interactive .NET notebook."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"! start-spark-debug.sh"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.6"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
97 changes: 97 additions & 0 deletions docker/images/interactive/apache-spark/02-run-dotnet-spark.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# .NET for Apache Spark example\n",
"\n",
"### 1) Add the Microsoft.Spark NuGet package"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#r \"nuget: Microsoft.Spark\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2) Create a new Spark Session"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"using Microsoft.Spark.Sql;\n",
"\n",
"var spark = SparkSession.Builder().GetOrCreate();"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3) Create a new DataFrame of Integers and show it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"var df = spark.CreateDataFrame(new int[] { 1, 2, 3 });\n",
"df.Show();"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 4) Create a second new DataFrame of Strings and show it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"var df2 = spark.CreateDataFrame(new string[] { \".NET\", \"for\", \"Apache\", \"Spark\" });\n",
"df2.Show();"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": ".NET (C#)",
"language": "C#",
"name": ".net-csharp"
},
"language_info": {
"file_extension": ".cs",
"mimetype": "text/x-csharp",
"name": "C#",
"pygments_lexer": "csharp",
"version": "8.0"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
30 changes: 30 additions & 0 deletions docker/images/interactive/apache-spark/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
ARG DOTNET_SPARK_VERSION=0.12.1
FROM dotnet-spark-interactive-base:$DOTNET_SPARK_VERSION

ARG SPARK_VERSION=2.4.6

ENV DAEMON_RUN=true
ENV SPARK_VERSION=$SPARK_VERSION
ENV SPARK_HOME=/spark

ENV HADOOP_VERSION=2.7
ENV PATH="${SPARK_HOME}/bin:${DOTNET_WORKER_DIR}:${PATH}"
ENV DOTNETBACKEND_PORT=5567
ENV JUPYTER_ENABLE_LAB=true

USER root

COPY bin/* /usr/local/bin/
COPY *.ipynb ${HOME}/nb/

RUN cd / \
&& wget -q --show-progress --progress=bar:force:noscroll https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \
&& tar -xvzf spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \
&& mv spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION} spark \
&& rm spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz \
&& chmod 755 /usr/local/bin/start-spark-debug.sh \
&& chown -R ${NB_UID} ${HOME} \
&& cd ${HOME}/nb
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem necessary.


USER ${USER}
WORKDIR ${HOME}/nb/
31 changes: 31 additions & 0 deletions docker/images/interactive/apache-spark/supervisor.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
[supervisord]
nodaemon=true
logfile=/home/jovyan/supervisord.log
pidfile=/home/jovyan/supervisord.pid

[program:notebook]
directory=/home/jovyan/nb
command=/usr/local/bin/start-notebook.sh
autorestart = unexpected
startsecs = 0
startretries = 0
exitcodes = 0
priority=1
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
stdout_logfile_maxbytes = 0
stderr_logfile_maxbytes = 0

[program:spark-debug]
directory=/dotnet/Debug/netcoreapp3.1
command=/usr/local/bin/start-spark-debug.sh
autorestart = unexpected
startsecs = 0
startretries = 0
exitcodes = 0
priority=2
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr
stdout_logfile_maxbytes = 0
stderr_logfile_maxbytes = 0

Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
#!/usr/bin/env bash

# Start the .NET for Apache Spark backend in debug mode

cd /dotnet/Debug/netcoreapp3.1 || exit
/spark/bin/spark-submit --class org.apache.spark.deploy.dotnet.DotnetRunner --jars "/dotnet/Debug/netcoreapp3.1/*.jar" --master local microsoft-spark-2.4.x-0.12.1.jar debug 5567
Loading