Skip to content

Unable to use existing spark server with spylon-kernel #68

@abhinavGupta16

Description

@abhinavGupta16

I already have a spark-server running on my machine with 1 master and 1 worker node. However, every time I run anything in scala, it creates it's own spark cluster.

How can I make it use the existing spark servers that are running. I can do this with pyspark, but not with spylon-kernel.

Failed with Spylon-kernel -
image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions