Skip to content

How to run multiple scala operations on one Spark engine? #6853

Answered by turboFei
HaoYang670 asked this question in Q&A
Discussion options

You must be logged in to vote

I find a PR that add a static lock for all operations, #4150. Can we remove that?

No. we have to disable the concurrent scala code interpret. because they share the same spark.repl.class.outputDir, which is a static configuration and unable to change it once the Spark context is initialized.

"-Yrepl-outdir",
s"${spark.sparkContext.getConf.get("spark.repl.class.outputDir")}",

https://github.com/apache/spark/blob/332efb2eabb2b9383cfcfb0bf633089f38cdb398/core/src/main/scala/org/apache/spark/SparkContext.scala#L497-L500

See …

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
1 reply
@turboFei
Comment options

Answer selected by turboFei
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants