WARN SparkContext: Multiple running SparkContexts detected in the same JVM











up vote
0
down vote

favorite












I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?



I am not initializing any other SparkContext within this spark-submit job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts.




  1. Check running SparkContexts.

  2. Use existing SparkContexts if they exist.

  3. Best way to modify this code.


from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf

spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


I submit the job with teh following



spark-submit /app.py


I get this error



WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:


Any help much appreciated as I have not found a proper explanation for this.










share|improve this question




















  • 1




    Why do you need to allow multiple contexts ?
    – eliasah
    18 hours ago










  • I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
    – CodeGeek123
    17 hours ago















up vote
0
down vote

favorite












I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?



I am not initializing any other SparkContext within this spark-submit job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts.




  1. Check running SparkContexts.

  2. Use existing SparkContexts if they exist.

  3. Best way to modify this code.


from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf

spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


I submit the job with teh following



spark-submit /app.py


I get this error



WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:


Any help much appreciated as I have not found a proper explanation for this.










share|improve this question




















  • 1




    Why do you need to allow multiple contexts ?
    – eliasah
    18 hours ago










  • I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
    – CodeGeek123
    17 hours ago













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?



I am not initializing any other SparkContext within this spark-submit job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts.




  1. Check running SparkContexts.

  2. Use existing SparkContexts if they exist.

  3. Best way to modify this code.


from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf

spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


I submit the job with teh following



spark-submit /app.py


I get this error



WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:


Any help much appreciated as I have not found a proper explanation for this.










share|improve this question















I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?



I am not initializing any other SparkContext within this spark-submit job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts.




  1. Check running SparkContexts.

  2. Use existing SparkContexts if they exist.

  3. Best way to modify this code.


from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf

spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


I submit the job with teh following



spark-submit /app.py


I get this error



WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:


Any help much appreciated as I have not found a proper explanation for this.







python apache-spark pyspark apache-spark-sql






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 17 hours ago

























asked 18 hours ago









CodeGeek123

1,75173660




1,75173660








  • 1




    Why do you need to allow multiple contexts ?
    – eliasah
    18 hours ago










  • I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
    – CodeGeek123
    17 hours ago














  • 1




    Why do you need to allow multiple contexts ?
    – eliasah
    18 hours ago










  • I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
    – CodeGeek123
    17 hours ago








1




1




Why do you need to allow multiple contexts ?
– eliasah
18 hours ago




Why do you need to allow multiple contexts ?
– eliasah
18 hours ago












I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago




I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago












1 Answer
1






active

oldest

votes

















up vote
1
down vote














a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code




In that case remove:



SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")


and leave only



spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


If there is an active context it will be reused.






share|improve this answer








New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
    – CodeGeek123
    18 hours ago











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53204397%2fwarn-sparkcontext-multiple-running-sparkcontexts-detected-in-the-same-jvm%23new-answer', 'question_page');
}
);

Post as a guest
































1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote














a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code




In that case remove:



SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")


and leave only



spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


If there is an active context it will be reused.






share|improve this answer








New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
    – CodeGeek123
    18 hours ago















up vote
1
down vote














a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code




In that case remove:



SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")


and leave only



spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


If there is an active context it will be reused.






share|improve this answer








New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.


















  • Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
    – CodeGeek123
    18 hours ago













up vote
1
down vote










up vote
1
down vote










a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code




In that case remove:



SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")


and leave only



spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


If there is an active context it will be reused.






share|improve this answer








New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.










a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code




In that case remove:



SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")


and leave only



spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()


If there is an active context it will be reused.







share|improve this answer








New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this answer



share|improve this answer






New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









answered 18 hours ago









user10622841

111




111




New contributor




user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






user10622841 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
    – CodeGeek123
    18 hours ago


















  • Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
    – CodeGeek123
    18 hours ago
















Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago




Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago


















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53204397%2fwarn-sparkcontext-multiple-running-sparkcontexts-detected-in-the-same-jvm%23new-answer', 'question_page');
}
);

Post as a guest




















































































Popular posts from this blog

Guess what letter conforming each word

Port of Spain

Run scheduled task as local user group (not BUILTIN)