WARN SparkContext: Multiple running SparkContexts detected in the same JVM
up vote
0
down vote
favorite
I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?
I am not initializing any other SparkContext
within this spark-submit
job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts
.
- Check running
SparkContexts
. - Use existing
SparkContexts
if they exist. - Best way to modify this code.
from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf
spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
I submit the job with teh following
spark-submit /app.py
I get this error
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
Any help much appreciated as I have not found a proper explanation for this.
python apache-spark pyspark apache-spark-sql
add a comment |
up vote
0
down vote
favorite
I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?
I am not initializing any other SparkContext
within this spark-submit
job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts
.
- Check running
SparkContexts
. - Use existing
SparkContexts
if they exist. - Best way to modify this code.
from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf
spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
I submit the job with teh following
spark-submit /app.py
I get this error
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
Any help much appreciated as I have not found a proper explanation for this.
python apache-spark pyspark apache-spark-sql
1
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?
I am not initializing any other SparkContext
within this spark-submit
job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts
.
- Check running
SparkContexts
. - Use existing
SparkContexts
if they exist. - Best way to modify this code.
from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf
spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
I submit the job with teh following
spark-submit /app.py
I get this error
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
Any help much appreciated as I have not found a proper explanation for this.
python apache-spark pyspark apache-spark-sql
I am writing this as I cannot find the answer to this problem. I am using PySpark and running a script, except I cannot see where my other Spark sessions are or close them down. What is the best way to do the following?
I am not initializing any other SparkContext
within this spark-submit
job so it must be contexts left behind from some previous run? Note that i do not want to allowmultiplecontexts
.
- Check running
SparkContexts
. - Use existing
SparkContexts
if they exist. - Best way to modify this code.
from src import config
import pandas as pd
import plotly.graph_objs as go
from visualize_main import app
from dash.dependencies import Input, Output, State
from pyspark.sql import SparkSession
from datetime import datetime
import dash_core_components as dcc
import dash_html_components as html
from pyspark import SparkContext, SparkConf
spark =SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
I submit the job with teh following
spark-submit /app.py
I get this error
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
Any help much appreciated as I have not found a proper explanation for this.
python apache-spark pyspark apache-spark-sql
python apache-spark pyspark apache-spark-sql
edited 17 hours ago
asked 18 hours ago
CodeGeek123
1,75173660
1,75173660
1
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago
add a comment |
1
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago
1
1
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code
In that case remove:
SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")
and leave only
spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
If there is an active context it will be reused.
New contributor
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code
In that case remove:
SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")
and leave only
spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
If there is an active context it will be reused.
New contributor
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
add a comment |
up vote
1
down vote
a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code
In that case remove:
SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")
and leave only
spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
If there is an active context it will be reused.
New contributor
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
add a comment |
up vote
1
down vote
up vote
1
down vote
a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code
In that case remove:
SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")
and leave only
spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
If there is an active context it will be reused.
New contributor
a) Check running sparkcontexts b) Use existing sparkcontexts if they exist c) best way to modify this code
In that case remove:
SparkContext.stop("morphy_test11")
conf = SparkConf()
conf.set("spark.driver.allowMultipleContexts", "true")
and leave only
spark = SparkSession.builder.master('local').appName("morphy_test111_dgdfgdf").config(conf=conf).getOrCreate()
If there is an active context it will be reused.
New contributor
New contributor
answered 18 hours ago
user10622841
111
111
New contributor
New contributor
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
add a comment |
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
Thank you. However, Using only the SparkSession.builder line did not work for me and hence why i tried adding the other three lines and that doesn't work either
– CodeGeek123
18 hours ago
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53204397%2fwarn-sparkcontext-multiple-running-sparkcontexts-detected-in-the-same-jvm%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
Why do you need to allow multiple contexts ?
– eliasah
18 hours ago
I do not. But every time i run spark submit again i get the error above and not sure how to deal with that.
– CodeGeek123
17 hours ago