How to send the result of a sql statement to a for loop using pyspark?
I am trying to send the sql result to a for loop. I am new to spark and python, please help.
from pyspark import SparkContext
sc =SparkContext()
from pyspark.sql import HiveContext
hive_context = HiveContext(sc)
#bank = hive_context.table("cip_utilities.file_upload_temp")
data=hive_context.sql("select * from cip_utilities.cdm_variable_dict")
hive_context.sql("describe cip_utilities.cdm_variables_dict").registerTempTable("schema_def")
temp_data=hive_context.sql("select * from schema_def")
temp_data.show()
data1=hive_context.sql("select col_name from schema_def where data_type<>'string'")
data1.show()
python apache-spark pyspark pyspark-sql
add a comment |
I am trying to send the sql result to a for loop. I am new to spark and python, please help.
from pyspark import SparkContext
sc =SparkContext()
from pyspark.sql import HiveContext
hive_context = HiveContext(sc)
#bank = hive_context.table("cip_utilities.file_upload_temp")
data=hive_context.sql("select * from cip_utilities.cdm_variable_dict")
hive_context.sql("describe cip_utilities.cdm_variables_dict").registerTempTable("schema_def")
temp_data=hive_context.sql("select * from schema_def")
temp_data.show()
data1=hive_context.sql("select col_name from schema_def where data_type<>'string'")
data1.show()
python apache-spark pyspark pyspark-sql
add a comment |
I am trying to send the sql result to a for loop. I am new to spark and python, please help.
from pyspark import SparkContext
sc =SparkContext()
from pyspark.sql import HiveContext
hive_context = HiveContext(sc)
#bank = hive_context.table("cip_utilities.file_upload_temp")
data=hive_context.sql("select * from cip_utilities.cdm_variable_dict")
hive_context.sql("describe cip_utilities.cdm_variables_dict").registerTempTable("schema_def")
temp_data=hive_context.sql("select * from schema_def")
temp_data.show()
data1=hive_context.sql("select col_name from schema_def where data_type<>'string'")
data1.show()
python apache-spark pyspark pyspark-sql
I am trying to send the sql result to a for loop. I am new to spark and python, please help.
from pyspark import SparkContext
sc =SparkContext()
from pyspark.sql import HiveContext
hive_context = HiveContext(sc)
#bank = hive_context.table("cip_utilities.file_upload_temp")
data=hive_context.sql("select * from cip_utilities.cdm_variable_dict")
hive_context.sql("describe cip_utilities.cdm_variables_dict").registerTempTable("schema_def")
temp_data=hive_context.sql("select * from schema_def")
temp_data.show()
data1=hive_context.sql("select col_name from schema_def where data_type<>'string'")
data1.show()
python apache-spark pyspark pyspark-sql
python apache-spark pyspark pyspark-sql
edited Nov 20 '18 at 7:20
Shankar Panda
asked Nov 20 '18 at 6:53


Shankar PandaShankar Panda
1751114
1751114
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
Use
DataFrame.collect()
method, which aggregates the result ofSpark-SQL
query from all executors into driver.The
collect()
method will return aPython
list
, each element of which is aSpark
Row
You can then iterate over this list in a
for
-loop
Code snippet:
data1 = hive_context.sql("select col_name from schema_def where data_type<>'string'")
colum_names_as_python_list_of_rows = data1.collect()
add a comment |
I think you need to ask yourself why you want to iterate over the data.
Are you doing an aggregation? Transforming the data? If so, consider doing it using the spark API.
Printing some text? If so, then use .collect() and retrieve the data back to your driver process. Then you can loop over the result in the usual python way.
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53387703%2fhow-to-send-the-result-of-a-sql-statement-to-a-for-loop-using-pyspark%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
Use
DataFrame.collect()
method, which aggregates the result ofSpark-SQL
query from all executors into driver.The
collect()
method will return aPython
list
, each element of which is aSpark
Row
You can then iterate over this list in a
for
-loop
Code snippet:
data1 = hive_context.sql("select col_name from schema_def where data_type<>'string'")
colum_names_as_python_list_of_rows = data1.collect()
add a comment |
Use
DataFrame.collect()
method, which aggregates the result ofSpark-SQL
query from all executors into driver.The
collect()
method will return aPython
list
, each element of which is aSpark
Row
You can then iterate over this list in a
for
-loop
Code snippet:
data1 = hive_context.sql("select col_name from schema_def where data_type<>'string'")
colum_names_as_python_list_of_rows = data1.collect()
add a comment |
Use
DataFrame.collect()
method, which aggregates the result ofSpark-SQL
query from all executors into driver.The
collect()
method will return aPython
list
, each element of which is aSpark
Row
You can then iterate over this list in a
for
-loop
Code snippet:
data1 = hive_context.sql("select col_name from schema_def where data_type<>'string'")
colum_names_as_python_list_of_rows = data1.collect()
Use
DataFrame.collect()
method, which aggregates the result ofSpark-SQL
query from all executors into driver.The
collect()
method will return aPython
list
, each element of which is aSpark
Row
You can then iterate over this list in a
for
-loop
Code snippet:
data1 = hive_context.sql("select col_name from schema_def where data_type<>'string'")
colum_names_as_python_list_of_rows = data1.collect()
answered Nov 20 '18 at 8:34


y2k-shubhamy2k-shubham
1,21811130
1,21811130
add a comment |
add a comment |
I think you need to ask yourself why you want to iterate over the data.
Are you doing an aggregation? Transforming the data? If so, consider doing it using the spark API.
Printing some text? If so, then use .collect() and retrieve the data back to your driver process. Then you can loop over the result in the usual python way.
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
add a comment |
I think you need to ask yourself why you want to iterate over the data.
Are you doing an aggregation? Transforming the data? If so, consider doing it using the spark API.
Printing some text? If so, then use .collect() and retrieve the data back to your driver process. Then you can loop over the result in the usual python way.
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
add a comment |
I think you need to ask yourself why you want to iterate over the data.
Are you doing an aggregation? Transforming the data? If so, consider doing it using the spark API.
Printing some text? If so, then use .collect() and retrieve the data back to your driver process. Then you can loop over the result in the usual python way.
I think you need to ask yourself why you want to iterate over the data.
Are you doing an aggregation? Transforming the data? If so, consider doing it using the spark API.
Printing some text? If so, then use .collect() and retrieve the data back to your driver process. Then you can loop over the result in the usual python way.
answered Nov 20 '18 at 8:40
ThatDataGuyThatDataGuy
4791518
4791518
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
add a comment |
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
Yes, i am trying to find the maximum, minimum , standard deviation. Thats why need to send each col name in an iteration
– Shankar Panda
Nov 20 '18 at 9:06
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
You should be using the inbuilt spark functions to do that - it will be far more performant. spark.apache.org/docs/2.2.0/api/python/…
– ThatDataGuy
Nov 20 '18 at 14:28
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53387703%2fhow-to-send-the-result-of-a-sql-statement-to-a-for-loop-using-pyspark%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown