Why doesn't dataset's foreach method require an encoder, but map does?











up vote
1
down vote

favorite












I have two datasets: Dataset[User] and Dataset[Book] where both User and Book are case classes. I join them like this:



val joinDS = ds1.join(ds2, "userid")



If I try to map over each element in joinDS, the compiler complains that an encoder is missing:




not enough arguments for method map: (implicit evidence$46: org.apache.spark.sql.Encoder[Unit])org.apache.spark.sql.Dataset[Unit].
Unspecified value parameter evidence$46.
Unable to find encoder for type stored in a Dataset.



But the same error does not occur if I use foreach instead of map. Why doesn't foreach require an encoder as well? I have imported all implicits from the spark session already, so why does map require an encoder at all, when the dataset is a result of joining two datasets containing case classes)? Also, what type of dataset do I get from that join? Is it a Dataset[Row], or something else?










share|improve this question






















  • Pretty sure you can't encode Unit.
    – erip
    Nov 8 at 18:15















up vote
1
down vote

favorite












I have two datasets: Dataset[User] and Dataset[Book] where both User and Book are case classes. I join them like this:



val joinDS = ds1.join(ds2, "userid")



If I try to map over each element in joinDS, the compiler complains that an encoder is missing:




not enough arguments for method map: (implicit evidence$46: org.apache.spark.sql.Encoder[Unit])org.apache.spark.sql.Dataset[Unit].
Unspecified value parameter evidence$46.
Unable to find encoder for type stored in a Dataset.



But the same error does not occur if I use foreach instead of map. Why doesn't foreach require an encoder as well? I have imported all implicits from the spark session already, so why does map require an encoder at all, when the dataset is a result of joining two datasets containing case classes)? Also, what type of dataset do I get from that join? Is it a Dataset[Row], or something else?










share|improve this question






















  • Pretty sure you can't encode Unit.
    – erip
    Nov 8 at 18:15













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I have two datasets: Dataset[User] and Dataset[Book] where both User and Book are case classes. I join them like this:



val joinDS = ds1.join(ds2, "userid")



If I try to map over each element in joinDS, the compiler complains that an encoder is missing:




not enough arguments for method map: (implicit evidence$46: org.apache.spark.sql.Encoder[Unit])org.apache.spark.sql.Dataset[Unit].
Unspecified value parameter evidence$46.
Unable to find encoder for type stored in a Dataset.



But the same error does not occur if I use foreach instead of map. Why doesn't foreach require an encoder as well? I have imported all implicits from the spark session already, so why does map require an encoder at all, when the dataset is a result of joining two datasets containing case classes)? Also, what type of dataset do I get from that join? Is it a Dataset[Row], or something else?










share|improve this question













I have two datasets: Dataset[User] and Dataset[Book] where both User and Book are case classes. I join them like this:



val joinDS = ds1.join(ds2, "userid")



If I try to map over each element in joinDS, the compiler complains that an encoder is missing:




not enough arguments for method map: (implicit evidence$46: org.apache.spark.sql.Encoder[Unit])org.apache.spark.sql.Dataset[Unit].
Unspecified value parameter evidence$46.
Unable to find encoder for type stored in a Dataset.



But the same error does not occur if I use foreach instead of map. Why doesn't foreach require an encoder as well? I have imported all implicits from the spark session already, so why does map require an encoder at all, when the dataset is a result of joining two datasets containing case classes)? Also, what type of dataset do I get from that join? Is it a Dataset[Row], or something else?







scala apache-spark






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 8 at 17:51









vaer-k

3,49342031




3,49342031












  • Pretty sure you can't encode Unit.
    – erip
    Nov 8 at 18:15


















  • Pretty sure you can't encode Unit.
    – erip
    Nov 8 at 18:15
















Pretty sure you can't encode Unit.
– erip
Nov 8 at 18:15




Pretty sure you can't encode Unit.
– erip
Nov 8 at 18:15












1 Answer
1






active

oldest

votes

















up vote
4
down vote



accepted










TL;DR Encoder is required to transform the outcome to the internal Spark SQL format and there is no need for that in case of foreach (or any other sink).



Just take a look at the signatures. map is



def map[U](func: (T) ⇒ U)(implicit arg0: Encoder[U]): Dataset[U] 


so in plain words it transforms records from T to U and then uses the Encoder of U to transform the result to internal representation.



foreach from the other hand, is



def foreach(f: (T) ⇒ Unit): Unit 


In other words it doesn't expect any result. Since there is no result to be stored, Encoder is just obsolete.






share|improve this answer























  • I see. I thought it needed to encode the input
    – vaer-k
    Nov 8 at 18:20










  • @vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
    – Alexey Romanov
    Nov 9 at 8:21













Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














 

draft saved


draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53213479%2fwhy-doesnt-datasets-foreach-method-require-an-encoder-but-map-does%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
4
down vote



accepted










TL;DR Encoder is required to transform the outcome to the internal Spark SQL format and there is no need for that in case of foreach (or any other sink).



Just take a look at the signatures. map is



def map[U](func: (T) ⇒ U)(implicit arg0: Encoder[U]): Dataset[U] 


so in plain words it transforms records from T to U and then uses the Encoder of U to transform the result to internal representation.



foreach from the other hand, is



def foreach(f: (T) ⇒ Unit): Unit 


In other words it doesn't expect any result. Since there is no result to be stored, Encoder is just obsolete.






share|improve this answer























  • I see. I thought it needed to encode the input
    – vaer-k
    Nov 8 at 18:20










  • @vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
    – Alexey Romanov
    Nov 9 at 8:21

















up vote
4
down vote



accepted










TL;DR Encoder is required to transform the outcome to the internal Spark SQL format and there is no need for that in case of foreach (or any other sink).



Just take a look at the signatures. map is



def map[U](func: (T) ⇒ U)(implicit arg0: Encoder[U]): Dataset[U] 


so in plain words it transforms records from T to U and then uses the Encoder of U to transform the result to internal representation.



foreach from the other hand, is



def foreach(f: (T) ⇒ Unit): Unit 


In other words it doesn't expect any result. Since there is no result to be stored, Encoder is just obsolete.






share|improve this answer























  • I see. I thought it needed to encode the input
    – vaer-k
    Nov 8 at 18:20










  • @vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
    – Alexey Romanov
    Nov 9 at 8:21















up vote
4
down vote



accepted







up vote
4
down vote



accepted






TL;DR Encoder is required to transform the outcome to the internal Spark SQL format and there is no need for that in case of foreach (or any other sink).



Just take a look at the signatures. map is



def map[U](func: (T) ⇒ U)(implicit arg0: Encoder[U]): Dataset[U] 


so in plain words it transforms records from T to U and then uses the Encoder of U to transform the result to internal representation.



foreach from the other hand, is



def foreach(f: (T) ⇒ Unit): Unit 


In other words it doesn't expect any result. Since there is no result to be stored, Encoder is just obsolete.






share|improve this answer














TL;DR Encoder is required to transform the outcome to the internal Spark SQL format and there is no need for that in case of foreach (or any other sink).



Just take a look at the signatures. map is



def map[U](func: (T) ⇒ U)(implicit arg0: Encoder[U]): Dataset[U] 


so in plain words it transforms records from T to U and then uses the Encoder of U to transform the result to internal representation.



foreach from the other hand, is



def foreach(f: (T) ⇒ Unit): Unit 


In other words it doesn't expect any result. Since there is no result to be stored, Encoder is just obsolete.







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 8 at 18:21

























answered Nov 8 at 18:19









user10465355

66729




66729












  • I see. I thought it needed to encode the input
    – vaer-k
    Nov 8 at 18:20










  • @vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
    – Alexey Romanov
    Nov 9 at 8:21




















  • I see. I thought it needed to encode the input
    – vaer-k
    Nov 8 at 18:20










  • @vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
    – Alexey Romanov
    Nov 9 at 8:21


















I see. I thought it needed to encode the input
– vaer-k
Nov 8 at 18:20




I see. I thought it needed to encode the input
– vaer-k
Nov 8 at 18:20












@vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
– Alexey Romanov
Nov 9 at 8:21






@vaer-k Then it would need Encoder[T], not Encoder[U] (or both).
– Alexey Romanov
Nov 9 at 8:21




















 

draft saved


draft discarded



















































 


draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53213479%2fwhy-doesnt-datasets-foreach-method-require-an-encoder-but-map-does%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Guess what letter conforming each word

Port of Spain

Run scheduled task as local user group (not BUILTIN)