sbt-assembly not found when building Spark 0.5





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







1















I am trying to build the 0.5 branch of Spark, but it raises errors:




sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found




Hence, I download the ivys and jars manually from dl.bintray.com, and put them into my local .ivy folder.



To be specific, I create a sbt-assembly under com.eed3si9n, and I rename files as:



enter image description here



However, this does not work. What is the correct solution?










share|improve this question























  • I'm curious why you need such an old Spark version?

    – Jacek Laskowski
    Dec 25 '18 at 22:13











  • @JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

    – chenzhongpu
    Dec 27 '18 at 1:44











  • Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

    – Jacek Laskowski
    Dec 27 '18 at 9:34


















1















I am trying to build the 0.5 branch of Spark, but it raises errors:




sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found




Hence, I download the ivys and jars manually from dl.bintray.com, and put them into my local .ivy folder.



To be specific, I create a sbt-assembly under com.eed3si9n, and I rename files as:



enter image description here



However, this does not work. What is the correct solution?










share|improve this question























  • I'm curious why you need such an old Spark version?

    – Jacek Laskowski
    Dec 25 '18 at 22:13











  • @JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

    – chenzhongpu
    Dec 27 '18 at 1:44











  • Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

    – Jacek Laskowski
    Dec 27 '18 at 9:34














1












1








1








I am trying to build the 0.5 branch of Spark, but it raises errors:




sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found




Hence, I download the ivys and jars manually from dl.bintray.com, and put them into my local .ivy folder.



To be specific, I create a sbt-assembly under com.eed3si9n, and I rename files as:



enter image description here



However, this does not work. What is the correct solution?










share|improve this question














I am trying to build the 0.5 branch of Spark, but it raises errors:




sbt.ResolveException: unresolved dependency: com.eed3si9n#sbt-assembly;0.8.3: not found




Hence, I download the ivys and jars manually from dl.bintray.com, and put them into my local .ivy folder.



To be specific, I create a sbt-assembly under com.eed3si9n, and I rename files as:



enter image description here



However, this does not work. What is the correct solution?







scala apache-spark sbt






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 22 '18 at 6:38









chenzhongpuchenzhongpu

2,30752552




2,30752552













  • I'm curious why you need such an old Spark version?

    – Jacek Laskowski
    Dec 25 '18 at 22:13











  • @JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

    – chenzhongpu
    Dec 27 '18 at 1:44











  • Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

    – Jacek Laskowski
    Dec 27 '18 at 9:34



















  • I'm curious why you need such an old Spark version?

    – Jacek Laskowski
    Dec 25 '18 at 22:13











  • @JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

    – chenzhongpu
    Dec 27 '18 at 1:44











  • Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

    – Jacek Laskowski
    Dec 27 '18 at 9:34

















I'm curious why you need such an old Spark version?

– Jacek Laskowski
Dec 25 '18 at 22:13





I'm curious why you need such an old Spark version?

– Jacek Laskowski
Dec 25 '18 at 22:13













@JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

– chenzhongpu
Dec 27 '18 at 1:44





@JacekLaskowski bcz I am trying to fully understand its source code, and the 0.5 brach is in its initial form. Hence, I think it is a good start from it.

– chenzhongpu
Dec 27 '18 at 1:44













Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

– Jacek Laskowski
Dec 27 '18 at 9:34





Have fun! I'm a bit doubtful about the results. I'd rather start with the latest and greatest even though the codebase could be overwhelming. The base has not changed that much (and even if it did, it's more worthwhile to know how things are now not back then, isn't it?). Ping me offline to discuss it.

– Jacek Laskowski
Dec 27 '18 at 9:34












1 Answer
1






active

oldest

votes


















4














Spark branch-0.5 uses sbt 0.11.3 according to project/build.properties, so that's pretty old.



sbt community repository location



There's a bug in project/plugins.sbt. It's pointing to scalasbt.artifactoryonline.com, but it should point to repo.scala-sbt.org.



$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)


JDK 1.6



To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.



$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal


This should get the sbt shell started. Once it comes up, type in:



> package





share|improve this answer
























  • Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

    – chenzhongpu
    Nov 22 '18 at 8:57













  • Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

    – Eugene Yokota
    Nov 22 '18 at 10:18












Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425169%2fsbt-assembly-not-found-when-building-spark-0-5%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









4














Spark branch-0.5 uses sbt 0.11.3 according to project/build.properties, so that's pretty old.



sbt community repository location



There's a bug in project/plugins.sbt. It's pointing to scalasbt.artifactoryonline.com, but it should point to repo.scala-sbt.org.



$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)


JDK 1.6



To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.



$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal


This should get the sbt shell started. Once it comes up, type in:



> package





share|improve this answer
























  • Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

    – chenzhongpu
    Nov 22 '18 at 8:57













  • Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

    – Eugene Yokota
    Nov 22 '18 at 10:18
















4














Spark branch-0.5 uses sbt 0.11.3 according to project/build.properties, so that's pretty old.



sbt community repository location



There's a bug in project/plugins.sbt. It's pointing to scalasbt.artifactoryonline.com, but it should point to repo.scala-sbt.org.



$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)


JDK 1.6



To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.



$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal


This should get the sbt shell started. Once it comes up, type in:



> package





share|improve this answer
























  • Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

    – chenzhongpu
    Nov 22 '18 at 8:57













  • Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

    – Eugene Yokota
    Nov 22 '18 at 10:18














4












4








4







Spark branch-0.5 uses sbt 0.11.3 according to project/build.properties, so that's pretty old.



sbt community repository location



There's a bug in project/plugins.sbt. It's pointing to scalasbt.artifactoryonline.com, but it should point to repo.scala-sbt.org.



$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)


JDK 1.6



To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.



$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal


This should get the sbt shell started. Once it comes up, type in:



> package





share|improve this answer













Spark branch-0.5 uses sbt 0.11.3 according to project/build.properties, so that's pretty old.



sbt community repository location



There's a bug in project/plugins.sbt. It's pointing to scalasbt.artifactoryonline.com, but it should point to repo.scala-sbt.org.



$ git diff
diff --git a/project/plugins.sbt b/project/plugins.sbt
index 63d789d0c1..70dcfdba00 100644
--- a/project/plugins.sbt
+++ b/project/plugins.sbt
@@ -1,7 +1,7 @@
resolvers ++= Seq(
"sbt-idea-repo" at "http://mpeltonen.github.com/maven/",
Classpaths.typesafeResolver,
- Resolver.url("sbt-plugin-releases", new URL("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
+ Resolver.url("sbt-plugin-releases", new URL("http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/"))(Resolver.ivyStylePatterns)
)


JDK 1.6



To run older version of sbt, it's necessary to use older JDK. In this case, JDK 1.6. On macOS, however, there's an issue with JLine with JDK 1.6, so I had to disable JLine.



$ jenv shell 1.6
$ java -version
java version "1.6.0_65"
...
$ sbt/sbt -Djline.terminal=jline.UnsupportedTerminal


This should get the sbt shell started. Once it comes up, type in:



> package






share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 22 '18 at 7:38









Eugene YokotaEugene Yokota

74.1k39182279




74.1k39182279













  • Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

    – chenzhongpu
    Nov 22 '18 at 8:57













  • Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

    – Eugene Yokota
    Nov 22 '18 at 10:18



















  • Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

    – chenzhongpu
    Nov 22 '18 at 8:57













  • Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

    – Eugene Yokota
    Nov 22 '18 at 10:18

















Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

– chenzhongpu
Nov 22 '18 at 8:57







Thanks. Since the source code of 0.5 branch is in a much smaller scale, I think it is good to read the source code based that branch. BTW, how can I determine the correct JDK version for different version of Spark? does it mainly depend on the Scala version used?

– chenzhongpu
Nov 22 '18 at 8:57















Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

– Eugene Yokota
Nov 22 '18 at 10:18





Basically. You can sort of guess the stable JDK version based on the date. JDK 1.7 came out in 2011 so maybe it works too, but JDK 1.6 would be a safer bet.

– Eugene Yokota
Nov 22 '18 at 10:18




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53425169%2fsbt-assembly-not-found-when-building-spark-0-5%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Guess what letter conforming each word

Port of Spain

Run scheduled task as local user group (not BUILTIN)