java - Having error while building Spark 1.3.0 JDK 1.6.0_45 maven 3.0.5 CentOS 6 -


when trying build spark 1.3.0 added dependencies in package error related class mismatch

`[warn] /u01/spark/core/src/main/scala/org/apache/spark/executorallocationmanager.scala:23: imported `clock' permanently hidden definition of trait clock in package spark [warn] import org.apache.spark.util.{systemclock, clock} [warn]                                            ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/executorallocationmanager.scala:127: type mismatch; [error]  found   : org.apache.spark.util.systemclock [error]  required: org.apache.spark.clock [error]   private var clock: clock = new systemclock() [error]                              ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/dagscheduler.scala:66: reference clock ambiguous; [error] imported twice in same scope [error] import org.apache.spark.util._ [error] , import org.apache.spark._ [error]     clock: clock = new systemclock()) [error]            ^ [warn] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/driverrunner.scala:34: imported `clock' permanently hidden definition of trait clock in package worker [warn] import org.apache.spark.util.{clock, systemclock} [warn]                               ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/driverrunner.scala:61: type mismatch; [error]  found   : org.apache.spark.util.systemclock [error]  required: org.apache.spark.deploy.worker.clock [error]   private var clock: clock = new systemclock() [error]                              ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/driverrunner.scala:190: value gettimemillis not member of org.apache.spark.deploy.worker.clock [error]       val processstart = clock.gettimemillis() [error]                                ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/driverrunner.scala:192: value gettimemillis not member of org.apache.spark.deploy.worker.clock [error]       if (clock.gettimemillis() - processstart > successfulrunduration * 1000) { [error]                 ^ [warn] /u01/spark/core/src/main/scala/org/apache/spark/executor/executor.scala:37: imported `mutableurlclassloader' permanently hidden definition of trait mutableurlclassloader in package executor [warn] import org.apache.spark.util.{childfirsturlclassloader, mutableurlclassloader, [warn]                                                         ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/executor/executor.scala:319: type mismatch; [error]  found   : org.apache.spark.util.childfirsturlclassloader [error]  required: org.apache.spark.executor.mutableurlclassloader [error]       new childfirsturlclassloader(urls, currentloader) [error]       ^ [error] /u01/spark/core/src/main/scala/org/apache/spark/executor/executor.scala:321: trait mutableurlclassloader abstract; cannot instantiated [error]       new mutableurlclassloader(urls, currentloader) [error]       ^ [warn] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/local/localbackend.scala:89: postfix operator millis should enabled [warn] making implicit value scala.language.postfixops visible. [warn] can achieved adding import clause 'import scala.language.postfixops' [warn] or setting compiler option -language:postfixops. [warn] see scala docs value scala.language.postfixops discussion [warn] why feature should explicitly enabled. [warn]       context.system.scheduler.scheduleonce(1000 millis, self, reviveoffers) [warn]                                                  ^ [warn] /u01/spark/core/src/main/scala/org/apache/spark/util/mutableurlclassloader.scala:26: imported `parentclassloader' permanently hidden definition of class parentclassloader in package util [warn] import org.apache.spark.util.parentclassloader [warn]                              ^ [warn] 5 warnings found [error] 7 errors found ` 

i got same errors when trying build on included maven + jdk 1.7

the full build output on pastebin id i9pfevj8 full pom.xml pastebin id 8gegt5ee

[updated]

i have changed spark version match 1.3.0 cyclic dependency errors. <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-streaming_2.10</artifactid> <version>1.3.0</version> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-streaming-kafka_2.10</artifactid> <version>1.3.0</version> </dependency>

i realized kafka streaming modules comes pre-built spark 1.3.0 mapr 3.x, modules , dependencies needed if 1 needs produce stream in application.


Popular posts from this blog