java - spark 0.9.1 on hadoop 2.2.0 maven dependency -


i set apache spark maven dependency in pom.xml follows

    <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-core_2.10</artifactid>         <version>0.9.1</version>     </dependency> 

but found dependency use "hadoop-client-1.0.4.jar" , "hadoop-core-1.0.4.jar", , when run program got error "org.apache.hadoop.ipc.remoteexception: server ipc version 9 cannot communicate client version 4", shows need switch hadoop version 1.0.4 2.2.0.

updates:

is following solution correct method solve problem?

    <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-core_2.10</artifactid>         <version>0.9.1</version>         <exclusions>             <exclusion>                  <groupid>org.apache.hadoop</groupid>                 <artifactid>hadoop-core</artifactid>             </exclusion>             <exclusion>                  <groupid>org.apache.hadoop</groupid>                 <artifactid>hadoop-client</artifactid>             </exclusion>         </exclusions>      </dependency>      <dependency>         <groupid>org.apache.hadoop</groupid>         <artifactid>hadoop-client</artifactid>         <version>2.2.0</version>     </dependency>  

many help.

spark 1.2.0 depends on hadoop 2.2.0 default. if can update spark dependency 1.2.0 (or newer) solve problem.


Comments

Popular posts from this blog

c++ - OpenCV Error: Assertion failed <scn == 3 ::scn == 4> in unknown function, -

php - render data via PDO::FETCH_FUNC vs loop -

The canvas has been tainted by cross-origin data in chrome only -