久久国产成人av_抖音国产毛片_a片网站免费观看_A片无码播放手机在线观看,色五月在线观看,亚洲精品m在线观看,女人自慰的免费网址,悠悠在线观看精品视频,一级日本片免费的,亚洲精品久,国产精品成人久久久久久久

分享

hadoop eclipse plugin 編譯安裝,,問(wèn)題總結(jié)

 大數(shù)據(jù)與云計(jì)算 2016-02-27

插件名稱:hadoop2x-eclipse-plugin

插件地址: https://github.com/winghc/hadoop2x-eclipse-plugin

1.下載并解壓hadoop2.x,,下載地址http://hadoop./releases.html#Download (我下載的是編譯好的包)

2.下載并解壓eclipse(我的是4.4.1版本,其他的版本類似)

3.下載hadoop2x-eclipse-plugin插件并解壓自己喜歡的目錄,,為了方便表達(dá),,我暫時(shí)叫他" H2EP_HOME "

4.插件編譯需要ant工具,下載地址 http://ant./bindownload.cgi

配置好ANT_HOME環(huán)境變量,,指向ant解壓地址,,配置PATH環(huán)境變量,,增加%ANT_HOME%\bin(linux環(huán)境類似)

5.打開(kāi)命令行工具,進(jìn)入“H2EP_HOME”目錄,;

6.執(zhí)行ant jar -Dversion=2.x.x -Dhadoop.version=2.x.x -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop


如:ant jar -Dversion=2.5.0-cdh5.3.5 -Dhadoop.version=2.5.0-cdh5.3.5 -Declipse.home=/program/eclipse -Dhadoop.home=/cloud/hadoop

eclipse.home配置成eclipse安裝目錄

hadoop.home配置成hadoop的解壓目錄

將 2.x.x 修改成對(duì)應(yīng)的hadoop的版本號(hào)

7.命令行在ivy-resolve-common處卡了

原因是找不到幾個(gè)依賴包,,那幾個(gè)依賴包可能是換路徑了,其實(shí)不需要這幾個(gè)依賴包也可以

解決方案:

修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml

找到:

<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">

去掉depends修改為

<target name="compile" unless="skip.contrib">

8.再次執(zhí)行第6步的編譯命令,,會(huì)提示copy不到相關(guān)jar包的錯(cuò)誤,,

解決方案:

修改"H2EP_HOME"\ivy\libraries.properties文件,

將報(bào)錯(cuò)的jar包版本號(hào)跟換成與"HADOOP_HOME"\share\hadoop\common\lib下面jar對(duì)應(yīng)的版本號(hào)

此步可能會(huì)有多個(gè)jar包版本不匹配,,需要多次修改

9.再次執(zhí)行第6步的編譯命令,,執(zhí)行成功

在"H2EP_HOME"\build\contrib\eclipse-plugin下生成hadoop-eclipse-plugin-2.x.x.jar插件

10.將hadoop-eclipse-plugin-2.x.x.jar放到eclipse的plugins目錄下,啟動(dòng)eclipse

11.打開(kāi)window===>prefernces,找到Hadoop Map/Reduce選項(xiàng)卡

配置hadoop installation directory目錄,,指向hadoop的安裝目錄

12.打開(kāi)window====>show view====>other,,找到Map/Reduce Locations,使其顯示

13.在Map/Reduce Locations中右鍵=====>new hadoop locations,,

此時(shí)沒(méi)反應(yīng),,查看eclipse日志(工作空間\.metadata\.log),發(fā)現(xiàn)報(bào)錯(cuò):

java.lang.ClassNotFoundException: org.apache.commons.collections.map.UnmodifiableMap  

解決方案:

修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml

增加:

<copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>

<jar>標(biāo)簽增加

lib/commons-collections-${commons-collections.version}.jar,

14.執(zhí)行 eclipse.exe -clean(清理一下緩存,不然有可能還是出現(xiàn)13步的問(wèn)題)啟動(dòng)eclipse

完整的build.xml如下("H2EP_HOME"\src\contrib\eclipse-plugin\build.xml)

<?xml version="1.0" encoding="UTF-8" standalone="no"?>

<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www./licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<project default="jar" name="eclipse-plugin">

  <import file="../build-contrib.xml"/>

  <path id="eclipse-sdk-jars">
    <fileset dir="${eclipse.home}/plugins/">
      <include name="org.eclipse.ui*.jar"/>
      <include name="org.eclipse.jdt*.jar"/>
      <include name="org.eclipse.core*.jar"/>
      <include name="org.eclipse.equinox*.jar"/>
      <include name="org.eclipse.debug*.jar"/>
      <include name="org.eclipse.osgi*.jar"/>
      <include name="org.eclipse.swt*.jar"/>
      <include name="org.eclipse.jface*.jar"/>

      <include name="org.eclipse.team.cvs.ssh2*.jar"/>
      <include name="com.jcraft.jsch*.jar"/>
    </fileset> 
  </path>

  <path id="hadoop-sdk-jars">
    <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
      <include name="hadoop*.jar"/>
    </fileset> 
    <fileset dir="${hadoop.home}/share/hadoop/hdfs">
      <include name="hadoop*.jar"/>
    </fileset> 
    <fileset dir="${hadoop.home}/share/hadoop/common">
      <include name="hadoop*.jar"/>
    </fileset> 
  </path>



  <!-- Override classpath to include Eclipse SDK jars -->
  <path id="classpath">
    <pathelement location="${build.classes}"/>
    <!--pathelement location="${hadoop.root}/build/classes"/-->
    <path refid="eclipse-sdk-jars"/>
    <path refid="hadoop-sdk-jars"/>
  </path>

  <!-- Skip building if eclipse.home is unset. -->
  <target name="check-contrib" unless="eclipse.home">
    <property name="skip.contrib" value="yes"/>
    <echo message="eclipse.home unset: skipping eclipse plugin"/>
  </target>

 <!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
 <!-- 此處去掉 depends="init, ivy-retrieve-common" -->
 <target name="compile"  unless="skip.contrib">
    <echo message="contrib: ${name}"/>
    <javac
     encoding="${build.encoding}"
     srcdir="${src.dir}"
     includes="**/*.java"
     destdir="${build.classes}"
     debug="${javac.debug}"
     deprecation="${javac.deprecation}">
     <classpath refid="classpath"/>
    </javac>
  </target>

  <!-- Override jar target to specify manifest -->
  <target name="jar" depends="compile" unless="skip.contrib">
    <mkdir dir="${build.dir}/lib"/>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/common">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/hdfs">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>
    <copy  todir="${build.dir}/lib/" verbose="true">
          <fileset dir="${hadoop.home}/share/hadoop/yarn">
           <include name="hadoop*.jar"/>
          </fileset>
    </copy>

    <copy  todir="${build.dir}/classes" verbose="true">
          <fileset dir="${root}/src/java">
           <include name="*.xml"/>
          </fileset>
    </copy>



    <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
  <!-- 此處增加 commons-collections 依賴-->
  <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
  <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar"  todir="${build.dir}/lib" verbose="true"/>
    <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar"  todir="${build.dir}/lib" verbose="true"/>

    <jar
      jarfile="${build.dir}/hadoop-${name}-${version}.jar"
      manifest="${root}/META-INF/MANIFEST.MF">
      <manifest>
   <attribute name="Bundle-ClassPath" 
    value="classes/, 
 lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
 lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
 lib/hadoop-auth-${hadoop.version}.jar,
 lib/hadoop-common-${hadoop.version}.jar,
 lib/hadoop-hdfs-${hadoop.version}.jar,
 lib/protobuf-java-${protobuf.version}.jar,
 lib/log4j-${log4j.version}.jar,
 lib/commons-cli-1.2.jar,
 lib/commons-configuration-1.6.jar,
 lib/commons-httpclient-3.1.jar,
 <!-- 此處修改commons-lang版本-->
 lib/commons-lang-${commons-lang.version}.jar,
 <!-- 此處增加commons-collections依賴-->
 lib/commons-collections-${commons-collections.version}.jar,
 lib/jackson-core-asl-1.8.8.jar,
 lib/jackson-mapper-asl-1.8.8.jar,
 lib/slf4j-log4j12-1.7.5.jar,
 lib/slf4j-api-1.7.5.jar,
 lib/guava-${guava.version}.jar,
 lib/netty-${netty.version}.jar"/>
   </manifest>
      <fileset dir="${build.dir}" includes="classes/ lib/"/>
      <!--fileset dir="${build.dir}" includes="*.xml"/-->
      <fileset dir="${root}" includes="resources/ plugin.xml"/>
    </jar>
  </target>

</project>

    本站是提供個(gè)人知識(shí)管理的網(wǎng)絡(luò)存儲(chǔ)空間,,所有內(nèi)容均由用戶發(fā)布,,不代表本站觀點(diǎn)。請(qǐng)注意甄別內(nèi)容中的聯(lián)系方式,、誘導(dǎo)購(gòu)買等信息,,謹(jǐn)防詐騙。如發(fā)現(xiàn)有害或侵權(quán)內(nèi)容,,請(qǐng)點(diǎn)擊一鍵舉報(bào),。
    轉(zhuǎn)藏 分享 獻(xiàn)花(0

    0條評(píng)論

    發(fā)表

    請(qǐng)遵守用戶 評(píng)論公約

    類似文章 更多