ITPub博客

首页 > 大数据 > Hadoop > 编译hadoop eclipse插件 详解

编译hadoop eclipse插件 详解

Hadoop 作者:oynyo01 时间:2012-12-02 19:34:02 0 删除 编辑

 

一、linux下编译hadoopeclipse-plugin

操作系统为 ubuntu10.0.4 x86

1.下载软件包

hadoop-1.0.4.tar.gz

eclipse-jee-indigo-SR2-linux-gtk.tar.gz

将二者解压到/software目录下,分别命名为eclipse3.7  hadoop

即: HADOOP_HOME=/software/hadoop

     ECLIPSE_HOME=/software/eclipse3.7

2. 安装jdk  ant

$ sudo apt-get install ant openjdk-6-jdk autoconf libtool

 

3.编辑 {HADOOP_HOME}/build.xml

    3.1对31行的hadoop版本做修改

value="1.0.4-SNAPSHOT"/>

修改为:

value="1.0.4"/>

3.2 2418行的ivy下载进行注释,因为已经包含了ivy.jar


   <!--target name="ivy-download" description="To download ivy" unless="offline">
   
  


   3.3 2426行去除对ivy-download的依赖关系,保留如下
    


4.修改/hadoop-1.0.4/src/core/org/apache/hadoop/fs/FileUtil.java里面的checkReturnValue,注释掉该函数即可

(这一步非常重要,因为在windows下,使用hadoop-1.0.4自带的hadoop-core-1.0.4.jar时,会抛出权限异常:12/04/24 15:32:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable,而修改了FileUtil.java后,编译得到的hadoop-core-1.0.4.jar在win平台不会出现上述问题)

[java] view plaincopy
  1. ......  
  2.   private static void checkReturnValue(boolean rv, File p,  
  3.                                        FsPermission permission  
  4.                                        throws IOException  
  5.       
  6.    
  7. ......  

5.编辑{HADOOP_HOME}/src/contrib./build-contrib.xml

添加高亮的两行,补充Eclipse路径和Hadoop版本

name="hadoopbuildcontrib" xmlns:ivy="antlib:org.apache.ivy.ant">

 

   name="eclipse.home" location="/software/eclipse3.7"/>

   name="version" value="1.0.4"/>

   name="name" value="${ant.project.name}"/>

   name="root" value="${basedir}"/>

   name="hadoop.root" location="${root}/../../../"/>

...

6.编译

6.1 编译hadoop

$ cd /software/hadoop

$ ant compile

win平台下,使用编译生成的 hadoop-core-1.0.4.jar 可以避免

6.2编译eclipse-plugin

$ cd  /software/hadoop/src/contrib./eclipse-plugin/

$ ang jar

执行成功后, /software/hadoop/build/contrib./eclipse-plugin目录下会生成hadoop-eclipse-plugin-1.0.4.jar

此时的hadoop-eclipse-plugin-1.0.4.jar还缺少一些jar包,放入${ECLIPSE_HOME}/plugins 后连接DFS会出现如下错误:

An internal error occurred during: "Connecting to DFS Hadoop".org/apache/commons/configuration/Configuration

解决方法:

hadoop-eclipse-plugin-1.0.4.jar进行修改。

5.1. 用归档管理器打开该包,发现只有commons-cli-1.2.jar hadoop-core.jar两个包。

HADOOP_HOME/lib目录下的commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar , jackson-core-asl-1.8.8.jar jackson-mapper-asl-1.8.8.jar复制到hadoop-eclipse-plugin-1.0.3.jarlib目录下

5.
2. 修改该包META-INF目录下的MANIFEST.MF,将classpath修改为:
Bundle-ClassPath:classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,lib/commons-httpclient-3.0.1.jar,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar

将该hadoop-eclipse-plugin-1.0.4.jar拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop eclipse编译环境。

连接

 

二.Windows下编译eclipse-plugin

操作系统: win7 32

1.下载如下软件包:

jdk-6u30-windows-i586.exe

apache-ant-1.8.4-bin.zip

eclipse-jee-indigo-SR2-win32.zip

hadoop-1.0.4.tar.gz

 

2.

2.1安装jdk 配置JAVA_HOME,并将%JAVA_HOME%in添加到Path变量

2.2 apache-ant-1.8.4-bin.zip解压到 E:/software/目录下, 配置ANT_HOME变量,

%ANT_HOME%in添加到Path变量

2.3 hadoop-1.0.4.tar.gz解压到E:/software/目录下

2.4eclipse-jee-indigo-SR2-win32.zip解压到E:/software/目录下,更名为eclipse3.7

 

3.修改eclipse-plugin相关的build配置文件,并准备jar

  3.1修改${HADOOP_HOME}/src/contrib/目录下的build-contrib.xml文件添加高亮的两行,补充Eclipse路径和Hadoop版本

name="hadoopbuildcontrib" xmlns:ivy="antlib:org.apache.ivy.ant">

 

   name="eclipse.home" location="E:/software/eclipse3.7"/>

   name="version" value="1.0.4"/>

   name="name" value="${ant.project.name}"/>

   name="root" value="${basedir}"/>

   name="hadoop.root" location="${root}/../../../"/>

...

3.2

 修改eclipse-plugin/build.xml文件

id="eclipse-sdk-jars">

     dir="${eclipse.home}/plugins/">

       name="org.eclipse.ui*.jar"/>

       name="org.eclipse.jdt*.jar"/>

       name="org.eclipse.core*.jar"/>

       name="org.eclipse.equinox*.jar"/>

       name="org.eclipse.debug*.jar"/>

       name="org.eclipse.osgi*.jar"/>

       name="org.eclipse.swt*.jar"/>

       name="org.eclipse.jface*.jar"/>

 

       name="org.eclipse.team.cvs.ssh2*.jar"/>

       name="com.jcraft.jsch*.jar"/>

    

     dir="../../../">

       name="hadoop*.jar"/>

    

  

(1)添加高亮的三行用于增加%Hadoop_HOME%/hadoop-*.jar作为ref

(2)删除build.xml中的deprecation=”${javac.deprecation}”

3.3 新建目录并复制jar

>copy %Hadoop_HOME%libcommons-cli-1.2.jar %HADOOP_HOME%uildivylibHadoopcommon

>copy %Hadoop_HOME%hadoop-core-1.0.4.jar %Hadoop_HOME%uild

 

4. 命令行:切换到 %HADOOP_HOME%srccontribeclipse-plugin目录下

   ant jar

  %HADOOP_HOME%uildcontribeclipse-plugins 目录下生成hadoop-eclipse-plugins-1.0.4.jar

 

此时的hadoop-eclipse-plugin-1.0.4.jar还缺少一些jar包,放入${ECLIPSE_HOME}/plugins 后连接DFS会出现如下错误:

An internal error occurred during: "Connecting to DFS Hadoop".org/apache/commons/configuration/Configuration

解决方法:

hadoop-eclipse-plugin-1.0.3.jar进行修改。

5.1. 用归档管理器打开该包,发现只有commons-cli-1.2.jar hadoop-core.jar两个包。

HADOOP_HOME/lib目录下的commons-configuration-1.6.jar , commons-httpclient-3.0.1.jar , commons-lang-2.4.jar , jackson-core-asl-1.8.8.jar jackson-mapper-asl-1.8.8.jar复制到hadoop-eclipse-plugin-1.0.3.jarlib目录下

5.
2. 修改该包META-INF目录下的MANIFEST.MF,将classpath修改为:
Bundle-ClassPath:classes/,lib/hadoop-core.jar,lib/commons-cli-1.2.jar,lib/commons-httpclient-3.0.1.jar,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar

将该hadoop-eclipse-plugin-1.0.4.jar拷贝到eclipse plugs-in目录中,重新启动eclipse,至此,搭建完成hadoop eclipse编译环境。

 

<!-- 正文结束 -->

来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/23607706/viewspace-1119608/,如需转载,请注明出处,否则将追究法律责任。

上一篇: 没有了~
下一篇: 没有了~
请登录后发表评论 登录
全部评论

注册时间:2010-03-28

  • 博文量
    1
  • 访问量
    387