您好,欢迎访问一九零五行业门户网

Hadoop2.x eclipse plugin插件编译安装配置

本文主要讲解如何编译安装配置 hadoop2.x eclipse plugin插件的详细过程: 环境参数 编译过程 安装配置 [一]、环境参数 eclipse juno service release 2 hadoop2.2.0 java 1.6.0_65 mac oxs 10.9.2 [二]、编译过程 目前hadoop2.x的插件源码托管在github上,
本文主要讲解如何编译安装配置 hadoop2.x eclipse plugin插件的详细过程:环境参数 编译过程 安装配置[一]、环境参数eclipse juno service release 2 hadoop2.2.0 java 1.6.0_65 mac oxs 10.9.2[二]、编译过程目前hadoop2.x的插件源码托管在github上,第一步肯定是下载源码git clone https://github.com/winghc/hadoop2x-eclipse-plugin.git
编译源码:$cd src/contrib/eclipse-plugin$ant jar -dversion=2.2.0 -declipse.home=/applications/eclipse -dhadoop.home=/usr/local/share/hadoop
但是在我的机器上第一次编译时失败了,类似下面的错误信息:............[javac] compiling 45 source files to /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/classes [javac] /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/mapreducenature.java:35: package org.eclipse.jdt.core does not exist [javac] import org.eclipse.jdt.core.iclasspathentry; [javac] ^ [javac] /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/src/java/org/apache/hadoop/eclipse/mapreducenature.java:36: package org.eclipse.jdt.core does not exist [javac] import org.eclipse.jdt.core.ijavaproject; [javac] ^..............
初步判断估计是我的eclipse中没有 org.eclipse.jdt.core* 相关lib包,无法找到类引起的,后来去我的本机目录?/applications/eclipse/plugins/ 下查看发现确实没有类似的lib包,证明了之前的判断。只好从其他同事电脑上找到一个 org.eclipse.jdt.core_3.7.3.v20120119-1537.jar ?copy到插件的源码目录 ?src/contrib/eclipse-plugin/lib 下,同时需要修改?src/contrib/eclipse-plugin/build.xml 文件,增加两处修改具体如下: ...... ......
然后再执行编译命令,看到?build successful 表示编译成功:micmiu-mbp:eclipse-plugin micmiu$ ant jar -dversion=2.2.0 -declipse.home=/applications/eclipse -dhadoop.home=/usr/local/share/hadoopbuildfile: /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xmlcheck-contrib:init: [echo] contrib: eclipse-plugininit-contrib:ivy-download: [get] getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] to: /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/ivy/ivy-2.1.0.jar [get] not modified - so not downloadedivy-probe-antlib:ivy-init-antlib:ivy-init:[ivy:configure] :: ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::[ivy:configure] :: loading settings :: file = /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/ivy/ivysettings.xmlivy-resolve-common:ivy-retrieve-common:[ivy:cachepath] deprecated: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead[ivy:cachepath] :: loading settings :: file = /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/ivy/ivysettings.xmlcompile: [echo] contrib: eclipse-plugin ...... [javac] note: some input files use or override a deprecated api. [javac] note: recompile with -xlint:deprecation for details. [javac] note: some input files use unchecked or unsafe operations. [javac] note: recompile with -xlint:unchecked for details. [javac] 7 warningsjar: [mkdir] created dir: /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/lib [copy] copying 9 files to /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/lib ...... [jar] building jar: /users/micmiu/no_sync/opensource_code/hadoop/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.2.0.jarbuild successfultotal time: 37 secondsmicmiu-mbp:eclipse-plugin m
编译成功后可以在?build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.2.0.jar 找到生成的jar包。下载地址:http://yun.baidu.com/s/1o6k457c#dir/path=%2fsourcecode%2fbuilder[三]、安装配置1、复制生成的 hadoop-eclipse-plugin-2.2.0.jar 到 eclipse/plugins 路径下,重启eclipse即可。2、在eclipse菜单依次点击 ?windows →?show view →?other… ,选择“show view”对话框打开,搜索框输入“map”,会找到项“map/reduce locations”,点击“确定”按钮:3、控制台会多出一个“map/reduce locations”的tab页:4、在“map/reduce locations” tab页 点击图标 ? 或者在空白的地方右键,选择“new hadoop location…”,弹出对话框“new hadoop location…”,配置如下内容:location name :随便取个名字 比如 hadoop2.2.0 map/reduce(v2) master :根据hdfs-site.xml中配置dfs.datanode.ipc.address的值填写 dfs master: name node的ip和端口,根据core-site.xml中配置fs.defaultfs的值填写配置成功后可以看到类似如下信息:5、打开“preferences”对话框,搜索“hadoop”,找到”hadoop map/reduce”项,点击”browse…” 配置hadoop的路径,比如我的路径是 /usr/local/share/hadoop-2.2.0—————– ?eof?@michael sun?—————–
其它类似信息

推荐信息