您好,欢迎访问一九零五行业门户网

使用Sqoop实现Hive与MySQL数据库间数据迁移时报错

使用sqoop实现hive与mysql数据库间数据迁移的时报错
执行 ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.10:3306/ekp_11 --table job_log --username root --password 123456 --hive-table job_log
准备将关系型数据的表结构复制到hive中。但是提示如下一堆错误信息:
warning: /usr/lib/hbase does not exist! hbase imports will fail.
 please set $hbase_home to the root of your hbase installation.
 warning: /usr/lib/hcatalog does not exist! hcatalog jobs will fail.
 please set $hcat_home to the root of your hcatalog installation.
 15/08/02 02:04:14 warn tool.basesqooptool: setting your password on the command-line is insecure. consider using -p instead.
 15/08/02 02:04:14 info tool.basesqooptool: using hive-specific delimiters for output. you can override
 15/08/02 02:04:14 info tool.basesqooptool: delimiters with --fields-terminated-by, etc.
 15/08/02 02:04:14 info manager.mysqlmanager: preparing to use a mysql streaming resultset.
 15/08/02 02:04:14 info manager.sqlmanager: executing sql statement: select t.* from `job_log` as t limit 1
 15/08/02 02:04:14 info manager.sqlmanager: executing sql statement: select t.* from `job_log` as t limit 1
 15/08/02 02:04:14 warn hive.tabledefwriter: column fd_start_time had to be cast to a less precise type in hive
 15/08/02 02:04:14 warn hive.tabledefwriter: column fd_end_time had to be cast to a less precise type in hive
 java hotspot(tm) 64-bit server vm warning: you have loaded library /cloud/hadoop-2.2.0/lib/native/libhadoop.so which might have disabled stack guard. the vm will try to fix the stack guard now.
 it's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
 15/08/02 02:04:16 warn util.nativecodeloader: unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 15/08/02 02:04:17 info hive.hiveimport: loading uploaded data into hive
 15/08/02 02:04:17 error tool.createhivetabletool: encountered ioexception running create table job: java.io.ioexception: cannot run program hive: error=2, no such file or directory
        at java.lang.processbuilder.start(processbuilder.java:1047)
        at java.lang.runtime.exec(runtime.java:617)
        at java.lang.runtime.exec(runtime.java:528)
        at org.apache.sqoop.util.executor.exec(executor.java:76)
        at org.apache.sqoop.hive.hiveimport.executeexternalhivescript(hiveimport.java:382)
        at org.apache.sqoop.hive.hiveimport.executescript(hiveimport.java:335)
        at org.apache.sqoop.hive.hiveimport.importtable(hiveimport.java:239)
        at org.apache.sqoop.tool.createhivetabletool.run(createhivetabletool.java:58)
        at org.apache.sqoop.sqoop.run(sqoop.java:145)
        at org.apache.hadoop.util.toolrunner.run(toolrunner.java:70)
        at org.apache.sqoop.sqoop.runsqoop(sqoop.java:181)
        at org.apache.sqoop.sqoop.runtool(sqoop.java:220)
        at org.apache.sqoop.sqoop.runtool(sqoop.java:229)
        at org.apache.sqoop.sqoop.main(sqoop.java:238)
 caused by: java.io.ioexception: error=2, no such file or directory
        at java.lang.unixprocess.forkandexec(native method)
        at java.lang.unixprocess.(unixprocess.java:186)
        at java.lang.processimpl.start(processimpl.java:130)
        at java.lang.processbuilder.start(processbuilder.java:1028)
        ... 13 more
惯性思维作祟,以为sqoop能智能到自己去找到本机的hive。
解决方案:为sqoop配置你使用的hive环境
具体步骤如下:
1、找到/sqoop-1.4.4/conf下的sqoop-env-template.sh 文件,将这个文件重命名为sqoop-env.sh ;
2、编辑sqoop-env.sh 文件,,将你的hive的安装目录配上就ok。
      如:export hive_home=/cloud/apache-hive-1.2.1-bin
相关阅读:
通过sqoop实现mysql / oracle 与hdfs / hbase互导数据
[hadoop] sqoop安装过程详解
用sqoop进行mysql和hdfs系统间的数据互导
hadoop oozie学习笔记 oozie不支持sqoop问题解决
hadoop生态系统搭建(hadoop hive hbase zookeeper oozie sqoop)
hadoop学习全程记录——使用sqoop将mysql中数据导入到hive中
sqoop 的详细介绍:请点这里
sqoop 的下载地址:请点这里
本文永久更新链接地址:
其它类似信息

推荐信息