您好,欢迎访问一九零五行业门户网

Alex的Hadoop菜鸟教程:第7课Sqoop2导出教程

承接上节课,现在说说导出教程 检查连接 先看看有没有可用的connection 连接,如果没有就要根据上节课的方法创建一个 sqoop:000 show connector --all1 connector(s) to show: connector with id 1: name: generic-jdbc-connector class: org.apache.sqoop.c
承接上节课,现在说说导出教程
检查连接先看看有没有可用的connection 连接,如果没有就要根据上节课的方法创建一个
sqoop:000> show connector --all1 connector(s) to show: connector with id 1: name: generic-jdbc-connector class: org.apache.sqoop.connector.jdbc.genericjdbcconnector version: 1.99.3-cdh5.0.1 supported job types: [export, import] connection form 1:
后面还有很长的输出我就不贴了,如果有就继续往下做
准备数据mysql数据表准备先在mysql 里面建立一个表 employeecreate table `employee` ( `id` int(11) not null, `name` varchar(20) not null, primary key (`id`) ) engine=myisam default charset=utf8;
hadoop文件准备在hadoop里面建立数据文件先建立一个文件 part-m-00000内容是1,'michael'
把他们放到hadoop里面
# hdfs dfs -mkdir /user/alex# hdfs dfs -put part-m-00000 /user/alex/# hdfs dfs -ls /user/alexfound 1 items-rw-r--r-- 2 root supergroup 20 2014-11-27 18:26 /user/alex/part-m-00000
导出create job --xid 1 --type export
接下来按照提示输入
sqoop:000> create job --xid 1 --type exportcreating job for connection with id 1please fill following values to create new job objectname: export to employeedatabase configurationschema name: table name: employeetable sql statement: table column names: stage table name: clear stage table: input configurationinput directory: /user/alexthrottling resourcesextractors: loaders: new job was successfully created with validation status fine and persistent id 3
执行这个任务sqoop:000> start job --jid 3submission detailsjob id: 3server url: http://localhost:12000/sqoop/created by: rootcreation date: 2014-11-27 18:29:27 cstlastly updated by: rootexternal id: job_1406097234796_0008 http://xmseapp01:8088/proxy/application_1406097234796_0008/2014-11-27 18:29:27 cst: booting - progress is not available
然后等一会儿,再去看mysql的employee表就有一条michael 的记录了
下节课说下sqoop跟hbase之间的通讯
后面还有很长的输出我就不贴了,如果有就继续往下做
其它类似信息

推荐信息