Sqoop写入Hive卡在连接Hive的JDBC上不执行
Sqoop访问 启用 HA模式的Hive
找到Hive的安装根目录:$HIVE_HOME/conf
创建一个新的配置文件:beeline-hs2-connection.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>beeline.hs2.connection.user</name>
<value>hive</value>
</property>
<property>
<name>beeline.hs2.connection.password</name>
<value>hive</value>
</property>
</configuration>
beeline.hs2.connection.user:指定Hive访问的用户名
beeline.hs2.connection.password:当前用户名对应的访问密码
重新执行后发现会有新的报错信息:
报错信息
23/11/02 13:57:23 INFO hive.HiveImport: Error: Error while compiling statement: FAILED: SemanticException Unable to load data to destination table. Error: The file that you are trying to load does not match the file format of the destination table. (state=42000,code=40000)
23/11/02 13:57:23 INFO hive.HiveImport: Closing: 0: jdbc:hive2://hdp3.node1:2181,hdp3.node2:2181,hdp3.node3:2181/default;password=hive;serviceDiscoveryMode=zooKeeper;user=hive;zooKeeperNamespace=hiveserver2
23/11/02 13:57:23 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive exited with status 2
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:253)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:206)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:273)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:564)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:655)
at org.apache.sqoop.Sqoop.run(Sqoop.java:151)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:187)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:241)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:250)
at org.apache.sqoop.Sqoop.main(Sqoop.java:259)
解决办法:
1-新建一个存储格式为textfile的临时表
create table hive_db.hive_01( id string comment 'Id')
row format delimited fields terminated by '\001'
stored as textFile;
2-将数据导入临时表中(Sqoop执行的Import写入到临时表中)
3-通过查询插入的方式将临时表数据导入目标表
insert into hive_db.hive_table select * from hive_db.hive_01