【Hive】——安装部署

1 MetaData(元数据)


2 MetaStore (元数据服务)

3 MetaStore配置方式


3.1 内嵌模式


3.2 本地模式


3.3 远程模式


4 安装前准备

bash 复制代码
    <!-- 整合hive -->
    <property>
        <name>hadoop.proxyuser.root.hosts</name>
        <value>*</value>
    </property>
    <property>
        <name>hadoop.proxyuser.root.groups</name>
        <value>*</value>
    </property>

5 远程模式安装

5.1 下载

powershell 复制代码
https://hive.apache.org/

5.2 解压并重命名

powershell 复制代码
tar -zxvf apache-hive-3.1.2-bin.tar.gz -C /opt/module/
cd /opt/module/
mv mv apache-hive-3.1.2-bin hive

5.3 解决hadoop、hive 之间的guava版本差异问题

powershell 复制代码
 cd /opt/module/hive/lib
 rm -f guava-19.0.jar
 cp /opt/module/hadoop-3.1.3/share/hadoop/common/lib/guava-27.0-jre.jar ./guava-27.0-jre.jar

5.4 添加环境变量

powershell 复制代码
vi /etc/profile.d/my_env.sh
powershell 复制代码
 #HIVE_HOME
 export HIVE_HOME=/opt/module/hive
 export PATH=$PATH:$HIVE_HOME/bin
powershell 复制代码
source /etc/profile

5.5 hive-env.sh 修改Hive环境变量

powershell 复制代码
cd /opt/module/hive/conf
mv hive-env.sh.template hive-env.sh
vim hive-env.sh
powershell 复制代码
# Set HADOOP_HOME to point to a specific hadoop install directory
export HADOOP_HOME=/opt/module/hadoop-3.1.3

# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/opt/module/hive/conf

# Folder containing extra libraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/opt/module/hive/lib

5.6 hive-log4j2.properties 日志配置

powershell 复制代码
mkdir -P /opt/module/hive/datas
cd /opt/module/hive/conf
mv hive-log4j2.properties.template hive-log4j2.properties
vim hive-log4j2.properties
powershell 复制代码
property.hive.log.dir = /opt/module/hive/datas

5.7 hive-site.xml 配置MateStore

添加了hive.metastore.uris 配置,则需要手动启动Matastore服务

powershell 复制代码
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>

  <!-- 存储元数据mysql配置 -->
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>
      jdbc:mysql://hadoop102:3306/hive?createDatabaseIfNotExist=true&amp;useUnicode=true&amp;useSSL=false&amp;characterEncoding=utf8</value>
    <description>
      JDBC connect string for a JDBC metastore.
      To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the
      connection URL.
      For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
    </description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.cj.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
    <description>Username to use against metastore database</description>
  </property>
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>123456</value>
    <description>password to use against metastore database</description>
  </property>
  <!-- H2S运行绑定host -->
  <property>
    <name>hive.server2.thrift.bind.host</name>
    <value>hadoop102</value>
    <description>Bind host on which to run the HiveServer2 Thrift service.</description>
  </property>
  <!-- 远程模式部署metastore 服务地址 -->
  <property>
    <name>hive.metastore.uris</name>
    <value>thrift://hadoop102:9083</value>
    <description>Thrift URI for the remote metastore. Used by metastore client to connect to remote
      metastore.</description>
  </property>
  <!-- 关闭元数据存储授权 -->
  <property>
    <name>hive.metastore.event.db.notification.api.auth</name>
    <value>false</value>
    <description>
      Should metastore do authorization against database notification related APIs such as
      get_next_notification.
      If set to true, then only the superusers in proxy settings have the permission
    </description>
  </property>
  <!-- 关闭元数据存储版本的验证 -->
  <property>
    <name>hive.metastore.schema.verification</name>
    <value>true</value>
    <description>
      Enforce metastore schema version consistency.
      True: Verify that version information stored in is compatible with one from Hive jars. Also
      disable automatic
      schema migration attempt. Users are required to manually migrate schema after Hive upgrade
      which ensures
      proper metastore schema migration. (Default)
      False: Warn if the version information stored in metastore doesn't match with one from in Hive
      jars.
    </description>
  </property>
</configuration>

5.8 上传mysql-connector-java-5.1.27-bin.jar

基于mysql的版本上传jar

powershell 复制代码
 /opt/module/hive/lib/mysql-connector-java-5.1.27-bin.jar

5.9 初始化Matedata

powershell 复制代码
cd /opt/module/hive/bin
./schematool -dbType mysql  -initSchema  --verbose

3.5.10 启动MateStore脚本

powershell 复制代码
vim hive_metastore.sh
powershell 复制代码
#!/bin/bash
if [ $# -lt 1 ]; then
    echo "No Args Input..."
    exit
fi

case $1 in
"start")
    {
        echo "----------------- MetaStore start -----------------"
        nohup /opt/module/hive/bin/hive --service metastore >> /opt/module/hive/datas/metastore.out 2>&1 &
        echo "----------------- Hiveserver2 start -----------------"
        nohup /opt/module/hive/bin/hive --service hiveserver2 >> /opt/module/hive/datas/hiveserver2.out 2>&1 &
    }
    ;;
"stop")
    {
        echo "----------------- MetaStore stop -----------------"
        pidMetaStore=$(ps -ef | grep -v grep | grep "Dproc_metastore" | awk '{printf $2" "}')
        kill -9 ${pidMetaStore}
        echo "----------------- Hiveserver2 stop -----------------"
        pidHiveserver2=$(ps -ef | grep -v grep | grep "Dproc_hiveserver2" | awk '{printf $2" "}')
        kill -9 ${pidHiveserver2}
    }
    ;;
*)
    echo "Input Args Error..."
    ;;
esac
相关推荐
大鳥2 小时前
企业级 Hive on Spark 开发规范
hive·hadoop·spark
90的程序爱好者4 小时前
Kettle多张表数据抽取操作步骤
数据库·数据仓库·数据挖掘
QQ12958455049 小时前
SSAS - 发货主题数据第1阶
数据仓库·数据分析
Hello.Reader11 小时前
Flink × Hive HiveCatalog 一键接入元数据,Flink 直接读写 Hive 表
大数据·hive·flink
Hello.Reader12 小时前
Flink + Hive Functions HiveModule、原生聚合加速、复用 Hive UDF/UDTF/UDAF
大数据·hive·flink
brave_zhao12 小时前
spoon如何连接carte如何将.ktr任务或者.kjb任务提交个远程carte服务让,carte的服务端来执行 etl脚本的任务呢?
数据仓库·etl
Hello.Reader14 小时前
Flink Hive 把 Hive 表变成“可流式消费”的数仓底座
大数据·hive·flink
B站计算机毕业设计超人1 天前
计算机毕业设计Python知识图谱中华古诗词可视化 古诗词情感分析 古诗词智能问答系统 AI大模型自动写诗 大数据毕业设计(源码+LW文档+PPT+讲解)
大数据·人工智能·hadoop·python·机器学习·知识图谱·课程设计
B站计算机毕业设计超人1 天前
计算机毕业设计Python+大模型音乐推荐系统 音乐数据分析 音乐可视化 音乐爬虫 知识图谱 大数据毕业设计
人工智能·hadoop·爬虫·python·数据分析·知识图谱·课程设计
归去来?1 天前
记录一次从https接口提取25G大文件csv并落表的经历
大数据·数据仓库·hive·python·网络协议·5g·https