Apache Hadoop生态组件部署分享-Spark

zookeeper: Apache Hadoop生态组件部署分享-zookeeper

hadoop:Apache Hadoop生态组件部署分享-Hadoop

hive: Apache Hadoop生态组件部署分享-Hive

hbase: Apache Hadoop生态组件部署分享-Hbase

impala:Apache Hadoop生态组件部署分享-Impala

1、下载spark并解压

下载地址: https://spark.apache.org/downloads.html

apache 复制代码
tar -xf spark-3.5.7-bin-hadoop3.tgz -C /opt/apache/

2、配置spark-env.sh

bash 复制代码
cd /opt/apache/spark-3.5.7-bin-hadoop3/confcp spark-env.sh.template spark-env.sh
vim spark-env.sh 添加以下内容:YARN_CONF_DIR=$HADOOP_HOME/etc/hadoopexport SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080 -Dspark.history.retainedApplications=30 -Dspark.history.fs.logDirectory=hdfs://nameservice1/spark-yarn-log"

3、配置spark-defaults.conf

bash 复制代码
cp spark-defaults.conf.template spark-defaults.conf
vim spark-defaults.confspark.eventLog.enabled           truespark.eventLog.dir               hdfs://nameservice1/spark-yarn-logspark.yarn.historyServer.address=apache230.hadoop.com:18080   #作业: 在yarn rm 8088页面可以通过history跳转过去spark.history.ui.port=18080

4、启动spark history服务

bash 复制代码
/opt/apache/spark-3.5.7-bin-hadoop3/sbin/start-history-server.sh

http://apache230.hadoop.com:18080

5、验证spark-yarn

A. 客户端部署模式 验证计算pi

swift 复制代码
/opt/apache/spark-3.5.7-bin-hadoop3/bin/spark-submit \--master yarn \--class org.apache.spark.examples.SparkPi \/opt/apache/spark-3.5.7-bin-hadoop3/examples/jars/spark-examples_2.12-3.5.7.jar 10

注: 此时部署模式是在客户端上 所以日志在客户端显示

B.集群部署模式 验证计算pi

swift 复制代码
/opt/apache/spark-3.5.7-bin-hadoop3/bin/spark-submit \--master yarn --deploy-mode cluster \--class org.apache.spark.examples.SparkPi \/opt/apache/spark-3.5.7-bin-hadoop3/examples/jars/spark-examples_2.12-3.5.7.jar 2

说明: 这个时候就可以看到driver在231节点了,之前客户端部署模式是在哪个客户端执行,driver就在哪个机器上面

6、spark-shell验证

swift 复制代码
[root@apache230 bin]# ./spark-shellSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).25/09/30 10:24:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable25/09/30 10:24:23 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.Spark context Web UI available at http://apache230.hadoop.com:4040Spark context available as 'sc' (master = local[*], app id = local-1759199063061).Spark session available as 'spark'.Welcome to      ____              __     / __/__  ___ _____/ /__    _\ \/ _ \/ _ `/ __/  '_/   /___/ .__/\_,_/_/ /_/\_\   version 3.5.7      /_/
Using Scala version 2.12.18 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144)Type in expressions to have them evaluated.Type :help for more information.
scala> sc.textFile("/tmp/wqg.txt").flatMap(_.split(" ")).map((_,1)).reduceByKey(_ + _).collectres0: Array[(String, Int)] = Array((16:07:50,243,2), (15:38:53,698,4), (15:20:03,258,2), (15:39:46,035,1), (15:50:34,501,4), (15:43:54,365,2), (16:12:00,567,2), (15:27:26,953,4), (16:13:23,677,4), (16:13:08,656,4), (15:36:57,946,2), (15:55:30,218,2), (15:48:41,009,4), (15:53:15,033,2), (15:53:50,076,4), (15:34:18,110,3), (15:21:56,442,4), (15:36:58,947,4), (15:08:51,130,4), (15:54:27,125,1), (16:07:38,229,2), (15:42:32,881,2), (15:58:28,461,2), (15:23:33,591,4), (15:10:53,351,2), (16:15:33,856,2), (15:12:37,531,2), (15:29:32,402,2), (16:08:03,626,1), (15:46:44,408,2), (15:55:38,227,2), (15:55:54,252,2), (15:32:41,569,1), (15:30:50,899,2), (16:12:14,584,2), (15:38:32,596,1), (15:05:54,815,3), (15:13:09,586,2), (15:17:46,039,2), (16:05:18,014,3), (16:12:02,569,2)...
相关推荐
小王毕业啦9 小时前
2005-2024年 省级-总抚养比、儿童抚养比、老年人抚养比数据(xlsx)
大数据·人工智能·数据挖掘·数据分析·社科数据·实证分析·经管数据
2501_927283589 小时前
荣联汇智助力天津艺虹打造“软硬一体”智慧工厂,全流程自动化引领印刷包装行业数智变革
大数据·运维·数据仓库·人工智能·低代码·自动化
还是奇怪11 小时前
AI 提示词工程入门:用好的语言与模型高效对话
大数据·人工智能·语言模型·自然语言处理·transformer
Data_Journal12 小时前
如何使用cURL更改User Agent
大数据·服务器·前端·javascript·数据库
weixin_4462608513 小时前
城市智能化的底层基石:基于腾讯地图服务生态的移动定位与导航架构指引
大数据·人工智能·架构
qq_2837200513 小时前
Vibe Coding 氛围编程入门教程:AI 时代的全新开发范式(零基础到实战)
大数据·人工智能
Volunteer Technology14 小时前
ES并发控制
大数据·elasticsearch·搜索引擎
小飞象—木兮14 小时前
《销售数据分析标准实践手册》:核心内涵与关键指标、落地销售数据分析的全流程···(附相关材料下载)
大数据·人工智能·数据挖掘·数据分析
howard200515 小时前
2.4.3 集群模式运行Spark项目
spark·项目打包·提交运行
KmSH8umpK15 小时前
Redis分布式锁从原生手写到Redisson高阶落地,附线上死锁复盘优化方案进阶第三篇
redis·分布式·wpf