【Bigtop】利用Bigtop3.2.0编译大数据组件RPM包

利用Bigtop3.2.0编译大数据组件RPM包

前言

原文参考:Bigtop 从0开始

参考了上述的博文自己尝试了编译组件,过程还是遇到很多问题,一一记录,方便后人。

Bigtop项目官网:BigTop

正文

Mvn本地目录的修改

我在编译过程中启动bigtop镜像的时候,把mvn挂载的目录改了,因此要在修改/usr/local/maven/conf/settings.xml时填写相关配置信息,这个一定要注意,不然每次编译都重新下载依赖:

xml 复制代码
<localRepository>/root/.m2/repository/</localRepository>

这个是目前困扰我时间最久的一个组件,Flink的编译过程中,会去下载nodejs,但是nodejs.org被墙了,所以每次编译到runtime-web都会超时失败:

java 复制代码
INFO: I/O exception (java.net.SocketException) caught when processing request to {s}-https://nodejs.org:443: Network is unreachable (connect failed)

解决办法是手动下载版本包放到mvn仓库下,但是这里有个坑,如果直接使用v16.13.2版本的node,会在这一步成功后,卡在下一步,并且报一个权限异常的问题:

bash 复制代码
/ws/build/flink/rpm/BuLD/flink-1.15.3/flink-runtime-web/web-dashboard/node/npm: Permission denied

因此,整体的解决方法是,修改flink源码包中的flink-runtime-web/pom.xml文件的npm部分为下面这样:

xml 复制代码
<plugin>
               <groupId>com.github.eirslett</groupId>
               <artifactId>frontend-maven-plugin</artifactId>
               <version>1.11.0</version>
               <executions>
                   <execution>
                       <id>install node and npm</id>
                       <goals>
                           <goal>install-node-and-npm</goal>
                       </goals>
                       <configuration>
                           <!--这里修改了node的版本-->
                           <nodeVersion>v12.22.1</nodeVersion>
                           <npmVersion>6.14.12</npmVersion>
                       </configuration>
                   </execution>
                   <execution>
                       <id>npm install husky</id>
                       <goals>
                           <goal>npm</goal>
                       </goals>
                       <configuration>
                           <arguments>install husky --registry=https://registry.npmmirror.com</arguments>
                       </configuration>
                   </execution>
                   <execution>
                       <id>npm install</id>
                       <goals>
                           <goal>npm</goal>
                       </goals>
                       <configuration>
                           <arguments>install --cache-max=0 --no-save --registry=https://registry.npmmirror.com</arguments>
                           <environmentVariables>
                               <HUSKY_SKIP_INSTALL>true</HUSKY_SKIP_INSTALL>
                           </environmentVariables>
                       </configuration>
                   </execution>
                   <execution>
                       <id>npm install local</id>
                       <goals>
                           <goal>npm</goal>
                       </goals>
                       <configuration>
                           <arguments>install --registry=https://registry.npmmirror.com --force</arguments>
                       </configuration>
                   </execution>
                   <execution>
                       <id>npmrun ci-check</id>
                       <goals>
                           <goal>npm</goal>
                       </goals>
                       <configuration>
                           <arguments>run ci-check</arguments>
                       </configuration>
                   </execution>
               </executions>
               <configuration>
                   <workingDirectory>web-dashboard</workingDirectory>
               </configuration>
           </plugin>

最主要的就是修改nodejs的版本,降到12.22.1手动把nodejs包放到mvn本地目录下:

这里注意下存放在本地的包版本号前不加v,手动下载的时候自带的名称里是有v的,另外,所有组件编译碰到nodejs下载失败的都可以参照这个操作

这样就没啥大问题了,接着就是等待编译通过了,其他没遇到什么大问题:

Kafka

先执行一次kafka的编译,让编译流程自动把kafka的源码包下载下来,bigtop3.2.0分支对应的kafka版本是2.8.1

bash 复制代码
./gradlew kafka-clean kafka-pkg -PparentDir=/usr/bigtop -PpkgSuffix -PbuildThreads=16C repo 

然后我们开始做修改。

grgit版本

首先是grgit的版本要修改一下,默认用的4.1.0,对应的pom文件已经404了,对应PR在这里:MINOR: Bump version of grgit to 4.1.1

修改${SOURCE_CODE}/gradle/dependencies.gradle中grgit的版本:

java 复制代码
versions += [
  activation: "1.1.1",
  apacheda: "1.0.2",
  apacheds: "2.0.0-M24",
  argparse4j: "0.7.0",
  bcpkix: "1.66",
  checkstyle: "8.36.2",
  commonsCli: "1.4",
  gradle: "6.8.1",
  gradleVersionsPlugin: "0.36.0",
  grgit: "4.1.1", // 修改这一行
  httpclient: "4.5.13",
]

手动准备gradle的文件

先手动下载gradle-wrapper.jar到${SOURCE_CODE}/gradle/wrapper/gradle-wrapper.jar,这里用的版本是6.8.1

bash 复制代码
curl -s -S --retry 3 -L -o "gradle-wrapper.jar"  https://mirror.ghproxy.com/https://raw.githubusercontent.com/gradle/gradle/v6.8.1/gradle/wrapper/gradle-wrapper.jar

然后把gradle-6.8.1-all.zip文件准备好,做成个http的服务,我这里直接用python启了一个SimpleHTTPServer,接着我们修改${SOURCE_CODE}/gradle/wrapper/gradle-wrapper.properties文件:

bash 复制代码
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
# 修改这里为对应的http服务
distributionUrl=http://172.18.2.31:444/gradle-6.8.1-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

Hadoop

Bigtop3.2.0使用的是hadoop3.3.4,YarnUI的编译过程会使用到nodejs,版本默认也用的是12.22.1,如果之前编译flink时已经把版本包下载到本地,就不用再做额外操作了,否则参考Flink章节把nodejs先存到本地。

修改mvn仓库

修改${SOURCE_CODE}/pom.xml中的mvn配置:

xml 复制代码
    <distMgmtSnapshotsId>apache.snapshots.https</distMgmtSnapshotsId>
    <distMgmtSnapshotsName>Apache Development Snapshot Repository</distMgmtSnapshotsName>
    <!--这一行修改为阿里的-->
    <distMgmtSnapshotsUrl>https://maven.aliyun.com/repository/apache-snapshots</distMgmtSnapshotsUrl>
    <distMgmtStagingId>apache.staging.https</distMgmtStagingId>
    <distMgmtStagingName>Apache Release Distribution Repository</distMgmtStagingName>
    <!--这一行修改为阿里的-->
    <distMgmtStagingUrl>https://maven.aliyun.com/repository/central</distMgmtStagingUrl>

合入Patch

这里要手动合一个Patch,否则编译YarnUI的时候还是会报如下的错:

bash 复制代码
[INFO] error triple-beam@1.4.1: The engine "node" is incompatible with this module. Expected version ">= 14.0.0". Got "12.22.1"

对应的issue:hadoop-deb FAILED on project hadoop-yarn-applications-catalog-webapp

将一下内容保存到bigtop-packages/src/common/hadoop/patch8-YARN-11528-triple-beam.diff

bash 复制代码
diff --git a/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/package.json b/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/package.json
index f09442cfc4e87..59cc3da179fd0 100644
--- a/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/package.json
+++ b/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-catalog/hadoop-yarn-applications-catalog-webapp/package.json
@@ -19,6 +19,9 @@
         "shelljs": "^0.2.6",
         "apidoc": "0.17.7"
     },
+    "resolutions": {
+        "triple-beam": "1.3.0"
+    },
     "scripts": {
         "prestart": "npm install & mvn clean package",
         "pretest": "npm install"

然后执行编译即可:

Tez

phantomjs下载不了

过程中会从github拉取phantomjs的安装压缩包,大概率会超时失败:

bash 复制代码
Saving to /tmp/phantomjs/phantomjs-2.1.1-linux-x86_64.tar.bz2\nReceiving...\n\nError making request.
Error: socket hang up
    at TLSSocket.onHangUp (_tls_wrap.js:1097:19)
    at TLSSocket.g (events.js:273:16)
    at emitNone (events.js:85:20)
    at TLSSocket.emit (events.js:179:7)
    at endReadableNT (_stream_readable.js:913:12)
    at _combinedTickCallback (internal/process/next_tick.js:74:11)
    at process._tickCallback (internal/process/next_tick.js:98:9)
    
Please report this full log at https://github.com/Medium/phantomjs"

网上各种让用npm手动安装的,这里其实只需要手动下载一下放在/tmp/phantomjs/phantomjs-2.1.1-linux-x86_64.tar.bz2即可,编译过程自己就会拿tmp目录下的压缩包

allow-root

默认pom.xml把sudo禁用了,会得到下面的报错:

bash 复制代码
[INFO] Running 'bower install --allow-root=false' in /ws/build/tez/rpm/BUILD/apache-tez-0.10.1-src/tez-ui/src/main/webapp
[ERROR] bower ESUDO         Cannot be run with sudo
[ERROR] 
[ERROR] Additional error details:
[ERROR] Since bower is a user command, there is no need to execute it with superuser permissions.
[ERROR] If you're having permission errors when using bower without sudo, please spend a few minutes learning more about how your system should work and make any necessary repairs.
[ERROR] 
[ERROR] http://www.joyent.com/blog/installing-node-and-npm
[ERROR] https://gist.github.com/isaacs/579814
[ERROR] 
[ERROR] You can however run a command with sudo using "--allow-root" option

手动把allow-root打开,改为true

然后一把过:

Zeppelin

替换下载路径

变异过程中zeppelin会去下载部分组件的源码包,直接连的apache源,很慢:

这里最好替换成国内的原,但是版本比较老,阿里和清华的源已经没了,这里目前尝试可以替换成华为源:

rlang/pom.xml和spark/pom.xml

xml 复制代码
        <spark.src.download.url>
            https://mirrors.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}.tgz
        </spark.src.download.url>
        <spark.bin.download.url>
            https://mirrors.huaweicloud.com/apache/spark/${spark.archive}/${spark.archive}-bin-without-hadoop.tgz
        </spark.bin.download.url>

flink/flink-scala-2.11/flink-scala-parent/pom.xml
flink/flink-scala-parent/pom.xml
flink/flink-scala-2.12/flink-scala-parent/pom.xml

xml 复制代码
  <properties>    
  	<flink.bin.download.url>https://mirrors.huaweicloud.com/apache/flink/flink-${flink.version}/flink-${flink.version}-bin-scala_${flink.scala.binary.version}.tgz</flink.bin.download.url>
  </properties>

git设置

过程中有个前端依赖会从git拉取代码:

这样拉不到,要配下git的配置:

bash 复制代码
git config --global url.https://gh-proxy.com/https://github.com/.insteadOf git://github.com/

OK,接下来等编译通过:

相关推荐
青云交19 分钟前
大数据新视界 --大数据大厂之Kafka消息队列实战:实现高吞吐量数据传输
大数据·kafka·消息队列·高吞吐量·大数据新视界·技术奥秘·应用场景、新兴技术
成都古河云29 分钟前
智慧园区:解析集成运维的未来之路
大数据·运维·人工智能·科技·5g·安全
深科信项目申报助手29 分钟前
2024年国家高新申报,警惕被退回的情况
大数据·经验分享·科技·其他
lynn-fish36 分钟前
蓝卓标杆客户镇洋发展,荣获IDC中国未来企业大奖
大数据·制造·智能制造·数字化·数字化转型·智能工厂·智能化
ZH_qaq36 分钟前
Linux 常用指令
linux·运维·服务器
Gauss松鼠会41 分钟前
GaussDB关键技术原理:高弹性(四)
java·大数据·网络·数据库·分布式·gaussdb
码哝小鱼1 小时前
Openssl升级
linux·运维·服务器
Dragon_qu·x1 小时前
Certbot 生成 SSL 证书并配置自动续期
运维·网络协议·https·ssl
字节跳动数据平台1 小时前
火山引擎数智平台:高性能ChatBI的技术解读和落地实践
大数据·大模型·数据可视化·bi
tRNA做科研1 小时前
Bio-Linux-shell详解-2-基本Shell命令快速掌握
linux·运维·服务器·生物信息·计算生物学