jdk17 编译 flink1.18

修改 flink-runtime-web 的配置

1
<npm.proxy>--registry https://registry.npmmirror.com</npm.proxy>

如此修改 flink-runtime-web

image-20230606154607996

否则构建会失败。

构建命令,使用自己有的私服,否则 lang-tag 也会报错

1
 mvn clean package  -DskipTests  -T 20  -DskipTests  -Dfast -Dmaven.compile.fork=true   -Pjava17-target -U  -X -e

参考文档:

https://github.com/apache/flink/pull/18860/files

遇到报错:

1
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:testCompile (scala-test-compile) on project flink-scala_2.12: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 240 (Exit value: 240) -> [Help 1]

https://issues.apache.org/jira/browse/FLINK-15736

  1. 在编译 flink-runtime-web 常常会报npm ci --cache-max=0 --no-save失败的错误

解决方案: - 删除本地 npm cache 缓存(npm cache clean 或者 rm -rf ~/.npm),重新运行 - 修改 flink/flink-runtime-web/pom.xml 中的<npm.proxy>--registry [http://172.17.0.1:8888/repository/npm/](https://link.zhihu.com/?target=http%3A//172.17.0.1%3A8888/repository/npm/)</npm.proxy><npm.proxy>--registry [https://registry.npm.taobao.org](https://link.zhihu.com/?target=https%3A//registry.npm.taobao.org)</npm.proxy> 使用淘宝 npm 源;

- 尝试运行mvn clean install -DskipTests -Dfast -rf :flink-runtime-web 继续编译;如果报相同错误, cd flink/flink-runtime-web 然后 npm install –cache-max=0 –no-save –registry https://registry.npm.taobao.org –force 观察是否成功

- 如果上述手动 npm install 的操作仍然失败,尝试更新 npm 版本(Mac 上可以brew upgrade npm)后再次尝试;npm install 成功后可再次返回 flink 项目根目录,继续编译。

- 如果不需要 WebUI 功能,mvn 命令添加-Pskip-webui-build.

  1. 使用较新版本的 maven 编译 flink-connector-hive 时报错: [ERROR] Failed to execute goal on project flink-connector-hive_2.12: Could not resolve dependencies for project org.apache.flink:flink-connector-hive_2.12:jar:1.16-SNAPSHOT: Failed to collect dependencies at org.apache.hive:hive-exec:jar:2.3.9 -> org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde: Failed to read artifact descriptor for org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde: Could not transfer artifact org.pentaho:pentaho-aggdesigner-algorithm:pom:5.1.5-jhyde from/to maven-default-http-blocker ([http://0.0.0.0/](https://link.zhihu.com/?target=http%3A//0.0.0.0/)): Blocked mirror for repositories: [conjars ([http://conjars.org/repo](https://link.zhihu.com/?target=http%3A//conjars.org/repo), default, releases+snapshots), apache.snapshots ([http://repository.apache.org/snapshots](https://link.zhihu.com/?target=http%3A//repository.apache.org/snapshots), default, snapshots)] -> [Help 1]

解决方案:

- 方案一:参考 Ref3 修改该 module 的 pom.xml 添加 conjars

- 方案二:为方便之后提交代码,建议使用较低的 Maven 版本(e.g. 3.2.5)进行编译。Flink 相关开发人员封装了一个 mvnw 方便进行该较低的 Maven 版本进行操作,可以直接./mvnw clean install -DskipTests -Dfast 继续编译。或者通过 mvn 来维护多个版本的 Maven 环境。

7.2.2 版本安装(flink1.17.1-rc)

1
mvn install:install-file -DgroupId=io.confluent -DartifactId=kafka-schema-registry-client -Dversion=7.5.3 -Dpackaging=jar -Dfile=kafka-schema-registry-client-7.5.3.jar -DpomFile=kafka-schema-registry-client-7.5.3.pom

## Reference

  1. Building Flink from Source | Apache Flink

  2. Flink1.13.2 源码编译 - 知乎

  3. [FLINK-27894] Build flink-connector-hive failed using Maven@3.8.5 - ASF JIRA

Licensed under CC BY-NC-SA 4.0
最后更新于 Sep 10, 2025 02:16 UTC
comments powered by Disqus
Built with Hugo
主题 StackJimmy 设计
Caret Up