我正在尝试设置flink
并运行集群,尽管我得到以下输出,看起来像集群启动了:
$ ./bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host LAPTOP-HRAHBL24.
Starting taskexecutor daemon on host LAPTOP-HRAHBL24.
当我转到localhost:8081时,连接被拒绝,所以我检查了flink日志,在任务执行器日志中,我看到以下错误:
Error: Could not find or load main class org.apache.flink.runtime.taskexecutor.TaskManagerRunner
在独立日志中,我得到这个:
Error: Could not find or load main class org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint
我一直在互联网上搜索,找不到任何东西。我的 javaenv
和系统变量是正确的,因为我可以看到 java 版本和javac
版本的输出。我正在使用java 8
,特别是jdk1.8.0_251
.我用Flink
1.10.1
和1.5.0
尝试了上述操作,两者都给了我相同的错误。 关于如何解决这个问题的任何想法?
我遇到了同样的问题,但现在我能够启动集群并看到 localhost:8081 UI。
在 Windows 10 操作系统上运行集群 - Apache Flink 1.11.2 for Scala 2.11
以下是我采取的步骤:
- Windows 的激活 WSL:https://www.thewindowsclub.com/how-to-run-sh-or-shell-script-file-in-windows-10(第 1 部分使用 WSL 执行命令行管理程序脚本文件(
- 让 Ubuntu for win 10 运行 Linux 命令(运行.sh文件(,这可以通过进入Microsoft商店并下载首选的 Linux 发行版来实现。(或在窗口中打开PowerShell,键入bash以获取方便的信息(
- 在 Ubuntu 上安装 OpenJDK,如下所示: https://askubuntu.com/questions/746413/trying-to-install-java-8-unable-to-locate-package-openjdk-8-jre(首先打开 Linux shell,然后按照说明进行操作(
完成这些步骤后,您应该能够在Apache Flink 1.11.2文件夹中打开Linux Shell并运行./bin/start-cluster.sh没有任何问题。
工作 以下行在 flink-daemon.sh 中更新
"$JAVA_RUN" $JVM_ARGS ${FLINK_ENV_JAVA_OPTS} "${log_setting[@]}" -classpath "`manglePathList "$FLINK_TM_CLASSPATH"`" ${CLASS_TO_RUN} "${ARGS[@]}" > "$out" 200<&- 2>&1 < /dev/null &
我遇到了同样的问题,我才刚刚解决它。
如果你打印 Flink 脚本尝试执行的 java 命令而不是执行它们,你会得到类似java <some-flags> -classpath <all-of-the-jars-in-lib-folder>::: <class-to-execute> <more-flags>
.
直接在 shell 上调用这些命令将获得相同的输出,但现在您可以随意调整命令以重现不良/期望的行为。java 命令有两个问题。
第一个问题是类路径末尾的:::
,将其删除!
第二个问题是log4j.configurationFile
标志,它有一个 git bash + windows 无法解释的路径,我不得不用-Dlog4j.configurationFile="\Users\eduardo\dev\flink-1.11.2\conf\log4j.properties"
替换-Dlog4j.configurationFile=file:/c/Users/eduardo/dev/flink-1.11.2/conf/log4j.properties
这对我来说是诀窍,试一试,让我知道它是怎么回事!
这可能与pom.xml文件有关。下面是示例示例:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>flink_examples</groupId>
<artifactId>kafkaExample</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<url>http://www.myorganization.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.9.1</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.11</scala.binary.version>
<jackson.version>2.9.0</jackson.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- Add connector dependencies here. They must be in the default scope (compile). -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- Add logging framework, to produce console output when running in the IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<!-- <scope>runtime</scope>-->
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<!-- <scope>runtime</scope>-->
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>commons-net</groupId>
<artifactId>commons-net</artifactId>
<version>3.7-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- Java Compiler -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
<!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder.
Otherwise, this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>flink_examples.StreamingJob</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<versionRange>[3.0.0,)</versionRange>
<goals>
<goal>shade</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore/>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
<!-- This profile helps to make things run out of the box in IntelliJ -->
<!-- Its adds Flink's core classes to the runtime class path. -->
<!-- Otherwise they are missing in IntelliJ, because the dependency is 'provided' -->
<profiles>
<profile>
<id>add-dependencies-for-IDEA</id>
<activation>
<property>
<name>idea.version</name>
</property>
</activation>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>compile</scope>
</dependency>
</dependencies>
</profile>
</profiles>
</project>
我遇到了同样的问题,但对我来说,它在打印后立即崩溃了 在主机上启动独立会话守护程序...
我想我也设法解决了这个问题,edu在这里的回答对我帮助很大。 我现在正在使用这个版本:适用于 Scala 2.11 的 Apache Flink 1.11.2,因此此解决方法可能不适用于其他版本。
所以我从这个 :::即将到达类路径末尾的地方注销,看起来它来自flink-daemon.sh. 在这个文件的第 127 行中,有一个 JAVA VM 的命令,用于运行一些代码,它看起来像这样:
$JAVA_RUN $JVM_ARGS ${FLINK_ENV_JAVA_OPTS} "${log_setting[@]}" -classpath "`manglePathList "$FLINK_TM_CLASSPATH:$INTERNAL_HADOOP_CLASSPATHS"`" ${CLASS_TO_RUN} "${ARGS[@]}" > "$out"
此行的重要部分如下:
:$INTERNAL_HADOOP_CLASSPATHS
3 个冒号来自这里,第一个冒号明确存在,第二个和第三个是该变量的值。
所以基本上只是删除这部分行,然后保存文件并再次开始运行start-cluster.sh,这次它应该可以工作。
它可能会记录一些与Hadoop相关的错误,但它不会崩溃并且会正常工作。例如,我看到了这个:
Hadoop is not in the classpath/dependencies. The extended set of supported File Systems via Hadoop is not available.
或者这个:
Cannot install HadoopSecurityContext because Hadoop cannot be found in the Classpath.
但似乎没关系,一切都在正常。顺便说一句,安装 Hadoop 也可以解决这个问题,但如果它有效,我不会修改它。 我可以打开本地主机:8081,仪表板正在运行。
如果有人有同样的情况,我在脚本所在的路径中有一个空格->"Apache Flink">