Apache Flink- 找不到 org.apache.flink.streaming.api.scala.DataS


使用 Apache Flink

版本 1.3.2 和 Cassandra 3.11,我编写了一个简单的代码,使用 Apache Flink Cassandra 连接器将数据写入 Cassandra。以下是代码:

final Collection<String> collection = new ArrayList<>(50);
        for (int i = 1; i <= 50; ++i) {
            collection.add("element " + i);
        }
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        DataStream<Tuple2<UUID, String>> dataStream = env
                .fromCollection(collection)
                .map(new MapFunction<String, Tuple2<UUID, String>>() {
                    final String mapped = " mapped ";
                    String[] splitted;
                    @Override
                    public Tuple2<UUID, String> map(String s) throws Exception {
                        splitted = s.split("\s+");
                        return new Tuple2(
                                UUID.randomUUID(),
                                splitted[0] + mapped + splitted[1]
                        );
                    }
                });
        dataStream.print();
        CassandraSink.addSink(dataStream)
                .setQuery("INSERT INTO test.phases (id, text) values (?, ?);")
                .setHost("127.0.0.1")
                .build();
        env.execute();

尝试使用 Apache Flink 1.4.2 (1.4.x( 运行相同的代码,我收到错误:

Error:(36, 22) java: cannot access org.apache.flink.streaming.api.scala.DataStream
  class file for org.apache.flink.streaming.api.scala.DataStream not found

在线

CassandraSink.addSink(dataStream)
                    .setQuery("INSERT INTO test.phases (id, text) values (?, ?);")
                    .setHost("127.0.0.1")
                    .build();

我认为我们在 Apache Flink 1.4.2 中有一些依赖项更改,它导致了问题。

我使用代码中导入的以下依赖项:

import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.cassandra.CassandraSink;

如何解决 Apache Flink 1.4.2 版本中的错误?

更新:在 Flink 1.3.2 中,类 org.apache.flink.streaming.api.scala.DataStream<T> 在 Java 文档中,但在 1.4.2 版本中没有这样的类。

我尝试了 Cassandra 连接器的 Flink 1.4.2 文档中的代码示例,但我遇到了同样的错误,但该示例适用于 Flink 1.3.2 依赖项!

除了所有其他依赖项之外,请确保您具有 Flink Scala 依赖项:

马文

<dependency>
  <groupId>org.apache.flink</groupId>
  <artifactId>flink-streaming-scala_2.11</artifactId>
  <version>1.4.2</version>
</dependency>

格拉德尔

dependencies {
    compile group: 'org.apache.flink', name: 'flink-streaming-scala_2.11', version: '1.4.2'
..
}

我设法让您的示例使用以下依赖项:

import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.cassandra.CassandraSink;

马文

<dependencies>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-java</artifactId>
        <version>1.4.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-java_2.11</artifactId>
        <version>1.4.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-scala_2.11</artifactId>
        <version>1.4.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients_2.11</artifactId>
        <version>1.4.2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-connector-cassandra_2.11</artifactId>
        <version>1.4.2</version>
    </dependency>
</dependencies>

相关内容

  • 没有找到相关文章

最新更新