下载并且解压Flink
启动Flink.
bash
$ ./bin/start-cluster.sh
Starting cluster.
Starting standalonesession daemon on host DESKTOP-T4TU7JE.
Starting taskexecutor daemon on host DESKTOP-T4TU7JE.
Flink 的版本附带了许多示例作业。您可以快速将这些应用程序之一部署到正在运行的集群。
XML
$ ./bin/flink run examples/streaming/WordCount.jar
$ tail log/flink-*-taskexecutor-*.out
(nymph,1)
(in,3)
(thy,1)
(orisons,1)
(be,4)
(all,2)
(my,1)
(sins,1)
(remember,1)
(d,4)
Stop Flink
bash
$ ./bin/stop-cluster.sh
利用java 代码运行第一个flink hello world.
pom.xml
XML
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>${flink.version}</version>
</dependency>
java 代码
java
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;
public class HelloWorld {
public static void main(String[] args) throws Exception {
// Set up the execution environment
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
// Create a stream of data
DataStream<String> dataStream = env.fromElements("Hello", "World", "Flink");
// Apply transformation: split each word by space
DataStream<Tuple2<String, Integer>> wordCounts = dataStream
.flatMap(new Splitter())
.keyBy(0)
.sum(1);
// Print the result
wordCounts.print();
// Execute the Flink job
env.execute("Hello World Example");
}
// Custom FlatMapFunction to split each sentence into words
public static final class Splitter implements FlatMapFunction<String, Tuple2<String, Integer>> {
@Override
public void flatMap(String sentence, Collector<Tuple2<String, Integer>> out) {
// Split the sentence into words
for (String word : sentence.split(" ")) {
// Emit the word with a count of 1
out.collect(new Tuple2<>(word, 1));
}
}
}
}
参考