37、Flink 的 WindowAssigner之会话窗口示例

1、处理时间

无需设置水位线和时间间隔。

bash 复制代码
input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

2、事件时间

需设置水位线和时间间隔。

bash 复制代码
// 事件时间需要设置水位线策略和时间戳
        SingleOutputStreamOperator<Tuple2<String, Long>> map = input.map(new MapFunction<String, Tuple2<String, Long>>() {
            @Override
            public Tuple2<String, Long> map(String input) throws Exception {
                String[] fields = input.split(",");
                return new Tuple2<>(fields[0], Long.parseLong(fields[1]));
            }
        });

        SingleOutputStreamOperator<Tuple2<String, Long>> watermarks = map.assignTimestampsAndWatermarks(WatermarkStrategy.<Tuple2<String, Long>>forBoundedOutOfOrderness(Duration.ofSeconds(0))
                .withTimestampAssigner(new SerializableTimestampAssigner<Tuple2<String, Long>>() {
                    @Override
                    public long extractTimestamp(Tuple2<String, Long> input, long l) {
                        return input.f1;
                    }
                }));

        // 设置了固定间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

3、固定间隔和动态间隔

bash 复制代码
EventTimeSessionWindows.withGap(Time.minutes(10));

EventTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<Tuple2<String, Long>>() {
                    @Override
                    public long extract(Tuple2<String, Long> element) {
                        return element.f1 + 2000L;
                    }
                });

4、完整代码示例

bash 复制代码
import org.apache.flink.api.common.eventtime.SerializableTimestampAssigner;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.windowing.WindowFunction;
import org.apache.flink.streaming.api.windowing.assigners.EventTimeSessionWindows;
import org.apache.flink.streaming.api.windowing.assigners.ProcessingTimeSessionWindows;
import org.apache.flink.streaming.api.windowing.assigners.SessionWindowTimeGapExtractor;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;

import java.time.Duration;

public class _04_WindowAssignerSession {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        DataStreamSource<String> input = env.socketTextStream("localhost", 8888);

        // 测试时限制了分区数,生产中需要设置空闲数据源
        env.setParallelism(2);

        // 事件时间需要设置水位线策略和时间戳
        SingleOutputStreamOperator<Tuple2<String, Long>> map = input.map(new MapFunction<String, Tuple2<String, Long>>() {
            @Override
            public Tuple2<String, Long> map(String input) throws Exception {
                String[] fields = input.split(",");
                return new Tuple2<>(fields[0], Long.parseLong(fields[1]));
            }
        });

        SingleOutputStreamOperator<Tuple2<String, Long>> watermarks = map.assignTimestampsAndWatermarks(WatermarkStrategy.<Tuple2<String, Long>>forBoundedOutOfOrderness(Duration.ofSeconds(0))
                .withTimestampAssigner(new SerializableTimestampAssigner<Tuple2<String, Long>>() {
                    @Override
                    public long extractTimestamp(Tuple2<String, Long> input, long l) {
                        return input.f1;
                    }
                }));

        // 设置了固定间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

        // 设置了动态间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<Tuple2<String, Long>>() {
                    @Override
                    public long extract(Tuple2<String, Long> element) {
                        return element.f1 + 2000L;
                    }
                }))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

        // 设置了固定间隔的 processing-time session 窗口
        input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

        // 设置了动态间隔的 processing-time 会话窗口
        input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<String>() {
                    @Override
                    public long extract(String s) {
                        return System.currentTimeMillis() / 1000;
                    }
                }))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

        env.execute();
    }
}
相关推荐
Dolphin_Home1 小时前
搭建 Hadoop 3.3.6 伪分布式
大数据·hadoop·分布式
Yvonne9781 小时前
Hadoop HDFS基准测试
大数据·hadoop·hdfs
Yvonne9781 小时前
Hadoop初体验
大数据·hadoop
m0_748247553 小时前
重学SpringBoot3-整合 Elasticsearch 8.x (二)使用Repository
大数据·elasticsearch·jenkins
南宫文凯3 小时前
Hadoop-HA(高可用)机制
大数据·hadoop·分布式·hadoop-ha
乐享数科4 小时前
乐享数科:供应链金融—三个不同阶段的融资模式
大数据·人工智能·金融
程序员古德4 小时前
《论大数据处理架构及其应用》审题技巧 - 系统架构设计师
大数据·应用·论文写作·lambda架构·处理架构
小赖同学啊5 小时前
jmeter 与大数据生态圈中的服务进行集成
大数据·jmeter
闲人编程6 小时前
Spark单机快速入门:从部署到数据分析实战
大数据
m0_748256347 小时前
重学SpringBoot3-整合 Elasticsearch 8.x (一)客户端方式
大数据·elasticsearch·jenkins