37、Flink 的 WindowAssigner之会话窗口示例

1、处理时间

无需设置水位线和时间间隔。

bash 复制代码
input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

2、事件时间

需设置水位线和时间间隔。

bash 复制代码
// 事件时间需要设置水位线策略和时间戳
        SingleOutputStreamOperator<Tuple2<String, Long>> map = input.map(new MapFunction<String, Tuple2<String, Long>>() {
            @Override
            public Tuple2<String, Long> map(String input) throws Exception {
                String[] fields = input.split(",");
                return new Tuple2<>(fields[0], Long.parseLong(fields[1]));
            }
        });

        SingleOutputStreamOperator<Tuple2<String, Long>> watermarks = map.assignTimestampsAndWatermarks(WatermarkStrategy.<Tuple2<String, Long>>forBoundedOutOfOrderness(Duration.ofSeconds(0))
                .withTimestampAssigner(new SerializableTimestampAssigner<Tuple2<String, Long>>() {
                    @Override
                    public long extractTimestamp(Tuple2<String, Long> input, long l) {
                        return input.f1;
                    }
                }));

        // 设置了固定间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

3、固定间隔和动态间隔

bash 复制代码
EventTimeSessionWindows.withGap(Time.minutes(10));

EventTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<Tuple2<String, Long>>() {
                    @Override
                    public long extract(Tuple2<String, Long> element) {
                        return element.f1 + 2000L;
                    }
                });

4、完整代码示例

bash 复制代码
import org.apache.flink.api.common.eventtime.SerializableTimestampAssigner;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.windowing.WindowFunction;
import org.apache.flink.streaming.api.windowing.assigners.EventTimeSessionWindows;
import org.apache.flink.streaming.api.windowing.assigners.ProcessingTimeSessionWindows;
import org.apache.flink.streaming.api.windowing.assigners.SessionWindowTimeGapExtractor;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.streaming.api.windowing.windows.TimeWindow;
import org.apache.flink.util.Collector;

import java.time.Duration;

public class _04_WindowAssignerSession {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        DataStreamSource<String> input = env.socketTextStream("localhost", 8888);

        // 测试时限制了分区数,生产中需要设置空闲数据源
        env.setParallelism(2);

        // 事件时间需要设置水位线策略和时间戳
        SingleOutputStreamOperator<Tuple2<String, Long>> map = input.map(new MapFunction<String, Tuple2<String, Long>>() {
            @Override
            public Tuple2<String, Long> map(String input) throws Exception {
                String[] fields = input.split(",");
                return new Tuple2<>(fields[0], Long.parseLong(fields[1]));
            }
        });

        SingleOutputStreamOperator<Tuple2<String, Long>> watermarks = map.assignTimestampsAndWatermarks(WatermarkStrategy.<Tuple2<String, Long>>forBoundedOutOfOrderness(Duration.ofSeconds(0))
                .withTimestampAssigner(new SerializableTimestampAssigner<Tuple2<String, Long>>() {
                    @Override
                    public long extractTimestamp(Tuple2<String, Long> input, long l) {
                        return input.f1;
                    }
                }));

        // 设置了固定间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

        // 设置了动态间隔的 event-time 会话窗口
        watermarks.keyBy(e -> e.f0)
                .window(EventTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<Tuple2<String, Long>>() {
                    @Override
                    public long extract(Tuple2<String, Long> element) {
                        return element.f1 + 2000L;
                    }
                }))
                .apply(new WindowFunction<Tuple2<String, Long>, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<Tuple2<String, Long>> iterable, Collector<String> collector) throws Exception {
                        for (Tuple2<String, Long> stringLongTuple2 : iterable) {
                            collector.collect(stringLongTuple2.f0);
                        }
                    }
                })
                .print();

        // 设置了固定间隔的 processing-time session 窗口
        input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withGap(Time.minutes(10)))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

        // 设置了动态间隔的 processing-time 会话窗口
        input.keyBy(e -> e)
                .window(ProcessingTimeSessionWindows.withDynamicGap(new SessionWindowTimeGapExtractor<String>() {
                    @Override
                    public long extract(String s) {
                        return System.currentTimeMillis() / 1000;
                    }
                }))
                .apply(new WindowFunction<String, String, String, TimeWindow>() {
                    @Override
                    public void apply(String s, TimeWindow timeWindow, Iterable<String> iterable, Collector<String> collector) throws Exception {
                        for (String string : iterable) {
                            collector.collect(string);
                        }
                    }
                })
                .print();

        env.execute();
    }
}
相关推荐
秋难降27 分钟前
一篇文章带你了解Pandassssssssssssssss
大数据·python·pandas
数据皮皮侠1 小时前
中国汽车能源消耗量(2010-2024年)
大数据·数据库·人工智能·物联网·金融·汽车·能源
TDengine (老段)1 小时前
TDengine 转化函数 TO_TIMESTAMP 用户手册
java·大数据·数据库·物联网·时序数据库·tdengine·涛思数据
LiRuiJie2 小时前
基于Hadoop3.3.4+Flink1.17.0+FlinkCDC3.0.0+Iceberg1.5.0整合,实现数仓实时同步mysql数据
大数据·hadoop·flink·iceberg·flinkcdc
时序数据说2 小时前
时序数据库IoTDB的优势场景分析
大数据·数据库·物联网·时序数据库·iotdb
Leo.yuan3 小时前
数据处理工具是做什么的?常见数据处理方法介绍
大数据·数据库·人工智能·python·信息可视化
阿里云大数据AI技术3 小时前
[VLDB 2025]面向云计算平台的多模态慢查询根因排序
大数据·数据库·人工智能
徐礼昭|商派软件市场负责人4 小时前
数智驱动的「库存管理」:从风险系数、ABC分类到OMS和ERP系统的协同优化策略
大数据·人工智能·分类
Guheyunyi4 小时前
安全风险监测系统是什么?内容有哪些?
大数据·人工智能·深度学习·安全·信息可视化
全知科技4 小时前
API产品升级丨全知科技发布「知影-API风险监测平台」:以AI重构企业数据接口安全治理新范式
大数据·人工智能·科技·安全