IDEA编写flinkSQL(快速体验版本,--无需配置环境)

相关资料

文档内容 链接地址
datagen生成器 https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/datagen/
print 生成器 https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/print/

准备工作

优点就是下载个idea就能体验,无需配置的环境(如 数据源等)

1、idea 开发工具

2、创建 maven 项目 -- archetype 选择quickstart 表示java开发

java代码

代码逻辑

1、采用datagen 生成器,作为数据 source

2、采用print 作为打印器,作为sink 直接输出

java 复制代码
package org.example;

import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;

public class FlinkSqlDemo {
    public static void main(String[] args) throws Exception {
        // 创建执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);

        // 创建输入表(使用DataGen生成测试数据)
        String sourceDDL = "CREATE TABLE user_behavior (\n" +
                "    user_id BIGINT,\n" +
                "    behavior STRING,\n" +
                "    ts TIMESTAMP(3)\n" +
                ") WITH (\n" +
                "    'connector' = 'datagen',\n" +
                "    'rows-per-second' = '1',\n" +
                "    'fields.user_id.kind' = 'random',\n" +
                "    'fields.user_id.min' = '1',\n" +
                "    'fields.user_id.max' = '2',\n" +
                "    'fields.behavior.length' = '2'\n" +
                ")";

        // 创建输出表(打印结果)
        String sinkDDL = "CREATE TABLE print_table (\n" +
                "    behavior STRING,\n" +
                "    cnt BIGINT\n" +
                ") WITH (\n" +
                "    'connector' = 'print'\n" +
                ")";

        // 执行DDL
        tableEnv.executeSql(sourceDDL);
        tableEnv.executeSql(sinkDDL);

        // 执行查询并插入结果
        Table resultTable = tableEnv.sqlQuery(
                "SELECT behavior, COUNT(*) AS cnt " +
                        "FROM user_behavior " +
                        "GROUP BY behavior"
        );

        // 插入到输出表
        resultTable.executeInsert("print_table").await();

        // 执行任务(流式任务需要保持运行)
        env.execute("Flink SQL Demo");
    }
}

pom依赖配置

`

4.0.0

复制代码
<groupId>org.example</groupId>
<artifactId>flinklearn</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>


<properties>
    <maven.compiler.source>8</maven.compiler.source>
    <maven.compiler.target>8</maven.compiler.target>
    <flink.version>1.17.1</flink.version>
</properties>

<dependencies>
    <!-- Flink Core -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-java</artifactId>
        <version>${flink.version}</version>
    </dependency>
        <!-- Flink实时流-->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-java</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients</artifactId>
        <version>${flink.version}</version>
    </dependency>

    <!-- Flink Table -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-table-planner_2.12</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>com.ververica</groupId>
        <artifactId>flink-sql-connector-mysql-cdc</artifactId>
        <version>2.0.1</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.4</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <transformers>
                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.example.FlinkSqlDemo</mainClass>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>
相关推荐
Hello.Reader2 小时前
Flink Firehose Sink 把实时流数据稳定写进 Amazon Kinesis Data Firehose
大数据·flink
·云扬·18 小时前
MySQL各版本核心特性演进与主流分支深度解析
数据库·sql·mysql
coding者在努力1 天前
SQL使用NOT EXITS实现全称量词查询(数据库查询所有)详细讲解和技巧总结
网络·数据库·sql
航Hang*1 天前
第3章:复习篇——第4节:创建、管理视图与索引---题库
网络·数据库·笔记·sql·学习·mysql·期末
ask_baidu1 天前
监控Source端Pg对Flink CDC的影响
java·大数据·postgresql·flink
砚边数影1 天前
KingbaseES基础(二):SQL进阶 —— 批量插入/查询 AI 样本数据实战
java·数据库·人工智能·sql·ai
zqmattack1 天前
SQL sever根据身份证判断性别函数
java·数据库·sql
知识分享小能手1 天前
Oracle 19c入门学习教程,从入门到精通,SQL语言基础详解:语法、使用方法与综合案例(5)
sql·学习·oracle
Hello.Reader1 天前
Apache Cassandra Connector:Flink 与宽列存储的高吞吐协作
大数据·flink·apache