IDEA编写flinkSQL(快速体验版本,--无需配置环境)

相关资料

文档内容 链接地址
datagen生成器 https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/datagen/
print 生成器 https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/print/

准备工作

优点就是下载个idea就能体验,无需配置的环境(如 数据源等)

1、idea 开发工具

2、创建 maven 项目 -- archetype 选择quickstart 表示java开发

java代码

代码逻辑

1、采用datagen 生成器,作为数据 source

2、采用print 作为打印器,作为sink 直接输出

java 复制代码
package org.example;

import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;

public class FlinkSqlDemo {
    public static void main(String[] args) throws Exception {
        // 创建执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);

        // 创建输入表(使用DataGen生成测试数据)
        String sourceDDL = "CREATE TABLE user_behavior (\n" +
                "    user_id BIGINT,\n" +
                "    behavior STRING,\n" +
                "    ts TIMESTAMP(3)\n" +
                ") WITH (\n" +
                "    'connector' = 'datagen',\n" +
                "    'rows-per-second' = '1',\n" +
                "    'fields.user_id.kind' = 'random',\n" +
                "    'fields.user_id.min' = '1',\n" +
                "    'fields.user_id.max' = '2',\n" +
                "    'fields.behavior.length' = '2'\n" +
                ")";

        // 创建输出表(打印结果)
        String sinkDDL = "CREATE TABLE print_table (\n" +
                "    behavior STRING,\n" +
                "    cnt BIGINT\n" +
                ") WITH (\n" +
                "    'connector' = 'print'\n" +
                ")";

        // 执行DDL
        tableEnv.executeSql(sourceDDL);
        tableEnv.executeSql(sinkDDL);

        // 执行查询并插入结果
        Table resultTable = tableEnv.sqlQuery(
                "SELECT behavior, COUNT(*) AS cnt " +
                        "FROM user_behavior " +
                        "GROUP BY behavior"
        );

        // 插入到输出表
        resultTable.executeInsert("print_table").await();

        // 执行任务(流式任务需要保持运行)
        env.execute("Flink SQL Demo");
    }
}

pom依赖配置

`

4.0.0

复制代码
<groupId>org.example</groupId>
<artifactId>flinklearn</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>


<properties>
    <maven.compiler.source>8</maven.compiler.source>
    <maven.compiler.target>8</maven.compiler.target>
    <flink.version>1.17.1</flink.version>
</properties>

<dependencies>
    <!-- Flink Core -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-java</artifactId>
        <version>${flink.version}</version>
    </dependency>
        <!-- Flink实时流-->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-streaming-java</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-clients</artifactId>
        <version>${flink.version}</version>
    </dependency>

    <!-- Flink Table -->
    <dependency>
        <groupId>org.apache.flink</groupId>
        <artifactId>flink-table-planner_2.12</artifactId>
        <version>${flink.version}</version>
    </dependency>
    <dependency>
        <groupId>com.ververica</groupId>
        <artifactId>flink-sql-connector-mysql-cdc</artifactId>
        <version>2.0.1</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.2.4</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <transformers>
                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                <mainClass>com.example.FlinkSqlDemo</mainClass>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>
相关推荐
ClouGence4 小时前
CloudDM 新增支持 GaussDB 与 openGauss:国产数据库管理更高效
数据库·sql·ci/cd
陆小叁9 小时前
基于Flink CDC实现联系人与标签数据实时同步至ES的实践
java·elasticsearch·flink
YA3331 天前
java基础(九)sql基础及索引
java·开发语言·sql
专注API从业者1 天前
基于 Flink 的淘宝实时数据管道设计:商品详情流式处理与异构存储
大数据·前端·数据库·数据挖掘·flink
码出未来8571 天前
浅谈DDL、DSL、DCL、DML、DQL
sql
AI 嗯啦1 天前
SQL详细语法教程(四)约束和多表查询
数据库·人工智能·sql
阿里云大数据AI技术1 天前
【跨国数仓迁移最佳实践6】MaxCompute SQL语法及函数功能增强,10万条SQL转写顺利迁移
python·sql
mask哥1 天前
详解flink java基础(一)
java·大数据·微服务·flink·实时计算·领域驱动
livemetee1 天前
Flink2.0学习笔记:Flink服务器搭建与flink作业提交
大数据·笔记·学习·flink
喂完待续1 天前
【Tech Arch】Hive技术解析:大数据仓库的SQL桥梁
大数据·数据仓库·hive·hadoop·sql·apache