探索ClickHouse——使用MaterializedView存储kafka传递的数据

《探索ClickHouse------连接Kafka和Clickhouse》中,我们讲解了如何使用kafka engin连接kafka,并读取topic中的数据。但是遇到了一个问题,就是数据只能读取一次,即使后面还有新数据发送到该topic,该表也读不出来。

为了解决这个问题,我们引入MaterializedView。

创建表

该表结构直接借用了《探索ClickHouse------使用Projection加速查询》中的表结构。

bash 复制代码
CREATE TABLE materialized_uk_price_paid_from_kafka ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4, 'other' = 0), is_new UInt8, duration Enum8('freehold' = 1, 'leasehold' = 2, 'unknown' = 0), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree ORDER BY (postcode1, postcode2, addr1, addr2);

CREATE TABLE materialized_uk_price_paid_from_kafka

(
price UInt32,
date Date,
postcode1 LowCardinality(String),
postcode2 LowCardinality(String),
type Enum8('terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4, 'other' = 0),
is_new UInt8,
duration Enum8('freehold' = 1, 'leasehold' = 2, 'unknown' = 0),
addr1 String,
addr2 String,
street LowCardinality(String),
locality LowCardinality(String),
town LowCardinality(String),
district LowCardinality(String),
county LowCardinality(String)

)

ENGINE = MergeTree

ORDER BY (postcode1, postcode2, addr1, addr2)

Query id: 55b16049-a865-4d54-9333-d661c6280a09

Ok.

0 rows in set. Elapsed: 0.005 sec.

创建MaterializedView

bash 复制代码
CREATE MATERIALIZED VIEW uk_price_paid_from_kafka_consumer_view TO materialized_uk_price_paid_from_kafka AS SELECT splitByChar(' ', postcode) AS p, toUInt32(price_string) AS price, parseDateTimeBestEffortUS(time) AS date, p[1] AS postcode1, p[2] AS postcode2, transform(a, ['T', 'S', 'D', 'F', 'O'], ['terraced', 'semi-detached', 'detached', 'flat', 'other']) AS type, b = 'Y' AS is_new, transform(c, ['F', 'L', 'U'], ['freehold', 'leasehold', 'unknown']) AS duration, addr1, addr2, street, locality, town, district, county FROM uk_price_paid_from_kafka;

这样kafka topic中的数据被清洗到materialized_uk_price_paid_from_kafka表中。

查询

bash 复制代码
select * from materialized_uk_price_paid_from_kafka;

我们在给topic发送下面的内容

"{5FA8692E-537B-4278-8C67-5A060540506D}","19500","1995-01-27 00:00","SK10 2QW","T","N","L","38","","GARDEN STREET","MACCLESFIELD","MACCLESFIELD","MACCLESFIELD","CHESHIRE","A","A"

再查询表

bash 复制代码
select * from materialized_uk_price_paid_from_kafka;
相关推荐
bug菌¹1 小时前
滚雪球学MySQL[9.1讲]:实践项目
数据库·mysql
武子康1 小时前
大数据-151 Apache Druid 集群模式 配置启动【上篇】 超详细!
java·大数据·clickhouse·flink·apache
YashanDB1 小时前
【YashanDB知识库】yashandb执行包含带oracle dblink表的sql时性能差
数据库·yashandb·崖山数据库·yashandb知识库
武子康1 小时前
大数据-144 Apache Kudu 基本概述 数据模型 使用场景
java·大数据·clickhouse·架构·flink·apache
ljh5746491191 小时前
thinkphp8 redis队列
数据库·redis·github
OpenCSG1 小时前
OpenCSG DataFlow:锻造大模型智慧的炼金术,开启数据集Agentic新范式
数据库·人工智能·数据处理·agentic
程序人生5182 小时前
PostgreSQL分区表
数据库·postgresql·分区表
bug菌¹3 小时前
滚雪球学MySQL[3.1讲]: 高级SQL查询
数据库·sql·mysql
燕雀安知鸿鹄之志哉.3 小时前
玄机:第二章 日志分析-mysql应急响应
数据库·经验分享·mysql·安全·web安全·网络安全
Janusne4 小时前
高效修复MySQL数据库
数据库·mysql·phpmyadmin·dbf for mysql