探索ClickHouse——使用MaterializedView存储kafka传递的数据

《探索ClickHouse------连接Kafka和Clickhouse》中,我们讲解了如何使用kafka engin连接kafka,并读取topic中的数据。但是遇到了一个问题,就是数据只能读取一次,即使后面还有新数据发送到该topic,该表也读不出来。

为了解决这个问题,我们引入MaterializedView。

创建表

该表结构直接借用了《探索ClickHouse------使用Projection加速查询》中的表结构。

bash 复制代码
CREATE TABLE materialized_uk_price_paid_from_kafka ( price UInt32, date Date, postcode1 LowCardinality(String), postcode2 LowCardinality(String), type Enum8('terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4, 'other' = 0), is_new UInt8, duration Enum8('freehold' = 1, 'leasehold' = 2, 'unknown' = 0), addr1 String, addr2 String, street LowCardinality(String), locality LowCardinality(String), town LowCardinality(String), district LowCardinality(String), county LowCardinality(String) ) ENGINE = MergeTree ORDER BY (postcode1, postcode2, addr1, addr2);

CREATE TABLE materialized_uk_price_paid_from_kafka

(
price UInt32,
date Date,
postcode1 LowCardinality(String),
postcode2 LowCardinality(String),
type Enum8('terraced' = 1, 'semi-detached' = 2, 'detached' = 3, 'flat' = 4, 'other' = 0),
is_new UInt8,
duration Enum8('freehold' = 1, 'leasehold' = 2, 'unknown' = 0),
addr1 String,
addr2 String,
street LowCardinality(String),
locality LowCardinality(String),
town LowCardinality(String),
district LowCardinality(String),
county LowCardinality(String)

)

ENGINE = MergeTree

ORDER BY (postcode1, postcode2, addr1, addr2)

Query id: 55b16049-a865-4d54-9333-d661c6280a09

Ok.

0 rows in set. Elapsed: 0.005 sec.

创建MaterializedView

bash 复制代码
CREATE MATERIALIZED VIEW uk_price_paid_from_kafka_consumer_view TO materialized_uk_price_paid_from_kafka AS SELECT splitByChar(' ', postcode) AS p, toUInt32(price_string) AS price, parseDateTimeBestEffortUS(time) AS date, p[1] AS postcode1, p[2] AS postcode2, transform(a, ['T', 'S', 'D', 'F', 'O'], ['terraced', 'semi-detached', 'detached', 'flat', 'other']) AS type, b = 'Y' AS is_new, transform(c, ['F', 'L', 'U'], ['freehold', 'leasehold', 'unknown']) AS duration, addr1, addr2, street, locality, town, district, county FROM uk_price_paid_from_kafka;

这样kafka topic中的数据被清洗到materialized_uk_price_paid_from_kafka表中。

查询

bash 复制代码
select * from materialized_uk_price_paid_from_kafka;

我们在给topic发送下面的内容

"{5FA8692E-537B-4278-8C67-5A060540506D}","19500","1995-01-27 00:00","SK10 2QW","T","N","L","38","","GARDEN STREET","MACCLESFIELD","MACCLESFIELD","MACCLESFIELD","CHESHIRE","A","A"

再查询表

bash 复制代码
select * from materialized_uk_price_paid_from_kafka;
相关推荐
知识分享小能手6 分钟前
edis入门学习教程,从入门到精通,Redis编程开发知识点详解(4)
数据库·redis·学习
qq_3349031511 分钟前
使用Flask快速搭建轻量级Web应用
jvm·数据库·python
wutang0ka13 分钟前
高频 SQL 50题 197.上升的温度
数据库·sql
薛定谔的悦17 分钟前
嵌入式 OTA(远程固件升级)(二)
服务器·数据库·能源·储能·ota
V1ncent Chen17 分钟前
SQL大师之路 14 子查询
数据库·sql·mysql·数据分析
奇点爆破XC41 分钟前
统计数据库当前数据容量
数据库
Leon-Ning Liu1 小时前
OGG同步Oracle到Kafka
数据库·oracle·kafka
远方16091 小时前
117-Oracle 26ai FILTER(过滤)子句新特性
大数据·数据库·sql·oracle·database
Maverick061 小时前
Oracle 归档日志(Archive Log)操作手册
数据库·oracle
isNotNullX1 小时前
一文讲清8大数据清洗方法
大数据·数据库·数据挖掘·数据迁移