使用k6进行kafka负载测试

1.安装环境

kafka环境

参考Docker搭建kafka环境-CSDN博客

xk6-kafka环境

bash 复制代码
./xk6 build --with github.com/mostafa/xk6-kafka@latest

查看安装情况

2.编写脚本

test_kafka.js

javascript 复制代码
// Either import the module object
import * as kafka from "k6/x/kafka";

// Or individual classes and constants
import {
    Writer,
    Reader,
    Connection,
    SchemaRegistry,
    SCHEMA_TYPE_STRING,
} from "k6/x/kafka";

// Creates a new Writer object to produce messages to Kafka
const writer = new Writer({
    // WriterConfig object
    brokers: ["localhost:9092"],
    topic: "my-topic",
});

const reader = new Reader({
    // ReaderConfig object
    brokers: ["localhost:9092"],
    topic: "my-topic",
});

const connection = new Connection({
    // ConnectionConfig object
    address: "localhost:9092",
});

const schemaRegistry = new SchemaRegistry();
// Can accept a SchemaRegistryConfig object

if (__VU == 0) {
    // Create a topic on initialization (before producing messages)
    connection.createTopic({
    // TopicConfig object
    topic: "my-topic",
    });
}

export default function () {
    // Fetch the list of all topics
    const topics = connection.listTopics();
    console.log(topics); // list of topics

    // Produces message to Kafka
    writer.produce({
    // ProduceConfig object
    messages: [
        // Message object(s)
        {
        key: schemaRegistry.serialize({
            data: "my-key",
            schemaType: SCHEMA_TYPE_STRING,
        }),
        value: schemaRegistry.serialize({
            data: "my-value",
            schemaType: SCHEMA_TYPE_STRING,
        }),
        },
    ],
    });

    // Consume messages from Kafka
    let messages = reader.consume({
    // ConsumeConfig object
    limit: 10,
    });

    // your messages
    console.log(messages);

    // You can use checks to verify the contents,
    // length and other properties of the message(s)

    // To serialize the data back into a string, you should use
    // the deserialize method of the Schema Registry client. You
    // can use it inside a check, as shown in the example scripts.
    let deserializedValue = schemaRegistry.deserialize({
    data: messages[0].value,
    schemaType: SCHEMA_TYPE_STRING,
    });
}

export function teardown(data) {
    // Delete the topic
    connection.deleteTopic("my-topic");

    // Close all connections
    writer.close();
    reader.close();
    connection.close();
}

3.运行测试

运作之前先开启kafka服务,打开终端输入命令

bash 复制代码
./k6 run test_kafka.js --vus 50 --duration 10s

测试结果

相关推荐
indexsunny4 分钟前
互联网大厂Java面试实战:Spring Boot微服务在电商场景中的应用与挑战
java·spring boot·redis·微服务·kafka·spring security·电商
TTBIGDATA23 分钟前
【Atlas】Ambari 中 开启 Kerberos + Ranger 后 Atlas Hook 无权限访问 Kafka Topic:ATLAS_HOOK
大数据·kafka·ambari·linq·ranger·knox·bigtop
岁岁种桃花儿4 小时前
Kafka从入门到上天系列第一篇:kafka的安装和启动
大数据·中间件·kafka
TTBIGDATA1 天前
【Atlas】Atlas Hook 消费 Kafka 报错:GroupAuthorizationException
hadoop·分布式·kafka·ambari·hdp·linq·ranger
indexsunny1 天前
互联网大厂Java面试实战:微服务与Spring生态技术解析
java·spring boot·redis·kafka·mybatis·hibernate·microservices
编程彩机1 天前
互联网大厂Java面试:从Spring Boot到分布式事务的技术场景解析
spring boot·kafka·分布式事务·微服务架构·java面试·技术解析
没有bug.的程序员1 天前
RocketMQ 与 Kafka 深度对垒:分布式消息引擎内核、事务金融级实战与高可用演进指南
java·分布式·kafka·rocketmq·分布式消息·引擎内核·事务金融
yumgpkpm1 天前
华为昇腾300T A2训练、微调Qwen过程,带保姆式命令,麒麟操作系统+鲲鹏CPU
hive·hadoop·华为·flink·spark·kafka·hbase
ApachePulsar2 天前
演讲回顾|谙流科技在 Kafka on Pulsar 之上的探索
分布式·科技·kafka
yumgpkpm2 天前
2026软件:白嫖,开源,外包,招标,晚进场(2025年下半年),数科,AI...中国的企业软件产业出路
大数据·人工智能·hadoop·算法·kafka·开源·cloudera