
- Spring Boot 教程
- Spring Boot - 首頁
- Spring Boot - 簡介
- Spring Boot - 快速入門
- Spring Boot - 引導
- Spring Tool Suite
- Spring Boot - Tomcat 部署
- Spring Boot - 構建系統
- Spring Boot - 程式碼結構
- Spring Bean & 依賴注入
- Spring Boot - 執行器
- Spring Boot - 啟動器
- Spring Boot - 應用屬性
- Spring Boot - 配置
- Spring Boot - 註解
- Spring Boot - 日誌
- 構建 RESTful Web 服務
- Spring Boot - 異常處理
- Spring Boot - 攔截器
- Spring Boot - Servlet 過濾器
- Spring Boot - Tomcat 埠號
- Spring Boot - Rest Template
- Spring Boot - 檔案處理
- Spring Boot - 服務元件
- Spring Boot - Thymeleaf
- 使用 RESTful Web 服務
- Spring Boot - CORS 支援
- Spring Boot - 國際化
- Spring Boot - 排程
- Spring Boot - 啟用 HTTPS
- Spring Boot - Eureka 服務
- 使用 Eureka 進行服務註冊
- 閘道器代理伺服器和路由
- Spring Cloud 配置伺服器
- Spring Cloud 配置客戶端
- Spring Boot - Actuator
- Spring Boot - Admin 伺服器
- Spring Boot - Admin 客戶端
- Spring Boot - 啟用 Swagger2
- Spring Boot - 使用 SpringDoc OpenAPI
- Spring Boot - 建立 Docker 映象
- 追蹤微服務日誌
- Spring Boot - Flyway 資料庫
- Spring Boot - 傳送郵件
- Spring Boot - Hystrix
- Spring Boot - Web Socket
- Spring Boot - 批處理服務
- Spring Boot - Apache Kafka
- Spring Boot - Twilio
- Spring Boot - 單元測試用例
- Rest Controller 單元測試
- Spring Boot - 資料庫處理
- 保護 Web 應用
- Spring Boot - 帶 JWT 的 OAuth2
- Spring Boot - Google Cloud Platform
- Spring Boot - Google OAuth2 登入
- Spring Boot 資源
- Spring Boot - 快速指南
- Spring Boot - 有用資源
- Spring Boot - 討論
Spring Boot - Apache Kafka
Apache Kafka 是一個開源專案,用於基於容錯訊息系統釋出和訂閱訊息。它設計快速、可擴充套件且分散式。如果您是 Kafka 的初學者,或者想要更好地理解它,請參考此連結 − www.tutorialspoint.com/apache_kafka/
在本章中,我們將瞭解如何在 Spring Boot 應用程式中實現 Apache Kafka。
配置 Kafka
首先,從 Spring Initializer 頁面下載 Spring Boot 專案 www.start.spring.io 並選擇以下依賴項 −
- 用於 Apache Kafka 的 Spring

首先,我們需要在構建配置檔案中新增 Spring Kafka 依賴項。
Maven 使用者可以在 pom.xml 檔案中新增以下依賴項。
<dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency>
Gradle 使用者可以在 build.gradle 檔案中新增以下依賴項。
compile group: 'org.springframework.kafka', name: 'spring-kafka'
傳送訊息
為了向 Apache Kafka 傳送訊息,我們需要定義生產者配置的配置類,如下所示 −
KafkaProducerConfig.java
package com.tutorialspoint.kafka; import java.util.HashMap; import java.util.Map; import org.apache.kafka.clients.producer.ProducerConfig; import org.apache.kafka.common.serialization.StringSerializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.core.DefaultKafkaProducerFactory; import org.springframework.kafka.core.KafkaTemplate; import org.springframework.kafka.core.ProducerFactory; @Configuration public class KafkaProducerConfig { @Bean ProducerFactory<String, String> producerFactory() { Map<String, Object> configProps = new HashMap<>(); configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class); return new DefaultKafkaProducerFactory<>(configProps); } @Bean KafkaTemplate<String, String> kafkaTemplate() { return new KafkaTemplate<>(producerFactory()); } }
要釋出訊息,請自動連線 Kafka Template 物件並生成訊息,如下所示。
@Autowired private KafkaTemplate<String, String> kafkaTemplate; public void sendMessage(String msg) { kafkaTemplate.send(topicName, msg); }
消費訊息
要消費訊息,我們需要編寫一個消費者配置類檔案,如下所示。
KafkaProducerConfig.java
package com.tutorialspoint.kafka; import java.util.HashMap; import java.util.Map; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization.StringDeserializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.annotation.EnableKafka; import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory; import org.springframework.kafka.core.ConsumerFactory; import org.springframework.kafka.core.DefaultKafkaConsumerFactory; @EnableKafka @Configuration public class KafkaConsumerConfig { @Bean public ConsumerFactory<String, String> consumerFactory() { Map<String, Object> props = new HashMap<>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:2181"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "group-id"); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); return new DefaultKafkaConsumerFactory<>(props); } @Bean public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } }
接下來,編寫一個監聽器來監聽訊息。
@KafkaListener(topics = "tutorialspoint", groupId = "group-id") public void listen(String message) { System.out.println("Received Messasge in group - group-id: " + message); }
讓我們從主 Spring Boot 應用程式類檔案的 ApplicationRunner 類 run 方法中呼叫 sendMessage() 方法,並從同一個類檔案中消費訊息。
您的主 Spring Boot 應用程式類檔案程式碼如下所示 −
KafkaDemoApplication.java
package com.tutorialspoint.kafka; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.ApplicationArguments; import org.springframework.boot.ApplicationRunner; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.kafka.core.KafkaTemplate; @SpringBootApplication public class KafkaDemoApplication implements ApplicationRunner { @Autowired private KafkaTemplate<String, String> kafkaTemplate; public void sendMessage(String msg) { kafkaTemplate.send("tutorialspoint", msg); } public static void main(String[] args) { SpringApplication.run(KafkaDemoApplication.class, args); } @KafkaListener(topics = "tutorialspoint", groupId = "group-id") public void listen(String message) { System.out.println("Received Messasge in group - group-id: " + message); } @Override public void run(ApplicationArguments args) throws Exception { sendMessage("Hi Welcome to Spring For Apache Kafka"); } }
完整的構建配置檔案程式碼如下所示。
Maven – pom.xml
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.3.4</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.tutorialspoint</groupId> <artifactId>kafka</artifactId> <version>0.0.1-SNAPSHOT</version> <name>kafka</name> <description>Demo project for Spring Boot</description> <url/> <licenses> <license/> </licenses> <developers> <developer/> </developers> <scm> <connection/> <developerConnection/> <tag/> <url/> </scm> <properties> <java.version>21</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter</artifactId> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka-test</artifactId> <scope>test</scope> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> </project>
Gradle – build.gradle
buildscript { ext { springBootVersion = '3.3.4' } repositories { mavenCentral() } dependencies { classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}") } } apply plugin: 'java' apply plugin: 'eclipse' apply plugin: 'org.springframework.boot' group = 'com.tutorialspoint' version = '0.0.1-SNAPSHOT' sourceCompatibility = 1.8 repositories { mavenCentral() } dependencies { compile('org.springframework.boot:spring-boot-starter') compile group: 'org.springframework.kafka', name: 'spring-kafka') testCompile('org.springframework.boot:spring-boot-starter-test') testCompile('org.springframework.kafka:spring-kafka-test') }
現在,建立一個可執行的 JAR 檔案,並使用以下 Maven 或 Gradle 命令執行 Spring Boot 應用程式,如下所示 −
對於 Maven,使用如下所示的命令 −
mvn clean install
“BUILD SUCCESS” 後,您可以在 target 目錄下找到 JAR 檔案。
對於 Gradle,使用如下所示的命令 −
gradle clean build
“BUILD SUCCESSFUL” 後,您可以在 build/libs 目錄下找到 JAR 檔案。
使用此處提供的命令執行 JAR 檔案 −
java –jar <JARFILE>
您可以在控制檯視窗中看到輸出。
. ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ [32m :: Spring Boot :: [39m [2m (v3.3.4)[0;39m [2m2024-09-24T11:51:15.187+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m [2m[0;39m[36mc.t.kafka.KafkaApplication [0;39m [2m:[0;39m Starting KafkaApplication using Java 21.0.3 with PID 20404 (E:\Dev\kafka\target\classes started by Tutorialspoint in E:\Dev\kafka) [2m2024-09-24T11:51:15.189+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m [2m[0;39m[36mc.t.kafka.KafkaApplication [0;39m [2m:[0;39m No active profile set, falling back to 1 default profile: "default" [2m2024-09-24T11:51:16.022+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m [2m[0;39m[36mo.a.k.clients.consumer.ConsumerConfig [0;39m [2m:[0;39m ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = latest bootstrap.servers = [localhost:2181] ... ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer [2m2024-09-24T11:51:16.080+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m [2m[0;39m[36mo.a.k.c.t.i.KafkaMetricsCollector [0;39m [2m:[0;39m initializing Kafka metrics collector [2m2024-09-24T11:51:16.206+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m ... [2m2024-09-24T11:51:16.247+05:30[0;39m [32m INFO[0;39m [35m20404[0;39m [2m---[0;39m [2m[kafka] [ main][0;39m [2m[0;39m[36mo.a.k.clients.producer.ProducerConfig [0;39m [2m:[0;39m ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [localhost:9092] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = kafka-producer-1 ...