Spring WebFlux Application Integration with Reactive Messaging System Kafka
Last Updated :
21 Jun, 2024
Spring WebFlux is the framework for building reactive applications on the JVM. It is useful for reactive programming and makes it easier to build asynchronous, non-blocking and event-driven applications. Apache Kafka is a distributed streaming platform which allows us to publish and subscribe to streams of the records in a fault-tolerant way and process streams of the records as they occur.
Integrating the Spring WebFlux with Apache Kafka can allow you to build scalable, reactive applications that can handles large amounts of data in real-time. This article will guide you through setting up the Spring WebFlux application and integrating it with Kafka for reactive messaging.
Main Concept
The integration of the Spring WebFlux with Kafka can involve using Spring Kafka and the Spring framework project that simplifies the Kafka-based messaging. The key components of this integration are:
- Producer: The component that can send the message to the Kafka topic.
- Consumer: The component that can receive the message from the Kafka topic.
- KafkaTemplate: The Spring Kafka utility that can simplify the sending of messages to Kafka.
Implementation of Spring WebFlux Application Integration with Reactive Messaging System Kafka
Step 1: Setup the Kafka
We can refer to this link to ensure Kafka is installed and running on the local system of the Kafka server application.
Step 2: Create the Spring Project
Create the Spring Boot project using the Spring Initializr and add the required dependencies.
- Spring Reactive Web
- Spring For Apache Kafka
- Lombok
- Spring DevTools
After Creating then the project folder structure will be like the below.

Step 3: Configure the application properties
Open the application.properties rename into the application.yml and add the below code for the configurations of the Apache Kafka of the Spring project.
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: group_id
auto-offset-reset: earliest
producer:
bootstrap-servers: localhost:9092
listener:
type: single
Step 4: Create the Message Class
Go to src > main > java >org.example.springreactivekafkademo > model > Message and put the below code.
Java
package org.example.springreactivekakfkademo.model;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@AllArgsConstructor
@NoArgsConstructor
public class Message {
private String content;
}
Step 5: Configure the Kafka
We can create the configuration class for the kafka into the Spring reactive project. Go to src > main > java > org.example.springreactivekafkademo > config > KafkaConfig and put the below code.
Java
package org.example.springreactivekakfkademo.config;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.*;
import org.springframework.kafka.core.*;
import org.springframework.kafka.support.serializer.ErrorHandlingDeserializer;
import org.springframework.kafka.support.serializer.JsonDeserializer;
import org.springframework.kafka.support.serializer.JsonSerializer;
import java.util.HashMap;
import java.util.Map;
@EnableKafka
@Configuration
public class KafkaConfig {
@Bean
public ProducerFactory<String, Object> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
@Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
@Bean
public ConsumerFactory<String, Object> consumerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id");
configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class);
configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class);
configProps.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName());
configProps.put(ErrorHandlingDeserializer.VALUE_FUNCTION, JsonDeserializer.class.getName() + "<>");
configProps.put(JsonDeserializer.TRUSTED_PACKAGES, "*");
return new DefaultKafkaConsumerFactory<>(configProps, new StringDeserializer(),
new ErrorHandlingDeserializer<>(new JsonDeserializer<>(Object.class)));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
Step 6: Create the Consumer Service
We can create the service that will send the messages to the kafka topic of the Spring reactive application. Go to src > main > java > org.example.springreactivekafkademo > service > KafkaConsumerService and put the below code.
Java
package org.example.springreactivekakfkademo.service;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;
import reactor.core.publisher.Flux;
import reactor.core.publisher.FluxSink;
import java.util.ArrayList;
import java.util.List;
@Service
public class KafkaConsumerService {
private final List<FluxSink<Object>> sinks = new ArrayList<>();
public Flux<Object> consumeMessages() {
return Flux.create(sinks::add);
}
@KafkaListener(topics = "test-topic", groupId = "group_id")
public void listen(Object message) {
sinks.forEach(sink -> sink.next(message));
}
}
Step 7: Create the Producer Service
We can create the service that will consume the messages from the kafka topic of the Spring reactive application. Go to src > main > java >org.example.springreactivekafkademo > service > KafkaProducerService and put the below code.
Java
package org.example.springreactivekakfkademo.service;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
@Service
public class KafkaProducerService {
private final KafkaTemplate<String, Object> kafkaTemplate;
@Autowired
public KafkaProducerService(KafkaTemplate<String, Object> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
public void sendMessage(String topic, Object message) {
kafkaTemplate.send(topic, message);
}
}
Step 8: Create the KafkaController class
We can create the controller to expose the endpoints for the producing and consuming of the spring reactive application. Go to src > main > java > org.example.springreactivekafkademo > controller > KafkaController and put the below code.
Java
package org.example.springreactivekakfkademo.controller;
import org.example.springreactivekakfkademo.model.Message;
import org.example.springreactivekakfkademo.service.KafkaConsumerService;
import org.example.springreactivekakfkademo.service.KafkaProducerService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@RestController
@RequestMapping("/kafka")
public class KafkaController {
private final KafkaProducerService producerService;
private final KafkaConsumerService consumerService;
@Autowired
public KafkaController(KafkaProducerService producerService, KafkaConsumerService consumerService) {
this.producerService = producerService;
this.consumerService = consumerService;
}
@PostMapping("/publish")
public Mono<Void> publishMessage(@RequestBody Message message) {
return Mono.fromRunnable(() -> producerService.sendMessage("test-topic", message));
}
@GetMapping("/subscribe")
public Flux<Object> subscribeMessages() {
return consumerService.consumeMessages();
}
}
Step 9: Main Class
No changes are required in the main class of the application.
Java
package org.example.springreactivekakfkademo;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class SpringReactiveKakfkaDemoApplication {
public static void main(String[] args) {
SpringApplication.run(SpringReactiveKakfkaDemoApplication.class, args);
}
}
pom.xml
XML
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.0</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>org.example</groupId>
<artifactId>spring-reactive-kakfka-demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>spring-reactive-kakfka-demo</name>
<description>spring-reactive-kakfka-demo</description>
<properties>
<java.version>17</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-devtools</artifactId>
<scope>runtime</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<excludes>
<exclude>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</project>
Step 10: Run the Application
Once complete the application then it will start the application at port 8080.

Step 11: Testing the Endpoints
Produce the Message
POST http://localhost:8080/kafka/publish

Check the logs of topic configuration of the kafka

Consume the Message
GET http://localhost:8080/kafka/subscribe

By the following these steps, we have successfully integrated Spring WebFlux with the Kafka for the reactive messaging. This setup can allows you to produce and consume the messages in the non-blocking, reactive manner and leveraging the full potiential of the Spring WebFlux and Kafka.
Similar Reads
Spring WebFlux REST Application Integration with Spring Data R2DBC
Spring WebFlux is the framework from the Spring ecosystem that supports reactive programming for building asynchronous and non-blocking web applications. Unlike the traditional Spring MVC, which can use the blocking I/O. WebFlux can be designed to handle large volumes of requests using fewer resourc
4 min read
Containerize Spring WebFlux Application
In todayâs Web development, containerization has become a dominant way of deploying and managing applications. The containers package application provides a consistent environment in the development and production phases. Docker is a popular tool for building, deploying, and running applications usi
3 min read
Building Reactive Microservices with Spring WebFlux
This article will teach us how to build reactive microservices with Spring WebFlux using a related example. Here, we create a student microservice to handle students and their data. For understanding purposes, we provide basic features and functionalities such as adding a student, searching for a st
5 min read
Non-Blocking I/O Operation with Spring WebFlux
Spring WebFlux is most powerful framework for building reactive, non-blocking web applications. Unlike the traditional Spring MVC which can be based on the synchronous, blocking the I/O model, Spring WebFlux the reactor library enables asynchronous and nonblocking operations. It can allow for better
5 min read
Event-driven Applications With Spring Cloud Stream
In Spring Boot, Event-driven architecture is a paradigm, and it can be software component communication through the production, detection, and consumption of events. Spring Cloud Stream is a popular framework built on top of Spring Boot that can be used to simplify the development of event-driven mi
9 min read
Reactive JWT Authentication Using Spring WebFlux
JSON Web Token (JWT) authentication is a popular method for securing APIs in microservices architectures. With Spring WebFlux, the reactive web framework, we can create highly scalable and responsive applications. In this article, we will guide you on how to implement JWT authentication in a reactiv
8 min read
Spring Boot â Integrate with Apache Kafka for Streaming
Apache Kafka is a widely used distributed streaming platform that enables the development of scalable, fault-tolerant, and high-throughput applications. In this article, we'll walk you through the process of integrating Kafka with a Spring Boot application, providing detailed code examples and expla
7 min read
Spring WebFlux Rest API Global Exception Handling
Spring WebFlux is part of Spring Framework, allowing us to Reactive programming and Support non-blocking I/O operations. The Spring Framework provides a lot of Annotations to handle the applications. This article focuses on Global Exception Handling by using Rest API in the Spring WebFlux. For this,
6 min read
Function Calling and Java Integration with Spring AI Models
Spring AI is a powerful Spring Framework project that brings Java developers artificial intelligence (AI) capabilities. By integrating AI models into Java applications, Spring AI simplifies the process of creating intelligent applications while leveraging the robustness of the Spring ecosystem.This
5 min read
Spring Boot - Integration with Kafka
Apache Kafka is a distributed messaging system designed for high-throughput and low-latency message delivery. It is widely used in real-time data pipelines, streaming analytics, and other applications requiring reliable and scalable data processing. Kafkaâs publish-subscribe model allows producers t
6 min read